You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I now have a pretty robust and dynamic test generator for xqsuite tests executed on the server and then "parsed" by Mocha. see generators/app/templates/tests/xqs/xqSuite.js. Should we include that here?
The only thing that we would need to adjust is adding a method to provide the location of the xqsuite runner (and or suite) file.
The downside it would add Mocha as a dependency.
The upside you could run xqsuite tests from inside node-exist.
I do like your approach - it is also very similar to https://github.com/line-o/xbow/blob/master/test/mocha/xqSuite.js. This would add mocha as a dependency, too.
I do like your approach to directly run the test-runner instead of uploading it. My version does test more metrics though. I think we should pick the best of both.
Yes great minds thinking alike … i have these three fixtures of different junit-reports i encountered (and slimmed down) in the wild. Most of my work was making sure that all three kind of runs are reported properly, and that repeated runs stay consistent.
How does your code deal with repeated runs, and with those variations in report format?
I now have a pretty robust and dynamic test generator for xqsuite tests executed on the server and then "parsed" by Mocha. see generators/app/templates/tests/xqs/xqSuite.js. Should we include that here?
The only thing that we would need to adjust is adding a method to provide the location of the xqsuite runner (and or suite) file.
The downside it would add Mocha as a dependency.
The upside you could run xqsuite tests from inside node-exist.
@line-o what do you think?
The text was updated successfully, but these errors were encountered: