Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Post results table for popular promise implementations #40

Open
2is10 opened this issue Oct 4, 2013 · 8 comments
Open

Post results table for popular promise implementations #40

2is10 opened this issue Oct 4, 2013 · 8 comments

Comments

@2is10
Copy link

2is10 commented Oct 4, 2013

How about running the test suite on popular promise implementations and posting the results in a table format like a kangax compatibility table?

This would help developers choose between promise implementations and perhaps educate/motivate library maintainers.

@joonhocho
Copy link

+1

@domenic
Copy link
Member

domenic commented Oct 6, 2013

Since passing is a binary yes/no proposition, we simply add passing promise libraries to https://github.com/promises-aplus/promises-spec/blob/master/implementations.md (via pull request usually). This list might not be completely up-to-date since the 1.1 spec bump (and accompanying 2.0 test suite bump). So re-testing those libraries would be a good idea. But the concept remains the same.

@briancavalier
Copy link
Member

I agree with @domenic. It seems like this might encourage people to choose between "partially broken" implementations, and even listing implementations that aren't fully compliant might indicate that we endorse partial compliance. Our goal with this spec and the tests is for implementations to be fully compliant.

That said, one thought (which literally just popped into my head as I wrote the above) is that we could encourage people to include their Travis badge in the table. I'm betting that most libs already display it via their README, so it's only one click away, but figured I'd throw it out here for discussion. Thoughts?

@domenic We should probably set some sort of grace period for 1.1 spec compliance.

@wizardwerdna
Copy link

Indeed. I think the best plan would be to keep two lists simultaneously
for a grace time, retaining the present list stated as being verified for
1.0, and a list on top of those currently verified to comply with 1.1. The
understanding is that the bottom list will be deleted and the asterisk
removed by some time certain.

If you want to keep a single list, drop an asterisk by all libraries only
verified for 1.1, stating they will be removed unless verified with the new
suite by such-and-such a date.

On Mon, Oct 7, 2013 at 5:47 AM, Brian Cavalier [email protected]:

I agree with @domenic https://github.com/domenic. It seems like this
might encourage people to choose between "partially broken"
implementations, and even listing implementations that aren't fully
compliant might indicate that we endorse partial compliance. Our goal with
this spec and the tests is for implementations to be fully compliant.

That said, one thought (which literally just popped into my head as I
wrote the above) is that we could encourage people to include their Travis
badge in the table. I'm betting that most libs already display it via their
README, so it's only one click away, but figured I'd throw it out here for
discussion. Thoughts?

@domenic https://github.com/domenic We should probably set some sort of
grace period for 1.1 spec compliance.


Reply to this email directly or view it on GitHubhttps://github.com//issues/40#issuecomment-25805345
.

@2is10
Copy link
Author

2is10 commented Oct 7, 2013

I see. Just for the record, my motivation for suggesting this was that I was uncertain whether jQuery 2 passed, and it would have saved me time to have found it listed as “known failed” on your website. I also would have found it very interesting to see which test cases failed, to know what to avoid if I’m stuck using jQuery 2. Finally, upon discovering that jQuery 2 (or my arbitrary library of choice) fails, it would save me or any other library user time raising the issue with the library authors to have a link to share that lists which tests the library fails.

So I’d ask you again to consider listing failed implementations with their test results somewhere. Feel free to put it in a different section or on a different page and preface it with whatever disclaimer / caveat language you feel is necessary to accurately describe your position on failing libraries.

If you decide you don’t think this would be helpful, then feel free to close this issue. Thanks.

@lbdremy
Copy link

lbdremy commented Dec 4, 2013

Yes, it will be indeed really nice to have that.

With this gist the test suite for Q does not run correctly, the test suite seems to be stucked in the middle, I wonder why. Any explanations? Contrary to this When adapter that just works.

I have done something similar for jQuery on this gist but most of the tests fail, any idea why, is the Promise support that bad in jQuery 2.0.3?

Thanks

@domenic
Copy link
Member

domenic commented Dec 4, 2013

jQuery is not a Promises/A+ compliant implementation; it's "promises" aren't really promises. See http://domenic.me/2012/10/14/youre-missing-the-point-of-promises/

Q has not been updated for Promises/A+ 1.1 support, although it is Promises/A+ 1.0 compliant.

@lbdremy
Copy link

lbdremy commented Dec 4, 2013

Thanks @domenic

781 failing tests seemed like a lot so I thought something was wrong with my adapter for jQuery.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants