How should Discourse handle GitHub pull requests?


(Jeff Atwood) #1

We’ve gotten some fantastic GitHub pull requests from the community, and we look at them all closely.

We are prioritizing support of those developers willing to actively contribute to the Discourse code base. Because open source!

However, we have had a few problems where we pulled in stuff that broke Discourse in various ways.

We’d like to improve our GitHub process so we:

  • Are timely in responding to pull requests

  • Give good feedback to pull requests, whether we end up using the code or not

  • Do not pull in things that accidentally break Discourse

We’ve already included some community suggested stuff to help us better process pull requests, namely

What else can we do, and how can we improve in this area?


Two emails for one user
(Jeff Atwood) #2

I believe we should be a bit stricter about accepting pure CSS and HTML changes. We should, before accepting, ask the author of the pull request have you tested these changes in…

  • IE?
  • Firefox?
  • Chrome?
  • Safari?
  • your mobile smartphone?
  • whatever tablet device you have available, if any?

I am not suggesting that every person who sends in a CSS pull request go to the nearest Apple Store first to buy testing devices, but I think it is reasonable to expect a good-faith effort to test CSS and HTML changes in the most common browsers and whatever smartphone / tablet the contributor has ready access to.

We need your help on these HTML and CSS changes. We are happy to help test, but it’s not fair to put all the burden of testing on us.


(Jeff Atwood) #3

This almost kinda goes without saying, but… whenever we do end up merging a pull request that breaks Discourse, we need to think long and hard about whether we can add tests that would have prevented the break.

I am a fan of unit testing, but I am also a pragmatist and realize it’s not feasible or practical to have tests for everything in the system, nor is 100% code coverage even a goal worth having.

However, for those areas in the code that are

  1. Critical to basic forum operation (login, posting, reading)

  2. Easy to test

  3. Frequently broken, either by ourselves or others

We should redouble our efforts to add tests in those areas.

Whenever you are tempted to type something into a print statement or a debugger expression, write it as a test instead.


(Jeff Atwood) #4

Also, we are in the process of writing a CLA bot. This is kind of specific to Discourse, but every potential contributor should sign the CLA before we can even look at their code.

http://www.discourse.org/cla

The CLA bot will look at each pull request and see if their GitHub user name is on the CLA signed list in Google Docs. If not, it will automatically insert a message asking them to please sign it.


(Régis Hanol) #5

That could maybe be automatized using http://testling.com?


(Donnie Berkholz) #6

Isn’t that what Sauce Labs is for? In particular, Open Source Software Testing | Sauce Labs


(Alexander) #7

I had been thinking this today – projects I’ve worked on at $dayjob have both JS tests (a mixture of functional and unit) and integration tests (acting out use cases). It doesn’t look like Discourse has either. If I am trying to fix a bug in the JS app, where do I start testing?


(Daniel Watkins) #8

If you want to get really crazy with it, you could spin up a new (sandboxed) Discourse instance with the PR code running to allow crowd-sourcing of testing.

Probably not especially feasible (particularly given the 1GB minimum memory constraint).


(Jeff Atwood) #9

Isn’t this what http://try.discourse.org is for, though? We’ve gotten lots of great test results from there!


(Daniel Watkins) #10

Unless I’ve misunderstood, try. is running master (or at least merged code)? I was proposing spinning up an instance for PRs, to allow testing of the code before it’s merged. Probably not every PR, but for large changes (particularly UI changes), it could be extremely useful.

On the other hand, it’s better to ask forgiveness than it is to get permission, so this may be overkill. :wink:


(Sam Saffron) #11

Keep in mind we already have travis set up, I think it may have this kind of stuff built in.


(Alexander) #12

Is there anything in the test suite that looks at actual response HTML, or Javascript integration?


(Sam Saffron) #13

Working on that today, going to automate it with phantom


(Alexander) #14

Great!    


(Daniel Watkins) #15

Travis presumably doesn’t do the sort of “spin up a publicly-accessible VM for longer-term testing” stuff that I was talking about, so it would be difficult to use it for Selenium tests (for example).


(sparr) #16

So many web developers aren’t aware of tools like http://browsershots.org for testing across many platforms.


(sparr) #18

This is the first I’m hearing that Discourse uses a CLA, and I am quite saddened by it.


(Kevin P. Fleming) #19

We have an open source GitHub-attached CLA Manager that will be released in the next few weeks; if you can hold tight, we may save you quite a bit of work.


(Jeff Atwood) #20

We already have a bot which checks to see that GitHib pull requestors have filled out this form:

http://www.discourse.org/cla

You’ll notice it in action on the pull requests


(Jeff Atwood) #21

Can you be more specific than “saddened?”