QA checklist for when new versions are released?

Does anybody have a QA checklist they’re willing to share, that they use when testing a new release? Our QA team created their own unofficial QA checklist back when we used vBulletin, adding as they tested or when a new plugin was installed, but Discourse does many things differently. Rather than start all over with a brand new list, perhaps you all can make this task easier for us. Crossing my fingers.

Thanks in advance!

2 Likes

Log in
Create a new topic
Reply to existing topic

3 Likes

Excellent list, very extensive. Thanks!

This is what we use at SP to test our upgrades.

The Site Look Feel is section is due to having a custom theme via plugin, we like to make sure the plugin is still functioning as expected.

5 Likes

This is great, thanks for sharing! We too have a custom front end, so the Site Look Feel fits us perfectly as well. Great to see what others are doing with QA. Thanks again!

I know you are being a bit sarcastic here and that is OK, it was indeed a short answer :wink: but I actually believe those three actions are the most important and essential ones to test, beyond “does the site actually load in a brower”.

2 Likes

I appreciated your short test. And being a developer that constantly works with QA, I completely agree that those are the most essential items for any discourse instance.

The one I posted primarily focuses on a moderator position. Can I continue to perform duties I do on a weekly basis.

2 Likes

Haha, I should have added a smiley face to my reply to lessen the sarcasm. Feedback much appreciated. We run some user account integration with Wordpress and custom user management in general, so our QA has to cover some non-standard processes related to anything within the user registration/verification process and user account status changes( being assigned mod/admin role, delete as spammer, anonymize account, impersonation, etc.).

1 Like

You could add more monitoring checks, than e.g. checking if the site is up and its content serves a specific string (I do that with Icinga on monitoring-portal.org).

If you for example enable the REST API, there’s many things you can test/check. Even insights into the applications currently running. Since Discourse runs in Docker, there’s a variety of APIs and checks around to go further with Redis, PostgreSQL, Nginx, Ruby-on-Rails, and so on.

You may also go the route with full application and UX monitoring with end2end tests. There’s certain frameworks around like casperJS, not sure though if that works with Discourse. Probably better to have such things automated via API.

But still, you could go the route to trigger events and expect something in return on the current site.

I don’t do that yet, but it is on my list to increase monitoring and always know if the site is fully operational, or if I have hit a bug or a regression.

2 Likes

Note we have extensive automated testing before any build is marked ‘tests-passed’ this include creating topics, uploading images and more via chrome headless.

If you are running standard discourse with official plugins, every build out there that is tests-passed will work for all basic stuff.

6 Likes

That’s a good point to make. I should probably elaborate why SP has its own testing document then. We have it because we’re one of the very few instances that runs off a fork, so merge issues can occur and we don’t necessarily run all the tests to ensure functionality still passes after the merge is done.

So let that be warning to not use a fork, it adds complexity and requires additional resources to ensure everything is functioning as it should.

6 Likes

The google sheet file got removed. I’m in the same situation to have a QA checklist. Then would have test them manually as a normal users. Would it be possible to share the google sheet file?