I was just writing up some stuff and making screenshots of the anonymization process and then I saw something I had never noticed before: anonymization apparently keeps the signup and last login IP addresses. Those should really be included in the anonymization process.
I fully agree with this, but perhaps the procedure should be made transparent (not sure if this is legally necessary, but it surely would help if both users and admins understand the distribution of responsibilities). What I mean is: I would like to assume that it is the user who has to point out each individual post that needs to be sanitized. In other words: it’s not enough to request “deletion” (aka anonymization) and assume that this will include any personal information in any post.
Perhaps the default ToS could be clarified in relation to deletion request. Currently, the elaboration of the CC user content license seems preoccupied about the site owner being allowed to remove content. How about also mentioning that the site owner can refuse the removal of content? Not sure whether it should say “within the limitations of applicable law” or something like that, but with or without that clause it would help make people aware of what they’re agreeing to.
I had breakfast this morning for someone who works with a major ad company and predicts that they’ll basically shut down a bunch of their services when 25 May hits because they don’t quite know what to do.
Yes, I know a few companies as well that will shut down some of their applications on May 24, just because it’s too big a problem to fix and the liabilities will be too high.
Just checking on an updated for acceptance of a change of ToS or other policies.
At the moment we have a compulsory field so all new users have to tick the “I have read and agree with ToS” and this is stored in the db
But this does not cover changes of ToS when a user will have to accept them again before logging in.
Is there a way of doing this in Discourse?
For those of you still confused about GDPR, I completely recommend watching this talk https://youtu.be/zU3GZyO_E4g
I attended this conference and it was one of the most concise talks on GDPR I’ve seen so far.
Careful, using boolean values for this can be incorrect. Use a DateTime. Reason? ToS versioning. Eg. what if someone agreed to your privacy policy in 2017-03-01 and you released a new version at 2017-04-01. You can just put that to false for everyone but then you have no “proof” (aside from backups) that said users agreed to your policies before.
I see your point but I could not see other way of recording users accepting the ToS.
Any suggestions welcome
I am actually not sure if you can do that without rewriting core Discourse elements at this point in time. If you were to go with that however - first I would create fields for each kind of consent and whether or not those are required (maybe you have some agreements that are not mandatory) and one more named revalidation_required being a Boolean. Those would be DateTime. When user registers they click on a checkbox for each and cannot proceed until they do. After registering you put a Time.now to each field.
Now, posts with your ToS messages should be mapped to database records. As in - you change them (so probably after_save action) - they put revalidation_required for each user.
Then you create a new redirection straight into ApplicationController for users that have revalidation_required set to true, you will also need a brand new controller and view with Consents. Basically you look into updated_at fields of each consent and if it’s greater than what user has agreed to then you create a checkbox there for that part of ToS that they have to accept to proceed.
This isn’t perfect but it does sound doable in a day. Heck, if it was “raw” Ruby on Rails then I have already done it and released a sample project that attempts to be compliant with GDPR. But unfortunately I am not familiar enough with Discourse’s front-end side and Ember.js to attempt it here, I could write all migrations and logic needed but I would fail horribly at presenting it correctly and making it togglable inside Discourse admin panel.
Thanks @ziptofaf
Oh dear! At least I know it cannot be done with Discourse as it is. We’ll have to figure out something else…
Thanks for sharing this. A really great overview with some good depth at the same time!
Yeah really good talk, thanks for sharing +1
@Adam_Prescott thanks for sharing this talk, have shared it onward to my USA-based team as it has some great examples of the cultural differences in understanding and legislation approaches!
The minimum amount of data is the minimum amount that you need to do the activity to which the user signed up.
For example, a business wanting to contact customers might want:
- name
- town
- country
- email address
However, the following possibly is not required to operate the business, and thus should not be collected or stored:
- Street address
- phone number
- age
Obviously every business and function is different. The aim is that you keep the minimum necessary to operate the service, and you tell users what you are collecting and why.
Pretty sure that was a spammer, here to post a spam link, not a legit request.
This is a helpful and comprehensive thread. It’s great to see how seriously Discourse is taking GDPR compliance.
I wanted to write in because I noticed that for the recently installed Discourse instance my community is useful, it is still using the default Privacy Policy that was last edited in 2013.
Because it hasn’t been edited since GDPR came under consideration, I am concerned that this default setting is not in compliance. I think that among other things, GDPR changed the way privacy policies are formatted in ways that are now internationally standard.
You can edit that Privacy Policy to say anything you like.
I understand that.
It is likely that I will edit it on my community’s instance. However, it is important that these notices are accurate. Otherwise, somebody hosting the software is liable for being sued for deceptive practices (by the FTC in the US).
If I’m fixing the privacy policy on our own instance, I’m happy to also prepare it as a contribution to the core repository. However, since getting the details right requires rather in depth understanding of the state of the software (as this thread shows) that will be hard to do without a lot of questioning of the team.
I may add: I’m a software developer by training and currently a privacy and security researcher by profession. I am asking about this because I think it could be a meaningful way to contribute. I’m wondering the best way to go about it.
Gotta be careful with statements like this.
Our updated privacy policy covers all the necessary areas so anyone is welcome to use that as a template for their own.
That said, we’re happy to answer any questions you have.
Ah, excellent, thank you.