California enacted a law coming into force on the new year (1 Jan 2024) that adds reporting requirements for “social media companies”: Bill Text - AB-587 Social media companies: terms of service.
It asks for what are effectively a mildly complex set of database queries about flags and deleted posts, along with a copy of the current Terms of Service and content policy.
There’s a few things that can’t currently be automated, like the number of appeals users made of content moderation decisions and how many reversals those appeals generated.
Discourse should ship a report builder tool that pulls together all the easy numbers and leaves blank spots for the hard ones!
(The law does not apply to forum owners that received less than 100M $USD in revenue yearly! But that’s still a lot of corporate support forums, and I think that being aware of these numbers is still going to be good for the health of communities that don’t generate revenue like that.)
Those do not qualify as “social media companies” which what is what this law is targeting.
No, I think it’s fairly straight forward:
(d) “Social media company” means a person or entity that owns or operates one or more social media platforms.
(e) “Social media platform” means a public or semipublic internet-based service or application that has users in California and that meets both of the following criteria:
(1) (A) A substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application.
(B) A service or application that provides email or direct messaging services shall not be considered to meet this criterion on the basis of that function alone.
(2) The service or application allows users to do all of the following:
(A) Construct a public or semipublic profile for purposes of signing into and using the service or application.
(B) Populate a list of other users with whom an individual shares a social connection within the system.
(C) Create or post content viewable by other users, including, but not limited to, on message boards, in chat rooms, or through a landing page or main feed that presents the user with content generated by other users.
- This chapter shall not apply to a social media company that generated less than one hundred million dollars ($100,000,000) in gross revenue during the preceding calendar year.
The only really debatable qualification in my opinion is 22675(e)(2)(B). It could easily go either way.
I don’t exactly disagree that a good lawyer could make a case to wiggle out of it, but it does seem like Discourse installations would qualify for this pretty robustly by default:
Given that profiles display Most Liked, Most Liked By, and Most Replied (all lists of users with whom a given user has some particular “social connection within the system”), this seems pretty straightforwardly applicable.
Interesting to see this thanks for bringing it to the attention
From tiny beginnings this will expand to other geographical territories and the threshold for inclusion will shrink and as the use of forums for special interest groups like charities serving vulnerable communities is recognised & so the requirement will become mainstream.
At some point alternative developers will see it as an opportunity to capture market share if discourse is not adroit enough to stay ahead of the curve.
Social media moderation is an aspect of community that need institutions like courts and oversight of governance - naturally activity will get incorporated into legal systems