They do, actually. Spamming is big business, as is scamming. Market (financial) incentives are largely at odds with what drives healthy relationships and communities, be they online or offline, because it introduces a conflicting agenda (profit) in people’s behaviour.
One need only look at what a great success (sarcasm) capitalism has been at driving and supporting community values. There’s a reason that pretty much everything related to public service is either served by non-profits or the state: because market economy doesn’t incentivise it.
You can also look into the studies that have been done in behavioural economics that show how financial reward skews behaviour and can even reduce performance. It’s a fascinating field!
We need to distinguish the behavior of ordinary users from scammers.
Scammers would likely attack this feature by setting up lots of fake accounts with fake credit cards and tipping each other’s posts, hoping to generate a payout before the charge backs start to come back on their credit cards. I already explicitly foil that scheme by delaying the payout period to six months, which greatly exceeds the 90 days that the cardholder has to file a claim. You could develop other anti-scam strategies.
That leaves ordinary users, and my comment applied to ordinary users. Ordinary users won’t reward large numbers of low quality posts.
How many pencils would manufacturers produce if every N pencils generated a new game badge? I understand the value of gamification for building engagement with new users, but ultimately many people burn out on generating hundreds of pages of content. And the better their content is, the more of it people start to demand. Giving them a few dimes for their time is going to be a more meaningful motivator than a new badge.
I’ve moved this to a separate topic to discuss the paid contributions, it’s an interesting one and shouldn’t be snowed under by the discussion about maintenance
Let’s also try and keep things on that particular topic.
Something not mentioned here is the power dynamic that would necessarily develop between posters who frequently get paid and the site admins/owners. A funding model like this puts the individual first, not the community, and the resulting consequences are likely to become extremely unhealthy for all involved.
If the admins are dependent on these payments, it is reasonable to believe they will want to ensure that those payments continue. Let’s say there is a prolific poster who nets a fair amount of money from their posts, such that if the poster stopped creating content, it would be felt by the admins. What would happen if the poster made a demand of the admins that would be considered unreasonable by any other ordinary user? This puts the admins in a difficult position - do they capitulate to the poster in order to preserve the income that poster brings in? Even if there is not an explicit threat from the poster that they’d leave, and even if the poster isn’t making demands that are unreasonable, this dynamic still exists. Whether they like it or not, the admins become incentivized to begin catering to the highest income posters.
The same could be said for posters who are paid primarily by one or a small number of high income users. How could that impact the content of the poster? What if users begin to doubt the content that the poster is creating, thinking they’re some sort of paid shill? What if the poster really does become a paid shill? These are very real things that can tear a community apart, even if we think ours is built different and no one would possibly do such a thing. Money changes things, deeply!
Fundamentally if a community runs on the core principles of rewards, eventually it shifts to inorganic content, derived and tooled to specifically optimise maximum revenue. This is a textbook case seen and documented time and time again, the most basic examples are people offering adsense optimised content on various review websites. Reviews used to be honest, transparent and a good source of truth, nowadays it is all about who pays the most for one and then easily gets a favourably worded review with no personality of its own.
What additional power does this give posters who are paid? The people who produce the best content already have power, just by virtue of the fact that the community relies on their content production. It’s easy to make up theories about how the world works, but it’s all very theoretical and there is not a shred of evidence introduced to prove these ideas as true.
Your best posters can do this anyway, and the power referenced here has nothing to do with getting paid. I run a probiotic ferments group that does DNA testing on ferments. One of my posters was extremely smart and extremely prolific. Every six months he quits and then disappears for long periods. This has all the impacts you reference, but I get no warning it is going to happen, and on top of that he has no incentives to stay. If he were getting paid something for his many fantastic posts, maybe he would think twice about leaving. If he would make a demand of me, at least I would know what his issues were. I might say yes to his demand and I might say no, but at least there would be a communication.
The other point is no one is going to be dependent on receiving a few dollars every month for their posts. When it gets to that point you create a subscription service and you make running the group a lifestyle business. Discourse already supports subscriptions.
This is not a valid argument because every group might have paid shills. It’s up to the admin to sniff out such situations and stop them if they interfere with the group’s integrity.
Facebook is the place where humanity is turned into slaves of a corporation, never paid for their many hours of work, and then killed off randomly at any time and for any reason. in no way shape or form does paying someone something for their efforts resemble Facebook.
Your example is one where the advertisers are charged to sell products, but the humans they advertise to curse them all the way. My proposal is one where the end consumers alone decide the worth of a post. If money is paid by end users, they were pleased to see the content.
It makes no sense to say that this incentivizes advertising. An advertiser could already post in a forum - FOR FREE - and there is no reason to complicate things by adding a mechanism by which people could pay them to advertise. It also makes no sense that anyone would pay the advertiser to advertise. The advertiser is also not going to pay himself.
While there are certainly going to be negative outcome use cases, the administrator at the end of the day is still the decision maker about when certain types of content are going to be allowed.
If an interested party knows that if they aligned the content to the audience, they will be tipped or rewarded, there will be a flood of purposely crafted content to appeal the audience, most of it either mass generated from AI or written by someone who specialises in content optimisation.
Your idea may sound novel on the surface, and your argument that admins will be weeding out the content, the problem it presents is that your staff (admins/mods) and the whole community would then be just burdened with identifying where to draw the line, what to consider organic and what not to.
The only way a healthy community can build is when people are genuinely interested in participating, without any expectation of reward.
I don’t think more than 2% of the posts would get financial rewards, and I don’t think that this 2% would get enough money to cause anyone to chase the token rewards. These tips are a symbolic way to tell people they are appreciated.
We already have AI-crafted content chasing eyeballs everywhere. That kind of thing will target Google search rankings because they can monetize it much more effectively. But to the extent that AI-crafted content starts to get financial rewards in a discussion forum, the admin can step in and stop it. It’s unproven that this would even happen.
There is a separate discussion to be had about AI. Because it is very clear that increasingly AI will start to do all the things that humans do, but even better than humans do it. We aren’t far from the point where people on either side of a discussion may start creating fake AI contributors who end up attracting a lot of conversation and helping a person to enforce their side of a story. That is a concern on its own, and it doesn’t require any kind of token system. This kind of AI intrusion into conversations will happen no matter what the financial rewards are.