Maybe it is time to make providers responsible for the content they carry. Give some leeway time to mark content as inappropriate and moderate it (if necessary) but, if it isn't taken down, the provider gets fined.
(And I clearly remember that Goog was hugely against such measure when they were working on G+.)
Uh, that's how it is already.
The provider must take content down in response to valid DMCA requests. They can't possibly monitor or moderate all content running through their platforms.
@juliobiason I appreciate the sentiment, and I sometimes wonder the same. But I've seen how crap automated systems are at moderating at scale, and it's just got to be impossible.
I don't see how it could be made to work. But maybe it can work, provided there are no large systems with a "social network" aspect to them.
But would you run a Masto instance if you were responsible for everything posted on it?
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!