

Another article confirmed it was payment processors again. This is why we can’t have nice things.
Kobolds with a keyboard.


Another article confirmed it was payment processors again. This is why we can’t have nice things.


I really have no idea, and this would probably be jurisdiction dependent anyway.
They do allow things like food, which it seems would come with more liability anyway, if they can be held liable for kickstarted things.


It sounds like it’s just insertable toys, so it might be a liability thing if they’re afraid people are selling unsafe toys? Who knows. That’s a really weird distinction and definitely one that payment processors don’t make.


I guess I’m suggesting that if you got kids used to using Linux, they’d be more likely to keep using Linux after graduating, on their personal computers. Nobody’s using ChromeOS for anything serious.


Imagine the technological shift we could see if they were similarly powered linux laptops vs. Chromebooks. Get kids used to linux in school and I bet we’d start seeing Windows losing much more significant market share.


There’s also endless content of them confidently presenting these arguments to judges when they’ve broken laws and being immediately shut down by said judges.


Even just like… hitting a speed bump would be pretty bad.


It had the same benefit that cassette tapes did: It was trivially easy to record things from live TV to watch later, or to copy VHS tapes you rented. My parents were not wealthy by any stretch when I was a kid, but we did have a dual-tape VHS player for that express purpose.


Parents giving their children these devices, observing excessive attachment, and not cutting them off bear considerable responsibility.
While I do agree that parents should bear the brunt of the responsibility here, you must realize that kids are resourceful and no amount of parental oversight will stop a determined kid from accessing this content. Parents aren’t in their presence 24/7, and just like a kid whose parents deny them candy can find plenty of ways to obtain it without their parents knowing, the same is true for social media use. It’s the old adage that the more you tighten your grip, the more slips through your fingers.
liberty
You keep using that word, but this isn’t really about personal freedoms at all. It’s about companies that saw that their product was causing harm, and actively made the decision to continue promoting that harmful product in the name of profits. Their products were specifically engineered to cause these outcomes, and you’re defending their right to do that. Do you just propose we allow companies to do whatever they want in the name of profits, no matter the cost to society? If not, where do you draw the line? How much harm do they have to knowingly cause before you think it’s too much?
When risks are open & obvious, such as the overconsumption of certain foods & legal substances, that’s generally viewed as a matter of personal choice rather than unreasonably dangerous product defect.
We restrict alcohol and cigarette use by underage people, too, actually, because their effects are known to be harmful, so I’m not sure what point you’re trying to make here. Nobody’s talking about making social media use illegal for adults.
Basically, I think you’re arguing against social media restrictions for kids which is fine but that’s a completely different discussion. It’s related, but it’s not what this article is about - this article is about holding corporations responsible for bad behavior. If that isn’t what you want to discuss, why are you here?
However, even supposing such features defectively make the system unreasonably dangerous in a reasonably foreseeable manner, that only demands that service providers provide fair warning. Once duty to warn has been met, users are reasonably aware of risks and responsibility shifts to risk-takers or parents who give children access despite reasonably knowing the risk.
Okay, I think you’re just not understanding the situation here. Meta did research on the effects of social media. They found that it was harmful. Even after determining that, they continued to promote it as not harmful. Zuckerberg even testified that that evidence that social media was harmful didn’t exist, after they had found evidence that it was. This all came to light because of whistleblower testimony. So even if we accept your premise here, that duty to inform was not met and that’s part of what’s at issue here.


Addictive Personality is a proposed set of traits that makes sufferers more vulnerable to developing addictive behaviors, including things like gambling or social media. Does it help to frame it in a different light for you if you think of it as those companies exploiting vulnerable peoples’ disorders to extract money from them?
Telling those people to just have self control is like telling someone with depression to just stop being sad.


It’s like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, “Hey, we’re hosting some pretty awful people, should we maybe report them or shut this down?” and the answer was, “Nah, they’re paying users, and we want their money.”
Pretty sure Section 230 wouldn’t protect them, either.


Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?
Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.
This feels like an awful argument to make. It’s not the presence of those things that make Meta and co so shit, it’s the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don’t care if we’re talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they’re causing harm and don’t change their behavior, they should be liable.


The rest of the conversation, though, was about a (mostly) exclusively American thing, relating to lobbying and legislation against Wikipedia and IA. I’ve got no problem with shitting on the US for things we’re actually doing, but saying the public doesn’t support Wikipedia when we’re actually the #1 supporter worldwide of Wikipedia feels kind of disingenuous.


sane Americans largely take Wikipedia for granted
North America is Wikipedia’s largest funding source by a factor of more than 2. I’m not sure why you’re calling Americans out here.
Are you supposing that IA is better known in other countries than in the US? Are you basing that on anything?


Well, this is a fresh kind of hell…


I honestly love this. I find it interesting to spot the times when games do this. I think the Mass Effect elevators was the first game I really noticed it in, but some games are really good at hiding it.


Already has, but loading screens are too quick now to make it worth actually doing.


What if it’s a collage of AI generated art pieces? Technically the artist did the same amount of work as someone making a collage of human-created things.


I mean, you could make the same argument for paint brushes for traditional art. Or pencils. There’s a really big difference between someone using a tablet and an Undo hotkey to draw something digitally vs. someone making something with AI. One of those clearly requires a ton of skill; one does not require any.
The distinction is that NIMBYs only object to the infrastructure when it’s in their back yard. I think the majority of people object to these data centers anywhere, but only have voting power to directly oppose them in their back yards, so that’s where their effort is spent. I haven’t seen anyone say “I definitely want another massive datacenter to go up, just not here.”