

Except that we’re going to start asking for more pay to clean up after AI slop.


Except that we’re going to start asking for more pay to clean up after AI slop.


they actually don’t own the game
This is a fact that has been made apparent repeatedly. We know.
However, you were replying to this:
Make us individual game owners pay license every time we download and install the game?
With this:
This is how it’s been done for decades now?
…which is patently false.


Every time a company tries to restrict the number of downloads/reinstalls for a game that you buy a license for, it has backlashed so catastrophically that they’ve walked it back.
So it has happened, yes, but there is no situation currently with game distributors where you pay a license for a single download. They’re all pay once, download in perpetuity.


Yup.
There are very niche situations where a free market actually works - situations where there is no hidden information and no barrier to entry, where monopolies can’t arise due to the nature of the specific market. By the nature of these restrictions, nothing of any importance will ever be supplied by these markets.


And there never will be. Not so long as it is possible to hide information from the consumer, and any sort of barrier to entry exists for market competition to spring up.


If it’s something that you don’t really care about others seeing, that’s a prime candidate for cloud storage and more power to you.
This topic is about password lockers. I’m pretty sure you don’t want some schlub who happens to work at Cloud Password Lockers Inc. to be able to get at your PayPal account.


The partnership will happen later on when the noise has died down, and it won’t be publicized.


Don’t store your stuff in the cloud unless you don’t mind someone else accessing it.
If you store things in the cloud that you don’t want other people to access, you better be encrypting it yourself and only opening it locally.
This has been a cardinal rule since day 1.


GDPR laws allow you to download the contents of a Discord server that you have never been on? I thought it only required access to your own data.


You can only access it if you install Discord.


It’s not so bad in comments that don’t have a significant amount of ‘th’ in it, but this one was like hitting a speed bump every five feet.


The trouble is that it becomes far too easy to fall into the pattern of power being used to garner more power. All power needs to be held in check to avoid that very simple flaw. The moment the people are convinced that power of any sort shouldn’t be held in check is the moment that they are doomed.


It’s not the query that burns through electricity like crazy, it’s training the models.
You can run a query yourself at home with a desktop computer, as long as it has enough RAM and compute cells to support the model you’re using (think a few high-end GPUs).
Training a model requires a huge pile of computer power though, and the AI companies are constantly scraping the internet to stealfind more training material


“We have to find a compelling use case so we can keep tragedying the commons!”


The implication is that they’ll have to jump ship to Linux, and thus become a member of the unixsocks community.


They utilize these tools because they have no morals. They are willing to lie and cheat their way into whatever they want.
The left does not have that advantage. We don’t want to lie, we don’t want to cheat, we want the liars and cheaters to be removed from power. Lying harder isn’t going to work for us because unlike the right, we will call out our own for it.


Except LLMs are absolutely terrible at working with a new, poorly documented library. Commonly-used, well-defined libraries? Sure! Working in an obscure language or an obscure framework? Good luck.
LLMs can surface information. It’s perhaps the one place they’re actually useful. They cannot reason in the same way a human programmer can, and all the big tech companies are trying to sell them on that basis.


It’s not interpretation, it’s extrapolation.


I mean, originally they thought they had come upon a magic bullet. Turns out it wasn’t the case, and now they’re going to suffer for it.
You still lose the internal state between each token in the database output. It would let it plan, but it would still be externalizing that planning, one token at a time. Condensing all of the internal state into a single token at a time still means huge losses in detail as well as fragmentation of responses, resulting in all the problems that you see with LLMs.
Somehow the actual internal state needs to not only be preserved, but fed back into itself. That’s how brains work. Condensing it into tokens isn’t enough.