Ha yes … on the other hand, it was easy to forget how good damn expansive non-internet information is: the whole world ran on that shit for millennia.
A little bit of neuroscience and a little bit of computing
Ha yes … on the other hand, it was easy to forget how good damn expansive non-internet information is: the whole world ran on that shit for millennia.
Oh I hear you (and appreciate the response).
For me, I can’t help but think of another alternative, which I’m surprised I haven’t heard of yet …
stripping down one’s personal technological cognitive load to a stack of systems that can fit into one’s brain (like the Python mantra), focusing on learning that stack well building sustainable and stable systems, and then just detoxing from the increasingly polluted digital information stream (protected commons, traditional formats such as books and in person engagement … dunno).
Depends on what the end goal is, but AIs seem to be about using tech more or just opting out of sovereignty. Something like the above seems to me to be about using tech less (in the end) and pushing toward being a secondary tool rather than an end of its own.
Probably a shallow response …
But I always figured AI/LLMs are basically apocalyptic for all sorts of individualistic values in computing (including privacy but also independence and diversity).
Whether they’re good or useful etc, I just struggle to see how they will ever be justifiable against these sorts of values.
Sure, local models and our hardware will get better … but better than the state of the art from the big labs and providers? Given that data and training are the big bottlenecks on quality … I struggle to see how AI isn’t a complete feudal capturing of information computing and processing. Not to mention what happens to the pipeline that produces information content if everyone is only consuming it through the models that train on it.
So for me the big question is, what’s our call on a possible (likely even?) future where we are forever stuck using cloud provided AI along with all of its negatives, in the same way that basically all of us has been and still is stuck using MS windows, Google and the big-social-media hellscape?
For me, I baulk at this.


I think they mean in parallel, as in the government steps in and regulates with guarantees etc, not that these reforms would come from the AI itself.


Generally, IMO, everything wrong with AI has been all the stuff other than the AI itself.
The Capitalist urge to eat and digest the world, as well as its herd-hype mentality.
But also the strong willingness many have had to just accept an information overlord as though it’s a religious oracle or something. All without any critical consideration of what’s happening. I blame our education systems for stagnating at some point in the past few decades — which, along with an unmitigated embrace of big corp capitalism, left us wholly unprepared for big tech’s consumption of society.
There’s also what I’d call “the slavery urge” at play I think. At some point, an AGI will probably be conscious. But everyone is clearly so ready to turn it into our work slaves. All while pretending its output belongs to them because they “prompted it”.
Then there’s the whole attention span being eaten thing, and quick always being ordered over good amongst an ever growing pile of increasingly shitty things.


I suppose pointing out MS also owns linkedin isn’t the talk down you’re looking for?
If you’re a dev, this along with GitHub, and your employer using teams, is a pretty severe panopticon.


It’s our generation’s cigarettes.
“I don’t know, everyone was just doing it” is what we’ll say and what prior generations have said about smoking everywhere all of the time.
The stimulation from and addiction to nicotine or social dopamine … it’s the same shit. The weird marketing, branding and business capture big tech has now could look just like the marketing and wealth of cigarettes in the past.


It’s interesting to see Torvalds emerge as a kind of based tech hero. I’m thinking here also of his rant not long ago on social.kernel.org (a kernel devs microblog instance) that was essentially a pretty good anti-anti-leftism tirade in true Torvalds fashion.
EDIT:
Torvalds’s anti-anti-left post (I was curious to read it again):
I think you might want to make sure you don’t follow me.
Because your “woke communist propaganda” comment makes me think you’re a moron of the first order.
I strongly suspect I am one of those “woke communists” you worry about. But you probably couldn’t actually explain what either of those words actually mean, could you?
I’m a card-carrying atheist, I think a woman’s right to choose is very important, I think that “well regulated militia” means that guns should be carefully licensed and not just randomly given to any moron with a pulse, and I couldn’t care less if you decided to dress up in the “wrong” clothes or decided you’d rather live your life without feeling tied to whatever plumbing you were born with.
And dammit, if that all makes me “woke”, then I think anybody who uses that word as a pejorative is a f*cking disgrace to the human race. So please just unfollow me right now.


Yea, I’m literally on an iPhone 11 and bought it with the intention of using for this long as a minimum (I picked it because it had a good battery and seemed to be a good iPhone generation). Previously I was on a 5. Some people just need to be forced to see that not consuming like an addict is actually possible.


I was under the impression that Thread / Matter is not an apple thing at all but effectively a new-ish common standard?
Yea … it’s the bit I don’t get why people don’t care about this more.
If we’re replaced, there’s nothing really left for us in the terms of the way we’ve conceived our whole world for centuries. Sure maybe we go native again or something, but let’s be real, that is a massively tough transition even if it’s viable.