

Nah, they’ll just push for streaming compute power and charge you once more!
Not much to say


Nah, they’ll just push for streaming compute power and charge you once more!


I don’t get this.
Someone posts an article, someone else enjoys it and tells them not to get discouraged by the crowd that doesn’t like the content and gets downvoted?
Afaik luddite is not an insult but maybe I’m wrong. Anyhow I feel a vocal section of the anti llm crew are just looking to discourage anyone who’s not aligning 100% with their battle.


Yeah I hope I am cautious enough. I use strict db models that were man written and have type checking and sanitation. That along with unit tests that cover everything I’ve been able to think of that can go right or wrong combined with the classic “obscurity===security” motto.
Of course there are always vectors one hasn’t thought of, but that goes for man made projects as well. If I decide to bring it live and scale up I’ll probably order a pen test.


I’ve been writing a slightly larger project with frontend, bff and backend and I need to take it in small batches so that I can catch when it misunderstands or outright does a piss job of implementing something. I’ve been focusing a lot on getting all the unit tests I need in place which makes me feel a bunch better.
The bigger and more complex the projects get, the harder it is for the LLM to keep stuff in context which means I’ll have to improve my chunking out smaller scoped implementations or start writing code myself I think.
All in all I feel pretty safe with my project and pleased with the agents work but I need to increase testing further before bringing anything live.


Jeez you really hit a nerve here, with your pretty sane concept about sharing resources communally.
I guess some people really don’t like the word wasteful or something.


Well yeah it is, but is most likely much harder to solve co-living like that in a way that’s acceptable for almost any people. Whereas what was suggested here is that people pool their resources and lend/rent to each other.
Nothing about forcing anything on anyone, and people who want to be able to have exactly the CPU they need at any given time would probably not be interested.


As someone who’s been using it in my work for the last 2 years, it’s my personal observation that while the models aren’t improving that much anymore, the tooling is getting much much better.
Before I used gpt for certain easy in concept, tedious to write functions. Today I hardly write any code at all. I review it all and have to make sure it’s consistent and stable but holy has my output speed improved.
The larger a project is the worse it gets and I often have to wrap up things myself as it shines when there’s less business logic and more scaffolding and predictable things.
I guess I’ll have to attribute a bunch of the efficiency increase to the fact that I’m more experienced in using these tools. What to use it for and when to give up on it.
For the record I’ve been a software engineer for 15 years


Nothing phone has a pretty average repairability score, so I’d assume so.


Been using zed lately. Pretty similar ui, wildly different performance.
If children sex dolls are readily available it risks normalizing the concept of sex with children. Both for potential pedophiles as well as children who browse shein, might get the impression that adults having sex with children is a thing.