

I have. I quit after it was decided our department should migrate to it. Half a year later I heard everything is on fire and the SNow migration was paused indefinitely.


I have. I quit after it was decided our department should migrate to it. Half a year later I heard everything is on fire and the SNow migration was paused indefinitely.


It can likely untangle all the jumps an obfuscator makes with relative ease. After that it should be easier to decompile into something meaningful.


DRM is basically just layers of obfuscated code to hide the “trap” code paths that render the game inoperable if you don’t have a license. I truly hope LLM can provide some good in this area, DRM is a black mark on digital rights and ownership.


Way ahead of ya. Graphene is mostly working well. Some small issues like MitId not working due to Play Integrity but hopefully the UnifiedAttestation and focus on removing big tech dependence will solve that.


I saw that on an app about a year ago. I’ve never uninstalled an app that fast before.
E: Yeah, exact same message



What’s that smell in the air? Is it the smell of Schadenfreude?


That’s what I have an entire laptop for. I call it the Dust Collector. It’s sole purpose is to collect dust and maybe once or twice a year be ready for a proctored exam. It’s not even allowed on my home network.


I guess the UK authorities think they’re impervious to the Streisand effect.
This would have been hilariously ironic, especially given the name of the ad campaign, if it wasn’t so grim.


When I said that Wikipedia should take it seriously and rip off the bandaid as quick as possible when the DDoS’s started, a few didn’t believe me when I said there was no reason to trust the content anymore if archive[.]today decided malicious activity using their traffic was okay. The owner’s ehtics (or lack thereof) showed that nothing stopped them from maliciously altering the content either, making any reason to hang on to the archive site null and void.
To those people doubting my perspective: Called it.


When Trump threatened tariffs I went ahead and bought 50 TB of storage. With my then expansion it would easily last me until the end of Trump’s turn and maybe a decade if I rationed.
Turns out that was one of my best calls of judgements to date, just not for the reason I thought.
We’re currently using waste heat in Denmark from data centres (and other industry including power plants, CHP - Combined Heat & Power) in district heating. Heat up the water and pump it down pipes to people’s homes.
My home is for example heated up primarily by a cement factory, a CHP plant, and a waste incinerator.


at least unless you look at the actual sources submitted
You can’t check the source for information that’s entirely been omitted. In any case, never assume Wikipedia provides the full story, or even a condensed and accurate one. What has been mentioned might be correct, but the devil is in what’s been left out.


Fun fact, Google was supposed to be named Googol, but the guy who were tasked with ordering the domain name misunderstood. As history would tell, they just decided to stick with Google.


“As quickly as possible” pulls a lot of weight in my statement. Just like when the EU is trying to cut our dependence with US payment providers, Wikipedia can’t do it overnight. The best time to plant a tree was 10 years ago, the next best time is right now.
Cutting ties with archive[.]today takes a long time, but the longer the decision to cut it takes, the longer to the ties are actually cut. It’s all about “make haste slowly”, ie. do a lot of planning on how to actually cut the ties with minimal impact so you can do it when forced to (for example if FBI were to take the servers one day) or when you decide that the independence from archive[.]today is more valuable than the remaining impact of cutting dependence. This could take half a year, a year, or more.
But indecision will at some point put you in a worse position: You are funneling your traffic to a malicious website that actively participates in DDoS attacks by using users’ traffic (including those coming from Wikipedia) to carry out the attack. Indecision can open you up to serious litigation and reputational damage by proximity. Given that archive[.]today crossed the line to malicious activity by misusing their traffic, what’s to stop them from malicious activity by misusing their content? IMO even if you think the integrity of your content and its sources are too valuable (and trust me, I think it’s very valuable) you need to consider this as a warning sign and realise that nothing’s stopping archive[.]today from losing the editorial integrity that you rely on.
So my suggestion, brainstorm ideas that would make you independent: Make agreements with IA to improve retention, roll your own archiver, make a deal with news orgs to show their articles as citations (this last one I actually like most the more I think about it. A good negotiator can call it advertising for the news org and you’ll at the same time not infringe on copyright like archive[.]today is). If you wait until point of no return, the choice has already been made for you whether you like it or not. And worst part is that you’d scramble to find a solution instead of the best solution.


I don’t really see it as a complicated issue. Archive[.]today is now an unreliable source that uses its user traffic to engage in malicious activities. By using it, Wikipedia will become unreliable by proxy.
The best course of action is to distance yourself from it as quickly as possible.


So just like M365 Copilot (actual name for the office app).


I still got a 4670k in my server. Thought of upgrading in Q1. I can forget all about that now… Unfortunately my mobo is slowly dying, so there’s a limit on how long I can push it.


Oh, really? Can you tell me more about that?


100% of LinkedIn
Yeah, we didn’t have those… We had one guy with prior experience of working in SNow on our team, none on the implementers’ team. He called the chaos and boy was he right. For reference, it was a company with 1000 people which were supposed to get SNow since the parent company wanted it (in total 20k people IIRC across all subsidiaries). No one thought to think that an agile software house required quite a lot of changes to the SNow setup to fit in together with all the old-school waterfall people it was designed for.