

Just switch to the F-Droid version.
Better: make sure all the apps you use come from F-Droid
I am also @lsxskip@mastodon.social
Just switch to the F-Droid version.
Better: make sure all the apps you use come from F-Droid
“Starch based” plastic is just a way to greenwash PLA.
Just because the C, H, and O originally came from starch, does not automatically make the chemically synthesized product safe.
They do this to juice the gross margin number and make the auto business appear more profitable.
TSLQ
But, I don’t think it’s smart. Holding this for more than a day or two is irresponsible. You capture more risk on the up days then you will gain on the down days of the underlying ticker.
Instead, invest in a business you expect to grow. Just ignore the failing ones.
Everything’s computer
I bet the people you work with are very happy to have you as a lead.
I’ve been in this scenario and I didn’t wait for layoffs. I left and applied my skills where shit code is not tolerated, and quality is rewarded.
But in this hypothetical, we got this shit code not by management encouraging the right behavior, and giving time to make it right. They’re going to keep the yes men and fire the “unproductive” ones (and I know fully, adding to the pile is not, in the long run, productive, but what does the management overseeing this mess think?)
They were just trying to make us smarter.
https://studyfinds.org/chewing-on-wood-brain-function-memory/
[kidding]
To be fair, if you give me a shit code base and expect me to add features with no time to fix the existing ones, I will also just add more shit on the pile. Because obviously that’s how you want your codebase to look.
I’m not sure why no one is direct linking it.
(the data looks incredibly incomplete)
This is not new knowledge and predates the current LLM fad.
See the Hutter prize which has had “machine learning” based compressors leading the ranking for some time: http://prize.hutter1.net/
It’s important to note when applied to compressors, the model does produce a code (aka encoding) that exactly reproduces the input. But on a different input the same model is unlikely to produce an impressive compression.