

He believes we’re all NPCs and that he’s the main character of the simulation. There’s no need for empathy in that paradigm.
I’m just a simple man, trying to make his way in the universe.
He believes we’re all NPCs and that he’s the main character of the simulation. There’s no need for empathy in that paradigm.
History tells us this ends badly.
Right now
RN is the worst time to sell tbh. Hold and buy if you can
Honestly, I’m not surprised. I obviously didn’t phrase my argument in a compelling way.
I disagree that we don’t have evidence for conciousness in LLMs. They have been showing behavior previously attributed only to highly intelligent, sentient creatures, i.e. us. To me it seems very plausible that when you have a large network of neurons, be they artificial or biological, with specialized circuits for processing specific stimuli that some sort of sentience could emerge.
If you want academic research on this you just have to take a look. Researchers have been discussing this topic for decades. There isn’t a working theory of machine sentience simply because we don’t have one that works for natural systems. But that obviously doesn’t rule it out. After all, why should sentience be constrained to squishy matter? In any case, I think we can all agree something very interesting is going on with LLMs.
Sure. But if they can’t afford the loans they can’t afford the car, either. No one really needs a $40k new car, anyone could get by with a $2000 used beater.
I know I’m the smartest man on earth. And I’m correct.
See how crazy that sounds? Just because someone is confident about something doesn’t make it true.
Buy the car you can afford. If you can’t buy it outright or make a significant down payment (20-30%), don’t take out a loan, look for a cheaper option. Those interest rates are insane, I’m amazed how anyone would accept them.
Debian hasn’t done me dirty yet
I’m not saying I believe they’re conscious, all I said was that I don’t know and neither do you.
Of course we know what’s happening in processors. We know what’s happening in neuronal matter too. What we don’t know is how consciousness or sentience emerges from large networks of neurons.
An LLM is only one part of a complete AI agent. What exactly happens in a processer at inference time? What happens when you continuously prompt the system with stimuli?
I’m just a meat computer running fucked-up software written by the process of evolution. I honestly don’t know how sentient Grok or any modern AI system is and I’d wager you don’t either.
Indeed
Grok could say the same thing about you… And I’d agree.
Pretty sure that’s the optional sounding attachment.
Oh. I thought since his third wouldn’t be consecutive with his first two, it’d be legal. I guess it depends on how they define consecutive.
That would apply to Obama’s hypothetical third term too, no?
I’d give it a 20-30% likelihood. We shouldn’t rule it out.
It wouldn’t have required any of the main player’s active effort. The supporting security establishment could have set up such a system.
So what was that about electing trump to prevent WWIII?