return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 3 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35fedilinkarrow-up1209arrow-down17
arrow-up1202arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 3 days agomessage-square35fedilink
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up3·2 days agoOr literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
minus-squareMeThisGuy@feddit.nllinkfedilinkEnglisharrow-up2·1 day agolike this guy? https://wikipedia.org/wiki/Oklahoma_City_bombing
Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
like this guy?
https://wikipedia.org/wiki/Oklahoma_City_bombing