misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square68fedilinkarrow-up1442arrow-down113
arrow-up1429arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square68fedilink
minus-squareSyrus@lemmy.worldlinkfedilinkEnglisharrow-up9·1 year agoYou would need to know the recipe to avoid making it by accident.
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up5·1 year agoEspecially considering it’s actually quite easy to make by accident.
You would need to know the recipe to avoid making it by accident.
Especially considering it’s actually quite easy to make by accident.