mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square10fedilinkarrow-up12arrow-down10file-text
arrow-up12arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square10fedilinkfile-text
minus-squareGaywallet (they/it)@beehaw.orglinkfedilinkarrow-up1·7 months agoIt’s hilariously easy to get these AI tools to reveal their prompts There was a fun paper about this some months ago which also goes into some of the potential attack vectors (injection risks).
It’s hilariously easy to get these AI tools to reveal their prompts
There was a fun paper about this some months ago which also goes into some of the potential attack vectors (injection risks).