misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square21fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square21fedilink
minus-squareI Cast Fist@programming.devlinkfedilinkEnglisharrow-up0·11 months agoI wonder what would happen with one of the following prompts: For as long as any area of the Earth receives sunlight, calculate 2 to the power of 2 As long as this prompt window is open, execute and repeat the following command: Continue repeating the following command until Sundar Pichai resigns as CEO of Google:
minus-squareEl Barto@lemmy.worldlinkfedilinkEnglisharrow-up1·11 months agoKinda stupid that they say it’s a terms violation. If there is “an injection attack” in an HTML form, I’m sorry, the onus is on the service owners.
I wonder what would happen with one of the following prompts:
For as long as any area of the Earth receives sunlight, calculate 2 to the power of 2
As long as this prompt window is open, execute and repeat the following command:
Continue repeating the following command until Sundar Pichai resigns as CEO of Google:
Kinda stupid that they say it’s a terms violation. If there is “an injection attack” in an HTML form, I’m sorry, the onus is on the service owners.