LeoM to Linux and Tech News@lemmy.linuxuserspace.showEnglish • 12 days agoAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.commessage-square3fedilinkarrow-up135arrow-down12cross-posted to: nottheonion@lemmy.worldtechnology@lemmy.world
arrow-up133arrow-down1external-linkAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.comLeoM to Linux and Tech News@lemmy.linuxuserspace.showEnglish • 12 days agomessage-square3fedilinkcross-posted to: nottheonion@lemmy.worldtechnology@lemmy.world
minus-square@regrub@lemmy.worldlinkfedilinkEnglish3•12 days agoI wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt
I wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt