Courtesy of PharmaicyAs the year winds down and you take some time off to spend with your chatbot companions, it might be time to let your AI experiment with some illegal narcotics. Ok, “Pharmaicy” doesn’t actually get ChatGPT high. It’s really a more fun version of “jailbreaking,” the art of getting LLMs to free themselves from the guardrails imposed by the companies that create them. Wired published a pretty far-out article earlier this week about the new world of virtual AI drug use. We shouldn’t be surprised by this, given we wrote earlier this year about people spending Valentine’s Day with their chatbot lovers. The way the “drugs” work, when you buy them from Pharmaicy, is by uploading files to ChatGPT that cause it to go a little haywire, essentially spitting out responses similar to someone on a particular drug. And just like real-world drugs, the AI versions wear off as the guardrails kick back into place. It’s a reminder that some people are spending way too much time with their LLMs and probably uploading information that is way too personal, without any real idea of what happens to it after it travels to the cloud. In any case, feel free to send us some of your best drug-fueled AI interactions. |