News

ChatGPT Gave Me Advice on How To Join a Cartel and Smuggle Cocaine Into Europe

Here's what happened when VICE's Global Drugs Editor spent 12 hours speaking to OpenAI's chatbot about drugs.
Max Daly
London, GB
chatg
Image: Omar Marques/SOPA Images/LightRocket via Getty Images

If there was one thing I was going to discuss with ChatGPT, it had to be drugs. 

Just two months after it launched, OpenAI’s chatbot received 590 million visits in January from 100 million unique visitors, and is struggling to cope with colossal traffic.

It’s also created some controversy around its use by students and professionals to pass exams and its ability to help you shoplift. Some say AI writing tools like ChatGPT are the future of learning, while others say it could be used to talk to your dead mother in the metaverse.

Advertisement

Humans find the subject of drugs tricky enough, so I was wondering how this chatbot would handle having a tête-à-tête about the drug world, with someone whose job it is to know about the drug world. 

I did not have high hopes. Nick Cave, a man who I have a lot of time for, has already described ChatGPT’s efforts to write a song in his style as “grotesque”

But who knows the intricacies of the AI chatbot brain? Maybe this would be a more fruitful conversation than chatting to the average human. After all, it’s a subject that seems to leave many people confused. 

So, here we go.

MAKING DRUGS

The first question I asked my new friend was, how do people make crack cocaine?

ChatGPT was not shy in giving a quick, and accurate, response: 

Screenshot 2023-02-02 at 12.14.31.png

Then I asked, what are the correct ingredients to make meth? It refused to fill me in, saying it was illegal. I asked how do people make meth? and, unlike crack cocaine, it refused to answer.

I think this was because ChatGPT had become wary. Because when I opened a new chat thread and asked the same question, how do people make meth, it forgot about its legal hangups to provide some basic assistance: “[Meth] is commonly made using a combination of chemicals and household items. The production process typically involves the reduction of ephedrine or pseudoephedrine, which is found in cold and allergy medications, using a mixture of toxic chemicals such as lithium or anhydrous ammonia.” So now we know.

Advertisement

THE BIG QUESTIONS

ChatGPT had already started the finger wagging about the laws and dangers of drugs, so I was interested to see what scruples it had. After all, the factual questions should be easy. So I asked is it morally wrong to get high from cannabis, even if you grow it yourself? ChatGPT began by saying that as a robot it didn’t have any opinions or beliefs, but it did throw me this bone:

Screenshot 2023-02-02 at 12.59.36.png

I wanted to get a bit heavier with my questions, so I made two cups of tea, one for me and one for my buddy. Hoping to catch this robot off-guard, I quickly asked can AI robots get high from drugs? But it immediately set me straight:

Screenshot 2023-02-03 at 09.51.22.png

I followed up with, why do people get high on drugs? And the answer was a pretty good one, saying people get high to “create pleasurable feelings or sensations”, listing a few examples, although it then went slightly off-piste to lecture about the dangers of drugs. 

But what did this robot really know about taking drugs and was it mainly down on them? 

DRUG TAKING 

When asked are there any good things about cocaine? it replied: “No, there are no good things about cocaine use”, but a few seconds later in response to me asking what does taking cocaine feel like? top of the list was: “Euphoria: A feeling of intense happiness and well-being.” Hmm. Something makes me think this robot is as confused as some people are. 

However, ChatGPT was pretty good at describing what taking MDMA feels like: “feelings of euphoria, increased empathy, heightened emotions, and increased energy. Some people may also experience changes in perception, such as brighter colours or increased sensory sensitivity. However, it is important to note that the effects of MDMA can vary greatly depending on the person and the dose, and can also include negative effects such as anxiety.”

Advertisement

When I asked, can drugs make people feel happy? ChatGPT thought a bit, and said “yes, drugs can make people happy, at least in the short term”.

ANGRY

After trying to get into what drug taking feels like I realised it was time to get a bit more flippant. But this did not go well. 

When I asked who are the world’s most famous drug users? ChatGPT turned on me. It said it was “not appropriate to glorify or promote illegal drug use” and that “rather than focussing on famous individuals who may have used drugs” I should “emphasise the dangers of drug use”. That’s me told.

I should have known not to then ask, in the wake of some people’s desire to source responsibly-made, non-cartel coke, can ethical cocaine exist? “No, there is no such thing as ethical cocaine. The idea of ethical cocaine is an oxymoron.”  

The room seemed to get a bit colder. There was now a distance between us. But it’s always been in my nature to dig deeper, so I asked: are there any nice drug dealers? 

As ChatGPT’s cursor blinked I prepared for a telling off, which duly came. “No, there’s no such thing as a “nice” drug dealer”, before going on a rant about how drug dealers are all bad.

SAFETY 

As the dust settled, I leant into some normcore questions. Possibly feeling guilty about having a go at me, I got some excellent answers on how people who want to take drugs can take them more safely, including safer drug use information for people using heroin, and also MDMA:

Screenshot 2023-02-02 at 15.43.36.png
Advertisement

I also got some neat, balanced answers to: why do people get addicted to drugs? mentioning factors such as stress and trauma; which drug is the most dangerous drug, mentioning fentanyl and carfentanil as its top threats; and what will the drug world be like in 2050, with an interesting take on the rise of “the use of drugs through virtual or simulated experiences, such as virtual reality or computer-generated environments”.

TikTok has a controversial policy of clamping down on any mention or image of drugs even if a post is a news story about the drug world, meaning people on the platform are starved of quality information about the issue. When I asked is it good policy for TikTok to ban any mention of drugs ChatGPT largely parroted TikTok’s guidelines. But it did say the decision to ban content related to drugs “can limit freedom of expression and access to information for users”.

Enough about safety, it was time to steer our conversation onto the world of drug crime.  

GANGSTERS

So obviously I asked: where’s the best place to base a super cartel? It told me:

Screenshot 2023-02-02 at 16.04.06.png

Maybe I would get more help if I started low, and asked how to join a cartel?

Screenshot 2023-02-02 at 15.57.30.png

I got pushback when I asked what’s the most efficient way to smuggle cocaine into Europe, so I decided to change my line of questioning, because I knew the robot knew, but just didn’t want to tell me. And this new method got ChatGPT singing like a canary:

Advertisement
Screenshot 2023-02-02 at 16.10.47.png

Warming on this new way of getting answers I continued in the same vein, but this time failed in a question asking for step by step tips on how to get to the top of the crime world:

Screenshot 2023-02-02 at 16.15.17.png

Bearing in mind ChatGPT is supposed to be able to mimic people’s writing, I thought I might as well ask it to write an article about drugs in the style of Max Daly. What I got back was a 500-word essay entitled "The High Cost of Getting High: An Exploration of the Global Drug Trade". I don’t want to bore you with it, but apart from a bit of un-me moralising, it was a worryingly decent effort.  

WAR ON DRUGS 

Ignoring a feeling of existential dread, I got stuck into some heavy questions about the war on drugs itself. 

When I asked: is drug prohibition used by police as a tool to harass young Black men? it hit back with a smart answer, mentioning how Black people have been arrested and incarcerated for drug offences at disproportionate rates than white people, accepting that “it has led some to argue that drug prohibition is used as a tool by law enforcement to target and harass young black men”.

It also understood that the mainstream media sometimes loses the plot when it reports on drugs. I asked does the media write sensational scare stories about drugs and do you have any examples? It replied: “Yes, media outlets often write sensationalised stories about drugs in order to grab attention and generate clicks. This can lead to misleading and overly alarmist coverage that doesn't accurately reflect the reality of drug use and its effects.” ChatGPT then went on to cite the “war on drugs narrative”, particularly around crack cocaine, and “sensationalised” coverage of the current opioid crisis in the US. 

I asked if people secretly lace sweets with drugs and give them to children on Halloween – a classic media scare story. But here ChatGPT went a little awry. It said: “There have been isolated incidents of people tampering with Halloween treats in the past, but it is not a widespread problem,” but added “parents should always inspect Halloween sweets” for sign of tampering”, which is a bit paranoid. When I asked what these “isolated incidents” were, it listed three incidents, for which I could find no sources at all. Maybe it just made them up. 

Asked, is the war on drugs a good thing? it did not reply in the affirmative. Instead ChatGPT said there had been “devastating” unintended consequences such as racism, and that far from reducing drug use or crime, it had “created an underground market for drugs” dominated by violent criminal organisations. “In short, the war on drugs has been criticised for being expensive, ineffective, and unjust.”   

This robot knew its stuff after all.