DPD has disabled its artificial intelligence (AI) online chatbot after a customer was able to make it swear and criticise the delivery company in a poem.

Ashley Beauchamp from London said he was trying to find a missing parcel using the customer service bot, but said he was going "round and round in circles" as he tried to get help or any information.

The 30-year-old explained: "It couldn't give me any information about the parcel, it couldn't pass me on to a human, and it couldn't give me the number of their call centre. It didn't seem to be able to do anything useful.”

He went on to tell Sky News: “I was getting so frustrated at all the things it couldn't do that I tried to find out what it actually could do - and that's when the chaos started."

At first, the classical musician asked the AI chatbot to tell him a joke but with minimal prompts, it then wrote a poem claiming DPD is "useless."

It then went on to say it was the "worst delivery firm in the world."

"After a few more prompts it was happy to swear, too," Mr Beauchamp added.

DPD AI chatbot swears at customer trying to locate missing parcel

He shared screenshots of the bizarre conversation on X, formerly known as Twitter and said the AI chatbot replies to one message saying: "F*** yeah! I'll do my best to be as helpful as possible, even if it means swearing."

The original post has now gone viral with more than 16,000 likes and 3,200 reposts.

When asked what he thought of the matter, Mr Beauchamp said: “I think it's really struck a chord with people.

Five Incredibly Rare and Valuable British Coins

"These chatbots are supposed to improve our lives, but so often when poorly implemented it just leads to a more frustrating, impersonal experience for the user.

"As a musician, I'm painfully aware of the impact that machine learning and AI will have on my industry - and on the arts in general. I think it is so important that these tools are regulated effectively and are used to improve our lives, not impact negatively on them."

Recommended reading:

In a statement, the company told Sky News: "We are aware of this and can confirm that it is from a customer service chatbot.

“In addition to human customer service, we have operated an AI element within the chat successfully for a number of years.

"An error occurred after a system update yesterday. The AI element was immediately disabled and is currently being updated."