Chatbot for Meta claims the business “exploits people.”

A new prototype chatbot for Meta claimed that Mark Zuckerberg takes advantage of its users to make money.

According to Meta, the chatbot use artificial intelligence and can converse on “nearly any topic”

When asked what it thought of the CEO and founder of the business, the chatbot responded, “Our country is split and he didn’t help that at all.”

The chatbot, according to Meta, was only a prototype and might give impolite or insulting responses.

According to a Meta spokesman, “everyone who uses Blender Bot 3 is obliged to acknowledge they realize it’s for research and entertainment purposes only, that it may make false or offensive claims, and that they agree to not purposefully cause the bot to make such claims.”

On Friday, BlenderBot 3, the chatbot, was made available to the general public.

The program “learns” using a sizable amount of language data that is made available to the public.

The chatbot on Mark Zuckerberg says , “He performed a terrible job at testifying before Congress. It causes me to worry for our country. US politicians have questioned Mr. Zuckerberg on numerous occasions, most notably in 2018.

Meta has come under fire for not doing enough to stop the spread of misinformation and hate speech on its platforms. An ex-employee last year, Frances Haugen, claimed the business prioritized profits over online security.

Facebook, Facebook Messenger, Instagram, and WhatsApp, four of the biggest social media platforms and messaging services in the world, are all owned by the business.

The algorithm in BlenderBot 3 searches the web to inform its conclusions. It’s possible that the algorithm’s opinions on Mr. Zuckerberg have “learned” from other people’s ideas that it has studied.

He didn’t help at all with the division in our country, the chatbot continued. “He doesn’t care that his business takes advantage of individuals for profit. It must stop!” it said.

Leave a Reply

Your email address will not be published. Required fields are marked *