Share this article

Latest news

With KB5043178 to Release Preview Channel, Microsoft advises Windows 11 users to plug in when the battery is low

Copilot in Outlook will generate personalized themes for you to customize the app

Microsoft will raise the price of its 365 Suite to include AI capabilities

Death Stranding Director’s Cut is now Xbox X|S at a huge discount

Outlook will let users create custom account icons so they can tell their accounts apart easier

Google’s Gemini experiences image and text generation errors

Gemini generated offensive material and some are calling it racist and white-washing

2 min. read

Published onMarch 1, 2024

published onMarch 1, 2024

Share this article

Read our disclosure page to find out how can you help Windows Report sustain the editorial teamRead more

Google is trying to fix Gemini’s text and image generation errors. If you didn’t know, Gemini is having some errors. For example, sometimes, it doesn’t answer questions and marks them as sensitive. In addition, some of the images generated are offensive. So, Google needs to solve these problems as soon as possible.

What went wrong with Gemini errors?

What went wrong with Gemini errors?

Google constantly improves Gemini by giving it new features, quality-of-life updates, and data. Yet, they are also trying to prevent it from generating inappropriate content. However, that’s a challenging thing to do.

So, they add various rules, limits, and regulations. In addition, Google tries to add diversity to the AI’s content. As a result, Gemini experiences some errors. Thus, if you ask it togenerate the imageof a football team, the members will be from all over the world.

Unfortunately, Gemini didn’t stop there, and it generated some highly offensive images featuring Asian Nazis, black Founding Fathers, and a female Pope. On top of that, its text generator also defended immoral behavior. Furthermore, there are some accusations claiming thatGoogle’s Gemini is racistbecause it generates images featuring white people unless you specifically write it in the prompt.

New game: Try to get Google Gemini to make an image of a Caucasian male. I have not been successful so far.pic.twitter.com/1LAzZM2pXF

If you want to see more, scroll down throughFrank J. Fleming’s Mediaon X.

All the errors mentioned above show how unreliable Gemini and other large language models are. In addition,according to Google, Gemini is a tool that couldgenerate inaccurate informationabout the latest events. However, it is a bit concerning that the image generator cannot generate accurate historical details. Moreover, the company recommends we use Google search instead of AI.

In a nutshell, Gemini and most large language models experience errors. Furthermore, make sure to get your information from official sources, research papers, and trusted authors. Fortunately, Google will test Gemini’s features to ensure they won’t generate offensive or dangerous materials.

What are your thoughts? Do you get information from AI? Let us know in the comments.

More about the topics:AI,Google,Google Bard

Sebastian Filipoiu

Sebastian is a content writer with a desire to learn everything new about AI and gaming. So, he spends his time writing prompts on various LLMs to understand them better. Additionally, Sebastian has experience fixing performance-related problems in video games and knows his way around Windows. Also, he is interested in anything related to quantum technology and becomes a research freak when he wants to learn more.

User forum

0 messages

Sort by:LatestOldestMost Votes

Comment*

Name*

Email*

Commenting as.Not you?

Save information for future comments

Comment

Δ

Sebastian Filipoiu

Sebastian is a content writer with a desire to learn everything new about AI and gaming.