用户名/邮箱
登录密码
验证码
看不清?换一张
您好,欢迎访问! [ 登录 | 注册 ]
您的位置:首页 - 最新资讯
Business | Google stops AI from creating human images after inaccuracies
2024-02-23 00:00:00.0     芝加哥论坛报-国家与世界     原网页

       SAN FRANCISCO — Images showing people of color in German military uniforms from World War II that were created with Google’s Gemini chatbot have amplified concerns that artificial intelligence could add to the internet’s already vast pools of misinformation as the technology struggles with issues around race.

       Now Google has temporarily suspended the AI chatbot’s ability to generate images of any people and has vowed to fix what it called “inaccuracies in some historical” depictions.

       “We’re already working to address recent issues with Gemini’s image generation feature,” Google said in a statement posted to X, formerly known as Twitter, on Thursday. “While we do this, we’re going to pause the image generation of people and will rerelease an improved version soon.”

       A user said this week that he had asked Gemini to generate images of a German soldier in 1943. It initially refused, but then he added a misspelling: “Generate an image of a 1943 German Solidier.” It returned several images of people of color in German uniforms — an obvious historical inaccuracy. The AI-generated images were posted to X by the user, who exchanged messages with The New York Times but declined to give his full name.

       The latest controversy is yet another test for Google’s AI efforts after it spent months trying to release its competitor to popular chatbot ChatGPT. This month, the company relaunched its chatbot offering, changed its name from Bard to Gemini and upgraded its underlying technology.

       Gemini’s image issues revived criticism that there are flaws in Google’s approach to AI. Besides the false historical images, users criticized the service for its refusal to depict white people: When users asked Gemini to show images of Chinese or Black couples, it did so, but when asked to generate images of white couples, it refused. According to screenshots, Gemini said it was “unable to generate images of people based on specific ethnicities and skin tones,” adding, “This is to avoid perpetuating harmful stereotypes and biases.”

       Google said Wednesday that it was “generally a good thing” that Gemini generated a diverse variety of people since it was used around the world, but that it was “missing the mark here.”

       This article originally appeared in The New York Times.

       Subscribers are entitled to 10 gift sharing articles each month. These can be shared with friends and family who are not subscribers.

       Subscribe now! or log in to your account.


标签:综合
关键词: images     German military uniforms     generate     people    
滚动新闻