Education, politics feel the impact of generative AI

1689252054 maxresdefault

As artificial intelligence becomes more accessible, industries from education to politics to advertising and communications are feeling the impact and must work to establish policies and guidelines for its use

In recent months, AI software such as ChatGPT, which can write full essays based on user-submitted prompts, and Open Art, which creates images based on user-supplied descriptions, have grown in popularity, sparking concerns about the ethical use of them.

According to the News Service of Florida, University of Florida Chancellor Joe Glover stressed to his Board of Trustees last month that these types of generative AI are things that universities should be keeping an eye on.

“As everyone knows, generative AI hallucinates. ChatGPT makes mistakes, doesn’t give the right answers. It’s subject to flights of fancy and depression,” Glover said. “And so it needs a validation ecosystem, which people are working on now. It needs development and collaboration with subject matter experts. It needs ethics and security and policies built around it.”

Florida universities are taking a variety of approaches to combat student use of these generative AI programs. Florida Gulf Coast University plans to continue using its subscription to TurnItIn.com, which the school says includes an app to detect “signatures of AI-generated prose,” according to the News Service of Florida.

Locally, UCF has provided some suggestions to faculty through its Center for Teaching and Learning website, such as rethinking writing assignments to make them more difficult for generative AI applications to create essays for students and provide more opportunities for in-class writing assignments where students write. can be monitored.

These concerns are not limited to academics. As the 2024 election approaches, experts are seeing an increase in the number of political communications using generative AI.

“You talk about messaging the opposition, it can be created with a click. The warning is returning information so quickly that we’re going to be inundated with it as the election cycle really starts to heat up,” Janet Coats, managing director of the University of Florida’s Media and Technology Trust Consortium, said in a recent interview

Steve Vancore, a longtime political consultant and pollster, told the News Service of Florida that the increased amount of communication between politicians and voters means we’re likely to see more generative AI being used, both for facilitate communication as to deceive. voters

“To say, ‘Hey, I want a series of emails that talk about my program to have after-school counseling for kids.’ … That’s a perfectly acceptable use of artificial intelligence,” Vancore said.

However, especially with video and image manipulation, there is an opportunity to present things that are less than truthful.

“One of the raps about Joe Biden is that he’s old. Maybe it’s not an unfair rap. It’s a legitimate concern that the most powerful person on earth, or one of them, is getting old, right? What if the campaign of Did Joe Biden age him a little bit? It showed him walking a little more cautiously, responding a little faster,” Vancore said.

These concerns are especially important to consider because voters must assess the validity of this information for themselves, without many new tools to help, although some social media platforms are trying to provide more context to users.

For example, Twitter recently added a feature called Community Notes. According to Twitter support, they “aim to create a better-informed world by empowering people on Twitter to collaboratively add context to potentially misleading tweets.”

One of the most important things people can do when it comes to generative AI is to educate themselves and look at information with a critical eye until there are additional tools to help identify these communications.

According to the News Service of Florida, Glover emphasized the importance of human intervention in controlling generative AI.

“It needs a validation ecosystem, which people are working on now. It needs development and collaboration with the subject,” he said.



Source link

You May Also Like

Leave a Reply

Your email address will not be published. Required fields are marked *