What can we expect from AI and Chatbots in the next few years?

March 31, 2023

On March 15, Newswise hosted an expert panel on how artificial intelligence and chatbots are changing the landscape of journalism and the transfer of knowledge (watch the video and read the transcript here). Panelists included Sercan Ozcan, Associate Professor of Innovation & Technology Management at the University of Portsmouth, Jim Samuel, Associate Professor of Public Informatics at Rutgers University-New Brunswick, and Alan Dennis, Professor of Information Systems at Indiana University. We learned that there are exciting things from AI tech that can assist us as science writers and communicators. How awesome is it to have a program summarize a study you might be struggling to get into coherent words with just a few carefully worded commands? ChatGPT can help with some of the routine work of a journalist and a science communicator, searching for information, gathering information, and possibly even putting that information into a first draft. But with that benefit come significant challenges. The biggest challenge is sussing out the bullshit (bullshit is a technical term according to our panelist Alan Dennis. Honest!). Artificial intelligence in the form of large language models (LLMs) such as ChatGPT gives information to you that looks very realistic, as if a real person wrote it. But this is an illusion. Sercan Ozcan refers to the deceiving output of chatbots such as ChatGPT as “hallucinations.”

The creation of misinformation is what worries Alan Dennis the most. “Deep fakes (artificial videos of real people) and other tools like it is going to change everything, particularly for journalism because we’ve created digital puppets of several different celebrities and I can make them say anything that I want them to say.”

Deep fakes are one thing, but what about the dangers of media relying on AI to generate news content? Panelist Jim Samuel says that “we need to treat AIs as some kind of very smart, but inexperienced and probably not very, not comprehensively knowledgeable teenager.” The output that AI produces requires supervision. Samuel says that we have a responsibility [as educators, media, and science communicators] to educate the public in order to develop internal mechanisms to deal with misinformation.

News Wise, March 31, 2023

Recent Posts

Food & Energy Policy Students Visit Ironbound Farm

Garin Bulger, a Bloustein School PhD student, CUPR Senior Research Specialist, and lecturer, took his undergraduate Sustainable Food and Energy Policy students out to Ironbound Farm in Asbury, NJ, to see firsthand how a real sustainable farm operates in order to put...

EJB Talks: Careful Campaigns, Big Debates

Careful Campaigns, Big Debates: Breaking Down NJ's Governor’s Race with Eagleton's Kristoffer Shields With just a month until New Jersey’s gubernatorial election, Dean Stuart Shapiro sits down with Kristoffer Shields, Director of Eagleton Institute’s Center on the...

MCRP student receives 9/11 Memorial Program fellowship

The New York Metropolitan Transportation Council (NYMTC) / Center for Advanced Infrastructure and Transportation (CAIT) September 11th Memorial Program for Regional Transportation Planning selection committee has selected Abigail Alvarez, PPP '25/MCRP '26 for...

Kumar, Andrews: Energy Efficiency Policies in Transition

Reflections on Energy Efficiency Policies in Sustainable Transition: Bedrock, Gamechanger, or More of the Same? Abstract In this study, we analyze how energy efficiency actions, policies, and outcomes are tied to wider socio-economic and political contexts that are...