Note: The following article was written as part of the 2024 Washington Journalism Education Association State Contest. Students listened to and asked questions in a live keynote panel presentation on the use of AI in Journalism, then students had 70 minutes to write an article about the panel presentation.
This article in the Editorial Writing category was awarded “Honorable Mention”
The rise of Artificial intelligence is continuously evolving and impacting daily life and future generations. As society is being exposed to AI in the classroom or the workplace, there is an obligation to consider the impacts and implement regulations to inform users of ethical and moral boundaries and risks of inaccuracy. Within journalism, regulations and objectives to educate users are vital to preserve critical thinking skills and avoid an eruption of misinformation.
The method of Generative AI inevitably increases the number of biases and misinformation in the news. AI is founded on interpreting information across the internet and producing a response it assumes the prompter will approve of. AI forms consensuses from sources that may contain biases within the information that AI is “learning” from. Beyond retrieving misinformation, AI has been known to fabricate dialogue and other “facts.” This blinds readers from reality and creates a massive issue revolving around fake news.
Various age groups are being exposed to AI and are grappling with the ethical aspects, especially within journalism. Journalists may intend to use AI for harmless tasks before it festers into a lack of critical thinking skills and unfulfillment in the workplace. At the Washington Journalism Education Association panel, Professor Brett Atwood passionately stated, “If you turn to chat GPT as a crutch not to think more deeply… that stops your critical thinking.” Curiosity and critical thinking are essential to the growth and development of journalists and readers. Writers have their own styles that increase diversity in our news, and individual creativity cannot be substituted for AI.
Regulations must be implemented to produce accurate information. AI can increase the amount of biased and skewed information that infects the media people consume. Journalists must learn how to use AI responsibly and meaningfully. Leaders and staff must be trained on the guidelines that a tool with such risk entails and enforce accountability.
Journalists must remain transparent and accountable while using AI. News is intended to report on events that impact the reader’s reality and should remain accurate and authentic. AI can be a healthy tool if used in addition to journalists’ creativity and accuracy rather than substituting it.