News

ChatGPT prompts: a real, non-AI generated handbook for journalists

2023-03-29. After being duped into paying $2 for a prompt handbook that was clearly generated using ChatGPT, Joe Amditis created his own comprehensive resource to help local journalists and newsrooms leverage the power of generative AI and automation – and enhance their news production.

by Lucinda Jordaan lucinda.jordaan@wan-ifra.org | March 29, 2023

Joe Amditis specialises in local news ecosystem management and support, focusing on local news innovation, collaborative journalism, project management, “and the chronically online”.

He is the associate director of products and events at the Center for Cooperative Media, a grant-funded program of the School of Communication and Media at Montclair State University.

He took time out to answer 5 questions on the release of his comprehensive Beginner’s prompt handbook: ChatGPT for local news publishers.

How long have you been using ChatGPT; how often do you use it now, and how has it impacted your daily/working life?
I’ve been using it since it first came out in late 2022, but I’ve been paying attention and playing around with GPT-3 and other LLM and AI tools for about a year or so now.

What concerns you most about it now – apart from the ‘hustle-culture grifters’ you mention in your blog post?
Big picture: I’m worried about how it might change our (read: my) relationship with text from a writer perspective to more of an editor-only perspective. Nerdwriter1 has a great video essay about these concerns.

I’m also worried about a rapid increase in pink-slime journalism* that becomes even harder to spot and combat.

Finally, I’m worried about the impact of LLMs (Large Language Models, or machine learning) that have been fine-tuned by right-wing activists and neo-Nazis, making it even easier to harass and intimidate marginalised people at a much larger scale and frequency.

What advice regarding its usage would you most want to convey to journalists, editors?

First, last, and always: transparency, honesty, and authenticity. ChatGPT and AI tools like it are just that:  tools. Treat them as such and remember that tools are only as good as the people who use them.

What are your thoughts on the ethical considerations (bias, accuracy, copyright, plagiarism, etc) of using ChatGPT and how newsrooms can counter this?
Again: Transparency, honesty, and authenticity are key here, as with any new technology or industry development. Don’t wait until after you publish an article to involve the community in the process. Bring them along with you and they may even help you figure out better ways to use these tools and incorporate them into your journalism.

A survey in UK newspapers this weekend said 50% of journalism jobs would be affected by AI. Do you agree – and how do you balance hope, opportunity and reality?
I would have to agree that journalism jobs and journalism as an industry will be affected by AI, but the ways in which those jobs are affected are much more tied to the relationship between journalism and capitalism than they are to the introduction of a new tool.

This happens every time there is a significant advancement in technology. Instead of using these tools as a way to elevate and improve journalism, the companies and executives in charge of these decisions inevitably use them to improve profit margins and the bottom line, often at the expense of the public and the newsroom employees.

At the same time, when these advancements and innovations are accessible to small newsrooms and freelancers, there is at least an opportunity for them to be used to reduce barriers to entry and allow smaller operations to punch above their previous weight.

Whether or not this becomes the case with AI will largely depend on how we as journalists and workers choose to respond to these advancements. Will we work together and collaborate to protect our professional and the public we serve, or will we allow large monied interests to squeeze even more surplus value out of an already squeezed and struggling sector?

I’ve also written about some of the untapped potential of LLMs for newsrooms here, and about the dangers of right-wing chatbots here.

* The term “pink-slime journalism” was coined by journalist Ryan Smith in 2012 to describe the practice in which news outlets publish poor-quality news reports that  appear to be local news, often to push both left-wing and right-wing agendas and gather user data. The reports are either computer-generated or written by poorly-paid outsourced writers, often using fake names. 

Download the Beginner’s prompt handbook: ChatGPT for local news publishers

Share via
Copy link