Australian Community Media's (ACM) in-house lawyer, James Raptis, will reportedly not be facing legal consequences from one of the nation's largest publishing companies after hosting multiple websites that used AI to plagiarize legitimate news stories.

The lawyer claims he had no involvement with the AI-copied stories.

Raptis denied involvement in drafting and publishing the articles, telling sources that he had only hosted and configured the websites.

The websites were all taken down hours after reporters attempted to contact him, and his social media pages have been made private or closed.

Raptis said in a statement to the source that the websites were run by someone else without the lawyer's knowledge or supervision.

The four websites-League Initiative, F1 Initiative, Surf Initiative, and AliaVera-posted subpar material that seemed to have been covered from the original copy using generative AI.

News Organizations Globally Believe Generative AI Presents New Opportunities for Journalism, Survey Reveals
(Photo: Justin Sullivan/Getty Images) A new survey has revealed that many news organizations globally are using generative AI in their works.

According to an ABC Media Watch program, some of the stories on the websites used the byline of James Raptis, the regional publisher's in-house attorney. His private company shares the address of AliaVera's workplace.

ACM's publications, which include the Illawarra Mercury, the Canberra Times, and the Newcastle Herald, are among its 16 daily and 55 non-daily news brands. Domain publisher Antony Catalano previously owned ACM.

The Guardian reports that ACM did not authorize Raptis's connection with the sites, but the business will not prosecute him. According to sources, management has agreed with his explanation that the fault lay with someone else.

Read Also: AI-Powered Audit Report Summarizer, Developed by Former Filipino Journalist, for Journalists 

Journalists on AI

Patrick Woods, a sports journalist for the Townsville Bulletin and creator of one of the copied news items, reportedly criticized the AI content, saying that the person running the website was one of a new class of "parasitic plagiarism" merchants that have become "all too common."

It looked like the website was rewriting the content using generative AI to escape accusations of plagiarism. Initiative Media was only one of several publishers employing AI to create a lot of material at a minimal cost; Sports Illustrated and other well-known publications have also used this tactic

Initiative Media monetized its articles by placing advertisements within them despite their poor quality. It was essentially stealing funds that could have been used to compensate reporters such as those who penned the first stories. 

Journalism-Trained AI

AI in journalism continues to be a slippery slope, as demonstrated by the copyright lawsuits Microsoft and OpenAI faced in February for their alleged internet-based information theft through AI training techniques and ongoing intellectual property violations.

Three new media outlets are at the core of this, claiming that both businesses are exploiting their publicly accessible web content for AI training without authorization or consent.   

It was alleged that ChatGPT copied completely or very literally copyright-protected news pieces, ignoring intellectual assets such as author names, titles, copyrights, and usage guidelines. 

The plaintiffs claimed that ChatGPT's ability to provide users with generating information would differ if OpenAI and Microsoft had trained it to incorporate copyright information. 

Related Article: OpenAI Collaborates with Financial Times to Enhance AI Models, Reader Experience with Exclusive Journalism Training 

Written by Aldohn Domingo

(Photo: Tech Times)

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion