Introduction
With the rapid development of the 21st century, science and technology are being updated and iterated at an alarming rate. Artificial intelligence is a product of this era and is one of the most outstanding representatives of the myriad of science and technology nowadays. On a macro level, it is an emerging technology that uses big data as a raw material for learning and computing, allowing intelligent machines to perform human tasks (Marconi , 2020). Its emergence has had a profound impact on the world, and its rapid integration into various disciplines and social fields has inevitably brought new problems and challenges while playing an important role. Ray Kurzweil, Google’s director of engineering, has predicted that in the next 20 years, machines may reach a singularity through self-learning and refinement, and that artificial intelligence will fundamentally reshape the future of human life, and that in the near future we will have to learn to coexist with machines. Therefore, it is important for us to face the problems of AI in a controlled manner, to think about effective solutions, to abandon our blind arrogance as inventors of technology, and to refuse to become slaves and subordinates of machines. Currently, AI technology has deeply intervened in the process of information dissemination in human society, profoundly changing the media industry, expanding the connotation of journalists in a more intelligent way, and incorporating intelligent robots into the scope of journalism. The new media ecology is also facing new ethical challenges at different levels. In this article, we focus on the application of AI technology in the field of journalism, involving various aspects of the news industry, such as information gathering, copy writing, and news distribution, to objectively demonstrate the advantages and ethical loopholes of AI, and to provide some feasible solutions based on the views of previous authors and my own insights, as we strive to truly realize the original intention of technology for human use, rather than being led by the nose and lose the initiative.
Intelligent Information Gathering: The Bias that Lies Within
The collection of news clues and information is the beginning of writing news stories. The traditional news gathering process is done by professional journalists, relying solely on journalistic sensitivity and personal experience to collect news materials, which limits the source channels of news clues, is prone to artificial subjective consciousness, and is closely related to the journalists’ own business level, and therefore has a certain degree of irrationality and instability. The intervention of artificial intelligence in the news gathering process liberates human labor and enriches the sources of information. Drones can enter locations that traditional journalists cannot explore in depth, such as some natural disaster sites, to obtain data information in a more macroscopic way with an overhead perspective, capture the full picture of news events in the first place, and transmit live pictures and videos in real time, which is an advantage that is difficult to be surpassed by human actions. As McLuhan said, the media is an extension of the human body, the application of sensors in the field of news gathering is a great extension of human senses, which collects more objective and credible data, and can obtain information that is beyond the reach of human senses. It is undeniable that the application of artificial intelligence in the field of information gathering has made significant contributions, but also generated some problems, the most representative of which is the hidden bias and discrimination behind the seemingly objective and impartial technology.
Buzzfeed, a U.S. entertainment news aggregator, has gradually transformed into serious media reporting in recent years and is among the high-end media industry. The company has developed a chatbot called buzzbot, which gathers useful news clues from users’ feedback through chats with them and edits top stories from a time period into news stories by asking them questions about top stories, such as Were you there when the news happened? Can you provide us with more information about it? What do you think about the news? Such chatbots collect a wide range of data, and because of the low barrier to entry to the Internet and the lack of strong regulatory policies, the users who provide data come from different regions and classes, and the news information they provide to the bots is likely to be emotionally and subjectively inclined, thus the information collected by the bots is not objective and rational. The audience receiving the news lacks proper perception and sees the bots as representatives of objectivity and impartiality and is easily negatively affected by the news received without distinction, and the objectivity and neutrality of news values will be violated (Leppänen , Tuulonen & Sirén-Heikel , 2020).
Robot writing lacks humanistic thinking
With the deep integration of artificial intelligence technology and journalism and the advent of the information explosion era, traditional journalists have been unable to fulfill the growing demands of their audiences in terms of the number of articles, and thus robots have been applied to news writing under a series of mixed factors. This liberates traditional professional journalists from the drudgery of interviewing and writing, allowing them to focus on more advanced and intellectually demanding human journalism, which leads to a rational allocation of resources (Torrijos , 2019). Several media outlets around the world are experimenting with automated robotic news writing, with the Associated Press, Reuters, and The Washington Post being the first practical pioneers to take the first steps. The Washington Post used robotic reporting at the 2018 PyeongChang Olympics to improve the accuracy and speed of event coverage, and robotic reporters can customize the production of pieces with their own characteristics to meet news requirements, as well as perform text error reviews with high quality and efficiency, humans and machines work together to reshape the news content production mechanism.
Although robotic writing has considerable advantages in terms of quantitative output, it also has some limitations that need to be addressed. First, robotic writing is based on objective data, and the limitation of a single isolated data source makes news stories lack relevant background information and other material to support them (Thurman , Dörr & Kunert , 2017), resulting in monotonous and unconvincing news stories without strong logical support. In addition, robots do not have the awareness of human active thinking and cannot trace the data information back to the source and dig deeper, instead mechanically producing homogeneous news reports based on algorithms and templates, and data-driven synthetic news reports do not have a humanistic perspective and lack vividness and emotional impact, artificial intelligence first entered the news industry in 2014, when the Los Angeles Times news report on the 4.7 magnitude earthquake in Southern California was completed by the intelligent robot quakebot three minutes after the earthquake, which, despite its rapid response, lacked human moral concern and was just a cold, factual description that did not exude empathy for the victims of natural disasters and hardly resonated with people. Hence they are currently limited to a few fields such as finance and sports in terms of genre coverage and cannot be used for comprehensive and in-depth reporting . In addition, robotic content production has squeezed the living space of traditional journalists, weakened their leading role, and caused a series of social problems including the unemployment of traditional journalists, therefore, people must confront these ethical failures that the use of machines will inevitably lead to, and think about solutions.
Algorithm distribution: audiences in the information cocoon
The application of algorithms in artificial intelligence technology to news distribution can depict user portraits and locate user needs through cloud computing based on massive data from user behavior feedback, accurately predicting which type of news content audiences are interested in, and the status of audiences is no longer ignored but elevated to an unprecedentedly important level, with users’ interests becoming the main starting point for news distribution, so that accurate information pushing . This precise information push achieves a good communication effect. Audience satisfaction with the pushed news generates more behavioral data, which in turn generates more interest recommendations, achieving a virtuous circle for news distributors (Hallinan & Striphas , 2016). The Chinese information Internet software “Today’s Headlines” is a representative of this type of news pushing, with the slogan “What you care about is the headline”, customizing different news according to the differences in audience characteristics to meet their personalized needs, allowing users to effectively filter out uninteresting content and save time costs in an era of information overload, while also providing opportunities for news producers.
Sunstein (2006) proposed a new concept of “information cocoon” which refers to the fact that a single source of information causes audiences to choose to receive only the content they are interested in the exclusion of other information, and over time they become trapped in an information cocoon. In order to cater to the audience’s taste and improve reception and actual economic benefits, news publishers deliberately choose similar information to push, which violates the audience’s right to free decision and the right to receive diverse information. News consumers in the information cocoon will gradually lose their judgment ability and be at the mercy of algorithms, and even irrational clustering phenomena such as group polarization will occur. Short video social software, represented by TikTok, makes full use of data mining technology to push short video content that meets users’ needs by collecting their like records, browsing records, and other demographic methods. A TikTok user who loves food has a high proportion of food content in his recommended options. In the frenzy of a large amount of homogenized information, it is difficult for Jitterbug users to escape from customized video content and other information from the outside world to come into view.
Conclusion
Technology is a double-edged sword that promotes social development and reshapes the social ecology while generating some unprecedented new problems that cannot be avoided and ignored. Because in the foreseeable future, we will usher in an era where everything is a medium, human-machine symbiosis and self-evolution, and living in harmony with artificial intelligence is a proposition that must be faced. In the field of journalism, the wide application of AI has changed the structure of news production, but the algorithm itself lacks objectivity, humanistic perspective and opacity, which brings troubles to journalism, thus it should be jointly governed by the government, news disseminators and audiences. The government has the right to supervise and can make corresponding laws and regulations to regulate AI behavior by mandatory means and impose severe punishment on those who do not comply with laws and rules. News disseminators should adhere to professional ethics, not deviate from their original intention because of the pursuit of profit, assume social responsibility, adhere to the guiding ideology of professionalism in journalism, and present the real world to the audience. Audiences themselves should also improve media literacy, seek multiple sources of information, learn to distinguish the truth from the falsity of news, step out of the comfort zone of information reception, develop the ability to think independently, and break through the information cocoon.
Tim Cook, the current president of Apple, once said this: “What I worry about is not that AI can think like humans, but that people will think like computers.”Humans should handle their relationship with AI rationally, maintaining a sense of reverence but also grasping the leadership position so that AI becomes the perfect tool to drive social progress, not a dagger to cause harm.
(Words account : 1926)
References:
Hallinan, B., & Striphas, T. (2016). Recommended for you: The Netflix Prize and the production of algorithmic culture. New Media & Society, 18(1), 117–137. https://doi.org/10.1177/1461444814538646
Leppänen, L., Tuulonen, H., & Sirén-Heikel, S. (2020). Automated Journalism as a Source of and a Diagnostic Device for Bias in Reporting. Media and Communication (Lisboa), 8(3), 39–49. https://doi.org/10.17645/mac.v8i3.3022
Marconi, F. (2020). Newsmakers : Artificial Intelligence and the Future of Journalism. New York, NY: Columbia University Press,. https://doi.org/10.7312/marc19136
Sunstein, C. R. (2006). Infotopia how many minds produce knowledge. Oxford: Oxford University Press.
Torrijos, J. L. R. (2019). Automated sports coverage. Case study of bot released by The Washington post during the río 2016 and Pyeongchang 2018 olympics. Revista Latina de Comunicación Social, 74(74), 1729–1747. https://doi.org/10.4185/RLCS-2019-1407en
Thurman, N., Dörr, K., & Kunert, J. (2017). When Reporters Get Hands-on with Robo-Writing: Professionals consider automated journalism’s capabilities and consequences. Digital Journalism, 5(10), 1240–1259. https://doi.org/10.1080/21670811.2017.1289819