If the exchange of images and information in the form of data and algorithms becomes the currency of the digital world – who is holding the purse strings? Are you always aware of how your data is being used or what rights you have to the images you upload online?
In an article shared by the Economist (N.C., 2019) the debate over the individual right to privacy versus the giant algorithms created by governments and global companies was challenged. While having a mobile phone and internet coverage may come as a right of passage to many, there is limited education on privacy regulations and the rights of individual users. Few stop to question the mechanics that drive the algorithms and news stories broadcasted across digital media. Surveillance is often framed as a positive aspect of modern technology, allowing platforms to show you selections personalised to your individual preferences.
Artificial intelligence is a friend you never knew you needed until now. Third party websites like Amazon will help you shop and recommend products you might like based on your search history. This same search history and data can be sold to third parties to deliver targeted advertising across social media or used in government political campaigns.
The question remains if the individual right to privacy is a valid part of the ‘new social contract’ (Suzor, 2019) that occurs across all digital transactions. In an age of camera phones and smart technology, the data collected on our personal lives has never been more invasive. Yet while Australians are recorded as the highest users of social media in the world, we rank the lowest for user rights, informed consent and privacy regulations on data usage and collection (Australian Competition and Consumer Commission, 2019).
How to balance the rights of the individual against the rights of the people has become a topic of hot debate. Scholars, governments, and corporations all have a different view of the best interests of their audience.
The need to have appropriate regulations stems from the idea of protecting the public interest,
a larger group of citizens, whereas the corporations driving the digital revolution – including technology giants and multinational corporations – are driven by a profit motive. For many digital platforms selling private information for profit is standard business practice. This often occurs without conscious acknowledgement from the users who are enticed by the offer of free services and entertainment in exchange for the ability to collect their personal data.
Even the terms and conditions shown on websites such as Facebook only provide a token approach towards privacy. Offering users a ‘take it or leave it’ set of conditions to access the services of the platform and conform to greater social norms, such as belonging to popular social media platforms or remaining informed via digital news articles. Large corporations and digital organizations often pay lip service to consumer concerns over privacy, offering them greater control over their information in a benign way which does not limit company profits – such as allowing users to choose who can view the answers to the pop quiz they answered or view the details of an event they created online.
Key Takeaway:
Large corporations are often content to allow users the illusion of control with targeted privacy settings and the ability to request content they dislike be removed, without acknowledging their legal rights or exposing the barriers to enforcing these regulations.
Drawing a line in the sand to identify legal rights and content ownership is not always straightforward in the digital age.
Suzor (2019) highlights the importance of understanding the legal implications and policies of digital platforms, specifically when users come to understand the legal immunity and copyright that are a fundamental part of the new social contract of internet governance. However, the swing of power favours the digital giants with inbuilt legal immunity and regulatory protections that make the details of their algorithm rankings and selection criteria protected from public view by trademark laws. These regulations dictate the information collected and how it is used by the company to further a commercial agenda is proprietary information.
However, who owns these copyrights and how they protect your privacy isn’t always so clear cut. Pre-existing media regulations were designed for a more tangible media presence with a well-defined chain of communication and distribution (Goggin et al., 2017).
Focus groups conducted by the University of Sydney in 2017, identified there was a ‘perception gap between people’s beliefs that harmful social media content was easy to take down, and the procedural reality that this is not always straightforward (Goggin et al., 2017).’ This lack of media literacy and understanding of how digital rights play out in a real-world environment creates a power imbalance, where one parties’ ignorance becomes another’s commercial gain (illustrated in the case study below).
Who owns the rights to your images?

Photo (left) of McDonald’s employee taken without consent and sold online by Getty Images. Photo (right) shows an image supplied by the defendant.
Huntsdale, J. (2020, May 16). Former McDonald’s worker wins ongoing battle over stock photo appearing on negative news stories. Abc.net.au. Retrieved March 29, 2022, from https://www.abc.net.au/news/2020-05-16/former-mcdonalds-worker-in-ongoing-battle-over-stock-photo/12224330
A young female McDonalds worker was photographed at her place of work and the legally obtained image was sold online for use in commercial publications. This image was taken without her knowledge and used in media that was disadvantageous to her chosen career in law.
The use of artificial intelligence, facial recognition and data matching allowed the image to keep appearing in searches connected with her current law career, creating a negative backlash with co-workers and potential employers easily able to access the image of her and her previous work in a fast-food chain. When seeking to have this image removed, it became apparent that despite being featured in the photograph, the copyright was owned by a commercial entity and legally able to be sold.
“It took three years, and multiple requests, but law student Kennedy Reese has finally convinced Getty Images to stop selling a photo of her (Huntsdale, 2020).”
Having the image removed became a costly legal battle and this was not an isolated incident. Terry Flew (2018), examines other threats to privacy online including misleading data collection for use in government and political campaigns, online harassment, hate speech, bullying and distribution of private images in the public domain, leading to a 2018 inquiry by the Australian Competition and Consumer Commission (ACCC) into regulating privacy on digital platforms.
Digital service providers seeking to operate within new operational guidelines advised by the ACCC allowed for the creation of user-controlled privacy settings and the option to report any perceived safety violations directly the platform. However, these strategies are limited in effectiveness by the sheer volume and competition of requests for content to be reviewed and inadequate regulations that fail to keep up with an ever-changing digital landscape (Goggin et al., 2017).” Other barriers to addressing threats to privacy include the structure of the complaints process, where rules for what is allowed are unclear and the risk of over regulation to impinge on the rights of users to share free speech and political opinions.
Adverse Impact of Regulatory Rules
“As significant numbers of Australians face new forms of risky and harmful speech online, government needs to explore law reform to address new privacy and speech rights breaches (Goggin et al., 2017).”
Giant transnational digital companies often use opaque and misleading techniques to obtain personal information. Risks to privacy can be attractively packaged and presented as ‘free’ services which allow users the ability to connect with others, share photos and be entertained without paying any subscription fees. This business model operates on the collection of personal data and the transfer of copyright to the service provider for user-generated content shared on their platform. For example, any content uploaded to Facebook becomes property of the company and the user relinquishes their copyright. The aggregate of the data is then de-identified and sold to third parties to generate revenue for the business.
Emerging from this business model have been several well-documented cases of false advertising, with digital giants offering personalised quizzes designed to identify your political orientation and then sold to government agencies seeking to influence campaign results (Flew, 2018). Privacy breaches and the ways in which personal information is used to show targeted advertising that attempts to influence users’ buying decisions based on previous search results has created a sense of distrust. Many Australian consumers report the latest form of targeted advertising ‘beyond the pale’ and the intrusion into their personal lives uncomfortable (Digital Advertising Services Inquiry, 2021).
A young intern lost her role at a prestigious organisation when an ill-thought-out post on popular social media platform Twitter went viral, bringing negative attention to her potential employer and displeasing important staff members with anonymous handles. Her moment of impulsiveness and its far-reaching impact were featured in multiple online news platforms including People.com in an article titled “Woman Loses Prized NASA Internship Over Vulgar Tweet” (Merrett, 2018).
The social media post came to the company’s attention via an automated alert system. The tweet – which was only posted for a few hours – was distributed to important audiences and able to be found by staff at NASA due to the user including the company hashtag or keyword in her post.
The infamous Twitter post (Merrett, 2018) went viral when an anonymous user commented on the language used in the post. It was her use of the hashtag “NASA” that allowed her post to be identified as related to the organisation and circulated online. The aggregation of images with similar content and facial recognition software allows for images posted online to easily be aggregated and shown together.
The threats to privacy are often nuanced and there is a subtle balance between the responsibility of the individual users for the information they post and the keywords they assign to their posts. These inputs are processed using artificial intelligence and automation that aggregate data. Machine-based learning does not consider human emotions or frailties. Rather it is driven by specific data inputs and predetermined algorithms. “AI is neither artificial nor intelligent” (Crawford, 2021).
Frank Pasquale (2019), author of the article “The Black Box Society” states, “The success of individuals, businesses and their products depends heavily on the synthesis of data and perceptions into reputation.”
The tools you see on social media that might appear benign or helpful to user autonomy and privacy, can become a double-edged sword. Hashtags and keywords users upload to their digital posts allow users to influence the algorithm and draw attention to their posts. This attention can also create a form of self-censorship that limits users’ freedom of expression due to fears over privacy and appearing out of step with current social norms. Lawrence Capello examines the power of digital surveillance to become invasive and detrimental to individual privacy in his book “None of Your Damn Business” (University of Chicago, 2019).

Lawrence Capello is a professor at the University of Alabama. Purchase your copy at Booktopia.com.au
Defining Public Interests
Public interests can refer to protecting the needs and privacy of digital users as a group or community. The public can be made up of a collection of users on a platform-specific basis or geographically grouped by countries.
Misplaced trust – shunning social media won’t protect your online privacy
Pasquale states “few of us appreciate the extent of ambient surveillance, and fewer still have access to its results – the all-important profiles that control so many aspects of our lives (Pasquale, 2019).”
The use of data collection and artificial intelligence is a widespread issue that impacts interactions across all businesses. Current research suggests the public is largely unaware of the extent of the surveillance and data collected about them that is being used to influence their everyday lives.
Did you know… Data can be collected in the process of engaging in everyday transactions online including when you book your car in for a service, or a profile on your location and social status based on searches you make for entertainment venues and restaurants in your local area.
It appears that the current levels of high trust towards data collectors may be naive, and further interventions needed to ensure a fair outcome for users.
Important safeguards that have the potential to become eroded in a digital monopoly are consumer choice, sufficient levels of competition and safeguarding policies to protect individual interests.
Other important things to consider are the Freedom from Improper influence, thereby ensuring that data collected reflects the human interests of a group and not just a commercial agenda used to sell products. This becomes important to note when looking at the influence of big data and distribution of news and political information. Freedom from improper influence can be defined as avoiding unnecessary restrictions by governments, while safeguarding societal interests through positive obligations (Nooran et al., 2018).
This information and data are aggregated and synthesized by algorithms which curate and prioritise what information to show consumers and ultimately control the ranking of information displayed on large search engines and digital social media platforms. The authors within the article raise the issue of safeguarding the public and the need to protect the interests of the individual. However, machine-driven insights and search functions continue to infiltrate daily life.
Criticism of Algorithms and Automation
For those critical of the current algorithms and their ability to deliver fair and equal access to information and freedom of speech, the status quo is unacceptable. Researchers in 2018 investigated racial discrimination on popular search engine Google and how the artificial intelligence was being used to further reinforce negative stereotypes and misinformation.
Specifically, Noble (2018) argued in her research that the term ‘woman of colour’ was portrayed negatively on Google, with images and related search terms coming up for sassy attitudes, black women’s body types and other crude imagery. Her research (Noble, 2018) compared these results when using search terms such as ‘white girls,’ which displayed images of demure women living an aspirational lifestyle and showed none of the negative bias of her previous searches. In this way, she argues search engines become algorithms of oppression and can be used to promote misinformation and unfairly influence perceptions of minority groups.
Current research demonstrates public perception appears to lean towards a more positive view of technology and innovation. The average citizen’s views on the safety of technology and digital innovation have the potential to be influenced by automated algorithms and covert data collection leading to preferential content being shown that reflects their preferences and creates a filter bubble that only shows information that reflects their interests and shows them posts by others with similar views and economic standing (Digital Advertising Services Inquiry, 2021). Snapchat founder and CEO, Evan Spielberg asserts that younger audiences such as millennials must take responsibility for the content they post and the impact of their digital interactions, reinforcing the view that millennials have grown up with the new digital technology and are therefore immune to its negative side effects and potential for privacy breaches (Crawford, 2021).
Younger audiences such as millennials ranked user-controlled features to moderate privacy of their information as helpful, whereas other groups show more divided opinion based on their political views, cultural background, and economic status, showing that citizens demand legal safeguards and regulations be put in place to safeguard their rights to privacy.
Researchers are clear that to maintain safeguards for public safety and privacy issues, we need to face the reality of negative consequences and commercial agendas driving the digital revolution, with the right to privacy and transparency needing to be prioritised, as well as greater transparency in place for consumers on how their data is used and collected.
No right to privacy means no legal recourse
Current legal frameworks provide limited regulation on the use and selling of private information, with tech giants dictating the terms of service that users must accept to use their platforms. ‘Data and algorithm-based’ ranking of information doesn’t necessarily require an understanding of what is important to the audience and their personal aspirations. It favours the interests of the service provider and their revenue models. There is currently no ethical framework used to inform intervention selection and mix of algorithms used by AI software. The implications for the broader economy make the issue of privacy and data collection even more complex.
Marking the dawn of a new exchange where privacy comes at a premium.
Related articles
References
Andrejevic, M. (2019). Automated Media (1st Edition). Routledge. https://doi.org/10.4324/9780429242595
Australian Competition and Consumer Commission. (2019, June). Digital Platforms Inquiry (978 1 920702 05 2). Commonwealth of Australia. https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf
Bowman, O. (2015, June 15). Snapchat CEO’s Graduation Speech. Mic.Com. Retrieved March 9, 2022, from https://www.mic.com/life/popular-home-improvements-people-are-making-for-under-40-on-amazon
Crawford, K. (2021). The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (1st ed.). Yale University Press. https://doi.org/10.1177/1461444816676645i
Flew, T. (2019). Platforms on Trial. Intermedia, 46(2), 18–23. https://doi.org/10.1386/jdmp.10.1.33_1
Huntsdale, J. (2020, May 16). Former McDonalds Worker in Ongoing Battle over Stock Photos. ABC. Retrieved March 9, 2022, from https://www.abc.net.au/news/2020-05-16/former-mcdonalds-worker-in-ongoing-battle-over-stock-photo/12224330
Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet, 39(2), 238–258. https://doi.org/10.1177/0163443716643157
Merrett, R. (2018, August 23). Woman Loses Prized NASA Internship Over Vulgar Tweet — But She Might Get a Better Position. People.Com. Retrieved April 2, 2022, from https://people.com/human-interest/woman-loses-nasa-internship-over-tweet/
N.C., K. (2019, December 13). Surveillance is a fact of life, so make privacy a human right. The Economist. Retrieved March 29, 2022, from https://www.economist.com/open-future/2019/12/13/surveillance-is-a-fact-of-life-so-make-privacy-a-human-right
Noble, S. (2018). Algorithms of oppression: how search engines reinforce racism (Vol. 374). New York University Press. https://doi.org/10.1126/science.abm5861
Nooran, P., van Gorp, N., & Ó Fathaigh, R. (2018). Should We Regulate Digital Platforms? A New Framework for Evaluating Policy Options. Policy and Internet, 1–38. https://onlinelibrary.wiley.com/doi/10.1002/poi3.177
Pasquale, F. (2015). The Black Box Society. The Need to Know, 1–18. https://www-jstor-org.ezproxy.library.sydney.edu.au/stable/j.ctt13x0hch
Suzor, N. (2019). Lawless The Secret Rules That Govern our Digital Lives. Cambridge University Press, 36(1), 3–17. https://doi.org/10.1017/9781108666428
Woods, L., & William, P. (2021). Regulating big tech: Policy responses to digital dominance. OUP USA. https://doi.org/10.1093/oso/9780197616093
