Why It Matters:
Each year, millions of images and videos of child sexual abuse material (CSAM) circulate online with reports having increased 15,000 percent over the last 15 years.
In 2022, the National Center for Missing and Exploited Children (NCMEC) received 31 million reports of alleged child sex abuse material. NCMEC noted that prepubescent children are at the greatest risk of being depicted in CSAM. Artificial intelligence is now being used to produce CSAM, magnify existing sextortion schemes, and target potential victims at previously unseen rates.
Apple is the world’s most valuable company and a major influencer in the ICT space with over 1.65 billion devices in active use. Its consumer electronics, software, operating systems and platforms for music, film, and internet portals are accessed by hundreds of millions of young people every day.
Apple does not proactively attempt to detect CSAM stored in its iCloud services despite widely available PhotoDNA detection technology used by other major tech firms, including Facebook, Google, Adobe, Reddit, Discord, and Verizon. Nor does Apple attempt to detect when its products and services are used to live-stream child sexual abuse. Former Apple Executive Eric Friedman stated that due to the company’s privacy protections, Apple is the “greatest platform for distributing child porn.”
Apple has developed “communication safety” tools to warn users about the dangers of sexual exploitation. Apple does not disclose data regarding the effectiveness of the tools in preventing the exploitation of children, claiming that doing so could raise privacy concerns. However, this information is financially material and will shed light on risks to investors.
The Tech Coalition, where Apple sits on the Board, emphasizes the importance of transparency in addressing CSAM. ICT peers, including Meta, Amazon/Twitch, AT&T and Verizon, have reported results from human rights and child rights impact assessments to understand and address risks to children across their business units. However, Apple discloses little information on how it assesses the risk of its products facilitating child sexual exploitation, leaving investors in the dark.
RECORD DATE
The record date pertains to the date by which investors must hold their shares in a company in order to participate in the company’s AGM. We will update this page as the record date and 2025 AGM date are announced.
Voting at the AGM
If you hold Apple stock before the next AGM record date, you will be eligible to vote at the next AGM and receive an email from us closer to the deadline.
What is shareholder activism?
Shareholder activism is when shareholders use their influence as owners of a company to effectuate change within the organisation.
What is Tulipshare?
Tulipshare is a sustainable investment fund and shareholder advocacy group on a mission to help investors push for stronger environmental and social commitments, using corporate governance to create a positive impact and ensure the companies we invest our money in are being responsibly managed by accountable leadership.
How does Tulipshare improve sustainability through investing?
Tulipshare addresses issues pertaining to climate change, human rights, racial and gender equity, political spending and operational transparency within some of America’s biggest publicly traded companies - issues that if left unaddressed could expose a company and its investors to significant legal, reputational and financial risks.