
Apple shaped how entire generations think about technology. For many of us, its products symbolized creativity and progress. The company taught us to “Think Different” and to believe technology could make life better. But today, Apple’s actions tell a different story — when ethics collide with revenue, Apple folds.
In 2014, Tim Cook came out as gay and positioned it as a moral stand. Apple wrapped itself in Pride flags, marketed inclusion, and sold us the idea that it stood for something bigger than profit. The company seemed to embrace the fact that Apple is the first Fortune 500 company with an openly gay CEO. Tim Cook even authored a piece in Bloomberg Business stating that he is, "proud to be gay."
Fast forward to now, when Apple quietly removed two of the largest gay dating apps in China, Blued and Finka, at Beijing’s request. No statement. No defense of queer communities. Just silent compliance.
This isn’t an isolated decision. It’s a pattern.
When ethics collide with revenue
Apple’s support of marginalized communities seems to collapse under pressure. Take child sexual abuse material (CSAM). In 2021, Apple admitted that verified images and videos of children being sexually abused were stored on iCloud. Because they knew it was a problem, they developed a privacy-protected, vetted-by-independent-experts detection system to stop it. They proudly announced their plan in August 2021 and then 30 days later paused the roll out.
Apple commissioned their own cryptography experts to confirm the system safeguarded privacy. Independent reviewers like David Forsyth and Benny Pinkas agreed: No innocent user data would be exposed. Yet Apple abandoned the plan after backlash over privacy concerns, retreating to arguments it had previously dismantled.
Apple's pivot to services like iCloud has made subscriptions a core revenue driver, generating nearly $100 billion annually with gross margins around 75 percent. Despite this profitability, Apple has still not implemented a meaningful solution to stop the spread of known CSAM, leaving iCloud as one of the few major cloud platforms that does not proactively detect known CSAM. This failure has sparked lawsuits from thousands of survivors who argue Apple's decision enables predators to pay for storage of abuse imagery, effectively monetizing their trauma. By contrast, companies like Google deploy industry-standard safeguards, combining hash-matching against NCMEC databases and AI to detect and report CSAM at scale. Apple's refusal to implement similar measures underscores a gap: While profiting from cloud services, it has not ensured those services are free from exploitation.
This isn’t just complacency. It’s negligence.
Ethics shouldn’t be optional
It’s easy to do the right thing when it sells. Pride campaigns drive revenue, but only when the White House is lit up rainbow or consumer trends value ethics. But standing up for queer communities in China when the government is challenging you to stand on the side of oppression? That’s harder. Tackling child abuse on your own platform? That’s riskier. Apple will remove LGBTQ+ apps to appease Beijing without putting up a fight, but won’t take decisive action against child predators.
Apple doesn’t "Think Different" anymore. It thinks profit. And until we demand better, it will keep choosing power over people.
What needs to change
Apple has the resources and expertise to lead on both fronts — protecting vulnerable communities and safeguarding children online. It could implement proven, privacy-conscious CSAM detection tools developed by experts at Thorn, NCMEC, and Johns Hopkins’ MOORE Center. It could take a public stand against censorship that erases LGBTQ+ lives. Instead, it has chosen silence and inaction.
Regulators, investors, and consumers must hold Apple accountable. Tech companies should not be allowed to monetize harm while hiding behind branding campaigns. Ethics cannot be optional in the digital age.
This article reflects the opinions of the writers.
Lennon Torres is a Public Voices Fellow on Prevention of Child Sexual Abuse with The OpEd Project. She is an LGBTQ+ advocate who grew up in the public eye, gaining national recognition as a young dancer on television shows. With a deep passion for storytelling, advocacy, and politics, Lennon now works to center the lived experience of herself and others as she crafts her professional career in online child safety at Heat Initiative. The opinions reflected in this piece are those of Lennon Torres as an individual and not of the entities she is part of. Lennon’s substack: https://substack.com/@lennontorres
Sarah Gardner is Founder and CEO of the Heat Initiative. With more than 13 years of technical and policy expertise in online child safety, she is an internationally recognized voice in advocating for the rights of children and survivors of child sexual abuse. Heat Initiative is an organization of technology experts, parents, survivors and advocates who believe strongly that tech companies like Apple and Meta need to remove CSAM from their platforms and implement policies that will keep children safe online.