Tim Cook on Privacy

Good afternoon, and John, thank you so much for that generous introduction and for hosting us today. It's a privilege to join you and to learn from this knowledgeable panel on this fitting occasion of Data Privacy Day.

A little more than two years ago, I joined by my good friend, the much-missed Giovanni Bittarelli and data protection regulators from around the world, came together to discuss one of the most pressing issues of our time. The debate over data collection and its consequences is a microcosm of a broader discussion about technology, society, and humanity.

The path we're on today is not an easy one. We know that the debate over data exploitation is often framed as a trade-off between convenience and security, but I believe that this narrative is fundamentally flawed. Technology does not need vast troves of personal data stitched together across dozens of websites and apps to succeed. Advertising existed and thrived for decades without it.

We're here today because the path of least resistance is rarely the path of wisdom. If a business is built on misleading users about data exploitation, on choices that are no choice at all, then it does not deserve our praise. It deserves reform. We should not look away from the bigger picture. At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement.

The longer, the better. And all with a goal of collecting as much data as possible. Too many are still asking how much can we get away with when they need to be asking what are the consequences. What are the consequences of prioritizing conspiracy theories and violent incitement simply because of the high rates of engagement? What are the consequences of not just tolerating but rewarding content that undermines public trust and lifesaving vaccinations?

What are the consequences of seeing thousands of users join extremist groups and then perpetuating an algorithm that recommends even more? It is long past time to stop pretending that this approach doesn't come with a cost, a polarization, lost trust, and yes, of violence. A social dilemma cannot be allowed to become a social catastrophe.

The past year, and certainly recent events, have brought home the risk of this for all of us as a society and as individuals as much as anything else. Long hours spent cooped up at home, the challenge of keeping kids learning when schools are closed, the worry and uncertainty about what the future would hold, all of these things threw into sharp relief how technology can help and how it can be used to harm.

Will the future belong to the innovations that make our lives better, more fulfilled, and more human, or will it belong to those tools that pry our attention away from everything else, compounding our fears and aggregating extremism to serve ever-more invasive targeted ads? At Apple, we made our choice a long time ago. We believe that ethical technology is technology that works for you.

It's technology that helps you sleep, not keeps you up, that tells you when you've had enough, that gives you space to create or draw or write or learn, not refresh just one more time. It's technology that can fade into the background when you're on a hike or going for a swim but is there to warn you when your heart rate spikes or help you when you've had a nasty fall.

And that all of this always puts privacy and security first because no one needs to trade away the rights of their user to deliver a great product. Call us nïve, but we still believe that technology made by people, for people, and with people's wellbeing in mind is too valuable a tool to abandon. We still believe that the best measure of technology is the lives it improves.

We are not perfect. We will make mistakes. That's what makes us human. But our commitment to you, now and always, is that we will keep faith with the values that have inspired our products from the very beginning because what we share with the world is nothing without the trust of our users. To all of you who have joined us today, please keep pushing us all forward, keep setting high standards that put privacy first, and take new and necessary steps to reform what is broken.

We've made progress together and we must make more because the time is always right to be bold and brave in service of a world where technology serves people and not the other way around.

"WEBVTTKind: captionsLanguage: enGood afternoon.And John, thank you somuch for that generousintroduction andfor hosting us today.It's a privilege to joinyou and to learn from thisknowledgeable panel onthis fitting occasionof Data Privacy Day.A little more than twoyears ago, joined by mygood friend, themuch-missed GiovanniBittarelli and dataprotection regulators fromaround the world, I spokein Brussels about theemergence of a dataindustrial complex.At that gathering, weasked ourselves what kindof world do wewant to live in?Two years later, weshould now take a hardlook at how we'veanswered that question.The fact is that aninterconnected ecosystemof companies and databrokers, purveyors of fakenews and peddlers ofdivision, of trackers andhucksters just looking tomake a quick buck is morepresent in our livesthan it has ever been.It has never been soclear how it degrades ourfundamental right toprivacy first and oursocial fabricby consequence.As I've said before, ifwe accept as normal andunavoidable thateverything in our livescan be aggregated andsold, then we lose somuch more than data.We lose thefreedom to be human.And yet this is a hopefulnew season, a time ofthoughtfulness and reform,and the most concreteprogress of all isthanks to many of you.Proving cynics anddoomsayers wrong, the GDPRhas provided an importantfoundation for privacyrights around the worldand its implementation andenforcement must continue.But we can't stop there.We must do more.We're already seeinghopeful steps forwardworldwide, includinga successful ballotinitiative strengtheningconsumer protections righthere in California.Together, we must senda universal humanisticresponse to those whoclaim a right to users'private information aboutwhat should not and willnot be tolerated.As I said in Brussels twoyears ago, it is certainlytime not only for acomprehensive privacy lawhere in the United Statesbut also for worldwidelaws and new internationalagreements that enshrinethe principles of dataminimization, user knowledge,user access, and datasecurity across the globe.At Apple, spurred on bythe leadership of many ofyou in the privacycommunity, these have beentwo years ofunceasing action.We have worked to notonly deepen our own coreprivacy principles but tocreate ripples of positivechange across theindustry as a whole.We've spoken out timeand again for strongencryption withoutbackdoors, recognizingthat security is thefoundation of privacy.We've set new industrystandards for dataminimization, usercontrol, and on-deviceprocessing for everythingfrom location data to yourcontacts and photos.At the same time thatwe've led the way infeatures that keep youhealthy and well, we'vemade sure thattechnologies like a bloodoxygen sensor and an ECGcome with peace of mindthat your healthdata stays yours.And last, but not least,we are deploying powerfulnew requirements toadvance user privacythroughout the AppStore ecosystem.The first is a simple butrevolutionary idea that wecall the PrivacyNutrition Label.Every app, including ourown, must share their datacollection and privacypractices, informationthat the App Storepresents in a way everyuser canunderstand and act on.The second is called AppTracking Transparency.At its foundation, ATT isabout returning control tousers about givingthem a say over howtheir data is handled.Users have asked for thisfeature for a long time.We have worked closelywith developers to givethem the time andresources to implement it.We are passionate aboutit because we think it hasgreat potential tomake things better foreverybody because ATTresponds to a very real issue.Earlier today, we releaseda new paper called A Dayin the Life of Your Data.It tells the story of howapps that we use every daycontain an averageof six trackers.This code often exists tosurveil and identify usersacross apps, watching andrecording their behavior.In this case, whatthe user sees is notalways what they get.Right now, users may notknow whether the apps theyuse to pass the time,to check in with theirfriends, or to find aplace to eat may, in fact,be passing on informationabout the photos they'vetaken, the people in theircontact list, or locationdata that reflects wherethey eat, sleep, or pray.As the paper shows,it seems no piece ofinformation is tooprivate or personal to besurveilled, monetized,and aggregated into a360-degreeview of your life.The end result of all ofthis is that you are nolonger the customer.You are the product.When ATT is in fulleffect, users will havea say over thiskind of tracking.Some may well think thatsharing this degree ofinformation is worth itfor more targeted ads.Many others, Isuspect, will not.Just as most appreciatedit when we built a similarfunctionality intoSafari limiting webtrackers several years ago.We see developing thesekinds of privacy-centricfeatures andinnovations as a coreresponsibility of our work.We always have.We always will.The fact is that thedebate over ATT is amicrocosm of a debatewe've been having for along time, one where ourpoint of view is very clear.Technology does not needvast troves of personaldata stitched togetheracross dozens of websitesand apps inorder to succeed.Advertising existed andthrived for decades without it.We are here todaybecause the path of leastresistance is rarelythe path of wisdom.If a business is built onmisleading users on dataexploitation, on choicesthat are no choices atall, then it doesnot deserve our praise.It deserves reform.We should not look awayfrom the bigger picture.At a moment of rampantdisinformation andconspiracy theories juicedby algorithms, we can nolonger turn a blind eyeto a theory of technologythat says allengagement is good engagement.The longer, the better.And all with a goalof collecting as muchdata as possible.Too many are still askingthe question how much canwe get away with whenthey need to be asking whatare the consequences.What are the consequencesof prioritizing conspiracytheories and violentincitement simply becauseof the high ratesof engagement?What are the consequencesof not just tolerating butrewarding content thatundermines public trustand lifesavingvaccinations?What are the consequencesof seeing thousands ofusers join extremistgroups and then perpetuatingan algorithm thatrecommends even more?It is long past time tostop pretending that thisapproach doesn't come witha cost, a polarization, oflost trust, and,yes, of violence.A social dilemma cannotbe allowed to becomea social catastrophe.I think the past year,and certainly recent events,have brought home the riskof this for all of us as asociety and as individualsas much as anything else.Long hours spent cooped upat home, the challenge ofkeeping kids learning whenschools are closed, theworry and uncertaintyabout what the futurewould hold, all of thesethings threw into sharprelief how technologycan help and how it canbe used to harm.Will the future belong tothe innovations that makeour lives better, morefulfilled, and more human,or will it belong tothose tools that pries ourattention to theexclusion of everything else,compounding our fears andaggregating extremism toserve ever-moreinvasively targeted ads overall other ambitions?At Apple, we made ourchoice a long time ago.We believe that ethicaltechnology is technologythat works for you.It's technology that helpsyou sleep, not keeps youup, that tells you whenyou've had enough, thatgives you space to createor draw or write or learn,not refreshjust one more time.It's technology that canfade into the backgroundwhen you're on a hike orgoing for a swim but isthere to warn you whenyour heart rate spikes orhelp you when you'vehad a nasty fall.And that all of thisalways puts privacy andsecurity first because noone needs to trade awaythe rights of their usersto deliver a great product.Call us nïve.But we still believe thattechnology made by people,for people, and withpeople's wellbeing inmind is toovaluable a tool to abandon.We still believe that thebest measure of technologyis the lives it improves.We are not perfect.We will make mistakes.That's whatmakes us human.But our commitment to you,now and always, is that wewill keep faith with thevalues that have inspiredour products from the verybeginning because what weshare with the world isnothing without the trustour users have in it.To all of you who havejoined us today, pleasekeep pushing us allforward, keep setting highstandards that put privacyfirst, and take new andnecessary steps toreform what is broken.We've made progresstogether and we must makemore because the time isalways right to be boldand brave in service of aworld where, as GiovanniBittarelli put it,technology serves peopleand not theother way around.Thank you very much.\n"