티스토리 뷰

Google has acted as judge, jury and executioner in the wake of Europe's right to be forgotten ruling. But what does society lose when a private corporation rules public information?


Mario Costeja spent five years fighting to have 18 words delisted from Google search results on his name


When the Spaniard googled himself in 2009, two prominent results appeared: home-foreclosure notices from 1998, when he was in temporary financial trouble. The notices had been published in Spanish newspaper La Vanguardia and recently digitised. But their original purpose - attracting buyers to auction - had lapsed a decade ago, as had the debt. Costeja Gonzalez asked the newspaper to remove them. When that was unsuccessful, he challenged Google, and the case was eventually elevated to the European Court of Justice, Europe's highest court


Forgetting and remembering are complex, messy, human processes. Our minds reconstruct, layer, contextualise and sediment. The worldwide web is different. As Google founders Sergey Brin and Larry Page described in their original Stanford research paper, the web is "a vast collection of completely uncontrolled heterogeneous documents"


And search engines take that corpus and give it perpetual, decontextualised freshness. Vast catalogues of human sentiments and stories get served up at the mercurial whims of black box algorithms - algorithms that Brin and Page initially described as "inherently biased towards the advertisers and away from the needs of the consumers", in a way that is "difficult even for experts to evaluate" and therefore is "particularly insidious"


The crude, timeless nature of digital memory - and the unquestioned power of private, commercially motivated companies that control it - was a challenge that 59-year-old Costeja Gonzalez decided to tackle direcrtly


Philosophical divide

In May 2014, the ECJ found against Google. It recognised that when we enter someone's name as a search query, scattered moments of their life are presented mechanistically, with a significance distorted by lack of context, building a detailed but selective profile. So what are the rights of the individuals to whom those profiles relate? And what are the rights of those seeking information?


The question produces an interesting philosophical divide. One position is that once online, information should stay online (except when unlawful under defamation, copyright, or criminal law). This is generally the starting point of most US internet companies, free speech organisations and the media; a typical view for those raised on US First Amendment thought


On the other hand, there are all manner of reasons to remove data, other than being compelled by law. One might want to remove information for emotional reasons, ethical reasons, or "just because", when there is no countervailing interest. Some recent removal requests approved by Google included patient medical histories; intimate private group conversations that ended up online.


They include prominent reminders that an individual was the victim of rape, assult or other criminal acts; that they were once an incidental witness to tragedy; that those close to them - a partner or child - were murdered. The original sources are often many years or decades old. They are static, unrepresentative reminders of lives past, lacking the dynamic of reality


In the real world, information sediments over time, affording people the capacity to move on remembering but not being burdened by their past. Offline, we communicate in different ways with different "publics" and purposes


And this was the view of the court in Luxembourg, which drew its justification from EU data protection law. The court ruled that personal data should be removed from search results on a person's name when outdated, inaccurate, inadequate, irrelevant, or devoid of purpose, and when there is no public interest.


'Right to be forgetten'

A public interest demands that relevant information remains accessible, which is why pertinent information about elected politicians ,public officials, professionals and criminals - and even just bad reviews - all rightly remain accessible, hence Google rejecting such requests. Importantly, nothing in the ruling suggested that source material should be deleted: it was solely about the prominence of information in search engine results


The phrase "right to be forgotten" was mentioned only briefly in the judgement but was immediately seized upon by the media, Google and regulators. Though now replaced by the more accurate "right to delist", the impact of the label "right to be forgotten" was to force the debate into binaries: forgetting vs remembering, privacy vs freedom of expression, censorship vs truth or history. These are false dichotomies, insufficiently nuanced to cope with the reality of our lives and the complexities of human existence


The point of having rights against search engines is not to manipulate memory or eliminate information, but to make it less prominent, where justified, and combat the side-effects of this uniquely modern phenomenon that information is instantly, globally, and perpetually accessible


Since when has the internet become "truth", or "memory"? And since when has "history" been reduced to Google's commercially prioritised list of an imperfect collection of digital traces? Such elisions ignore the nuance of forgiveness and understanding, in conjunction with memory itself, in building truth and justice. They undervalue privacy and autonomy, at the price of near-total transparency, in building community and security


The all-or-nothing framings imposed on this case constrain, influence and shape the narrative of a much broader war: the struggle for our digital identities. We have reached a critical moment. Control over our personal data has been all but lost online: lost to corporations, to governments; lost to each other. How can we, as individuals, be empowered by the huge benfits of digital connectivity and global information flows, yet still retain some personal control over the way our identities are represented and traded online? Costeja Gonzalez's case is a small but critical battle on that broader terrain


Nine months after the European ruling, it is clear that Google's implementation has been fast, idiosyncratic, and allowed the company to shape interpretation to its own ends, as well as to gain an advantage on competitors and regulators forced into reactive mode. It avoided a broader and much deeper reflection on digital public space, information sedimentation, and the exploration of collaborative solutions between public and private actors - such as a joint request service across different search engines, with processes for getting cofidential advice from publishers and public officials


Veneer of authenticity

A little more than two weeks after the ruling, Google launched an online form for citizens to identify search result links about  themselves that are "irrelevant, outdated or otherwise objectionable" - only a partial reading of the government law, which also includes "incorrect, inadequate or misleading". It started to remove links one month later. At the time of writing, the company had received 218,427 requests, comprising a total of 789,496 links. It has reached a decision on 83% of the links and actually removed 264,450 of them, or 34%. Yet all this has been done without disclosing its internal processes, removal criteria or how it is prioritising cases


Google established an impressive "advisory council" of formidable experts, insulating its processes with a veneer of authenticity and respectability - despite being excluded from any actual knowledge of what Google is doing internally because it has revealed so few details of the cases it is processing. Created to compete with democratically legitimate expert regulatory bodies, the council's work culminated in the "independent" report released in February 2015, setting out its recommendations based on seven recent consultations across Europe


When the ECJ announced its ruling, Google criticised it as "disappointing","striking the wrong balance" and "going too far". Yet despite highlighting the "many open questions" of the ruling, Google chose not to wait for guidance from the regulators, which emerged eventually in late November(and which Google has since ignored). It has taken every opportunity to passively promote its role as a "truth" engine while avoiding discussion on the deficiencies of search:algorithmic bias, incomplete coverage, murky reputation management practices and heavy cultural-bias


Most controversially, Google's interpretation is that successful requests are removed only under European Google domains such as google.fr, google.uk and google.de. In contrast, requests alleging copyright infringement  - which outnumber privacy requests by 1000:1 - are implemented under US law on all its domains worldwide, including the largest, google.com


Tactical decision

Google's decision to only remove links on its European domains was "unacceptable" and creates a trivial workaround to undermine the ruling, according to the November report by regulators from 28 European countries and the European Commission. But, backed by its advisory council, Google's decision was tactical, shifting discussion away from the core issue of establishing and protecting digital rights, and instead encouraging confilct between apparent European and US viewpoints


A line repeatedly used by executive chairman Eric Schmidt and chief legal counsel David Drummond is that Google has always seen itself as a "card index" for the web - an oddly archaic analogy that implies objectivity, memory and public record. Yet Google can and does curate its search results, including material it judges to be promoting terrorism or child abuse


Google refuses to countenance the possibility of an algorithmic flaw, yet the question remains why Costeja's old debts - and the private, incorrect or outdated information of others who have made requests to Google since - were featured so prominently in search results


While Goole previously removed personal information in limited cases of clear and imminent harm, such as identity theft or financial fraud, this case represents the first generally accessible speed-bump on what has been an open road for Google to aggregate and proliferate publicly accessible personal information


The right to delist forces us to look at the privatised reality of digital life, and take responsibility for what we see. Internet companies have been successful in making us believe that the internet is "public space", when, in reality, it is just an algebraic representation of privately owned services. Not public parks, or the Greek agora to build politics ,but a long run of amusement parks. The notion of public space is fundamental to democratic, community-oriented rule


Asymmetries of power

So, if we concede that the internet is public space, that the web is the public record, then Google, on its logic, is the custodian and indexer of our personal records. We must be careful to distinguish the offerings of a handful of internet services from the real public record guaranteed by law, from archives, and even from human memory itself - which will all continue to be available when the amusement park closes


Citizens have unwittingly come to comprehensively rely on privately owned, culturally biased, black-box services in navigating the digital ecosystem. Google has benefited vastly from this custom, creating enormous asymmetries of power when compared to the creators, subjects and consumers of digital content


The selective information flowing from Google's sophisticated PR machine has seen little pushback from publishers. After the web-form for requests was introduced, the press was told of tens of thousands of requests, and that many of them concerned criminals, paedophiles, dodgy doctors, and politicians


The first removal was made public not by Google, with a clear breakdown of what was being delisted and why, but by prominent journalists reacting to alarming emails sent to their webmasters headed "Notice of removal from Google search"


By selectively covering the most sensational removal requests, the media created the false impression that most seeking delisting are "bad people". The concern that they might be allowed to cover their tracks is understandable, but the broader implication is dangerously misleading


Regulators need to take a more active and central role in these kind of legal and ethical debates, but have struggled to keep pace with technology. Most of Europe's 31 national data protection authorities are cumbersome, under-resourced bureaucracies issuing occasional, random fines and reacting when a court occasionally clarifies the law. Europe's data protection laws need to be deconstructed, simplified, and rebuilt into more workable form. Nevertheless, their aspirations are critically important


The same regulators should be encouraging a more nuanced and transparent discussion with Google and other search engines, confronting complex issues such as how the ruling sits alongside laws on processing sensitive data, the role of the media and of sources. Yet regulators have been far too inactive, tolerating misrepresentation of removal requests by the press and failing to insist on greater transparency, both of which have undermined the protection of individuals' data


Nuance, empathy and respect

Publishers, too, have a case to answer. Was La Vanguardia entirely in the right to republish its entire archive, or was it careless? Balancing transparency and the protection of the individual, publishers should consider tailored responses: removing the article at source or pseudonymising the subject; removing data from the search engine; geo-filtering; a right to reply, or updated contextual information


We need more sophisticated technical processes to improve how personal data is handled, flagging data as sensitive so that search engines and data processors apply data protection principles in a more intelligent way - with the nuance, empathy and respect individuals command in real life


As information security expert Dan Geer characterised, the right to delist is "the only check on the tidal wave of observability that a ubiquitous sensor fabric is birthing now, observability that changes the very quality of what 'in public' means"


This struggle for freedom, autonomy and control is unfolding within a digital ecosystem defined by surveillance. We are already a long way along the path of a parasitic system, offering 'free' services in return for the exploitation of personal data. But this should not mean that we throw up our hands in despair, abandoning responsibility circulation of personal data beyond all control


Blunt, binary logic might work for machines, but it doesn't work for humans. Our right, and our basic human need, to disclose, seek, find, transform, and distribute information must be reconciled with our equal right and need to be left alone. We have a right to decide to withhold, to remain silent , to resist. This is what is at stake here: our own rightful soverignty over our life stories, our personal narratives, our communications and even our very memories themselves






our personal narratives, our communications and even our very memories themselves

our own rightful soverignty over our life stories

this is what is at stack here

have a right to decide to withhold, to remain silent, to resist

need to be left alone

must be reconciled with our equal right

to disclose, seek, find, transform, and distribute information must be reconciled with our equal right

blunt, binary logic might work for machines

abandoning responsibility circulation of personal data beyond all control

should not mean that we throw up our hands in despair

in return

offering 'free' services in return for the exploitation of personal data

a long way along the path of a parasitic system

this struggle for freedom, autonomy and control is unfolding within a digital ecosystem defined by surveillance

observability that changes the very quality of what 'in public ' means

a ubiquitous sensor fabric is birthing now

the right to delist is "the only check on the tidal wave of observability

as information security expert Dan Geer characterised

apply data protection principles in a more intelligent way

flagging data as sensitive

to improce how personal data is handled

need more sophisticated technical processes to improve how personal data is handled

removing data from the search engine; geo-filtering; a right to reply, or updated contextual information

pseudonymising the subject

publishers should consider tailored responses

balancing transparency and the protection of the individual

entirely in the right to republish its entire archive

nuance, empathy and respect

both of which have undermined the protection of individual's data

failing to insist on greater transparency

tolerating misrepresentation of removal requests by the press

yet regulations have been far too inactive

the role of the media and of sources

alongside laws on processing sensitive data

confronting complex issues such as how the ruling sits alongside laws on processing sensitive data

the smae regulations should be encouraging a more nuanced and tranparent discussion with Google

their aspirations are critically important

need to deconstructed, simplified, and rebuilt into more workable form

reacting when a court occasionally clarifies the law

national data protection authorities are cumbersome, under-resourced bureaucracies issuing occasional, random fines

have struggled to keep pace with technology

regulators need to take a more active and central role in these kind of legal and ethical debates

the broader implication is dangerously misleading

might be allowed to cover their tracks is understandable

the media created the false impression that most seeking delisting are "bad people"

by selectively covering the most sensational removal requests

headed "Notice of removal from Google search"

ny prominent journalists reacting to alarming emails sent to their webmasters

with a clear breakdown of what was being delisted

the first removal was made public not by Google

dodgy doctors

many of them concerned criminals, paedophiles, dodgy doctors, and politicians

the press was told of tens of thousands of requests

after the web-form for requests was introduced

has seen little pushback from publishers

the selective information flowing from Google's sophisticated PR machine

compared to the creators, subjects and consumers of digital content

creating enormous asymmetries of power

has benefited vastly from this custom

in navigating the digital ecosystem

have unwittingly come to comprehensively rely on privately owned, culturally biased, black-box services

will all continue to be available when the amusement park closes

must be careful to distinguish the offerings of a handful of internet services from the real public record guaranteed by law

is the custodian and indexer of our personal records

on its logic

concede that the internet is public space

asymmetries of power

the notion of public space is fundamental to democratic, community-oriendted rule

a long run of amusement parks

the Greek agora to build politics

is just an algebraic representation of privately owned services

have been successful in making us believe that the internet is "public space"

take responsibility for what we see

the privatized reality of digital lift

the right to delist forces us to look at the privatised reality of digital life

to aggregate and proligerate publicly accessible personal information

represents the first generally accessible speed-bump on what has been an open road for Google

such as identity thefft or financial fraud

previously removed personal information in limited cases of clear and imminent harrm

were featured so prominently in search results

have made requests to Google since

yet the question remains why

refuses to countenance the possibility of an algorithmic flaw

including material it judges to be promoting terrorism or child abuse

can and does curate search results

an oddly archaic analogy that implies objectivity, memory and public record

has always seen itself as a "card index" for the web

a line repeatedly used by executive chairman Eric Schmidt

instead encouraging confilct between apparent European and US viewpoints

establishing and protecting digital rights

shifting discussion away from the core issue of establishing and protecting digital rights

backed by its advisory council

creates a trivial workaround to undermine the ruling

google's decision to only remove links on its Euripean domains was "unacceptable"

tactical decision

are implemented under US law on all its domains worldwide

outnumber privacy requests by 1000:1

in constrast, requests alleging copyright infringement

most controversially, Google's interpretation is that successful requests are removed only under European Google domains

incomplete coverage

murky reputation management practices

heavy cultural-bias

algorithmic bias, incomplete coverage, murky reputation management practices and heavy cultural-bias

the deficiencies of search

while avoiding discussion on the deficiencies of search

has taken every opportunity to passively promote its role as a "truth" engine

emerged eventually in late November and which Google has since ignored

chose not to wait for guidance from the regulators

despite highlighting the "many open questions" of the ruling

critised it as "disappointing","striking the wrong balance" and "going too far"

announced its ruling

setting out its recommendations based on seven recent consultations across Europe

culminated in the "independent" report released in February 2015

created to compete with democratically legitimate expert regulatory bodies

has revealed so few details of the cases it is processing

despite bing excluded from any actual knowledge of what Google is doing internally

with a veneer of authenticity and respectability

insulating its processes with a veneer of authenticity and respectability

established an impressive "advisory council" of formidable experts

how it is prioritising cases

removal criteria

yet all this has been done without disclosing its internal processes, removal criteria or how it is prioritising cases

has reached a decision on 83% of the links

comprising a total of 789,496 links

includes "incorrect, inadequate or misleading"

only partial reading of the government law

are irrelevant, outdated or otherwise objectionable

launched an online form for citizens to identify search result links about themselves that are irrelevant, outdated or otherwise objectionable

a little more than two weeks after the ruling

veneer of authenticity

with processes for getting confidential advice from publishers and public officials

such as a joint request service across different search engines

the exploration of collaborative solutions between public and private actors

avoided a broader and much deeper reflection on digital public space, information sedimentation

regulators forced into reactive mode

as well as to gain an advantage on competitors

allowed the company to shape interpretation to its own ends

is clear that Google's implementation has been fast, idiosyncratic

is a small but critical battle on that broader terrain

yet still retain some personal control over the way our identities are represented and traded online

global information flows

be empowered by the huge benefits of digital connectivity and global information flows

list to corporations, to governments; lost to each other

control over our personal data has been all but lost online

have reached a critical moment

the struggle for our digital identities

the all-or-nothing framings imposed on this case constrain, influence and shape the narrative of a much broader war

in building community and security

at the price of near-total transparency

undervalue privacy and autonomy

in building truth and justice

in conjunction with memory itself

such elisions ignore the nuance of forgiveness and understanding

an imperfect collection of digital traces

commercially prioritised list of an imperfect list of an imperfect collection of digital traces

since when has "history" been reduced to Google's commercially prioritised list of an imperfect collection of digital traces?

since when has the internet become "truth" or "memory"?

justified, and combat the side-effects of this uniquely modern phenomenon that information is instantly, globally, and perpetually accessible

to make it less prominent

is not to manipulate memory or eliminate information

the point of having rights against search engines

insufficiently nuanced to cope with the reality of our lives and the complexities of human existence

are false dichotomies

the impact of the label "right to be forgotten" was to force the debate into binaries

replaced by the more accurate "right to delist"

was immediately seized upon by the media, google and regulators

was mentioned only briefly in the judgement

was solely about the prominence of information in search engine results

importantly, nothing in the ruling suggested that source material should be deleted

all rightly remain accessible, hence Google rejecting such requests

which is why pertinent information about elected politicians

remains accessible

demands that relecant information remains accessible

when outdated, inaccurate, inadequate, irrelevant, or devoid of purpose

the court ruled that personal data should be removed from search results on a person's name

drew its justification from EU data protection law

was the view of the court in Luxembourg

communicate on differnt ways with different "publics and purposes

being burdened by their past

affording people the capacity to move on remembering but not being burdened by their past

information sediments over time

lacking the dynamic of reality

are static, unrepresentative reminders of lives past

the original sources are often many years decades old

an incidental witness

were once an incidental witness to tragedy

they include prominent reminders that an individual was the victim of rape, assult or other criminal acts

intimate private group conversations that ended up online

some recent removal requests approved by Google included patient medical histories

is no countervailing interest

ethical reasons

might want to remove information for emotional reasons

other than being compelled by law

are all manner of reasons to remove data

a typical view for those raised on US First Amendment thought

free speech organisations and the media

is generally the starting point of most US internet companies

except when unlawful under defamation, copyright, or crimical law

one position is that once online, information should stay online

the question produces an interesting philosophical divide

what are the rights of those seeking information?

what are the rights of the individuals to whom those profiles relate?

building a detailed but selective profile

with a significance distorted by lack of context

scattered moments of their life are presented mechanistically

recognised that when we enter someone's name as a search query

philosophical divide

decided to tackle directly

was a challenge that 59-year-old Costeja Gonzalez decided to tackle directly

commercially motivated companies that control it

the unquestioned power of private, commercially motivated companies that control it

the crude, timeless nature of digital memory

particularly insidious

and therefore is "particularly insiduous"

in a way that is "difficult even for experts to evaluate

away from the needs of the consumers

inherently biased towards the advertisers

initially described as "inherently biased towards the advertisers and away from the needs of the consumers

the mercurial whims

stories get served up at the mercurial whims of black box algorithms

vast catalogues of human sentiments

decontextualised freshness

give it perpetual

take that corpus and give it perpetual

is a vast collection of completely uncontrolled heterogeneous documents

described in their original Stanford research paper

our minds reconstruct, layer, contextualise and sediment

forgetting and remembering are complex, messy, human processes

was eventually elevated to the European Court of Justice

as had the debt

had lapsed a decade ago

attracting buyers to auction

recently digitised

had been published in Spanish newspaper

home-foreclosure notices from 1998, when he was in temporary fianancial trouble

two prominent results appeared

delisted from Google search

spent five years fighting to have 18 words delisted from Google search results on his name

what does society lose when a private corporation rules public information?

executioner in the wake of Europe's right to be forgotten

has acted as judge

determined our right to be forgotten

댓글
반응형
공지사항
최근에 올라온 글
최근에 달린 댓글
Total
Today
Yesterday
링크
TAG
more
«   2024/11   »
1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
글 보관함