| |
PatrickOConnorFirstPaper 2 - 06 Jan 2013 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstPaper" |
Tailored Search: Is Google Incorporating Our Biases In Its Results and Hiding Them From Us? | | We should seek forms of mediation that filter for reliability and direct users to useful information. Intermediary services must be transparent. That is, they must be upfront about the assumptions underlying the mechanisms used to order results. Armed with this knowledge, users may then adjust their own use. Adaptive and tailored search results ensure that the biases underlying a search engine are ever shifting, individualized, less visible to the user. | |
> > | | | | |
< < |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines: | > > | Some estimate of the magnitude of the phenomenon is part of any
reasonable evaluation, but we have no indication of the breadth of
the effect.
We can be sure, however, of securing an unbiased search in this sense
if we are not presenting a cookie linked to a browsing history, are
not logged in using any other Google services, and are browsing from
the IP address of a multi-user mainframe or machine cluster. This is
comparatively simple to achieve, and anyone actually trying to
protect her privacy would be doing these things anyway, for other
reasons. If people broadly adopted personal privacy technology, this
effect would be automatically turned off for all such people, and
might no longer be valuable to do at all. | | | |
< < | | > > | Shouldn't that at least be pointed out in discussing the situation? | | | |
< < | Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. | | \ No newline at end of file | |
> > | | | \ No newline at end of file |
|
PatrickOConnorFirstPaper 1 - 23 Oct 2012 - Main.PatrickOConnor
|
|
> > |
META TOPICPARENT | name="FirstPaper" |
Tailored Search: Is Google Incorporating Our Biases In Its Results and Hiding Them From Us?
-- By PatrickOConnor - 23 Oct 2012
Introduction
It has become clear that the central bargain struck by internet users in the era of “free” (as in gratis) online services involves the surrender of personal information. In exchange for whatever service Facebook or Google provides to the user, the user allows the company to manipulate and market information provided in the course of using that service to third parties. In analyzing this bargain, more attention must be paid to how information is used to shape the service itself.
For at least six years, the internet has served as my principal source of information. In order to manage the massive amount of information that is available, I rely on filters of various sorts. As a result, the information I access is determined in some part by the mechanisms employed by these filters to weigh various qualities deemed relevant to the quality of filtered results.
When I enter a search query into Google’s search engine, I expect to receive a set of results that has not been subjected to government censorship or private editorial approval. That is, I expect to receive results that are generated impartially by an uninterested mechanism. The question I am interested in is: What is lost or gained when this mechanism takes into consideration information about the user and the user’s habits, in addition to information about the webpages it is filtering?
What’s the Problem?
There are some obvious situations in which allowing a search algorithm to take into consideration information about the user dramatically increases the usefulness of the service. For example. when I search for information about restaurants that deliver, I am not interested in information about restaurants that are located in other cities.
The inclusion of user information becomes more problematic when the range of search subjects is expanded to information that would qualify as “News.” One can imagine two distinct biases that might injected into search results in order to provide “better results”: demography-influenced filters (tailoring) and user-history influenced filters (adaptive search). . These techniques provide results that may be more satisfying to users, at the expense of objectivity.
The Importance of Objectivity
It is possible to access the objectivity of a given search engine on two different scales. The first, which might be called internal objectivity deals with the extent to which the mechanisms used to assess the relevance of information reflect traditional forms of bias. The second, which might be called cross-user objectivity is concerned with the extent to which a search engine returns different results for different users. In both cases, the user is unable to assess both the extent to which results are biased and the extent to which those biases shift over time as the user interacts with the service. If the user is unaware of these problems, she is left with the mistaken assumption that she is accessing the most relevant available information. If she is aware, she faces a sort of epistemological conundrum. How does one parse the the results for bias when he or she does not understand which biases are in play or how those biases interact?
Detecting and accounting for bias is an essential skill for those who seek to inform themselves about the news on the internet. The fact that everyone with a Twitter or Blogspot account can broadcast facts and opinions at little or no cost has led to an explosion of commentary. Navigating this mass of information requires that users take a step back to anticipate the biases before digesting information. Tailored and adaptive search results further complicate this task by introducing what amount to the biases of the user.
Betraying the Promise of Free Access to Information
Universal access to knowledge is one of the promises of the networking technology underlying the world wide web. We no longer rely on other human beings to transmit knowledge. Nor do we rely on expensive printed materials. Freed from from these hindrances, each human is, in theory, free to enlighten him or herself. However, network technologies cannot alleviate entirely the need for some form of mediation between the user and the mass of knowledge.
We should seek forms of mediation that filter for reliability and direct users to useful information. Intermediary services must be transparent. That is, they must be upfront about the assumptions underlying the mechanisms used to order results. Armed with this knowledge, users may then adjust their own use. Adaptive and tailored search results ensure that the biases underlying a search engine are ever shifting, individualized, less visible to the user.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|
|
This site is powered by the TWiki collaboration platform. All material on this collaboration platform is the property of the contributing authors. All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
|
|
| |