MislavMataijaFirstPaper 5 - 06 May 2009 - Main.MislavMataija
|
|
META TOPICPARENT | name="FirstPaper" |
Assessing a Regulatory Approach to Data Protection: the EU Directive | | Doing too much and doing too little | |
< < | Some of its broadly applicable obligations may seem tough. But even if in some ideal world they were really effective in regulating business practices, they would also be able of imposing unnecessarily obstacles to harmless conduct and chilling free expression. Thus, in Lindqvist, the Directive was applied to a church volunteer who posted a list of people working for her parish on a website, along with phone numbers and some "mildly humorous" information on their jobs and hobbies. Under that standard, having a list of students and their e-mails available on a public wiki would definitely be suspect. | > > | Some of its broadly applicable obligations may seem tough. But even if in some ideal world they were really effective in regulating business practices, they would also be able of imposing unnecessary obstacles to harmless conduct and chilling free expression. Thus, in Lindqvist, the Directive was applied to a church volunteer who posted a list of people working for her parish on a website, along with phone numbers and some "mildly humorous" information on their jobs and hobbies. Under that standard, having a list of students and their e-mails available on a public wiki would definitely be suspect. | | Elevating privacy to the level of a fundamental right does not help either, because many other interests merit that lofty status in the case law of the European Court of Justice. A case in point is Telefonica, where a recording industry group demanded the names and addresses of a Spanish ISP's clients. The case was governed, said the Court, by three fundamental rights: right to property, right to an effective remedy, and the right to respect for private life. Which of these will win? The judgment does not really say - all we are left with is a hodge-podge of fundamental rights, from which national courts are supposed to derive a solution through "fair balancing". One can only imagine what the outcomes are in the 27 Member States. |
|
MislavMataijaFirstPaper 4 - 06 May 2009 - Main.MislavMataija
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | A market for privacy? | > > | Assessing a Regulatory Approach to Data Protection: the EU Directive | | | |
< < | -- By MislavMataija - 06 Mar 2009 | > > | -- By MislavMataija - 01 May 2009 | | | |
< < | Can the collection of surfing habits, preferences, typed words and other datastreams left online, be compensated for on a market basis? Some people have suggested models in which that might work. Consumers, or perhaps "infomediaries" negotiating in their favor, would come to terms with online service providers wishing to acquire data about them. Sure, you lose some of your privacy, but you get something of value in return – money, goods, services, access. | | | |
< < | The implication of having a workable model of this sort is that there is little need for government regulation. Instead of supposedly inefficient prohibitions and regulations on the use of personal data, the market could not only decide what happens to the data, but also police violations of the consumer-provider contract. | > > | Can the collection of personal data be regulated by the market? Some people have suggested models in which consumers could negotiate with companies interested in their data, to everyone's benefit; others argue for industry self-regulation through common privacy policies. The implication of having a workable model of this sort is that there is little need for government regulation. Instead of supposedly inefficient regulations, the market could decide what happens to the data, as well as police violations. | | | |
< < | There are, of course, assumptions in these theories. First, privacy interests have to be stated in terms of economic value. Second, both parties have equal access to information and there are no significant transaction costs standing in the way. Third, the parties will abide by the terms of the bargain. Fourth, the buyer does not pass the data on – this would clearly be a distortion of the system in which every privacy concession is compensated by the buyer. | > > | The opposite perspective is broad-ranging regulation. The EU Data Protection Directive, which limits the collection and use of data, at least provides a minimum standard and some sort of predictability for the person whose data is used, unlike the US system of piecemeal or "self"-regulation. But how well does the Directive do its job? The verdict is not good. | | | |
< < | Incentives and the status quo | > > | First of all, its power to protect individual rights is compromised by the market-building approach that lies at its core. As a harmonizing measure, it is meant to achieve a common market in data. While this may not necessarily be a bad thing, it means that privacy protection is not its center of gravity and is always tainted by the desire to facilitate cross-border trade and services. Evidence of this is that, unlike even the US situation, Member States are not allowed to provide for higher levels of data protection within the scope of the Directive. | | | |
< < | Unfortunately, none of these assumptions is realistic. More importantly, what is the incentive for sites that live on marketing their users' information to negotiate away their most valued asset? Everyday experience tells us, instead, that they make even the simplest steps such as unsubscribing from mailing lists, terminating your account as difficult as possible – not to mention keeping their privacy policies convoluted, lengthy and ever-changing. Since the ease of access to personal data is a central part of their business model, they are not likely to make concessions. The bargain, if one can speak of such a thing, is likely to be built into the system. For the customer it will probably remain a "take it or leave it" scenario. Since the risks are dispersed and require foresight, and the gains are immediate and clear, most people will take the offer. If there was any additional negotiation to be had at any point, someone would probably have come up with it by now. | > > | Doing too much and doing too little | | | |
< < | BT's rollout of Phorm, a "service" which tracks users' surfing habits and serves them with targetted ads, is a case in point. No one seemed to have tried negotiating any sort of compensation for users. If the rollout happens, it will simply be a part of the broadband package offered by BT (perhaps it could be worked around or turned off, if the user is so inclined and sufficiently tech-savvy). To add insult to injury, it will probably be billed as a benefit – instead of just random advertising, you get context-specific product information tailored to your own interests. Everyone wins. | > > | Some of its broadly applicable obligations may seem tough. But even if in some ideal world they were really effective in regulating business practices, they would also be able of imposing unnecessarily obstacles to harmless conduct and chilling free expression. Thus, in Lindqvist, the Directive was applied to a church volunteer who posted a list of people working for her parish on a website, along with phone numbers and some "mildly humorous" information on their jobs and hobbies. Under that standard, having a list of students and their e-mails available on a public wiki would definitely be suspect. | | | |
< < | Who gets the data? | > > | Elevating privacy to the level of a fundamental right does not help either, because many other interests merit that lofty status in the case law of the European Court of Justice. A case in point is Telefonica, where a recording industry group demanded the names and addresses of a Spanish ISP's clients. The case was governed, said the Court, by three fundamental rights: right to property, right to an effective remedy, and the right to respect for private life. Which of these will win? The judgment does not really say - all we are left with is a hodge-podge of fundamental rights, from which national courts are supposed to derive a solution through "fair balancing". One can only imagine what the outcomes are in the 27 Member States. | | | |
< < | The problem, of course, is not only what the original provider – or contractor – uses the data for. It is also whether, and how much, he shares it with others. What Facebook knows is perhaps not as much a reason to worry as what the providers of all the third-party Facebook applications know, and where your data goes from there. The weakest link controls the spread, and even if you trust the honesty of Facebook there is simply no way of guaranteeing your data is protected once it is divulged to a third party. | > > | How effective is it? | | | |
< < | In addition to social networking apps such as Facebook, the amount of data we now hold online for different reasons is another cause of concern. The files I have on my own computer easily come within 4th amendment protection, but what about all the Google Docs spreadsheets and other materials stored online, somewhere? Can I guarantee that all that data remains private? Once again, even if I know who I can negotiate with, and even if I trust their particular promises, all kinds of third parties can be involved. A company I am contracting with might tomorrow be contracting to store my data with someone else. The cloud is perhaps a fitting description for the place your data ends up going to. | > > | Finally, all of these doubts to one side, the Directive is not effective in practice. The more specific problems are: people don't know about it, companies don't follow it and national regulators are not really functional. | | | |
< < | Legal solutions | > > | In an EU-wide survey, 48% of companies said that they received less than 10 requests for access to personal data in the previous year, and 25% said they have received none. A third of surveyed individuals had never heard of their most important rights under the Directive - access, correction and erasure of data. While this may be explained by general apathy regarding privacy matters, it certainly does not serve as a glowing recommendation for the Directive. | | | |
< < | If the problem is really impossible to negotiate away, what are the alternatives? More robust regulation could be a step in the right direction. EU-like regulation of data protection, which limits the collection and use of data, at least provides a minimum standard and some sort of predictability for the person whose data is used, unlike the US system of piecemeal or "self"-regulation. It is still, however, questionable to what extent data protection rules are implemented and enforced, and how much they actually constrain anyone. In the US context, making constitutional guarantees less dependent on personal space, and therefore extending the right to privacy in your data even if it is located on a remote server, could also be a step in the right direction. | > > | As for the companies subject to legislation, the consensus so far seems to be that compliance is not a top priority because of low detection risk and weak enforcement. The Directive also tries to encourage the adoption and clearance of industry-wide privacy policies, but only one has been adopted so far. | | | |
< < | Legal guarantess, in other words, are helpful. Still, they do not get you very far. The first problem is enforcement. But a bigger problem is the fact that users do in fact "negotiate" their privacy away – even if they do so with no real consideration, in a context of asymmetric information and high transaction costs. To stop that from happening, something will have to change in the way people value their privacy. But the forces working in the other direction seem overwhelming. The business models of most of the popular social networking sites are centered around collecting personal data for marketing purposes, and every willing participant has a measurable market value from their perspective. For users with a high level of privacy awareness, that value will always be low. Consequently, anyone who cares enough to want to negotiate is probably not worth negotiating with anyway. As a benefit to users, privacy negotiation sounds like a thoroughly unrealistic idea. | > > | Data protection agencies | | | |
< < |
Word count: 982 | > > | One way to improve the situation might be by raising awareness and scrutinizing individual companies more aggressively. Explaining in more accessible language, perhaps by way of real-life examples, what "data processing incompatible with the purpose for which the data were initially collected" means, would be a good start. This should be the job of the national data protection agencies.
The problem is, however, that less than a third of Europeans are aware of the existence of those agencies. Looking at some of their "guides for the citizens", maybe that is for the better. As a random example, the Irish guide is 20 pages of trite text more or less repeating the Directive, with some pep talk sprinkled around ("Who is a data subject? We are all data subjects!"). A survey from 2009 shows that the agencies are largely reactive, do not engage with other agencies or NGOs, and have no mechanisms in place to measure the effectiveness of their "promotional activities". Elsewhere they have been described as "characterised by excessive legalism and procedures" (Y. Poullet).
Finally, national legislation is not close to being fully harmonized, even now. Poullet's report indicates that some national rules diverge even over the most fundamental concepts, such as what "personal data" means, how data processing can be justified and the scope of the right of access to personal data. This has actually led to companies pleading for an EU-level regulation leaving no implementation powers to the Member States.
All of this is still not an argument against regulation, and definitely not an argument for "self-regulation". But if the world's most ambitious piece of data protection legislation has achieved so little to protect privacy, there is a problem. Progress might come if European and national regulators start to cooperate more effectively in developing a "privacy culture". Perhaps they can influence policymakers and regulators in other fields to make privacy a central concern; perhaps some of that will lead to better industry practices in specific sectors. So far, however, none of that seems to be happening.
Word count: 992
[Old version removed! Mislav] | | Mislav, if you have come across any great papers discussing how well the EU Data Protection Directive has (or has not worked), I would appreciate it if you could pass those on to me as they would be of great interest to me. Thanks. -- KateVershov - 09 Mar 2009 |
|
MislavMataijaFirstPaper 3 - 17 Apr 2009 - Main.EbenMoglen
|
|
< < |
META TOPICPARENT | name="FirstPaper%25" |
| > > |
META TOPICPARENT | name="FirstPaper" |
| | A market for privacy?
-- By MislavMataija - 06 Mar 2009 | | Mislav, if you have come across any great papers discussing how well the EU Data Protection Directive has (or has not worked), I would appreciate it if you could pass those on to me as they would be of great interest to me. Thanks. -- KateVershov - 09 Mar 2009 | |
< < | # * Set ALLOWTOPICVIEW = TWikiAdminGroup, MislavMataija | > > |
- I don't think I understand either the technical or economic arguments of the essay. It's hard to have a functioning market in valueless data. The "behavioral marketing from clickstream" scam is just stupid, as I've mentioned elsewhere, so we don't have to wonder why there isn't a market in the dreck Phorm can capture. But because I don't know how to evaluate the arguments that seem to me to depend on unestablished technical or economic propositions, I don't really know how to take the "We should talk about what turns out to be a thoroughly unrealistic idea" structure of this paper. Is it a deliberate send-up? A satire whose point I miss? It can't be serious but it doesn't seem to be fooling.
| | \ No newline at end of file |
|
MislavMataijaFirstPaper 2 - 09 Mar 2009 - Main.KateVershov
|
|
META TOPICPARENT | name="FirstPaper%25" |
A market for privacy? | | Legal guarantess, in other words, are helpful. Still, they do not get you very far. The first problem is enforcement. But a bigger problem is the fact that users do in fact "negotiate" their privacy away – even if they do so with no real consideration, in a context of asymmetric information and high transaction costs. To stop that from happening, something will have to change in the way people value their privacy. But the forces working in the other direction seem overwhelming. The business models of most of the popular social networking sites are centered around collecting personal data for marketing purposes, and every willing participant has a measurable market value from their perspective. For users with a high level of privacy awareness, that value will always be low. Consequently, anyone who cares enough to want to negotiate is probably not worth negotiating with anyway. As a benefit to users, privacy negotiation sounds like a thoroughly unrealistic idea. | |
> > |
| | Word count: 982 | |
< < | | > > | Mislav, if you have come across any great papers discussing how well the EU Data Protection Directive has (or has not worked), I would appreciate it if you could pass those on to me as they would be of great interest to me. Thanks. -- KateVershov - 09 Mar 2009 | |
# * Set ALLOWTOPICVIEW = TWikiAdminGroup, MislavMataija |
|
MislavMataijaFirstPaper 1 - 06 Mar 2009 - Main.MislavMataija
|
|
> > |
META TOPICPARENT | name="FirstPaper%25" |
A market for privacy?
-- By MislavMataija - 06 Mar 2009
Can the collection of surfing habits, preferences, typed words and other datastreams left online, be compensated for on a market basis? Some people have suggested models in which that might work. Consumers, or perhaps "infomediaries" negotiating in their favor, would come to terms with online service providers wishing to acquire data about them. Sure, you lose some of your privacy, but you get something of value in return – money, goods, services, access.
The implication of having a workable model of this sort is that there is little need for government regulation. Instead of supposedly inefficient prohibitions and regulations on the use of personal data, the market could not only decide what happens to the data, but also police violations of the consumer-provider contract.
There are, of course, assumptions in these theories. First, privacy interests have to be stated in terms of economic value. Second, both parties have equal access to information and there are no significant transaction costs standing in the way. Third, the parties will abide by the terms of the bargain. Fourth, the buyer does not pass the data on – this would clearly be a distortion of the system in which every privacy concession is compensated by the buyer.
Incentives and the status quo
Unfortunately, none of these assumptions is realistic. More importantly, what is the incentive for sites that live on marketing their users' information to negotiate away their most valued asset? Everyday experience tells us, instead, that they make even the simplest steps such as unsubscribing from mailing lists, terminating your account as difficult as possible – not to mention keeping their privacy policies convoluted, lengthy and ever-changing. Since the ease of access to personal data is a central part of their business model, they are not likely to make concessions. The bargain, if one can speak of such a thing, is likely to be built into the system. For the customer it will probably remain a "take it or leave it" scenario. Since the risks are dispersed and require foresight, and the gains are immediate and clear, most people will take the offer. If there was any additional negotiation to be had at any point, someone would probably have come up with it by now.
BT's rollout of Phorm, a "service" which tracks users' surfing habits and serves them with targetted ads, is a case in point. No one seemed to have tried negotiating any sort of compensation for users. If the rollout happens, it will simply be a part of the broadband package offered by BT (perhaps it could be worked around or turned off, if the user is so inclined and sufficiently tech-savvy). To add insult to injury, it will probably be billed as a benefit – instead of just random advertising, you get context-specific product information tailored to your own interests. Everyone wins.
Who gets the data?
The problem, of course, is not only what the original provider – or contractor – uses the data for. It is also whether, and how much, he shares it with others. What Facebook knows is perhaps not as much a reason to worry as what the providers of all the third-party Facebook applications know, and where your data goes from there. The weakest link controls the spread, and even if you trust the honesty of Facebook there is simply no way of guaranteeing your data is protected once it is divulged to a third party.
In addition to social networking apps such as Facebook, the amount of data we now hold online for different reasons is another cause of concern. The files I have on my own computer easily come within 4th amendment protection, but what about all the Google Docs spreadsheets and other materials stored online, somewhere? Can I guarantee that all that data remains private? Once again, even if I know who I can negotiate with, and even if I trust their particular promises, all kinds of third parties can be involved. A company I am contracting with might tomorrow be contracting to store my data with someone else. The cloud is perhaps a fitting description for the place your data ends up going to.
Legal solutions
If the problem is really impossible to negotiate away, what are the alternatives? More robust regulation could be a step in the right direction. EU-like regulation of data protection, which limits the collection and use of data, at least provides a minimum standard and some sort of predictability for the person whose data is used, unlike the US system of piecemeal or "self"-regulation. It is still, however, questionable to what extent data protection rules are implemented and enforced, and how much they actually constrain anyone. In the US context, making constitutional guarantees less dependent on personal space, and therefore extending the right to privacy in your data even if it is located on a remote server, could also be a step in the right direction.
Legal guarantess, in other words, are helpful. Still, they do not get you very far. The first problem is enforcement. But a bigger problem is the fact that users do in fact "negotiate" their privacy away – even if they do so with no real consideration, in a context of asymmetric information and high transaction costs. To stop that from happening, something will have to change in the way people value their privacy. But the forces working in the other direction seem overwhelming. The business models of most of the popular social networking sites are centered around collecting personal data for marketing purposes, and every willing participant has a measurable market value from their perspective. For users with a high level of privacy awareness, that value will always be low. Consequently, anyone who cares enough to want to negotiate is probably not worth negotiating with anyway. As a benefit to users, privacy negotiation sounds like a thoroughly unrealistic idea.
Word count: 982
# * Set ALLOWTOPICVIEW = TWikiAdminGroup, MislavMataija |
|
|
|
This site is powered by the TWiki collaboration platform. All material on this collaboration platform is the property of the contributing authors. All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
|
|