NicolaiNuberFirstPaper 5 - 19 May 2018 - Main.NicolaiNuber
|
|
META TOPICPARENT | name="FirstPaper" |
| |
| |
< < | Agreed that users are likely more concerned about private surveillance. I was trying to point to a slightly different aspect, notably the convergence of private and government surveillance. Governments use data and tech from private companies. Generally, the strict separation of the two spheres (government vs. private surveillance) has become blurred and private companies’ surveillance poses new challenges to democracies. On example that comes to mind is the Cambridge Analytica scandal and the corresponding data gathering of private companies targeted at influencing public opinion and elections. | > > | Agreed that users are likely more concerned about private surveillance. I was trying to point to a slightly different aspect, notably the convergence of private and government surveillance. Governments use data and tech from private companies. Generally, the strict separation of the two spheres (government vs. private surveillance) has become blurred and private companies’ surveillance poses new challenges to democracies. One example that comes to mind is the Cambridge Analytica scandal and the corresponding data gathering of private companies targeted at influencing public opinion and elections. | | |
|
NicolaiNuberFirstPaper 4 - 18 May 2018 - Main.NicolaiNuber
|
|
META TOPICPARENT | name="FirstPaper" |
| |
| |
> > |
I was not trying to imply that government will not attempt to prevent cyber-attacks, quite the contrary is likely true. What I was trying to point to is that citizens’ trust in its government’s ability to protect them from outside attacks has decreased. By way of example, after the revelation that Chancellor Merkel’s phone was tapped by NSA for three years, the trust of German citizens in its government has decreased. While government-on-government spying is nothing new, what has changed – and what Snowden revealed – was the unprecedented scale of the spying activities, which are mainly made possible by technological developments.
| | With the knowledge gained thanks to Snowden, we can change our behavior accordingly - at least theoretically. By we, I mean each and every one of us. I say theoretically because society is distracted easily and technological knowledge and abilities are often lacking. Be that as it may, at least we now know about the occurrence and the scale of government surveillance. What society generally doesn’t appreciate though, is the role the big data economy plays as a facilitator (voluntarily or not) of privacy attacks inflicted upon us.
| |
| |
> > |
Agreed that users are likely more concerned about private surveillance. I was trying to point to a slightly different aspect, notably the convergence of private and government surveillance. Governments use data and tech from private companies. Generally, the strict separation of the two spheres (government vs. private surveillance) has become blurred and private companies’ surveillance poses new challenges to democracies. On example that comes to mind is the Cambridge Analytica scandal and the corresponding data gathering of private companies targeted at influencing public opinion and elections.
| | And these private companies are typically not subject to constitutional notions of privacy but rather possess wide discretion to use our data as set forth in their terms of use.
| | The Net and the Private-Public Convergence
| |
< < | The internet is as a system without clear country borders and where private and public sectors converge. This means that we live in a world where big data companies might become (voluntary or involuntary through e.g. decryption backdoors) henchmen of governments. If mandated for the wrong reasons, the government’s availment of big data companies isn’t much different morally than the 18th century-style issuance of general warrants. What is very different in the digital age though is the scale, speed and simplicity with which a disagreeable individual can be traced and manipulated. Even more severe are the potential chilling effects the technological possibilities in the digital age imply to a society as a whole (see Xinjiang): The big data surveillance is not merely about the data collection of individuals but about the study of contextual and collective human behavior. Other reasons for concern are the sometimes opaque motives of big data companies themselves, let alone the difficulty for consumers to know with certainty the implications of all the data they give away for free. These technological possibilities and the shamelessness with which certain political proposals treat privacy issues (such as the Feinstein-Burr decryption bill) should have all of us concerned. The technological possibilities spur politician’s Benthamian-kind utilitarian hopes and, if misappropriated, could end-up in (maybe still science-fiction-like seeming) realities as described in Yuval Noah Harari’s Homo Deus or Frederick B. Skinner’s Walden Two. | > > | The internet is as a system without clear country borders and where private and public sectors converge. This means that we live in a world where big data companies might become (voluntary or involuntary through e.g. decryption backdoors) henchmen of governments. If mandated for the wrong reasons, the government’s availment of big data companies isn’t much different morally than the 18th century-style issuance of general warrants. What is very different in the digital age though is the scale, speed and simplicity with which a disagreeable individual can be traced and manipulated. Even more severe are the potential chilling effects the technological possibilities in the digital age imply to a society as a whole (see Xinjiang): The big data surveillance is not merely about the data collection of individuals but about the study of contextual and collective human behavior. Other reasons for concern are the sometimes opaque motives of big data companies themselves, let alone the difficulty for consumers to know with certainty the implications of all the data they give away for free. | |
Barrhus Frederic—known universally as BF—Skinner, actually. It would have been better to check. | | What are the names being dropped for? I don't associate Dianne Feinstein with Jeremy Bentham much. Is this the Panopticon Bentham? Not really a model for thinking about the national security state, nor an explanation of why the senior Democrat and Republican on the Senate Intelligence Committee would take the political positions these senators are predictably taking (again). They're not visionaries imagining some changed form of human society. They're political supporters of the intelligence community whose influence depends on relationship.
| |
> > |
The names were dropped to show general examples of (possible) unwanted realities that could result from a combination of political initiatives that attack privacy and today’s technological possibilities. It wasn’t intended to link specific politicians to particular ideologies. Since this wasn't clear, I deleted the names. In my view bad realities would be Benthamian-inspired thinking of utilitarianism or Harari’s Homo Deus, which presuppose a fully-transparent citizen.
| | | | Regulating Private Companies
| |
< < | In regards to regulation, the principles set forth in the EU’s General Data Protection Regulation (GDPR) seem to me to be a step in the right direction. The new sanction system that will enter into force – a fine of the higher of either up to 4 % of a company’s worldwide annual turnover or up to 20 million EUR – should finally cause companies to take data protection seriously. The privacy by design principle encourages encryption of private data. Further, the extra-territorial reach of GDPR might foster the development of converging rules and provide an equal level of privacy protection, but that is far from certain. Especially the current US administration doesn’t seem to be too bothered by the fact that the future of the US – EU and US – Swiss, respectively, Privacy Shield framework’s future is unclear, given the concerns that the EU Article 29 Working Party has expressed regarding the self-certification procedure of US companies (https://www.hldataprotection.com/2017/12/articles/international-eu-privacy/article-29-working-party-sets-deadline-to-address-privacy-shield-concerns/). Furthermore, the European Commission recently announced that it will make it a priority in future trade and investment agreements to counter rules from other countries – including Russia, China and India – that require companies to store data on local servers (https://www.forbes.com/sites/davidschrieberg1/2018/02/11/e-u-hoping-new-data-protection-under-gdpr-will-have-global-impact/#57258c042dc1). However, it seems unlikely that non-EU countries will follow these demands them due to fundamentally divergent views and interests when it comes to privacy. Also, unfortunately private sector companies are likely to cave into demands on local data storage and sharing, as recently observed with the demands made by China and followed by apple (https://www.cnet.com/news/apple-moving-icloud-encryption-keys-to-china-for-china-based-users/). While these EU policies may be well intended they have the further major shortcoming in that they focus on a bilateral transactional relationship. The governance of the bilateral relationship is necessary but far from sufficient since the big issue relates to behavior collection on a large scale that affects society as a whole. On this point, GDPR remains silent. | > > | In regards to regulation, the principles set forth in the EU’s General Data Protection Regulation (GDPR) may, on paper, seem like a step in the right direction. Some positive developments can be mentioned: The new sanction system that will enter into force – a fine of the higher of either up to 4 % of a company’s worldwide annual turnover or up to 20 million EUR – will finally cause companies to take data protection seriously. The privacy by design principle encourages encryption of private data. Further, the extra-territorial reach of GDPR might foster the development of converging rules and provide an equal level of privacy protection. In this regard, the European Commission's efforts to combat local storage requirements in trade agreements can be mentioned (https://www.forbes.com/sites/davidschrieberg1/2018/02/11/e-u-hoping-new-data-protection-under-gdpr-will-have-global-impact/#57258c042dc1).
Even if these policies may be well intended, they don’t lack shortcomings. On a political point, it seems unlikely that non-EU countries will follow these demands due to fundamentally divergent views and interests when it comes to privacy. Dictated by real-world economy constraints and shareholder value considerations, unfortunately private sector companies are likely to cave into demands on local data storage and sharing, as recently observed with the demands made by China and followed by apple (https://www.cnet.com/news/apple-moving-icloud-encryption-keys-to-china-for-china-based-users/). Infrastructural, the major shortcoming is that GDPR focuses on a bilateral transactional relationship. The governance of the bilateral relationship is necessary but far from sufficient since the big issue relates to behavior collection on a large scale that affects society as a whole. On this point, GDPR remains silent. Speculating as to why this may be the case, two presumptions come to mind: Either expert technological know-how and/or enforcement manpower is lacking or, perhaps the more likely explanation, the EU fears to fall further behind in the “digitization” race, which it feels might happen if the EU was to actually intervene in the technological infrastructure and in code. | |
Education and Community Building
| |
< < | Another essential pillar, if privacy is to be taken seriously in the digital age, is the education of individuals and the establishment of networks devoted to ascertaining privacy. With data breaches and negative news of (centralized) big data companies on the rise, one would expect that privacy becomes a topic that individuals start to care more about. A promising example of this can be seen in blockchain technology and the communities built around it. In its set-up, these networks inherently cut-out the middle-man and are user-controlled. However, without additional layers of protection, blockchain technology itself is not safe from big data analytics (http://peerproduction.net/issues/issue-9-alternative-internets/peer-reviewed-papers/the-interplay-between-decentralization-and-privacy-the-case-of-blockchain-technologies/). Blockchain-based projects such as “Chainiac” (developed at the Swiss Federal Institute of Technology in Lausanne (EPFL)), which tries to make it impossible for governments to force software companies to deliver software updates with secret backdoors in them, spur hope (https://www.fanaticalfuturist.com/2017/08/blockchain-experts-are-putting-a-stop-to-governments-putting-backdoors-in-software/). Ultimately, it seems to be a constant fight between privacy proponents and their adversaries, the outcome of which will heavily influence how society will develop. Individuals' technological literacy will play a significant role in the outcome. | > > | Another essential pillar, if privacy is to be taken seriously in the digital age, is the education of individuals and the establishment of networks devoted to ascertaining privacy. With data breaches and negative news of (centralized) big data companies on the rise, one would expect that privacy becomes a topic that individuals start to care more about. Still, the raising of awareness is one of the greatest challenges. Our ability to spell out, in simple terms and tangible form, the dangers a society driven and controllable by big data poses, becomes crucial. And it is a very challenging task due to the abstract, intangible concepts of privacy, which further requires at least basic technological understandings. Even for Columbia Law students the topic is challenging as was exemplified by a speech at graduation where graduating students thanked the Law School for providing a “safe learning environment” – of course they meant a physically safe environment. However, thanks to the rise of scandals and tangible examples – e.g. Cambridge Analytica or Xinjiang – there is momentum. We must make sure that in the noise people don’t lose sight of what we stand to risk. Strong counter forces are at play, such as the “convenience” aspect, the fear of missing out on something – particularly among teenagers – and of course the lobbying by big tech.
Education, I believe, should start with tricks we learned in this course: Knowing how to browse anonymously and writing encrypted emails. Further, we have to figure out whether there is any actual smart way of using the smart phone. In the greater shape of things, educating children about privacy is crucial. We must make sure that children understand the risks involved in their daily usage of technology and equip them with tools that allow to take better control of their own data. This will require a severe realignment of the current trends in education. The task requires a wider societal discourse, which has been largely lacking so far. | | |
|
NicolaiNuberFirstPaper 3 - 10 May 2018 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | | Private-Public Partnership | | Introduction
| |
< < | As a citizen of a country that is based on a constitutional democracy, which acknowledges an inalienable right to privacy, one would expect that respective government to protect its citizens from cyber-attacks and illegal surveillance. However, the latest after Snowden’s revelations we know that reality looks different. Since governments themselves may initiate and/or become targets of cyber-attacks, individuals can’t solely rely on governments to ensure their constitutionally mandated protective function. With the knowledge gained thanks to Snowden, we can change our behavior accordingly - at least theoretically. By we, I mean each and every one of us. I say theoretically because society is distracted easily and technological knowledge and abilities are often lacking. Be that as it may, at least we now know about the occurrence and the scale of government surveillance. What society generally doesn’t appreciate though, is the role the big data economy plays as a facilitator (voluntarily or not) of privacy attacks inflicted upon us. And these private companies are typically not subject to constitutional notions of privacy but rather possess wide discretion to use our data as set forth in their terms of use. | > > | As a citizen of a country that is based on a constitutional democracy, which acknowledges an inalienable right to privacy, one would expect that respective government to protect its citizens from cyber-attacks and illegal surveillance. However, the latest after Snowden’s revelations we know that reality looks different. Since governments themselves may initiate and/or become targets of cyber-attacks, individuals can’t solely rely on governments to ensure their constitutionally mandated protective function.
Why does that lead to the conclusion that government will not
attempt to prevent cyber attack? Governments are more involved in
protecting digital "infrastructure" from foreign attack during what
should be peacetime than they are in protecting most societal
infrastructure. They may be less reliable than citizens perceive in
limiting the efforts of their own listeners, and less effective than
they and their citizens would like against the most powerful
outsiders, but I'm not aware of any evidence that governments with
money to spend are less motivated in these matters than they used to
be. What's the evidence?
With the knowledge gained thanks to Snowden, we can change our behavior accordingly - at least theoretically. By we, I mean each and every one of us. I say theoretically because society is distracted easily and technological knowledge and abilities are often lacking. Be that as it may, at least we now know about the occurrence and the scale of government surveillance. What society generally doesn’t appreciate though, is the role the big data economy plays as a facilitator (voluntarily or not) of privacy attacks inflicted upon us.
Evidence? Most of the public opinion research I see suggests that
educated users of the Net in many countries think private
surveillance is a bigger problem than government listening. What
can you point to that leads to the opposite conclusion?
And these private companies are typically not subject to constitutional notions of privacy but rather possess wide discretion to use our data as set forth in their terms of use. | | | | The Net and the Private-Public Convergence
The internet is as a system without clear country borders and where private and public sectors converge. This means that we live in a world where big data companies might become (voluntary or involuntary through e.g. decryption backdoors) henchmen of governments. If mandated for the wrong reasons, the government’s availment of big data companies isn’t much different morally than the 18th century-style issuance of general warrants. What is very different in the digital age though is the scale, speed and simplicity with which a disagreeable individual can be traced and manipulated. Even more severe are the potential chilling effects the technological possibilities in the digital age imply to a society as a whole (see Xinjiang): The big data surveillance is not merely about the data collection of individuals but about the study of contextual and collective human behavior. Other reasons for concern are the sometimes opaque motives of big data companies themselves, let alone the difficulty for consumers to know with certainty the implications of all the data they give away for free. These technological possibilities and the shamelessness with which certain political proposals treat privacy issues (such as the Feinstein-Burr decryption bill) should have all of us concerned. The technological possibilities spur politician’s Benthamian-kind utilitarian hopes and, if misappropriated, could end-up in (maybe still science-fiction-like seeming) realities as described in Yuval Noah Harari’s Homo Deus or Frederick B. Skinner’s Walden Two. | |
> > |
Barrhus Frederic—known universally as BF—Skinner, actually. It would have been better to check.
What are the names being dropped for? I don't associate Dianne Feinstein with Jeremy Bentham much. Is this the Panopticon Bentham? Not really a model for thinking about the national security state, nor an explanation of why the senior Democrat and Republican on the Senate Intelligence Committee would take the political positions these senators are predictably taking (again). They're not visionaries imagining some changed form of human society. They're political supporters of the intelligence community whose influence depends on relationship.
| | | | | |
> > |
The technology description in the last section is not accurate: it's
got the usual hype-ful misunderstanding of blockchain storage. If
we're going to recommend teaching people more about privacy, we
should start nearer to the bottom, so that we don't get caught
hyping what we don't fully understand. The best improvement here
would be to explain how people learn about privacy technology, so we
can think about how to teach them, rather than pointing at some
shiny things that may not turn out to be exactly as we imagine them.
On the regulatory side, assuming that law is what it says in the
books is not a safe assumption. GDPR is what data protection
authorities do, so it would be useful to explain the model of their
behavior that leads to any specific prediction of the real meaning
of the system.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines: |
|
NicolaiNuberFirstPaper 2 - 01 May 2018 - Main.NicolaiNuber
|
|
META TOPICPARENT | name="FirstPaper" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | | Regulating Private Companies
| |
< < | In regards to regulation, the principles set forth in the EU’s General Data Protection Regulation (GDPR) seem to me to be a step in the right direction. The new sanction system that will enter into force – a fine of the higher of either up to 4 % of a company’s worldwide annual turnover or up to 20 million EUR – should finally cause companies to take data protection seriously. The privacy by design principle encourages encryption of private data. Further, the extra-territorial reach of GDPR might foster the development of converging rules and provide an equal level of privacy protection, but that is far from certain. Especially the current US administration doesn’t seem to be too bothered by the fact that the future of the US – EU and US – Swiss, respectively, Privacy Shield framework’s future is unclear, given the concerns that the EU Article 29 Working Party has expressed regarding the self-certification procedure of US companies (https://www.hldataprotection.com/2017/12/articles/international-eu-privacy/article-29-working-party-sets-deadline-to-address-privacy-shield-concerns/). Furthermore, the European Commission recently announced that it will make it a priority in future trade and investment agreements to counter rules from other countries – including Russia, China and India – that require companies to store data on local servers (https://www.forbes.com/sites/davidschrieberg1/2018/02/11/e-u-hoping-new-data-protection-under-gdpr-will-have-global-impact/#57258c042dc1). While these EU policies may be well intended it seems unlikely that non-EU countries will adopt them since many non-EU countries have fundamentally different views and interests when it comes to privacy. Also, unfortunately private sector companies are likely to cave into demands on local data storage and sharing, as recently observed with the demands made by China and followed by apple (https://www.cnet.com/news/apple-moving-icloud-encryption-keys-to-china-for-china-based-users/). Albeit the observance and enforcement of GDPR in cross-border situations is questionable, the imposition of respective duties on private companies as stipulated by GDPR is a welcoming and important development. | > > | In regards to regulation, the principles set forth in the EU’s General Data Protection Regulation (GDPR) seem to me to be a step in the right direction. The new sanction system that will enter into force – a fine of the higher of either up to 4 % of a company’s worldwide annual turnover or up to 20 million EUR – should finally cause companies to take data protection seriously. The privacy by design principle encourages encryption of private data. Further, the extra-territorial reach of GDPR might foster the development of converging rules and provide an equal level of privacy protection, but that is far from certain. Especially the current US administration doesn’t seem to be too bothered by the fact that the future of the US – EU and US – Swiss, respectively, Privacy Shield framework’s future is unclear, given the concerns that the EU Article 29 Working Party has expressed regarding the self-certification procedure of US companies (https://www.hldataprotection.com/2017/12/articles/international-eu-privacy/article-29-working-party-sets-deadline-to-address-privacy-shield-concerns/). Furthermore, the European Commission recently announced that it will make it a priority in future trade and investment agreements to counter rules from other countries – including Russia, China and India – that require companies to store data on local servers (https://www.forbes.com/sites/davidschrieberg1/2018/02/11/e-u-hoping-new-data-protection-under-gdpr-will-have-global-impact/#57258c042dc1). However, it seems unlikely that non-EU countries will follow these demands them due to fundamentally divergent views and interests when it comes to privacy. Also, unfortunately private sector companies are likely to cave into demands on local data storage and sharing, as recently observed with the demands made by China and followed by apple (https://www.cnet.com/news/apple-moving-icloud-encryption-keys-to-china-for-china-based-users/). While these EU policies may be well intended they have the further major shortcoming in that they focus on a bilateral transactional relationship. The governance of the bilateral relationship is necessary but far from sufficient since the big issue relates to behavior collection on a large scale that affects society as a whole. On this point, GDPR remains silent. | |
|
|
NicolaiNuberFirstPaper 1 - 20 Mar 2018 - Main.NicolaiNuber
|
|
> > |
META TOPICPARENT | name="FirstPaper" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
Private-Public Partnership
-- By NicolaiNuber - 20 Mar 2018
Introduction
As a citizen of a country that is based on a constitutional democracy, which acknowledges an inalienable right to privacy, one would expect that respective government to protect its citizens from cyber-attacks and illegal surveillance. However, the latest after Snowden’s revelations we know that reality looks different. Since governments themselves may initiate and/or become targets of cyber-attacks, individuals can’t solely rely on governments to ensure their constitutionally mandated protective function. With the knowledge gained thanks to Snowden, we can change our behavior accordingly - at least theoretically. By we, I mean each and every one of us. I say theoretically because society is distracted easily and technological knowledge and abilities are often lacking. Be that as it may, at least we now know about the occurrence and the scale of government surveillance. What society generally doesn’t appreciate though, is the role the big data economy plays as a facilitator (voluntarily or not) of privacy attacks inflicted upon us. And these private companies are typically not subject to constitutional notions of privacy but rather possess wide discretion to use our data as set forth in their terms of use.
The Net and the Private-Public Convergence
The internet is as a system without clear country borders and where private and public sectors converge. This means that we live in a world where big data companies might become (voluntary or involuntary through e.g. decryption backdoors) henchmen of governments. If mandated for the wrong reasons, the government’s availment of big data companies isn’t much different morally than the 18th century-style issuance of general warrants. What is very different in the digital age though is the scale, speed and simplicity with which a disagreeable individual can be traced and manipulated. Even more severe are the potential chilling effects the technological possibilities in the digital age imply to a society as a whole (see Xinjiang): The big data surveillance is not merely about the data collection of individuals but about the study of contextual and collective human behavior. Other reasons for concern are the sometimes opaque motives of big data companies themselves, let alone the difficulty for consumers to know with certainty the implications of all the data they give away for free. These technological possibilities and the shamelessness with which certain political proposals treat privacy issues (such as the Feinstein-Burr decryption bill) should have all of us concerned. The technological possibilities spur politician’s Benthamian-kind utilitarian hopes and, if misappropriated, could end-up in (maybe still science-fiction-like seeming) realities as described in Yuval Noah Harari’s Homo Deus or Frederick B. Skinner’s Walden Two.
What to do?
Given (private) big data companies’ role, I believe we should respond twofold: Firstly, regulation has to be amended to address the changed reality, particularly with respect to big data companies. Secondly, individuals have to be educated in order to have the desire to protect themselves.
Regulating Private Companies
In regards to regulation, the principles set forth in the EU’s General Data Protection Regulation (GDPR) seem to me to be a step in the right direction. The new sanction system that will enter into force – a fine of the higher of either up to 4 % of a company’s worldwide annual turnover or up to 20 million EUR – should finally cause companies to take data protection seriously. The privacy by design principle encourages encryption of private data. Further, the extra-territorial reach of GDPR might foster the development of converging rules and provide an equal level of privacy protection, but that is far from certain. Especially the current US administration doesn’t seem to be too bothered by the fact that the future of the US – EU and US – Swiss, respectively, Privacy Shield framework’s future is unclear, given the concerns that the EU Article 29 Working Party has expressed regarding the self-certification procedure of US companies ( https://www.hldataprotection.com/2017/12/articles/international-eu-privacy/article-29-working-party-sets-deadline-to-address-privacy-shield-concerns/). Furthermore, the European Commission recently announced that it will make it a priority in future trade and investment agreements to counter rules from other countries – including Russia, China and India – that require companies to store data on local servers ( https://www.forbes.com/sites/davidschrieberg1/2018/02/11/e-u-hoping-new-data-protection-under-gdpr-will-have-global-impact/#57258c042dc1). While these EU policies may be well intended it seems unlikely that non-EU countries will adopt them since many non-EU countries have fundamentally different views and interests when it comes to privacy. Also, unfortunately private sector companies are likely to cave into demands on local data storage and sharing, as recently observed with the demands made by China and followed by apple ( https://www.cnet.com/news/apple-moving-icloud-encryption-keys-to-china-for-china-based-users/). Albeit the observance and enforcement of GDPR in cross-border situations is questionable, the imposition of respective duties on private companies as stipulated by GDPR is a welcoming and important development.
Education and Community Building
Another essential pillar, if privacy is to be taken seriously in the digital age, is the education of individuals and the establishment of networks devoted to ascertaining privacy. With data breaches and negative news of (centralized) big data companies on the rise, one would expect that privacy becomes a topic that individuals start to care more about. A promising example of this can be seen in blockchain technology and the communities built around it. In its set-up, these networks inherently cut-out the middle-man and are user-controlled. However, without additional layers of protection, blockchain technology itself is not safe from big data analytics ( http://peerproduction.net/issues/issue-9-alternative-internets/peer-reviewed-papers/the-interplay-between-decentralization-and-privacy-the-case-of-blockchain-technologies/). Blockchain-based projects such as “Chainiac” (developed at the Swiss Federal Institute of Technology in Lausanne (EPFL)), which tries to make it impossible for governments to force software companies to deliver software updates with secret backdoors in them, spur hope ( https://www.fanaticalfuturist.com/2017/08/blockchain-experts-are-putting-a-stop-to-governments-putting-backdoors-in-software/). Ultimately, it seems to be a constant fight between privacy proponents and their adversaries, the outcome of which will heavily influence how society will develop. Individuals' technological literacy will play a significant role in the outcome.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|
|
This site is powered by the TWiki collaboration platform. All material on this collaboration platform is the property of the contributing authors. All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
|
|