SebastianValdezOrandayFirstPaper 3 - 25 Apr 2024 - Main.SebastianValdezOranday
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | The EU Digital Services Act – What does it mean and are there implications for the US? | > > | The EU DIGITAL SERVICES ACT: PROTECTING INTERESTS OR EXTENDING PATERNALISM? | | -- By SebastianValdezOranday - 01 Mar 2024
BACKGROUND: | |
< < | In 2000, the European Union made its first attempt at regulating digital services through the employment of the Electronic Commerce Directive [https://www.privacyshield.gov/ps/article?id=European-union-ECommerce#:~:text=The%20Electronic%20Commerce%20Directive%20(2000,established%20(country%20of%20origin]. The directive was aimed at requiring online service providers to abide by those already existing consumer protection rules in the regions where they operated. With rapid advancements in technology and internet service, the directive’s goal of telling companies to comply with the already existing rules, predictably, went out of style both in scope and capability. Internet legislation from 2000 simply could not keep up, and the directive only provided a basic skeleton for requiring providers to post information publicly and comply with advertising regulations.
By the 2020s, the EU sought to greatly update the original directive to reflect the changed media landscape. This objective culminated in the passage of the Digital Services Act (“DSA”) and the Digital Markets Act (“DMA”); twin acts passed in 2022 to revamp the old directive [https://www.globalcompliancenews.com/2022/11/16/https-insightplus-bakermckenzie-com-bm-intellectual-property-european-union-the-digital-services-act-dsa-and-the-digital-markets-act-dma-finally-approved_10312022/]. Like the General Data Protection Regulation, the new acts apply to both EU operators and non-EU companies that interact with the EU or its citizens. To understand the implications of the DSA and the DMA, we’ll need to examine what responsibilities both acts impose on their subjects, and who those subjects even are. | > > | In the 2020s, the European Union passed the Digital Services Act (“DSA”) and the Digital Markets Act (“DMA”); 2022’s twin acts designed to revamp the EU’s old approach to digital policing. Like the General Data Protection Regulation, the new acts apply to both EU operators and non-EU companies that interact with the EU or its citizens. | | WHAT THE DMA AND DSA ACTUALLY DO: | |
< < | To cover the DMA quickly, it suffices to say that the DMA allows the EU Commission to designate certain digital providers as “gatekeepers” in digital markets [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_en]. A “gatekeeper” is any large online platform with an advantageous economic position that influences digital markets. This designation makes a provider subject to the DMA, requiring them to, among other things, provide advertising for other companies, list their products at the same level as third-party products, and allow other businesses to access the data generated through that business’s use of the platform. The DMA, then, is intended to promote competition by leveling the playing field where one team is designated a gatekeeper by forcing the gatekeeper to advantage third parties on its platform. | > > | The DMA is intended to promote competition by leveling the playing field by designating one team a “gatekeeper” and forcing the gatekeeper to advantage third parties on its platform. The DSA applies to any online intermediary, including social media platforms and digital storefronts. See the following for how the Commission distinguishes providers based on their size and influence: Digital Services Act. The largest providers must analyze societal risks, like those related to illegal content, freedom of expression, media freedom, pluralism, discrimination, and election misinformation.
While the EU highlights its goals of setting bright-line rules to lessen confusion around content moderation and to increase competitiveness among businesses, most interesting is the focus on an obviously paternalistic goal to protect “society at large” by ensuring companies are accountable to democratic control.
IMMEDIATE IMPACT:
Two days after its complete implementation, the EU Commission announced an investigation and possible proceedings against TikTok for violations of the DSA. The investigation is targeting violations related to the promotion of behavioral addictions due to system design and a failure to conduct an assessment to counter risks for a person’s mental health and radicalization.
Meta, formerly Facebook, welcomed the transparency required by the DSA and lauded the harmonization of compliance requirements among large providers. Still, compliance with the DSA for large providers costs money –– and the larger the provider, the more it will pay. Meta and TikTok? are among companies challenging the fees they were assessed to fund EU regulators to ensure compliance.
WHY THE FUSS? | | | |
< < | On the flip side, the DSA applies to any online intermediary, including social media platforms and digital storefronts. Among these providers, the EU makes distinctions between multiple levels of providers, with their obligations under the DSA matching their size/influence. Thus, intermediary services like internet providers and hosting services have the fewest obligations under the DSA, while providers with the designation of “very large online platforms/search engines” (“VLOP/SE”) are subject to the strictest obligations, as they pose “particular risks in the dissemination of illegal content and societal harms” [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en#:~:text=Digital%20Services%20Act%20(DSA)%20overview&text=Its%20main%20goal%20is%20to,and%20open%20online%20platform%20environment]. Because of their size, these VLOPs and VLOSEs (examples include Google, Amazon, X (still listed as Twitter)) must analyze societal risks, like those related to illegal content, freedom of expression, media freedom, pluralism, discrimination, election information, and others [https://digital-strategy.ec.europa.eu/en/policies/dsa-vlops]. If these risks exist, they must be reported to the EU Commission and the provider must address the risks, while making its information available to the EU both for examination and for the EU to independently vet platform data. | > > | This question is easier asked than answered. Broadly speaking, the EU’s actions can be boiled down to be a response to either of two developments resulting from the proliferation of social media and internet communication. | | | |
< < | These specificities in the DSA’s language indicate sharp departures from the original directive in 2000, notably by attempting to combat what the EU identified as a “risk” beyond the ordinary consumer protection hurdles. Now, the EU’s lofty goal is an attempt to reign in misinformation, disinformation, and possible threats to freedom of expression. | > > | First, and more cynically, the likelier of the two options: The EU, like all great powers before it, is hoping to establish control over and reign in the hold of foreign agents over the vast majority of its citizenry. American-based platforms and providers like Facebook, Instagram, What’s App, and Google are a ubiquitous part of the EU citizen’s life, and with that presence inevitably comes the belief that one’s citizens are subject to foreign control. It’s no shock that much of the Act is geared toward the household American apps and not, say, Telegram or WeChat? . | | | |
< < | The act’s overarching goal is to prevent “illegal and harmful activities online and the spread of disinformation" [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en#:~:text=Digital%20Services%20Act%20(DSA)%20overview&text=Its%20main%20goal%20is%20to,and%20open%20online%20platform%20environment]. While the EU highlights its goals of setting bright-line rules to lessen confusion around content moderation and to increase competitiveness among businesses, most interesting is the focus on a possibly paternalistic goal to protect “society at large” by ensuring companies are accountable to democratic control. | > > | In fairness, the EU Commission does not stand alone in their crusade to control foreign media operators. One need look no further than the U.S. government’s animosity toward TikTok? for a domestic example. Under the guise of regulating content access and protecting the next generation of Americans, Congress and the Executive seem most focused on effecting the sale of TikTok? to a US-based company. While not conclusive, such fixation shows that, for the U.S., it’s not so much about abating misinformation and disinformation, rather, it’s about making sure the suspect information is not coming from a foreign provider. While the EU act’s do not take the outwardly drastic stance that the U.S. does toward TikTok? , the same sentiment is there; content can be unsafe, so long as it is unsafe under the home country’s watchful eye. | | | |
< < | IMPACT SO FAR: | > > | The other explanation is that the EU, like the U.S., is embracing its paternalistic role in internet and content regulation. Unfortunately for all parties involved, this explanation for the DSA and legislation like it require us to make a couple of costly assumptions about the state of the world and the people who use the internet as a whole. | | | |
< < | In the second half of 2023, the DSA began coming into effect for subjects. On February 17, 2024, any grace periods associated with the DSA expired, and its provisions and rules applied to all platforms. So, how did the EU, and subject companies, react to the implementation? | > > | First, to validate this paternalistic role in content regulation that seems to ramp up during election years, the governing body must believe that its citizens are incapable of thinking critically or skeptically about the misinformation they might encounter on the web. Second, upon identifying these purported weaknesses, the body must believe that it is best positioned to regulate and control such content so its people’s best interests are served. | | | |
< < | For its part, the EU wasted no time in acting on its newfound authority under the DSA. Two days after its complete implementation, the EU Commission announced an investigation and possible proceedings against TikTok? for violations of the DSA [https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926]. The investigation is targeting violations related to the promotion of behavioral addictions due to system design and a failure to conduct an assessment to counter risks for a person’s mental health and radicalization. | > > | If the average citizen is really a danger to themselves from internet consumption to the extent that a government truly needs to step in and bring oversight, these measures attack the symptoms and not the cause. If governments are sincere in their assessments that these moves are about protecting citizens and not a power grab over foreign markets, they should shine the spotlight on the total lack of media literacy, particularly with respect to the rise of AI content on platforms like X, TikTok? , and even films. | | | |
< < | Businesses, meanwhile, have had positive and negative public reactions to the DSA. In 2023, Meta welcomed the transparency required by the DSA and lauded the harmonization of compliance requirements among large providers [https://about.fb.com/news/2023/08/new-features-and-additional-transparency-measures-as-the-digital-services-act-comes-into-effect/]. Still, compliance with the DSA for large providers costs money –– and the larger the provider, the more it will pay. Meta and TikTok? are among companies challenging the fees they were assessed to fund EU regulators to ensure compliance; these fees are calculated based on a provider’s profits. Effectively, larger companies must pay more for regulation, while loss-leaders will pay less, if anything at all [https://www.theverge.com/2024/2/8/24065809/meta-european-union-digital-services-act-monitoring-compliance-charge-challenge]. | > > | AMERICAN OUTCOMES | | | |
< < | While the DSA’s requirements are limited to the EU and its citizens, the effects are largely on American businesses, and it’s possible that shifting attitudes in American society bring the goals of the DSA to American shores, namely efforts to combat disinformation and free speech risks for online content [https://www.thenation.com/article/politics/european-union-digital-services-act-us-tech-regulation/]. These issues, or at least their identification, are increasingly popular in America in a time where the Senate questions tech leaders over concerns with platform privacy, free speech, and election misinformation. This being an election year, I suspect these conversations will only be amplified. If American providers have these systems and assessment mechanisms in place already, a cultural shift toward accepting more government control of the operations of online providers is not out of the realm of possibility. | > > | While the DSA’s requirements are limited to the EU and its citizens, the effects are largely on American businesses, and it’s possible that shifting attitudes in American society bring the goals of the DSA to American shores, namely efforts to combat disinformation and free speech risks for online content. | | | |
< < |
Summarizing legislation is not a good use of your space. One link to the text and one or two to the best published analysis you have found does the whole job in 150 words or less. Then we can ask real questions. Why does the European Commission try to regulate non-European entities that are absolutely important to the lives of its citizens? The desire to export regulation over non-native industries is politically self-evident, and its uselessness equally so. This is really a tax negotiation, accompanied by the forms of delay that decade-consuming litigation will involve, while the terms of commissioners are five years long. A little political realism, like a page of history, turns out to be worth a volume of logic.
| > > | These issues, or at least their identification, are increasingly popular in America in a time where the Senate questions tech leaders over concerns with platform privacy, free speech, and election misinformation. This being an election year, I suspect these conversations will only be amplified. If American providers have the systems and assessment mechanisms called for by the DSA in place already, a cultural shift toward accepting more government control of the operations of online providers is not out of the realm of possibility if American providers get on board. | | | |
> > | Unfortunately for all involved, the root of the issue will remain unsolved, and perhaps even get worse, with such a level of control exerted by governments. If history has taught us anything, it’s that suppression of ideas will only cause people to seek them out more zealously. | |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines: |
|
SebastianValdezOrandayFirstPaper 2 - 22 Apr 2024 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | | | | |
< < | It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | | The EU Digital Services Act – What does it mean and are there implications for the US? | | Businesses, meanwhile, have had positive and negative public reactions to the DSA. In 2023, Meta welcomed the transparency required by the DSA and lauded the harmonization of compliance requirements among large providers [https://about.fb.com/news/2023/08/new-features-and-additional-transparency-measures-as-the-digital-services-act-comes-into-effect/]. Still, compliance with the DSA for large providers costs money –– and the larger the provider, the more it will pay. Meta and TikTok? are among companies challenging the fees they were assessed to fund EU regulators to ensure compliance; these fees are calculated based on a provider’s profits. Effectively, larger companies must pay more for regulation, while loss-leaders will pay less, if anything at all [https://www.theverge.com/2024/2/8/24065809/meta-european-union-digital-services-act-monitoring-compliance-charge-challenge].
While the DSA’s requirements are limited to the EU and its citizens, the effects are largely on American businesses, and it’s possible that shifting attitudes in American society bring the goals of the DSA to American shores, namely efforts to combat disinformation and free speech risks for online content [https://www.thenation.com/article/politics/european-union-digital-services-act-us-tech-regulation/]. These issues, or at least their identification, are increasingly popular in America in a time where the Senate questions tech leaders over concerns with platform privacy, free speech, and election misinformation. This being an election year, I suspect these conversations will only be amplified. If American providers have these systems and assessment mechanisms in place already, a cultural shift toward accepting more government control of the operations of online providers is not out of the realm of possibility. | |
> > |
Summarizing legislation is not a good use of your space. One link to the text and one or two to the best published analysis you have found does the whole job in 150 words or less. Then we can ask real questions. Why does the European Commission try to regulate non-European entities that are absolutely important to the lives of its citizens? The desire to export regulation over non-native industries is politically self-evident, and its uselessness equally so. This is really a tax negotiation, accompanied by the forms of delay that decade-consuming litigation will involve, while the terms of commissioners are five years long. A little political realism, like a page of history, turns out to be worth a volume of logic.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines: |
|
SebastianValdezOrandayFirstPaper 1 - 01 Mar 2024 - Main.SebastianValdezOranday
|
|
> > |
META TOPICPARENT | name="FirstPaper" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
The EU Digital Services Act – What does it mean and are there implications for the US?
-- By SebastianValdezOranday - 01 Mar 2024
BACKGROUND:
In 2000, the European Union made its first attempt at regulating digital services through the employment of the Electronic Commerce Directive [https://www.privacyshield.gov/ps/article?id=European-union-ECommerce#:~:text=The%20Electronic%20Commerce%20Directive%20(2000,established%20(country%20of%20origin]. The directive was aimed at requiring online service providers to abide by those already existing consumer protection rules in the regions where they operated. With rapid advancements in technology and internet service, the directive’s goal of telling companies to comply with the already existing rules, predictably, went out of style both in scope and capability. Internet legislation from 2000 simply could not keep up, and the directive only provided a basic skeleton for requiring providers to post information publicly and comply with advertising regulations.
By the 2020s, the EU sought to greatly update the original directive to reflect the changed media landscape. This objective culminated in the passage of the Digital Services Act (“DSA”) and the Digital Markets Act (“DMA”); twin acts passed in 2022 to revamp the old directive [https://www.globalcompliancenews.com/2022/11/16/https-insightplus-bakermckenzie-com-bm-intellectual-property-european-union-the-digital-services-act-dsa-and-the-digital-markets-act-dma-finally-approved_10312022/]. Like the General Data Protection Regulation, the new acts apply to both EU operators and non-EU companies that interact with the EU or its citizens. To understand the implications of the DSA and the DMA, we’ll need to examine what responsibilities both acts impose on their subjects, and who those subjects even are.
WHAT THE DMA AND DSA ACTUALLY DO:
To cover the DMA quickly, it suffices to say that the DMA allows the EU Commission to designate certain digital providers as “gatekeepers” in digital markets [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_en]. A “gatekeeper” is any large online platform with an advantageous economic position that influences digital markets. This designation makes a provider subject to the DMA, requiring them to, among other things, provide advertising for other companies, list their products at the same level as third-party products, and allow other businesses to access the data generated through that business’s use of the platform. The DMA, then, is intended to promote competition by leveling the playing field where one team is designated a gatekeeper by forcing the gatekeeper to advantage third parties on its platform.
On the flip side, the DSA applies to any online intermediary, including social media platforms and digital storefronts. Among these providers, the EU makes distinctions between multiple levels of providers, with their obligations under the DSA matching their size/influence. Thus, intermediary services like internet providers and hosting services have the fewest obligations under the DSA, while providers with the designation of “very large online platforms/search engines” (“VLOP/SE”) are subject to the strictest obligations, as they pose “particular risks in the dissemination of illegal content and societal harms” [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en#:~:text=Digital%20Services%20Act%20(DSA)%20overview&text=Its%20main%20goal%20is%20to,and%20open%20online%20platform%20environment]. Because of their size, these VLOPs and VLOSEs (examples include Google, Amazon, X (still listed as Twitter)) must analyze societal risks, like those related to illegal content, freedom of expression, media freedom, pluralism, discrimination, election information, and others [https://digital-strategy.ec.europa.eu/en/policies/dsa-vlops]. If these risks exist, they must be reported to the EU Commission and the provider must address the risks, while making its information available to the EU both for examination and for the EU to independently vet platform data.
These specificities in the DSA’s language indicate sharp departures from the original directive in 2000, notably by attempting to combat what the EU identified as a “risk” beyond the ordinary consumer protection hurdles. Now, the EU’s lofty goal is an attempt to reign in misinformation, disinformation, and possible threats to freedom of expression.
The act’s overarching goal is to prevent “illegal and harmful activities online and the spread of disinformation" [https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en#:~:text=Digital%20Services%20Act%20(DSA)%20overview&text=Its%20main%20goal%20is%20to,and%20open%20online%20platform%20environment]. While the EU highlights its goals of setting bright-line rules to lessen confusion around content moderation and to increase competitiveness among businesses, most interesting is the focus on a possibly paternalistic goal to protect “society at large” by ensuring companies are accountable to democratic control.
IMPACT SO FAR:
In the second half of 2023, the DSA began coming into effect for subjects. On February 17, 2024, any grace periods associated with the DSA expired, and its provisions and rules applied to all platforms. So, how did the EU, and subject companies, react to the implementation?
For its part, the EU wasted no time in acting on its newfound authority under the DSA. Two days after its complete implementation, the EU Commission announced an investigation and possible proceedings against TikTok? for violations of the DSA [https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926]. The investigation is targeting violations related to the promotion of behavioral addictions due to system design and a failure to conduct an assessment to counter risks for a person’s mental health and radicalization.
Businesses, meanwhile, have had positive and negative public reactions to the DSA. In 2023, Meta welcomed the transparency required by the DSA and lauded the harmonization of compliance requirements among large providers [https://about.fb.com/news/2023/08/new-features-and-additional-transparency-measures-as-the-digital-services-act-comes-into-effect/]. Still, compliance with the DSA for large providers costs money –– and the larger the provider, the more it will pay. Meta and TikTok? are among companies challenging the fees they were assessed to fund EU regulators to ensure compliance; these fees are calculated based on a provider’s profits. Effectively, larger companies must pay more for regulation, while loss-leaders will pay less, if anything at all [https://www.theverge.com/2024/2/8/24065809/meta-european-union-digital-services-act-monitoring-compliance-charge-challenge].
While the DSA’s requirements are limited to the EU and its citizens, the effects are largely on American businesses, and it’s possible that shifting attitudes in American society bring the goals of the DSA to American shores, namely efforts to combat disinformation and free speech risks for online content [https://www.thenation.com/article/politics/european-union-digital-services-act-us-tech-regulation/]. These issues, or at least their identification, are increasingly popular in America in a time where the Senate questions tech leaders over concerns with platform privacy, free speech, and election misinformation. This being an election year, I suspect these conversations will only be amplified. If American providers have these systems and assessment mechanisms in place already, a cultural shift toward accepting more government control of the operations of online providers is not out of the realm of possibility.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|