The GDPR is a European Union directive that seeks to protect personal data. It was passed in 2016, but the implications for companies like Facebook and Twitter were only just coming into focus last year. Now though, EU lawmakers have reached an agreement on how this huge new law will work in practice. The decision could be crucial for internet providers as they try to comply with the law while still providing services to consumers.
The “digital services act explained” is a piece of legislation that has been proposed in Europe. It aims to regulate social media companies and protect consumer rights.
BRUSSELS— Europe is raising the bar on technology regulation once again, creating new guidelines for social media and other digital platforms that are anticipated to have global ramifications.
The main points of a new law aimed at forcing tech companies to take more responsibility for the content their users post online have been agreed upon by legislators in the European Union, as part of a sweeping package of regulations introduced by the EU to set new rules on digital competition and services.
The Digital Services Act would impose a number of rules on online platforms, including standards for removing unlawful material, a prohibition on child-targeted advertising, and additional screening procedures for third-party vendors. Very big platforms, defined as those with more than 45 million users in the EU, will be required to perform risk assessments and provide authorities access to the algorithms that decide what material users view.
Although the text of the agreement was not immediately accessible, a statement from the European Parliament summarized some of the key issues reached by negotiators.
The new EU guidelines may have an impact on how social media businesses, such as Meta Platforms’ Facebook, react to reports of hazardous content.
REUTERS/Carlos Barria photo
In a statement, Christel Schaldemose, a member of the European Parliament from Denmark who has been the body’s primary negotiator for the DSA, claimed that the law “would establish new worldwide norms.” “At long last, we’ve ensured that what’s unlawful outside is equally criminal online.”
The EU isn’t the only country working on new rules for internet businesses. In the United States, the Biden administration has backed antitrust measures aimed at limiting the market dominance of big IT businesses. A proposal in the United Kingdom seeks to compel businesses to handle harmful information, including as material about eating disorders or self-harm.
The proposed new EU rules could have an impact on a variety of business practices, from how online marketplaces like Amazon.com Inc. interact with third-party sellers to how social media companies like Meta Platforms Inc.’s Facebook or ByteDance Ltd.’s TikTok respond to user complaints about harmful posts or a decision to lock a user’s account.
The Digital Single Market Act, or DSA, is the second half of a large EU package aimed at establishing new regulations for competition and online content, and it may have far-reaching global implications for businesses and consumers. Its equivalent, the Digital Markets Act, was approved by parliament in March and would impose new competition regulations on the world’s largest digital giants, as well as heavy penalties.
The DSA’s political accord clears the way for the bill to go ahead. The EU Parliament and representatives from EU nations must still give their final approval, although that procedure is unlikely to result in any substantial changes.
Some analysts believe the DSA might serve as a model for other nations. In the past, Europe has been a regulatory trailblazer, such as with its groundbreaking privacy law, the General Data Protection Regulation, or GDPR, and more recently with the Digital Markets Act.
According to Joris van Hoboken, a legal professor at Vrije Universiteit Brussel, “Europe has acquired a bit of a hunger” for being the first to implement new legislation. “It is hoped that this will become a global standard.”
Subscribe to our newsletter
Alert on Technology
There’s a lot going on in the world of technology.
In a meeting that was largely expected to result in a settlement, negotiators started talks on Friday morning. How to put limitations on advertising that targets minors or deals with sensitive data, as well as how to handle so-called dark patterns, in which a user is persuaded into doing or agreeing to something they didn’t mean, were hot topics ahead of the discussions.
Legislators were also expected to consider a measure to levy a tax on big platforms to fund the expenses of policing the new laws.
Experts say the law isn’t intended to regulate what lawful information internet platforms can and can’t put online. Instead, it tries to establish procedural norms for dealing with unlawful material and ensuring that businesses enforce their own terms and conditions uniformly and equitably. It also aims to require platforms to give more information about how they make choices, such as removing material or locking a user’s account, and to provide a way for users to complain about such decisions.
During a crisis, such as a public security risk or a health hazard like the Covid-19 outbreak, one clause introduced late in the legislation process would empower authorities to force extremely big platforms “to restrict any urgent threats” on such platforms. According to the European Parliament’s statement, the provision’s requirements would be restricted to three months.
According to an earlier draft of the law, very big corporations who do not comply with the new standards might face penalties of up to 6% of their annual worldwide sales.
The proposal was originally submitted in December 2020, and it has since progressed quickly through the EU’s sometimes-difficult legislative procedure. The disclosures from The Wall Street Journal’s Facebook Files investigation, which indicated that Facebook, now known as Meta, was aware that its platforms had faults that caused damage to certain users, increased the drive, according to government and industry leaders.
The Computer and Communications Industry Association, a trade organization that represents digital businesses, has indicated support for the legislation’s intentions but has questioned some of the provisions, such as providing remedy for any user whose material is demoted by a platform.
The DSA, as planned, includes duties that would apply to a variety of businesses that aren’t considered online platforms, such as internet service providers and web hosting services. In comparison to those that apply to online platforms, those firms’ duties would be far more constrained.
Hundreds of thousands of businesses might be affected by the new regulation, according to Daphne Keller, an internet law professor at Stanford University. Many smaller businesses, she believes, are likely unaware of the new requirements.
Companies that perform any type of content moderation “will most likely spend the next six months recruiting and developing new procedures,” she added. “This is going to be a difficult task.”
—This essay was co-written by Daniel Michaels and Sam Schechner.
Kim Mackrael can be reached at [email protected]
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
The “facebook digital services act” is a new deal that was reached by European lawmakers. The new law will create stricter regulations for social media companies operating in Europe.
- dsa regulations
- digital services act social media
- digital data act
- digital services act advertising
- digital legislation