Tuesday, July 22, 2025

Creating liberating content

Introducing deBridge Finance: Bridging...

In the dynamic landscape of decentralized finance (DeFi), innovation is a constant,...

Hyperliquid Airdrop: Everything You...

The Hyperliquid blockchain is redefining the crypto space with its lightning-fast Layer-1 technology,...

Unlock the Power of...

Join ArcInvest Today: Get $250 in Bitcoin and a 30% Deposit Bonus to...

Claim Your Hyperliquid Airdrop...

How to Claim Your Hyperliquid Airdrop: A Step-by-Step Guide to HYPE Tokens The Hyperliquid...
HomePoliticsFacebook and 16...

Facebook and 16 other major platforms have four months to comply with the new EU rules.

On Tuesday, the European Commission identified 17 very large platforms, including Facebook and Twitter, as well as two search engines, Bing and Google, which will have new responsibilities for content moderation and user protection in four months.

These commitments come with the entry into force of the European Union (EU) Digital Services Act last November, under which “the Commission adopted its first appointment decisions on Tuesday” identifying 17 very large scale online platforms with 45 million active users per month who will have to comply with the new rules, including AliExpress, Amazon, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and Zalando.

In addition, two very large search engines such as Bing and the Google search tool.

Following this determination, based on the total number of users as of February last year, “companies will now have four months to comply with a full set of new obligations under the Digital Services Act,” which aims to “enhance and protect online users,” including minors who require special services to assess and mitigate their systemic risks and provide robust content moderation tools,” the community leader said in a statement.

From the outset, it is about more power for users, who will now need to receive clear information and be able to easily denounce the illegal content that platforms will have to deal with, while technology will also be responsible for labeling ads and informing who is promoting it. their.

Regarding the protection of minors, platforms will have to redesign their systems to guarantee a high level of privacy and security, and they will no longer be able to serve ads aimed at children.

Another responsibility that now falls on these very large platforms is to combat misinformation, to take measures to combat the spread of false news, to eliminate the risks associated with the dissemination of illegal content “online” and the negative consequences for freedom of expression and information. as well as having a mechanism that allows users to tag this type of content.

To monitor compliance with all these new responsibilities, external and independent audits are planned, and in addition, platforms will need to provide researchers with greater access to data and publish transparency reports.

“Within four months of notification of designation decisions, designated platforms and search engines must adapt their compliance systems, resources and processes, establish an independent compliance system, and conduct and submit to the Commission their first annual risk assessment,” concludes Brussels.

In November last year, the new Digital Services Law, created to protect the fundamental rights of “online” users, came into force, an unprecedented law in the digital space that makes platforms liable for illegal and harmful content.

The new law applies to tech “giants” that have 45 million or more users in the EU, about 10% of the community’s population, as well as new AI services such as ChatGPT, given that despite not being are considered a platform that cooperates with these

Author: Portuguese
Source: CM Jornal

Get notified whenever we post something new!

Continue reading