proposed bill aims to limit the daring of opacity on the Internet | Jackson walker
Online platforms may soon face restrictions on how they use personal data to personalize content and information flows for users. Biparty legislation recently proposed in the US House of Representatives aims to fundamentally change the delivery of content and information online by making it illegal to unknowingly adapt a user’s experience. If adopted, the proposal would prevent online platforms from concentrating search results on a specific user based on that user’s personal data without the user’s express consent to receive personalized results.
Under the Bubble Filter Transparency Bill, HR 5921, an online content provider cannot use an “opaque algorithm”, that is, a “ranking system that determines the order. or how information is provided to a user. . . based, in whole or in part, on user-specific data[,]”To determine what information users see and how they see it.
To overcome the restriction, the House bill requires a user to give consent or somehow provide personal data in a way that shapes the user experience. Otherwise, the vendor must provide users with an alternate “transparent in” version of its platform that “does not use a user’s user-specific data to determine the order or manner in which the information is. provided to this user ”. The bill goes so far as to require providers using an opaque algorithm to provide their users with the ability to disable the algorithm “by selecting a prominently placed icon” which instantly switches the user to the transparent algorithm as input. .
The proposed law applies not only to websites, but to any “covered Internet platform” such as mobile apps, social media sites, video services and search engines.
In other words, the proposed legislation would require developers to create two completely different experiences or face sanctions from the Federal Trade Commission. The exact extent of the penalties remains to be seen, but a violation would be considered an unfair or deceptive act or practice under FTC law.
Supporters see the House bill as a necessary step to stem the misuse of user data to manipulate what the internet forces us to consume, which has been a hot topic in recent congressional hearings. They fear that unintentional users exposed to manipulated information flows will not know how that information gets to them, making their user experience artificial and inauthentic. With this in mind, the bill hopes to transfer control over the flow of information from suppliers to users and curb the harmful or addictive nature of certain sites. Beautiful lenses indeed. A bipartisan Senate version of the bill, S. 2024, was also introduced under the same name.
So what should a supplier do in response to this proposition? Nothing yet. When and if that becomes law, developers will have one year to update their platform design and associated user agreements.
Until then, the boldness of opacity remains limited only by the current guardrails in place.