Digital euro proposal set for debate as EU advances AI restrictions legislation
Separate proposals could impose restrictions in each area.
Separate reports indicate that lawmakers in the European Union (EU) are advancing potential rules around a “digital euro” and artificial intelligence (AI).
ECB could limit digital euro for financial stability
Bloomberg reported June 14 that it obtained a copy of a draft proposal from the European Commission. That proposal could require the European Central Bank (ECB) to set limits on the use of a digital euro as far as financial stability is concerned.
Those limits could not block transactions that do not pose a risk to financial stability, and the limits would need to be applicable across the eurozone.
Bloomberg reported the draft text also indicates that acceptance of a digital euro would be mandatory and that the asset would be considered legal tender. Yet it would be possible for some parties not to accept the asset in certain situations.
Because the digital euro and other central bank digital currencies (CBDCs) allow for extensive government control, they are highly controversial in the crypto community.
Bloomberg reported that the draft text is subject to change before its June 28 presentation, and other recent reports suggest that EU policymakers are divided on whether a digital euro is needed at all.
EU moves toward AI restrictions
An unrelated announcement from the European Parliament on June 14 said that members advanced an Artificial Intelligence (AI) Act to the next stage.
The relevant vote saw 499 votes in favor, 28 against, and 93 abstaining. This does not enact the legislation into law; rather, it adopts a negotiating position.
If the Artificial Intelligence Act eventually comes into force, it would ban certain applications of AI. The law would specifically ban discriminatory uses of AI, such as predictive policing applications and some types of biometric applications.
It would also designate AI systems that can affect health, safety, and the environment, or influence election voting or social media recommendation systems, as “high risk.”
Finally, the Act would require foundational model providers to assess product risk and register before launching on the EU market. It would also require providers of generative AI models to provide summaries of copyrighted training datasets and take measures to prevent illegal content from being generated.