BSV
$71.81
Vol 96.62m
1.22%
BTC
$95107
Vol 88815.75m
-1.19%
BCH
$531.54
Vol 1474.17m
2.88%
LTC
$128.55
Vol 3621.88m
6.58%
DOGE
$0.4
Vol 9478.61m
-4.38%
Getting your Trinity Audio player ready...

Dr. Thorsten Pötzsch, Executive Director of Securities Supervision and Asset Management at BaFin, Germany’s integrated financial regulatory authority, has called for “speed, clear rules and a minimum of bureaucracy” in legislation, citing the EU’s landmark MiCA regulation as a prime example of well-made legislation that could benefit from being less comprehensive.

“If Europe wants to exploit the potential of the capital markets, it must not stand in its own way. Regulation should not be designed to regulate every little detail. It has to address the big problems,” said Pötzsch in a speech at the BVI Fund Operations Conference.

By way of example, he pointed to MiCA, which he praised as “a binding legal framework in Europe for the regulation of crypto assets and service providers. This gives us security for providers and customers.”

However, Pötzsch noted that with MiCA also came an additional 57 guidelines and legal rules on various specific topics and minutiae.

“Despite all the understanding for quickly adapting regulations in fast-moving markets, I ask myself: Is this amount of rules really effective? Do we need a set of rules that determines what standard forms and sample texts for crypto asset white papers should look like? Couldn’t it really be simpler, more pragmatic?” asked Pötzsch.

He urged that Europe must be an attractive and safe location for capital market players, saying, “We have to curb complexity.” Pötzsch suggested that the ultimate aim of regulation should be to “advance the European capital markets.”

Yet he was keen to put some onus on companies to also take responsibility for ensuring that the European capital markets function well in an increasingly digitalized environment, emphasizing the need to get cyber risks under control.

“Due to increasing digitalization, the attack surface for cyber attacks is growing. The threat from powerful white-collar criminals is increasing. And geopolitical tensions increase the risk of politically motivated attacks,” warned Pötzsch. “Cyber ​​attacks are dangerous for companies. But they are also extremely dangerous for the financial system as a whole and for financial stability.”

DORA protections

In terms of specific cybersecurity risks, Pötzsch pointed to the outsourcing of IT services as an area where companies should be vigilant.

“More and more companies in the financial sector are outsourcing IT services. That makes economic sense too, because costs fall and quality ideally increases, but it carries risks,” said Pötzsch.

“In some areas, a few IT service providers serve many companies in the financial sector. If disruptions occur with these multi-mandate service providers, then in the worst case the entire financial sector has a problem. That’s why targeted risk management is extremely important.”

The EU has not been complacent on this issue. Pötzsch pointed to the Digital Operational Resilience Act, or DORA, which was introduced as part of the same package as MiCA and should enable regulators to better manage such cybersecurity risks, including through a European monitoring framework for ICT third-party service providers.

First published in September 2020 as part of the EU’s Digital Finance Package (DFP), DORA aims to improve the ‘digital resiliency’ of the financial system, in particular cybersecurity vulnerabilities, reporting and testing shortcomings, and a lack of oversight of third-party providers.

It introduces a common set of standards to manage digital risks across the financial sector and ensures the necessary measures are in place to protect against cyberattacks and other sources of disruption.

The Act entered into force on January 16, 2023, and will apply as of January 17, 2025. At which time, the majority of financial institutions in the EU, as well as their supply chains, will fall within its purview. This includes digital asset companies, such as wallet providers, who will be regulated under MiCA.

Another cybersecurity-related topic of concern for Pötzsch was the correct handling of data, particularly in relation to artificial intelligence (AI).

The EU’s AI Act

“As a financial regulator, we see a fundamental problem when it comes to AI: With various AI models, the decisions are only comprehensible to a limited extent, and that is not acceptable. We expect companies to use new technologies responsibly and ensure careful governance,” said Pötzsch.

He noted that in his capacity as a supervisor of the capital markets, he sees three specific areas of concern related to the hot-button technology: AI being abused as an even more effective tool for cyberattacks; criminals using AI to profit from market manipulation; and investor protection issues related to the possibilities that generative AI opens up when it comes to disinformation.

To mitigate such risks, Pötzsch offered the assembled audience some advice:

“You need to regularly check how powerful your AI tools are. People must continue to control and intervene in decision-making processes. The importance of human judgment must not be compromised by excessive reliance on AI. Companies must avoid hidden discrimination, misapplication, lack of traceability or copyright infringement.”

On top of this advice, he pointed to the EU’s recently passed AI Act, which introduces a regulatory framework categorizing AI applications based on their perceived risk levels.

Lower-risk applications, such as spam filters and content recommendation systems, are subject to minimal regulations; these applications are required to disclose their use of AI to ensure transparency. High-risk AI systems, especially those deployed in sensitive sectors like healthcare, education, and public services, face stringent regulatory requirements; these systems must include detailed documentation of their processes and have a mandatory human oversight component to their operations. Unacceptable risk AI systems are considered a threat to people and will be banned; this includes social scoring systems, predictive policing, and emotion recognition systems in schools and workplaces.

Additionally, the Act restricts the use of AI for biometric identification by police in public spaces, except in cases involving serious crimes such as terrorism or kidnapping.

The AI Act is scheduled to begin implementation in 2025, following final approval from EU member states—which should be a formality as the Act has already received the bloc’s legislative body’s endorsement.

“The European AI regulation will certainly play an important role in this field,” said Pötzsch. “It will set a framework for the trustworthy use of AI across sectors.”

Summing up his speech, Pötzsch concluded that Europe needs practical regulation that sets appropriate guardrails, “regulation that addresses the big problems, avoids unnecessary bureaucracy and enables progress…We also need companies that manage their risks from cyber attacks or IT outsourcing well and that handle data and AI [responsibly].”

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: The Future Internet—Uniting Blockchain, AI & IPv6

Recommended for you

Crypto bros, money launderers celebrate Tornado Cash ruling
The Fifth Circuit's reversal of an earlier ruling granted Tornado Cash plaintiffs a motion for partial summary judgment, a decision...
December 3, 2024
Southeast Asia’s digital economy growth: AI, blockchain’s roadmap
Google, Temasek, and Bain & Company launched their e-Conomy SEA report that tracks Southeast Asia's digital economy and identifies key...
December 3, 2024
Advertisement
Advertisement
Advertisement