Tech firms vs European Union AI Act

GitHub, other tech firms petition against EU’s strict open-source AI Act

Software firm GitHub has submitted a paper to the European Union over its stringent rules on open-source artificial intelligence (AI).

GitHub teamed up with Hugging Face, Creative Commons, EleutherAI, LAION, and Open Future to pen the paper to EU regulators ahead of its proposed AI regulation. In their letter, the coalition of tech firms argues that strict guardrails on open-source AI rules have the downsides of stifling innovation.

Under the EU proposed rules, upstream open-source projects will be regulated as commercial projects or deployed projects.

“This would be incompatible with open source development practices and counter to the needs of individual developers and non-profit research organizations,” read the paper. “The Act risks chilling open source AI development and thus could undermine its goals of promoting responsible innovation in line with European values.”

The paper notes that open-source projects form the cornerstone of AI development, noting that “it is essential that policymakers support the blossoming open-source AI ecosystem.” They argued that open-source AI played a key role in model documentation while improving transparency requirements.

To prevent the incoming Act from stifling open-source development, the paper proffers five solutions to EU lawmakers. The first involves a clear definition of AI components, while the second pushes open-source development to be exempted from the application of the incoming AI Act.

“Clarify that collaborative development of open-source AI components and making them available in public repositories does not subject developers to the requirements of the AI Act,” read the paper.

Other suggestions included an expansion of the rules to promote an open-source AI ecosystem, allowing limited testing and setting proportional requirements for foundational models.

The European Parliament passed the EU Act in June, but as member nations smooth out the details, there are claims that the rules may be harsh for industry players. The rules propose a ban on deploying AI in predictive policing systems and emotion recognition systems.

Stiff rules may lead to an exodus of EU talent

In a separate letter penned to the EU, a coalition of tech executives expressed concerns over the wording of the proposed AI Act. The executives argued that the rules could trigger sky-high compliance costs and unnecessary liability risks for AI firms.

Some key players in the space have hinted that stiff rules could force them to set up operations in friendlier jurisdictions. OpenAI’s CEO Sam Altman’s tour of Southeast Asia is seen as a hint that the pioneering firm could begin its search for new frontiers amid stiff opposition from EU regulators.

The firm’s ChatGPT has been criticized for its potential to spread fake news, and adversely affect privacy rights and the role of smart contract auditors in Web3.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: AI, Blockchain, and secret to winning in technology

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.