Reserved IP Address°C
02-22-2025
BSV
$37.35
Vol 23.11m
-1.28%
BTC
$96577
Vol 30887.93m
-1.31%
BCH
$320.89
Vol 174.76m
-1.29%
LTC
$127.68
Vol 1186.52m
-2.68%
DOGE
$0.24
Vol 1598.95m
-1.75%
Getting your Trinity Audio player ready...

ChatGPT can write smart contracts, but developers must avoid using it to audit their code, a team of researchers has found.

The researchers from blockchain security firm Salus Security assessed the ability of GPT-4, OpenAI’s most powerful large language model (LLM), to audit smart contracts. They concluded that while its precision rate in detecting vulnerabilities is high, its recall rate is dangerously low for a smart contract auditing tool.

In their recently published study, the two researchers selected eight sets of smart contracts that had been injected with 18 types of vulnerabilities, 60 in total. Their goal was to assess whether OpenAI’s LLM could mimic a professional auditor and if it could parse the code and unearth the vulnerabilities.

The researchers found that GPT-4 detected seven types of vulnerabilities with a high precision of above 80%. However, its recall rate was strikingly low across the data sets, with the lowest being around 11%, suggesting “that GPT-4 may miss some vulnerabilities during detection.”

The recall rate is the percentage of data samples that a model can correctly identify as belonging to a class out of the total sample or the true positive rate. Precision is the ratio of true positives to the true and false positives, or how many junk positives were included in a data set.

The researchers pointed out that the results indicate that GPT-4’s vulnerability detection capabilities are below par and should only be used as an auxiliary tool in smart contract auditing.

“In summary, GPT-4 can be a useful tool in assisting with smart contract auditing, especially in code parsing and providing vulnerability hints. However, given its limitations in vulnerability detection, it cannot fully replace professional auditing tools and experienced auditors at this time,” the researchers concluded.

The latest study corroborates other findings that have dismissed claims that artificial intelligence (AI) will replace developers. While specialized AI-powered tools are becoming
increasingly better at writing code and detecting vulnerabilities, they still can’t be depended upon on their own, especially in the digital asset world where the slightest vulnerability is pounced on.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Does AI know what it’s doing?

Recommended for you

Majorana 1 chip offers breakthroughs in quantum computing
Microsoft's Majorana 1 chip signifies a leap in quantum computing, but developers in the blockchain community should still be wary...
February 21, 2025
Ransomware losses tumble but threat remains: Chainalysis
A new report shows that collaboration between authorities and victims' refusal to negotiate with bad actors caused a decline in...
February 20, 2025
Advertisement
Advertisement
Advertisement