BSV
$53.01
Vol 34.6m
-7.91%
BTC
$96318
Vol 51510.06m
-2.49%
BCH
$447.75
Vol 405.63m
-4.47%
LTC
$99.08
Vol 948.61m
-6.2%
DOGE
$0.31
Vol 6432.31m
-9.71%
Getting your Trinity Audio player ready...

ChatGPT can write smart contracts, but developers must avoid using it to audit their code, a team of researchers has found.

The researchers from blockchain security firm Salus Security assessed the ability of GPT-4, OpenAI’s most powerful large language model (LLM), to audit smart contracts. They concluded that while its precision rate in detecting vulnerabilities is high, its recall rate is dangerously low for a smart contract auditing tool.

In their recently published study, the two researchers selected eight sets of smart contracts that had been injected with 18 types of vulnerabilities, 60 in total. Their goal was to assess whether OpenAI’s LLM could mimic a professional auditor and if it could parse the code and unearth the vulnerabilities.

The researchers found that GPT-4 detected seven types of vulnerabilities with a high precision of above 80%. However, its recall rate was strikingly low across the data sets, with the lowest being around 11%, suggesting “that GPT-4 may miss some vulnerabilities during detection.”

The recall rate is the percentage of data samples that a model can correctly identify as belonging to a class out of the total sample or the true positive rate. Precision is the ratio of true positives to the true and false positives, or how many junk positives were included in a data set.

The researchers pointed out that the results indicate that GPT-4’s vulnerability detection capabilities are below par and should only be used as an auxiliary tool in smart contract auditing.

“In summary, GPT-4 can be a useful tool in assisting with smart contract auditing, especially in code parsing and providing vulnerability hints. However, given its limitations in vulnerability detection, it cannot fully replace professional auditing tools and experienced auditors at this time,” the researchers concluded.

The latest study corroborates other findings that have dismissed claims that artificial intelligence (AI) will replace developers. While specialized AI-powered tools are becoming
increasingly better at writing code and detecting vulnerabilities, they still can’t be depended upon on their own, especially in the digital asset world where the slightest vulnerability is pounced on.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Does AI know what it’s doing?

Recommended for you

Google unveils ‘Willow’; Bernstein downplays quantum threat to Bitcoin
Google claims that Willow can eliminate common errors associated with quantum computing, while Bernstein analysts noted that Willow’s 105 qubits...
December 18, 2024
WhatsOnChain adds support for 1Sat Ordinals with new API set
WhatsOnChain now supports the 1Sat Ordinals with a set of APIs in beta testing; with this new development, developers can...
December 13, 2024
Advertisement
Advertisement
Advertisement