Google building

Google’s new AI training technique is 13 times faster

Getting your Trinity Audio player ready...

Google (NASDAQ: GOOGL) says it has developed a new AI training technique that is 13 times faster than existing approaches at a time when the rising energy consumption of the AI industry is raising global concerns.

In a recent research paper, Google’s DeepMind AI research lab detailed its new approach, which is based on multimodal contrasting learning with joint example selection (JEST). DeepMind researchers say this approach is not only 13 times faster but also ten times more efficient.

Existing AI training methods deal with individual data points. JEST branches from this approach by batching similar data into clusters, which it then uses to train the AI models.

“In computer vision, hard negatives (i.e. clusters of points which lie close to one another but contain different labels) have been found to provide a more effective learning signal than trivially solvable ones,” DeepMind researchers stated.

Google-based its new approach on this notion to accelerate learning beyond the existing model of independent examples.

“When scoring batches according to their learnability… JEST improves further, matching the performance of state-of-the-art models with up to 13× fewer training iterations,” the paper read.

With JEST, Google first created a smaller AI model that it then used to grade the training data quality, assign it to various batches, and then rank these batches by quality. This small model then determines which batches best train an AI model, and its findings are translated to a large AI model.

JEST relies on an ability “to steer the data selection process towards the distribution of smaller, well-curated datasets.”

AI training relies primarily on the quality of the data used, and with JEST, the importance of quality data is amplified. Essentially, JEST is Google’s attempt at a method in which the AI model filters the data and selects the best quality data set to train on.

JEST’s arrival is timely. In recent months, global focus has shifted towards the rising energy costs from the AI industry, with some experts claiming it’s unsustainable.

Rene Haas, the CEO of $190 billion chipmaker Arm Holdings (NASDAQ: ARM), noted in April that by 2030, “AI data centres could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less.”

“That’s hardly very sustainable, to be honest with you,” said Haas, whose company’s value has tripled since last October.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch BSV DevCon 2024 highlights: Building real-world solutions for real-world problems

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.