Neural Networks: When good enough is better than best

Research

Neural Networks: When good enough is better than best

A Queen鈥檚 research team has developed a new way to train AI systems so they focus on the bigger picture instead of specific, optimized data.

February 13, 2026

Share

成人大片 researchers Irina Babayan, Greg van Anders, Hazhir Aliahmadi

A 成人大片 research team led by Greg van Anders (Physics, Engineering Physics and Astronomy), centre, along with Irina Babayan, a masters student in physics, left, and recent PhD recipient Hazhir Aliahmadi, has developed a new way to聽leverage聽data to train neural networks without reducing generalizability. Their finding was recently published in Nature Communications.

Training a neural network can be a bit like preparing a student for an exam. If the student memorizes every practice question and answer, they might score well on a familiar test, but struggle when the questions change. Real learning means understanding bigger ideas. A Queen鈥檚 research team says many artificial intelligence (AI) systems have fallen into the memorization trap, and they have developed a new training approach designed to help neural networks learn more effectively.

What is a neural network?

Neural networks are powerful AI systems inspired by how the brain works. They can handle complex tasks like making decisions, analyzing images, and understanding text. You already see them in things like movie recommendations, voice assistants, photo tagging, and automatic translation. Most neural networks are trained to do extremely well on a specific set of examples. If a model becomes too closely matched to that training data 鈥 a problem called overfitting 鈥 it can fail on new situations. That can cause real-world problems, such as unreliable threat detection or wrong medical diagnoses.

To counter this, the Queen's team introduced an alternative approach, recently published in , called 鈥榮ufficient training鈥.

鈥淲e realized that the problem was optimization,鈥 says co-lead author Irina Babayan, a master鈥檚 student in physics. 鈥淭he leading approaches all start with some version of optimization but then use one trick or another to disrupt the optimizer. We thought: Why not start from something else?鈥 

Instead of trying to train a single, optimized method that makes as few errors as possible, they鈥檝e opted for training that generates 鈥榞ood enough鈥 parameter sets, allowing for a variety of non-optimal solutions that can generalize better to new data.

鈥淚t sounds like a paradox,鈥 adds co-lead author Hazhir Aliahmadi, a recent physics PhD recipient. 鈥淏ut sufficiently trained networks outperform optimally trained ones.鈥 

From memorization to learning

The approach hinges on the distinction the team drew between memorization and learning.

鈥淲hen you鈥檙e learning something new, you might start by memorizing facts. But the goal of learning isn鈥檛 to memorize isolated facts. The goal is to get a broader understanding,鈥 says Dr. Aliahmadi. 鈥淲e wanted to develop a training approach that moves toward neural network representations of 鈥榣earning鈥 at this higher level.鈥 

In practice, the team found their neural networks were better trained to capture the 鈥榞ist鈥 of an underlying dataset, rather than reproducing measurement errors. That improved generalizability, which they expect will be useful for making decisions or predictions in areas such as health care, engineering, and finance.

鈥淭here鈥檚 a lot of work in the social sciences showing that diverse teams reach better outcomes,鈥 says senior author Greg van Anders, an associate professor in the Department of Physics, Engineering Physics, and Astronomy. 鈥淭his work shows that this intuition about people also holds for neural networks. We find that a diverse collection of neural networks substantially outperforms an individual network, or a non-diverse collection of networks.鈥 

By allowing for a range of solutions rather than forcing a single optimal one, sufficient training naturally creates that diversity, taking advantage of the properties of 鈥榚mergence鈥, where a collective or whole is more than the sum of its parts. 

鈥淯ltimately, we wanted to develop a tool that was useful for solving real-world problems,鈥 says Babayan. 
That focus has shaped the areas where the team thinks the approach will have the 成人大片 impact.

鈥淧eople who need to diagnose rare diseases, detect fraud, or value financial instruments don鈥檛 have access to limitless data because data are prohibitively difficult or expensive to collect, or there are privacy limitations,鈥 says Dr. van Anders. 鈥淪ufficient training is tailor made for settings like that.鈥 
 

Technology and Innovation
Arts and Science
Industry Innovation and Infrastructure