top of page

Geoffrey Hinton: from failure to Nobel Prize winner

  • STEMonics
  • Aug 13, 2025
  • 3 min read

In 1950, Alan Turing published his book "Computer Machinery and Intelligence," outlining the concept of artificial intelligence (AI). By providing the basis for the development of such technologies, he would late come to be known as the “father of AI”. Recently, AI has developed into a pivotal, almost crucial, force in society, so much so that we often cannot recall life without it. It has quickly become a source of great potential but has led to great concern. But has AI stayed the same for all this time? Often people discuss the most recent developments to AI but never the key milestone that have allowed it to function as it does today: the Boltzmann machine.


The Boltzmann machine, named after Austrian physicist Ludwig Boltzmann, was created in 1983–1985 by Geoffrey Hinton using tools from statistical physics[1]. It was inspired by the structure of the brain as it consists of interconnected neurons capable of making random decisions by random probability distribution. The machine provided a foundation for understanding how networks could be trained to process and recognise data patterns. This allowed people to create images, learn patterns of data, and solve problems, all using AI. A type of Boltzmann machine called the Restricted Boltzmann machine is an algorithm used for probability distribution enabling it to predict which video you want to watch next on social media, for example. The Boltzmann machine was key in giving AI its power and enabling it to reach the importance it has today.


However, success did not come as quickly as some expected. Hinton’s machine was hardly noticed at first due to it being discredited and considered irrelevant until only quite recently. Computer scientists started using his ideas to develop AI using “backpropagation”[2] that enables neural networks[3] to learn, making AI significantly more efficient.


But Hinton is not a physicist. How did the man once presented as having “failed at physics and dropped out of psychology” win a physics Nobel Prize? The truth is he used the physics in a Hopfield network as the basis for the Boltzmann machine. A Hopfield network is an interconnected neural network which is made up of one layer of ‘n’ fully connected recurrent neurons. This network is the reason why John Hopfield was also credited and shared the Nobel Prize for physics with Hinton this year.

Geoffrey Hinton received much praise for his Boltzmann machine and after winning the Turing Award in 2019 for his contributions to AI, admitted that he was “flabbergasted” to receive yet another prize. However, he did not hide his apprehensions for the future of AI after his latter success, expressing fears about the dangers of machines that could outsmart humans. He even resigned from Google in 2023 to speak more freely about the issue of the takeover of AI: he said, "my guess is in between five and 20 years from now there’s a probability of half that we’ll have to confront the problem of AI trying to take over". The ability of AI to store, process and generate enormous amounts of data, surpassing the human brain, is the reason many are apprehensive for the future. While this does instill an element of fear in some minds it reminds us to not rely fully on technology and restore independence in every aspect of our lives.


Once considered to have “failed at physics” Hinton has proved his worth and became known as the “godfather of AI”, stepping an inch closer to the genius of Alan Turing. His accomplishments have paved the way for the creation of current AI systems and Hinton has officially been engraved in history after having received the most prestigious award in his field.


By Sofia Aiello

 

Bibliography:


[1]  Statistical physics is the study of the special laws that govern the behaviour and properties of macroscopic bodies or bodies formed by a very large number of individual particles.

[2] Backpropagation is a method of training neural networks to perform tasks more accurately by collecting data from previous iterations and fine-tuning their performance based on past errors.

[3] A computer system modelled on the human brain and nervous system.


Comments


Top Stories

Get notifications with every new post! Sign up for our newsletter.

  • TikTok
  • Youtube
  • Instagram
  • Twitter
bottom of page