Register now for better personalized quote!

For Turing Award winner, everything is computation and some problems are unsolvable

Apr, 10, 2024 Hi-network.com
wigderson-conference-photo-credit-andrea-kane-institute-for-advanced-study-princeton-njusa

"It is one of the most fundamental intellectual questions ever asked," says Avi Wigderson, recipient of the 2024 ACM Turing Award, regarding the question of P = NP. "If everyone is wrong, and P equals NP, it has amazing consequences."

ACM

The Association for Computing Machinery (ACM) on Wednesday announced that it has awarded this year's A.M. Turing prize, often referred to as the Nobel Prize of computing, to computer scientist and mathematician Avi Wigderson of the Institute for Advanced Study in Princeton, New Jersey. 

Wigderson was recognized for fundamental contributions that advanced what is known as the Theory of Computation, with particular distinction in "reshaping our understanding of the role of randomness in computation," and "decades of intellectual leadership in theoretical computer science," said the ACM.

Also:Algorithms soon will run your life - and ruin it, if trained incorrectly

Named for British mathematician and computer scientist Alan M. Turing, the award comes with a$1 million prize with financial support provided by Google.

Wigderson, the Herbert H. Maass Professor at the Institute, has demonstrated over decades a profound understanding of computation as being more than just machines. Computation is a phenomenon found throughout the universe.    

"Computation is a really fundamental notion," said Wigderson in an interview with . "It's not just algorithms in computers:everythingcomputes." 

"When you see a seashell on the beach, with some remarkable pattern, that was computed by extremely simple steps: It grew from one grain, and, locally, these grains computed neighboring grains, and were connected -- this really simple process, you evolve it, and you get some amazing patterns."

These simple rules of local, incremental change -- "which we know very well," said Wigderson -- "can create very complex phenomena." Computational examples abound. Computation is how the fertilized egg gets to be a human being, or how human cells divide throughout one's lifetime. "Also, gossip," added Wigderson. "When you tell something that was told to you, or on Facebook -- the way information evolves, these are computational questions that can be framed in complete computational models."

Awarding the Turing prize to Wigderson is especially apt because his field, the Theory of Computation, was begun by Turing in 1936 with Turing's landmark paper, "On Computable Numbers, With an Application to the Entscheidungsproblem." 

Turing laid the groundwork for the extremely broad definition of computation as an evolution of an environment via local, incremental changes, where the "environment" can be human beings, galaxies, shells on the beach, information, or any number of phenomena. 

Also:Jack Dongarra, who made supercomputers usable, awarded 2021 ACM Turing prize

Viewed through the lens of Turing, as advanced over decades by Wigderson and colleagues, every natural phenomenon is computation. "How termites build these castles, or the bees, who somehow know where to go and look for the honey -- they actually evolve to create, even though they have very small brain cells," explained Wigderson.

The award recognizes Wigderson specifically for his contribution to understanding the role randomness plays in computation. Conventional computers, from mainframes to iPhones, are "deterministic"; meaning, they follow a predictable pattern of steps. But many of the most intriguing examples of computation in nature involve elements of randomness, from the seashell to the flocking of starlings known as murmuration.  

Wigderson and collaborators used the exploration of randomness to advance the understanding of what problems can and cannot be computed "efficiently." There are numerous problems in the world -- in science, math, and engineering -- for which there are no known efficient algorithms, meaning an algorithm that can operate in "polynomial time," rather than exponential time.

Also:Alan Turing's enduring legacy: 10 ideas beyond Enigma

For example, factorization of large integers into their prime factors doesn't have a known algorithm that can run in less than exponential time. But it's not enough to find such an algorithm; computer scientists and mathematicians want toprove, to know for certain, one way or another, whether therecouldexist a solution for such "hard" problems. 

The three primary papers cited by the ACM form a sequence, a progression, Wigderson told . 

"The question was how powerful is randomness in algorithms," said Wigderson. Starting in the eighties, computer scientists had found many ways to use randomness as a route to efficient algorithms, "And so, it looked like randomness is very powerful."

"These works aimed to show that randomness is not so powerful," he said. Instead, Wigderson and colleagues developed pseudorandom number generators that could solve some hard problems in an efficient, deterministic way. 

"In all these papers, you take a hard problem and you create a pseudorandom generator that can deterministically generate bits that look random, and you can replace random inputs in a probabilistic algorithm [and fashion] a deterministic one."

Also: Famous computer scientists

The weeding out of randomness, replacing it with a deterministic approach, culminated in the final paper with the most sophisticated pseudorandom generator. The lesson, noted Wigderson: "We could conclude that any polynomial-time probabilistic algorithm can be made a polynomial-time deterministic one." 

A surprising side-effect of those discoveries is that if you can remove randomness from an algorithm, you can prove the hardness of the problem -- a striking back-door way to settle the question of whether a problem is or is not a hard one. 

"Somehow, these two concepts [randomness and hardness], which are very different, are intimately connected," said Wigderson. "They're almost the same thing. Removing randomness from probabilistic algorithms, and proving hardness of computational problems are, sort-of, dual problems."

For those wishing to delve more deeply into the work on randomness, the papers are listed below. However, you can just read the chapter on randomness in Wigderson's large volume, "Math and Computation," which is available as a free download. If you skim that book, you'll be head and shoulders above most people at your next cocktail party.

  • "Hardness vs. Randomness" (Co-authored with Noam Nisan)
    Among other findings, this paper introduced a new type of pseudorandom generator, and proved that efficient deterministic simulation of randomized algorithms is possible under much weaker assumptions than previously known.
  • "BPP Has Subexponential Time Simulations Unless EXPTIME has Publishable Proofs" (Co-authored with L

tag-icon Hot Tags : Innovation

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.