Articles

AI and Human Intelligence

Where, in our experience, do language-based digital code, computer-like programming, machines, and other high [information] structures come from? They have only one known source: intelligence

03/24/22

John Stonestreet

Kasey Leander

In a recent interview, Mark Zuckerberg was asked to fill out a captcha test to prove once and for all that he’s not a robot. In case you were wondering, he passed. 

It’s all a bit funny, given (as John Mulaney observed a couple years ago), just how much time we spend these days proving to robots that we’re not robots. Only human brains, apparently—can recognize every square with a stop sign or a boat.  

It’s also a bit ironic, given the most exciting trend in computing technology is building computers using design principles based on the human brain.  

“Today’s most successful artificial intelligence algorithms [are] loosely based on the intricate webs of real neural networks,” writes Allison Whitten in Quanta Magazine. She’s referring to something called “deep learning,” advanced artificial intelligence that can compute huge amounts of data, while correcting mistakes or even anticipating future problems.   

From mapping traffic patterns to predicting storm fronts to understanding the stock market, the possibilities of deep learning are endless. For years, though, there’s been a major holdup: how to keep it running. Deep learning requires so-called “simulated neural networks” or multiple layers of computers all crunching out the same problem. 

“Unlike our highly efficient brains,” Whitten explains, “running these algorithms on computers guzzles shocking amounts of energy: The biggest models consume nearly as much power as five cars over their lifetimes.”  

And now, after years of puzzling out how to make a system capable of running advanced AI, researchers are finding a breakthrough source of inspiration: the human brain.  

The secret lies in how the brain processes electronic data. Whereas digital communication is binary, using 1’s and 0’s, the brain’s communication is analog: using one continuous data stream. Likewise, while digital tech relies on one central processing unit, the brain arranges millions of computing units next to memory units in the forms of synapses and neurons.  

The science is definitely chewy for us non-engineering types, but the results are incredible. Digital technology has transformed the modern world, opening vistas that previous generations would have thought impossible. Yet to fully master artificial intelligence, our brightest scientists are forced to direct their attention back to God’s original design specs.  

This is part of a bigger phenomenon in engineering called “biomimetics.” It’s a word coined from the Greek bios, which means life, and mīmēsis which means imitation. Neurocomputing is just one of the more dramatic examples of what some have called the biomimetics’ “goldrush,” a race to understand and apply systems and design features of the natural world, which in many cases are far beyond even our most advanced engineering.  

This is not to say God somehow always prefers “natural” systems to “artificial” ones. Humans are designed to create and innovate and design in ways that have advanced us past our Garden of Eden beginnings.  

At the same time, there’s no escaping the basic truth that natural systems are nothing short of stunning. People didn’t design themselves, yet we possess self-repairing, massively complex biological systems. We experience life through five senses, our brains seamlessly integrating terabytes of data at all times. And to do it, we require only assorted organic materials we find around us. The brain can sustain itself on a cheeseburger.  

In 2013, a collaboration between Japanese and German scientists created one of the most realistic brain simulations ever attempted. They used what was, at that time, the world’s fourth-largest computer, containing over 700,000 processor cores and producing an eye-popping 1.4 million gigabytes of RAM. The machine worked at top speed, crunching numbers for over 40 minutes. In the end, it produced just one second of simulated brain activity.  

Technology has advanced since then, but it still raises a fundamental question: How do we explain the complexity of the natural world?  

Darwinian naturalists are forced to punt to random, genetic mutations—or in the words of Bertrand Russell “accidental collocations of atoms”—and a limitless supply of time and chance.  

But they still can’t escape the haunting questions. How did the universe itself come into existence? How do the laws of physics perfectly align for our survival here? How can we account for the incredible amount of complex and specified information in biological systems, including the human brain? How can we explain abiogenesis, or the origin of the first life from non-life?  

Christians don’t have to agree on every scientific detail to point to what is obvious. There’s a designer behind all of this design. In other words, as Casey Luskin with Evolution News summarizes: “Where, in our experience, do language-based digital code, computer-like programming, machines, and other high [information] structures come from? They have only one known source: intelligence.” 

Share


  • Facebook Icon in Gold
  • Twitter Icon in Gold
  • LinkedIn Icon in Gold

Resources:

The Limits of Artificial Intelligence
John Stonestreet & Dr. Timothy D. Padgett | BreakPoint | May 18, 2021

The Age of Artificial Intelligence
Shane Morris and Jason Thacker| BreakPoint Podcast | June 28, 2020

Have a Follow-up Question?

Related Content