Brain-Like AI: No Internet Needed


ORLANDO, Fla. (Ivanhoe Newswire) – Our computers, devices, smart watches, video monitoring systems – we rely on connectivity to the internet and don’t think twice about it. Now, scientists are developing technology for artificial intelligence that will allow it to work even in remote areas. Brain-like AI.

Self-driving cars, drone helicopters, medical monitoring equipment – it’s all cutting-edge technology that requires connection to the cloud. Now, researchers at the University of Central Florida are developing devices that won’t rely on internet connection.

“What we are trying to do is make small devices, which will mimic the neurons and synapses of the brain,” researcher at the University of Central Florida, Tania Roy, PhD, explains.

(Read Full Interview)

Right now, artificial intelligence learning requires connection to a remote server to perform heavy computing calculations. The scientists are making the AI circuits microscopically small.

Roy emphasizes, “Each device that we have is the size of 1/100th of a human hair.”

The AI can fit on a small microchip – less than an inch wide – eliminating the need for internet connection, meaning life-saving devices could work in remote areas. For example, helping emergency responders find missing hikers.

“We would send a drone which has a camera eye, and it can just go and locate those people and rescue them,” Roy says.

The scientists say with no need for internet connection, the AI would also work in space, where no AI technology has gone before.

The same UCF team is expanding on their work with artificial brain devices, and they are developing artificial intelligence that mimics the retina in the human eye, meaning someday, AI could instantly recognize the images in front of it. The researchers say this technology is about five years away from commercial use.

Contributors to this news report include: Cyndy McGrath, Producer; Roque Correa, Videographer & Editor.

To receive a free weekly e-mail on medical breakthroughs from Ivanhoe, sign up at:





REPORT:       MB #5098

BACKGROUND: With developments in technology, people all over the world are starting to work with complex artificial intelligence. It is beginning to become an increased part of our lives. Each time we do a Google search, it carries data around two hundred terawatts of energy per year, and when considering applications to use, including the internet and autonomous robotic agents like Siri, that do not need to be operated by a computer with deep learning algorithms and that can reduce the energy consumption we use on daily things. Recent analysis also shows that the increasing demand for computing power is vastly outpaces the improvements made from scaling.

(Source: )

MICROCHIP MIMICS THE HUMAN BRAIN?: Artificial intelligence (AI) helps your phone recognize your voice—think Alexa and Siri—but these programs need large amounts of energy.  Scientists are working on the next generation of AI that may be up to 1000 times more energy efficient by manufacturing new computer chips that work much like the human brain. These neuromorphic chips can run AI algorithms using just a fraction of the energy consumed by ordinary chips. The chips process information like a network of neurons in the brain; each neuron receives inputs from others in the network and fires if the total input exceeds a threshold. The new chips are designed to have the hardware equivalent of neurons linked together in a network. Right now, very few of these neuromorphic chips are commercially available, but research continues at several universities in the United States and other labs worldwide.


AI MIMICS THE HUMAN EYE?: Scientists at the University of Central Florida are building on prior research developing tiny brain-like computer chips. They’ve now developed a device for artificial intelligence that mimics the retina of the eye. The development could lead to advanced AI that can instantly recognize what it sees and could have applications in self-driving cars and robotics. Researchers say the device outperforms the eye in the number of wavelengths it can see, from ultraviolet to visible light and on to the infrared spectrum. The UCF-developed device  integrates three different operations into one. Current intelligent imaging technology, like what’s used in self-driving vehicles, requires separate sensing, memorization and data processing.



Robert Wells                           Zenaida Kotala

(352) 213-54                           (407) 446-6567              

If this story or any other Ivanhoe story has impacted your life or prompted you or someone you know to seek or change treatments, please let us know by contacting Marjorie Bekaert Thomas at

Doctor Q and A

Read the entire Doctor Q&A for Tania Roy, PhD, researcher

Read the entire Q&A