Edge chips could render some networks useless

Edge chips could render some networks useless

【51CTO.com Quick Translation】Some scientists say that hardware processing is expected to make devices independent of the Internet. The original intention behind this is to improve machine efficiency, save electricity and ensure flexibility.

[[252301]]

"Device such as drones rely on constant Wi-Fi signals," wrote an article about the researchers at Binghamton University in New York. "If the Wi-Fi goes down, the drone will crash."

But if you make a device independent of any network connection, it could become more resilient, the researchers say. Also, the more processing you can do on the machine, the more energy savings you’ll get because you don’t have to worry about powering communications.

“You can put 5G and 6G everywhere, assuming you have a reliable internet connection all the time, or you can solve the hardware processing problem, which is what we’re doing,” said Louis Piper, associate professor of physics and chair of materials science and engineering at Binghamton University.

Researchers at Binghamton University and Georgia Tech are working to develop neuromorphic circuits, computer chips that mimic the brain, that would allow all device processing to happen at the chip level, meaning there would be no network overhead and no need to use the network at all to communicate.

“The idea is, we want this chip to do everything within the chip itself, rather than having to send messages back and forth to some big server,” Piper said.

This saves power, and the machine becomes powerful enough to handle the environment, so there's no need to query a bunch of machines somewhere else. It's not only faster, but more reliable, and it saves energy. It's like an extreme edge network.

Neuron-like circuits work like neurons in the brain

Artificial neuron-like device circuits are being developed that attempt to mimic actual biological neurons in the brain. Neurons are electrically responsive nerve cells and fibers in the brain that process information. They send signals that cause muscles to contract, communicate with the spinal cord and nerves, and importantly, use very little energy.

Neuromorphic devices, electronic circuits that mimic the brain, were first theorized in 1962 and tested in 2013 using a material called niobium dioxide (NbO2), the researchers wrote in their paper published in Nature. The problem with such circuits, however, is that they require high voltages and a complex associated manufacturing process, called electroforming, to make the only theoretically working switching elements developed so far.

“Like Frankenstein’s monster, you basically run a lot of current through the material and all of a sudden it becomes an active element,” Piper said. “That’s not very reliable in terms of manufacturing process steps.”

For example, scalability is an issue.

But scientists say they have been making a major breakthrough in this area, with the team saying it can now perform the switching function without the hassle of applying electricity.

“You could make neuron-like devices out of this because you don’t need electroforming, it’s more reliable, and it could create an industry,” Piper said.

The team's research results are expected to lead to cheaper, more energy-efficient, and higher-density neural device circuits, and provide us with more energy-efficient and adaptable computing earlier.

Original title: Edge-chips could render some networks useless, author: Patrick Nelson

[Translated by 51CTO. Please indicate the original translator and source as 51CTO.com when reprinting on partner sites]

<<:  Magical IPv6, mobile phones can be assigned independent IP addresses

>>:  Talking about IPv6 tunnel technology

Recommend

With this subnet division summary, I know all about subnet mask design~

1. Subnet Division Subnet division is actually th...

A Deep Dive into Data Link Layer Devices

In computer networks, there are multiple layers t...

The role of active optical networks in enhancing data transmission

While fiber will always be the primary network, t...

Application of multimodal algorithms in video understanding

1. Overview At present, video classification algo...

5G, AI and IoT: the dream team for modern manufacturing

Artificial Intelligence and the Internet of Thing...

What changes will 6G bring by 2030?

We are in the midst of a great digital wave. Inno...

When to use 5G and Wi-Fi 6?

We’ve seen a lot of hype around 5G cellular and W...