Close Menu
GT NewsGT News

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    A Russian military-endorsed “propaganda game” has just landed on Steam

    May 30, 2025

    Eddie Murphy and Martin Lawrence’s children are married: A timeline of their relationship

    May 30, 2025

    Emma Watson recalls ‘the most horrifying thing’ in her career

    May 30, 2025
    Facebook X (Twitter) Instagram
    GT NewsGT News
    • Home
    • Trends
    • U.S
    • World
    • Business
    • Technology
    • Entertainment
    • Sports
    • Science
    • Health
    GT NewsGT News
    Home » How the natural world is inspiring the robot eyes of the future
    Science

    How the natural world is inspiring the robot eyes of the future

    LuckyBy LuckyMay 29, 2025No Comments12 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    How the natural world is inspiring the robot eyes of the future
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The miniature curved compound eye, called CurvACE, was inspired by the eyes of insects.Credit: Alain Herzog, EPFL

    Electrical engineer Young Min Song remembers when his colleague at the Gwangju Institute of Science and Technology in South Korea asked him why the eyes of the numerous stray cats around the institute had vertically shaped pupils. Song builds visual components for robots inspired by nature but had not until that moment considered the vision of the felines on his doorstep.

    The conversation inspired Song to design a prototype robotic eye with a cat-like, 3D-printed adjustable vertical pupil1. In daylight, when the shape of the pupil is slit-like instead of circular, the prototype eye more accurately detected objects against a busy background. Song also added a patterned metal reflector beneath the image sensor, modelled on the reflective layer of cells called the tapetum lucidum that sits behind the light-sensing retina in cats’ eyes. Just as the tapetum lucidum makes it easier for tigers to see their prey at dusk by feeding the retina with more light, the reflective layer of cells improved the ability of the robotic eye to see in dim conditions.

    Song showed that the feline-inspired robotic eye provided high-quality visual data that required less intensive processing, such as contrast enhancement, to track objects than did the same system with a conventional circular aperture. With a vertical pupil, “tracking becomes way more stable”, he says, and energy consumption is reduced. “It really shows how the visual system doesn’t have to work as hard.”

    Nature Outlook: Robotics and artificial intelligence

    It’s a design that Song thinks would benefit a small surveillance robot. “Good hardware”, he says, “is kind of like a hidden treasure — it saves you a tonne of energy on computation.”

    As robots venture beyond the controlled environments of factory floors, where they have existed for decades, it will become imperative for them to rapidly perform an array of perceptual tasks with a limited energy supply. An autonomous vehicle, for instance, must be able to perceive the world in real time under various lighting and weather conditions to navigate in unpredictable traffic. Yulia Sandamirskaya, a neuromorphic computing specialist at the Zurich University of Applied Sciences in Switzerland, thinks that better vision systems are key to the success of safe, clever machines. “For robots to collaborate with humans, or even do anything around humans, they need smarter vision,” she says.

    A typical camera-based vision system in a robot uses a lens, or multiple lenses, to focus incoming light onto a sensor — typically a flat semiconductor device known as a CMOS sensor. These sensors produce a vast amount of raw data, which must then be processed to produce images.

    One way to achieve smarter vision would be to improve this processing, by creating algorithms that can extract more information from incoming visual data more quickly. But Song and other scientists say that innovative hardware — that is, the optical components and the sensors of robotic visual systems — should also be part of the equation.

    Young Min Song, an electrical engineer at the Gwangju Institute of Science and Technology in South Korea, designs components for robotic eyes.Credit: Young Min Song

    Dario Floreano, a roboticist at the Swiss Federal Institute of Technology in Lausanne (EPFL), thinks that the mainstream approach to machine perception is to consider the hardware of the sensory device and the signal processing separately. But this is a “reductionist perspective of vision”, he says, “because the two components depend on each other”. A better approach, Floreano argues, is to mimic a biological model, in which the image-gathering organ — the eye — co-evolves with the neural circuitry that processes these signals. Song agrees. “Imagine an eagle with human eyes. No matter how complex the processing is, do you think it could see as well as a real eagle from far away? Probably not — its brain would have to work like crazy,” he says.

    Rather than rely on clever computation to make the best of conventional camera hardware, some engineers are instead experimenting with optical components such as new apertures, and animal-inspired sensors, that together can gather high-quality visual data that requires less intensive processing. The hope is that this hardware, which can be tailored to certain conditions and tasks, will not only increase the speed with which machines can perceive their environment, but also reduce energy consumption and weight — crucial considerations for lightweight, battery-powered robots.

    Animal inspiration

    Song’s feline-like eye is not his only design that has been inspired by the natural world. He and his colleagues were similarly influenced by the eyes of the cuttlefish, the pupils of which take on a ‘W’ shape under bright light. In an unevenly lit ocean environment, a W-shaped pupil reduces the amount of light entering the eye from the bright surface of the water.

    The researchers designed a robotic visual system that similarly uses a W-shaped aperture to produce evenly lit images2. Song thinks that this W-shaped pupil could be integrated into the vision system of self-driving cars to improve the vehicles’ ability to handle bright light and reduce the need for further processing to cope with sun glare.

    The pupils of cuttlefish take on a W-shape in bright light.Credit: Sergio Hanquet/Nature Picture Library/Science Photo Library

    Despite the unusual pupil shape, the cuttlefish eye otherwise operates in broadly the same way as a conventional camera, and indeed our own eyes, using a single lens to focus light onto the retina. But these are not the only types of eye to exist in nature: roboticists are also drawing inspiration from the honeycomb-like compound eyes of insects and crustaceans.

    Compound eyes are made up of an array of many small visual units. Their design is an example of computational morphology, a principle found in nature in which sensing, control or computation is outsourced to the shape and form of the body, says Helmut Hauser, a roboticist at the University of Bristol, UK.

    The design of a compound eye holds certain advantages over conventional cameras. To sense movement, for example, flying insects register optic flow — that is, the pattern of motion across the retina. Conventional camera technology can gather this sort of information, Floreano says, but it requires either multiple cameras or wide-angle lenses to cover a large field of view, and the processing needed to make sense of the data is significant. It must, for instance, compensate for distortion and aberrations induced by wide-angle optics. In a compound eye, this problem is solved by the many ommatidia — the individual photoreceptive units that make up the eye. “These ommatidia directly provide optic flow signals that do not require further computational adjustments,” Floreano says. Each ommatidium follows the same optical design and photoreceptor layout, and they point in different directions, providing a field of view of up to 360 degrees.

    Engineers have, for years, tried to bring this small, computationally-light vision system to robots, taking inspiration from a variety of animals, including the European mantis (Mantis religiosa)3 and fiddler crab (Uca arcuata)4. In 2013, at the University of Illinois Urbana-Champaign, Song, with John Rogers — a materials scientist who is now based at Northwestern University in Evanston, Illinois — and other colleagues, were among the first to use bendable optics to create a compound-eye-inspired hemispherical camera, with a 160° field of view and an almost infinite depth of field5. The same year, Floreano led an international team to develop a miniature curved compound eye called CurvACE6. Each artificial ommatidium that makes up the eye consists of a highly transparent polymer microlens moulded onto glass that focuses light precisely onto a curved photodetector layer. The eye weighs just 1.75 grams and consists of 630 ommatidia — about as many as are contained in the eye of a Drosophila fruit fly.

    With a field of view of 180° in the horizontal plane, and high temporal resolution that provides high motion sensitivity, CurvACE is particularly suited to drones that must deftly navigate complex environments and avoid obstacles in real time. Indeed, in 2022, CurvACE was used in a prototype of a flapping-wing robot to estimate optic flow at an energy-efficient 40 × 15 pixels7.

    Neuromorphic cameras — which respond to changes in brightness and are inspired by the retina — can be used in drones. The image on the right illustrates the visual input the drone receives from the camera: red indicates pixels that are getting darker and green indicates those that are getting brighter.Credit: TU Delft, TU Delft/Studio Oostrum

    The intricate structure and small size of compound eyes pose scaling challenges, says Rogers, such as difficulty creating high volumes of dense arrays of photodetectors in curved layouts. “Hardware is ‘hard’, especially for cost-effective, manufactured systems,” he says.

    Brain-like sensing

    CurvACE’s innovations go beyond the shape and wide field of view: the design of the image sensors also diverges from convention. CurvACE focuses light onto an array of neuromorphic photodetectors, so named because their design is inspired by the neural architecture of the fly’s eye. The output of the photodetectors signals the intensity of light perceived, which the eye is programmed to process as optic flow. Each ommatidium is equipped with a neuromorphic adaptation circuit, which enables the photodetectors to accurately respond to a much broader range of light intensities than can the sensors of conventional cameras, and at a much higher frequency. This allows the system to function well in a weakly lit room or under a bright sky, just like an insect’s eye.

    In the decade since CurvACE was created, neuromorphic sensors have gradually emerged as commercial alternatives to CMOS sensors. For example, in 2015, start-up company iniVation in Zurich, Switzerland, was established to commercialize a neuromorphic sensor developed by a team led by electrical engineer Tobias Delbrück at the University of Zurich. The firm’s ‘dynamic vision sensor’ is inspired by the workings of the human retina. Instead of producing a single image multiple times a second, as conventional image sensors do, Delbrück’s neuromorphic sensor reacts to changes in luminance. In other words, each pixel of the sensor array generates a signal only when there is a change in the intensity of light that strikes it. This means that these ‘event’ cameras consume much less power than typical systems.

    Fruit flies (Drosophila melanogaster) have honeycomb-like compound eyes that are made up of an array of individual photoreceptive units called ommatidia (inset).Credit: Eye of Science/Science Photo Library. Adapted from Anne Weston, Francis Crick Institute/Science Photo Library

    In conventional cameras with CMOS sensors, “the image is produced and output to the computer whether anything changes or not”, Sandamirskaya explains. This is “very wasteful and slow, [and] requires a lot of processing”. Neuromorphic cameras that react only to events are less computationally expensive, and because there is no lag between the sensing and the processing they are not affected by motion blur. Sandamirskaya thinks that ‘seeing’ in this way, instead of performing intensive processing on multiple images per second, holds promise in achieving the fast, intelligent vision of the future. “I believe in this neuromorphic idea,” she says. “Evolution took its time to come up with a pretty good solution that solves many of the problems that we want to solve for robots.”

    Bodily improvement

    The big challenge with using neuromorphic cameras in robotic vision is aligning the sensing with the computing. Conventional vision algorithms, which take in image sequences, are not designed to work with event data. This means that the output of neuromorphic sensors might need to be converted into a form that the software can understand — defeating the purpose of using an unconventional sensor in the first place. “You will have to make frames out of events, and that will again slow you down,” says Sandamirskaya. “You lose all the advantages.” For this reason, she thinks that it is important to “couple this camera with the computer that also fits”. But having been around for only a few decades, there is much work still to be done to develop an ecosystem around these cameras. “Neuromorphic hardware in general is still quite experimental, and not always easy to work with due to interfacing limitations,” says Jesse Hagenaars, an artificial intelligence (AI) and robotics researcher at Delft University of Technology in the Netherlands.

    Some engineers and computer scientists are working together to co-design hardware and software for future nature-inspired vision systems. Song, for example, is wrapping up a project based on compound eyes with computer scientist Frédo Durand at the Massachusetts Institute of Technology in Cambridge. They are trying to achieve high-resolution, ultra-wide-angle imaging — something that conventional fly-eye or compound-eye camera research hasn’t yet been able to fully accomplish. Song’s role in the project, titled Artificial Compound Eyes with Artificial Intelligence (ACE-AI), is building the hardware, while Durand and his team have been integrating AI techniques to supplement and enhance the information obtained from the hardware.

    Collaborations of this nature are indications that roboticists are moving away from the common view that a simpler body is always better. There are signs of this change in the allocation of research funding as well: the UK government’s Advanced Research and Invention Agency (ARIA) — modelled on the US Defense Advanced Research Projects Agency (DARPA) — has, for example, committed £57 million (US$75.7 million) to a project called Smarter Robot Bodies, which aims to create innovative sensing and manipulation hardware. At one point, the solution to many challenges in robotics might have been thought to lie in better processing, but now “there’s an awareness that we have to also build the body in a clever way”, says Hauser. “Bodies that help us to actually use artificial intelligence more efficiently.”

    eyes future inspiring natural Robot World
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleHow North Korea infiltrated remote US jobs through a TikTok user in Minnesota
    Next Article How to spot a fake smile from the real thing
    Lucky
    • Website

    Related Posts

    Science

    Turn ChatGPT into income with these 5 money-making AI prompts

    May 30, 2025
    Science

    ‘I did a bit of a dance’: Detectorist finds gold ‘mourning ring’ engraved with skull and date in UK field

    May 30, 2025
    Science

    Our verdict on Larry Niven’s Ringworld: Sci-fi classic has nice maths, shame about Teela

    May 30, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Stability trend for private markets to see in 2025

    February 21, 2025971 Views

    Appeals court allows Trump to enforce ban on DEI programs for now

    March 14, 2025943 Views

    My mom says these Sony headphones (down to $38) are the best gift I’ve given her

    February 21, 2025886 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    • Pinterest
    • Reddit
    • Telegram
    • Tumblr
    • Threads
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Stability trend for private markets to see in 2025

    February 21, 2025971 Views

    Appeals court allows Trump to enforce ban on DEI programs for now

    March 14, 2025943 Views

    My mom says these Sony headphones (down to $38) are the best gift I’ve given her

    February 21, 2025886 Views
    Our Picks

    A Russian military-endorsed “propaganda game” has just landed on Steam

    May 30, 2025

    Eddie Murphy and Martin Lawrence’s children are married: A timeline of their relationship

    May 30, 2025

    Emma Watson recalls ‘the most horrifying thing’ in her career

    May 30, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest YouTube Tumblr Reddit Telegram
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © .2025 gtnews.site Designed by Pro

    Type above and press Enter to search. Press Esc to cancel.