THE LAB Prophesee, Paris. Christoph Posch, co-founder and chief
OBJECTIVE Develop biologically-based sensors and algorithms to make
machine vision faster and more efficient.
DEVELOPMENT The Onboard vision system, a combination of sensor plus
processing algorithms modeled after the human eye and brain.
HUMAN-MIMICKING MACHINE VISION
sors combine an accelerometer, gyroscope, and magnetometer
to collect three-axis data on the suit-wearer's movement. An
infrared-illuminated eye-tracking device uses two cameras to
follow pupil motion.
It took some engineering to get the system to work. Eye
trackers typically use infrared light to follow pupil motion
because they work with both dark and light eyes. While they are
fine indoors, outdoors the Sun's infrared rays overwhelm them.
To let in visible light but keep out IR wavelengths, Matthis
settled on a welding screen, a full-face green plastic visor that
shields the eye-tracking sensor without restricting a subject’s
field of view.
Calibrating 2-D eye-tracking information in a 3-D experiment
was a challenge that took Matthis into uncharted territory. To do
it, Matthis leveraged a human reflex, called the vestibulo-ocular
reflex. This works like Newton's third law: If a person moves
his or her head while focusing on a given object, their eyes will
move in the opposite direction, compensating to keep the same
object in view.
By having a volunteer focus on a fixed point while moving his
or her head, Matthis could map from 2-D eye movements to the
3-D environment using eye-tracking and head-motion data.
Among his findings: Humans look two strides ahead on medium terrain, and look at the ground more than 90 percent of the
time on rugged paths. In both cases they consistently look 1. 5
seconds ahead of their current position.
Next, Matthis plans to study how visual deficits affect motion.
He hopes to work with new computer algorithms to elicit more
granular vision data, watching exactly what cues subjects use to
decide where to step next. ME
Humans use an important trick to process complex images in milliseconds: prioritizing dynamic data over static information. Prophesee, a French startup, channels that
approach in an imaging sensor and processing algorithms that
samples and analyzes only what changes in a scene.
“These are really like high-speed eyes for machines,” said
Christoph Posch, co-founder and chief technology officer of
The problem with conventional vision systems, Posch
explained, is that they sample each pixel equally. This imposes a
large computing burden and limits processing speed. While their
algorithms are plugging away, these systems often miss data
between video frames.
Neuromorphic sensing, a term coined by neural computing
pioneer Carver Mead, promises to address this problem. It seeks
to mimic the brain by focusing only on the parts of the image that
Receiving and processing information
on an event-by-event basis, creates
faster, and more adaptive vision
systems that use less computing
power, Posch said.
At 10,000 frames per second,
Prophesee’s Onboard event-
based vision chip captures
motion like a high-speed
camera. It consists of a silicon
CMOS image sensor and circuits
that send signals only when they
detect changes in the light hitting a
pixel. By filtering out static information
from each frame, it produces lean data
that specialized algorithms analyze while
using only milliwatts of power.
Prophesee’s event-based vision chip captures
motion like a high-speed camera.
Prophesee also developed brain-inspired algorithms to
process this asynchronous data, enabling fast tracking and object
recognition for multiple shapes.
The challenge, Posch said, has been to cram more complex
processing circuitry into each of the sensor’s pixels. Current
chips accommodate a 15 μm pixel pitch, but Posch says better
resolution is in the works. As their pixels shrink, Posch and his
team envision their systems improving monitoring in factories,
surveillance, and driverless vehicles.
Posch originally started out developing event-based particle
detectors for CERN, the European Organization for Nuclear
Research. They were instrumental in identifying the Higgs Boson
particle in 2012.
As Posch shifted to vision-focused work, he took on projects
with both machine and medical applications. He is also scientific
advisor to a company called Pixium Vision that makes medical
devices to restore partial vision in certain cases of blindness. The
company currently has two bionic vision systems in clinical trials.
He also thinks it might be possible to apply event-based
principles to LIDAR, a radar-like laser sensor used in some
autonomous cars, or other types of sensors. Low-power, low-bandwidth sensing, he said, should make it easy to work with
many of these devices at once. ME