Metaspectral snaps $4.7M seed round to bring real-time hyperspectral data to recycling

For all its promise, the current state of solid-waste recycling leaves a lot to be desired. Plastic waste is among the worst: It represents about 12% of solid waste in the U.S., but only about 4.5% of what gets recycled.

There are plenty of reasons why we’re terrible at recycling plastics. For one, not all are easily recyclable. The plastics industry hasn’t been doing us any favors, either.

But one that could be more easily solved is the problem of sorting. There isn’t just one type of plastic — there are many and sorting them into a few different categories can go a long way toward making recycling more profitable and feasible.

Metaspectral thinks it’s found a way to make the sorting problem easier, and it’s raised a $4.7 million seed round to launch and expand its data analysis platform, TechCrunch has exclusively learned. Investors on the round include SOMA Capital, Acequia Capital, the Canadian government and angel investors including Jude Gomila and Alan Rutledge.

They’re not the first to try to crack plastics sorting, but they’re taking a different tack than many, using hyperspectral imagery to decipher which plastic is which in real time.

Hyperspectral cameras slice light into hundreds of different bands, far more than the usual red, green and blue captured by typical visible-light cameras. The concept was first hatched at NASA’s Jet Propulsion Laboratory in the 1970s as a way to study the Earth’s surface in more detail. Remote sensing experts knew they could get a clearer picture with higher spatial resolution, and they also felt they could get complementary information with greater spectral resolution.

A range of fields use hyperspectral imaging, including Earth observation and national security, but recycling shows particular promise. To the human eye, many plastics look the same despite having different compositions. But hyperspectral images can reveal differences that would otherwise be too subtle to pick up on.

Metaspectral has developed an image analysis platform that can process hyperspectral data in “real time,” co-founder and CEO Francis Doumet told TechCrunch. Other plastics analysis systems “can replace the human, but they can’t go superhuman,” distinguishing between high and low grades of the same plastic or picking out soiled materials like motor oil containers. “But we can do that.”

Hyperspectral image reveals differences in two detergent bottles

Two detergent bottles look nearly identical to the naked eye, but hyperspectral imagery reveals that one is made of PET while the other is made of HDPE. The cap is made of a third plastic, polypropylene. Image Credits: Metaspectral

The platform is based on algorithms developed by co-founder and CTO Migel Tissera.

“Migel and I met about four years ago,” Doumet said. “Migel was just freshly minted from his Ph.D. program and had this really cool idea of how to better compress data. A typical compression algorithm just uses the same formula to compress everything. So it does a mediocre job at compressing everything, but it’s good enough because it handles such a wide variety of data. Migel’s idea was, why can’t we tailor the compression algorithm to the input, so that we get better compression performance depending on what type of imagery it is?”

The pair founded Metaspectral, and their first customer was an agricultural company that was inspecting crops using imagery that contained infrared data, a frequent component of hyperspectral imagery. “That’s kind of what opened up our eyes to that market,” Doumet said.

Because it’s so data rich, hyperspectral imagery can be challenging to process, especially if you want results quickly. Tissera was able to adapt his algorithms to hyperspectral data, and he and Doumet were off to the races.

Despite their initial experience in the agricultural space, Metaspectral quickly focused on two markets: national security and environmental monitoring. “Recycling is probably like 90% of our focus right now, as a company,” Doumet said.

Because their software can process hyperspectral images extremely quickly, recycling sorting facilities have been keen to try it out. “The recycling conveyor belt moves at one meter a second, and the robot is only four meters from the vision system. So we have to make the decision within two to three seconds for the robot to move and pick it up,” Tissera told TechCrunch.

The system can use a number of off-the-shelf hyperspectral cameras. The images from those are fed to on-premise servers that include custom field-programmable gate arrays that can process hyperspectral data at 37 gigabits per second. Managers can also view the imagery live as it’s scrolling past. Currently, Metaspectral is working with “the largest recycler in Canada,” which also has a sizable presence in the U.S., Doumet said.

Doumet and his team are piloting Metaspectral’s technology from one of its Vancouver facilities. If the initial trial works as planned, they’ll install nearly 20 cameras to turn the facility into a “showcase plant,” Doumet said. “And then if that goes well, then we’re going to be deploying into the States as well.”

Though recycling might be the beachhead market, Doumet and Tissera see plenty of applications for their hyperspectral image analysis software, including defense, wildfire monitoring and quality control in industries like processed foods and pharmaceuticals.

By making it easier to process hyperspectral data, “we’re opening up this entire new dimension that was previously invisible to industry,” Doumet said.