One of the tell-tale signs you’re living in the future happens when scientists figure out how to make stuff invisible. Thanks to big-brained researchers at MIT, that future has apparently arrived – although it might not be exactly what you pictured.
Recent PhD graduate YiChang Shih is the head author of an algorithm that can remove pesky reflections from photographs. What makes this all the more remarkable is that the reflection removal isn’t performed in post processing, but on the spot as the snapshot is being taken. With this news, it’s now only a matter of time before frustrated amateur photographers the world over will be able to take flawless photographs through reflective glass without having their own image spoil the end result.
The algorithm works by folding in a technique that was developed by a couple of guys named Daniel Zoran and Yair Weiss. The Zoran/Weiss method takes a digital image and breaks it down into tiny pixel blocks, identifying which pixels likely belong to the subject being photographed and which should be eliminated.
The science is not quite “there” yet with respect to making just any photo reflection disappear. Apparently, Shih and company’s algorithm only works on reflections created by very thick glass. The kind you might find in a double-paned window, for example. This is because thick glass typically creates a double reflection of the same image, which is easier for the algorithm to spot and fix.
Naturally, this has already started conversations about how the technology may be used with robot technology, allowing said AI machines to differentiate their own reflection from what they’re actually spying on through, say, your living room window. Yes, creepy.