Be a part of a creative and interactive community of talents! Build your connections, find resources, and extend your brand! Explore your opportunities!

Google Has Figured Out How To Make Your Phone Photos Better Before You Take Them

Aug 7, 2017

TAKING INSTAGRAM-WORTHY PHOTOS is one thing, editing them is another. Most of us just upload a pic, tap a filter, tweak the saturation, and post. If you want to make a photo look good without the instant gratification of the Reyes filter, enlist a professional. Or a really smart algorithm.

Researchers from MIT and Google recently showed off a machine learning algorithm capable of automatically retouching photos just like a professional photographer. Snap a photo and the neural network identifies exactly how to make it look better—increase contrast a smidge, tone down brightness, whatever—and apply the changes in a 20th of a millisecond.

“That’s 50 times a second,” says Michael Gharbi, an MIT doctoral student and lead author of the paper. Gharbi’s algorithm transforms photos so fast you can see the edited version in the viewfinder before you snap the picture.

Gharbi started working with researchers from Google last year to explore how neural networks might learn to mimic specific photographic styles. It follows similar research German researchers completed in 2015 when they built a neural network that could imitate the styles of painters like Van Gogh and Picasso. The idea, Gharbi says, is to make it easier to produce professional-grade images without opening an editing app.

Think of the algorithm as an automatic filter, but with more nuance. Most filters apply editing techniques to the entire image, regardless of whether it needs it. Gharbi’s algorithm can pinpoint specific features within an image and apply the appropriate improvements. “Usually every pixel gets the same transformation,” he says. “It becomes more interesting when you have images that need to be retouched in specific areas.” The algorithm might learn, for example, to automatically brighten a face in a selfie with a sunny background. You could train the network to increase the saturation of water, or bump up the green in trees when it recognizes a landscape photo.

READ MORE: https://www.wired.com/story/googles-new-algorithm-perfects-photos-before-you-even-take-them/