This Software Can Virtually Age a Baby 80 Years

The future of missing person photos is being led by a new algorithm and the Google images library.

Image: University of Washington

Life might be imitating art, or research labs might be imitating apps from four years ago, but University of Washington computer scientists are working to raise the bar when it comes to aging children in photographs.

Gif by Dan Stuckey. Image: University of Washington

The first thing that came to my mind while reading about their work was Lindsay Bluth's photo-enhancing service, “Mommy What Will I Look Like,” which was supposed to give new parents a glimpse at what their infants would look like five decades later, but had the unfortunate habit of depicting those babies after time had really done a number on them.

The explanation of the University of Washington project pretty quickly saps the fun out of the project, by pointing out that they're not trying to disrupt “In20years.com” but rather trying to work faster and more accurately than the forensics artists who aid police looking for missing children. Currently, the most often-used software requires the subject be at least 6 years old in order to work. The University of Washington's software can visualize what a one-year-old will look like when he or she turns 80.

Gif by Dan Stuckey. Image: University of Washington

It's pretty interesting how they worked out an algorithm that was capable of working with photographs taken under any condition—the software can work with any image, not just one that was taken straight-on with a neutral expression. It began, as so many things do, with a Google Image search:

"To analyze aging effects we created a large dataset of people at different ages, using Google image search queries like 'Age 25,' '1st grade portrait,' and so forth. We additionally drew from science competitions, soccer teams, beauty contests, and other websites that included age/grade information. The total number of photos in the dataset is 40K and each cluster includes, on average, 1,500 photos of different people in the same age range. This database captures people 'in the wild' and spans a large range of ages."

Then, to fine tune, they took to Amazon's Mechanical Turk crowdsourcing marketplace to have people compare the lab's results with actual photographs of people and say which they felt was more realistic.

Gif by Dan Stuckey. Image: University of Washington

Obviously there's still work to be done—hair colors and styles to be considered, for instance. But as far as I can tell the results look pretty good, in that they look like how an elderly person might actually look, as opposed to apps that only give photos the unconvincing Winona Ryder-in-Edward-Scissorhands-old-age-make-up veneer. Oddly, if I had a critique, it'd be that at this point, the visualizations look a little too much like the subject, as compared to actual photos, seen below.

Software is on the left, actual photo is on the right. Image: University of Washington

Obviously, though, that's nickpicking.