FILTERVILLE When it first launched, the popular smartphone app Snapchat had a less than desirable reputation.
People assumed a service that allowed a demographic of mostly young people to send and receive photos that disappeared after a few seconds was begging for trouble. However, the app seemingly shed that status when it introduced its popular Snapchat lenses (or filters, as theyre referred to colloquially), which turn users' faces into tongue-wagging puppies, give them glasses or top their heads with whimsical crowns of flowers.
Vox recently took to YouTube to explain how the technology works, and its fascinating.
Snapchat's inroads to its lenses application began in 2015, when it acquired a Ukrainian startup called Looksery for a cool $150 million. While no one at the company could speak to the people at Vox, having its patents online allowed the Vox team to dissect what makes the filters run.
The first step is detection. The program looks for contrast on the face, scanning images and calculating the differences between the light and dark areas of the face. The algorithm implemented does best when the face is forward facing and still, so it can scan for facial features, using contact points for an average face based on manually entered data from hundreds of faces.
Once those contacts points are established, the model adjusts to a custom fit based on areas of contrast and facial features to create a 3-D mask that moves with the person, rotating and scaling along with the face so it will coordinate with the video data. It then has the capacity to distort, morph, trigger animations and even face swap, the last of which enthralled the internet for a good month.
According to Vox, the components of facial detection software arent new, but the innovation enabled Snapchat's live video capabilities to run on a handheld smart device. Those processing capabilities havent been realistic for the technology until recently.
What do you think of facial detection software? Is it all fun and games, or does it worry you? Take our poll.
People assumed a service that allowed a demographic of mostly young people to send and receive photos that disappeared after a few seconds was begging for trouble. However, the app seemingly shed that status when it introduced its popular Snapchat lenses (or filters, as theyre referred to colloquially), which turn users' faces into tongue-wagging puppies, give them glasses or top their heads with whimsical crowns of flowers.
Vox recently took to YouTube to explain how the technology works, and its fascinating.
Snapchat's inroads to its lenses application began in 2015, when it acquired a Ukrainian startup called Looksery for a cool $150 million. While no one at the company could speak to the people at Vox, having its patents online allowed the Vox team to dissect what makes the filters run.
The first step is detection. The program looks for contrast on the face, scanning images and calculating the differences between the light and dark areas of the face. The algorithm implemented does best when the face is forward facing and still, so it can scan for facial features, using contact points for an average face based on manually entered data from hundreds of faces.
Once those contacts points are established, the model adjusts to a custom fit based on areas of contrast and facial features to create a 3-D mask that moves with the person, rotating and scaling along with the face so it will coordinate with the video data. It then has the capacity to distort, morph, trigger animations and even face swap, the last of which enthralled the internet for a good month.
According to Vox, the components of facial detection software arent new, but the innovation enabled Snapchat's live video capabilities to run on a handheld smart device. Those processing capabilities havent been realistic for the technology until recently.
What do you think of facial detection software? Is it all fun and games, or does it worry you? Take our poll.