Three new mobile apps from Google let users create on-the-fly photo and video effects with the help of advanced hardware and computer vision algorithms. Google describes its Storyboard, Selfissimo!, and Scrubbies apps as the first in a series of "appsperiments" to help guide future imaging innovations.Available for Android devices only, Storyboard (pictured, left) automatically transforms mobile videos into stylized comic layouts. Meanwhile, Selfissimo! (pictured, middle) lets both Android and iOS users capture multiple black-and-white selfies whenever they strike poses during photo sessions. The iOS-based Scrubbies (pictured, far right) allows users to create and save video loops by "scratching" video playbacks much in the same way a DJ scratches a vinyl record.
All three Google apps are just the latest indicators of how advanced technologies are making it easier for ordinary users to manipulate photo and video content in a variety of ways. While Google says its new apps are designed for fun applications, other technologies are increasing the potential to further blur the lines between real and fake content.
Just yesterday, for example, Motherboard published a feature story describing how a pseudonymous Redditor is using open source artificial intelligence technologies to create fake pornographic videos using the faces of famous female actors and entertainers.
'Radically New Creative Applications'
Google's new appsperiments provide a taste of the imaging capabilities that are becoming possible with next-generation mobile cameras, according to a post published yesterday on the Google Research blog.
"Each of the world's approximately two billion smartphone owners is carrying a camera capable of capturing photos and video of a tonal richness and quality unimaginable even five years ago," Google interaction researcher Alex Kauffmann who wrote in the blog post. "Until recently, those cameras behaved mostly as optical sensors, capturing light and operating on the resulting image's pixels. The next generation of cameras, however, will have the capability to blend hardware and computer vision algorithms that also operate as well on an image's semantic content, enabling radically new creative mobile photo and video applications."
Kauffmann said the new appsperiments build on Google's success with a previous iOS- and Android-based app called Motion Stills. First released in 2016, Motion Stills lets users quickly and easily create short looping GIFs and videos with a single tap on their smartphone cameras.
Based on the positive reception for Motion Stills, Google is now experimenting with new photo and video apps to "help guide some of the technology we develop next," Kauffmann said.
Threat of 'Adversarial Deep Learning'
Many technology companies are working to develop similar photo and video apps, as well as new AI-based technologies that make it ever easier for ordinary users to manipulate content in creative or persuasive ways.
Last year, for instance, Facebook unveiled an app built on its Caffe2Go deep-learning system that lets iOS and Android device users transform videos with artwork-inspired special effects in real time. The company said the technology was part of its wider plan to advance applications for AI, connectivity and virtual reality in the future.
Such technologies have already reached the point where they enable ordinary users to create convincing fake photos and videos. This is raising new questions about how people will be able to separate fact from fiction in the future.
In its story, Motherboard described how a Redditor who goes by the handle "deepfakes" has created believable-looking fake porn videos using machine learning to paste female celebrities' faces onto porn actresses' bodies.
On Twitter, AI expert Alex J. Champandard, who was interviewed for the article and described such applications as "adversarial deep learning."
"I've been wrestling with the implications of this for a few days, since I was asked to comment on the story," Champandard tweeted yesterday. "This needs to be a very loud and public debate, so *everyone* understands that all videos should be suspected of being fake."