Over the past few years, Snapchat’s growing collection of Lenses have been some of the best examples of smartphone-powered augmented reality, enabling users to effortlessly add , , and to their photos. Now parent company Snap is enabling creators to use self-provided machine learning models in Lenses, and hoping the initiative will inspire partnerships between ML developers and creatives.
Today’s key change is an update to , the free desktop development app used to create most of Snapchat’s AR filters. A new feature called SnapML — unrelated to — will let developers import machine learning models to power lenses, expanding the range of real world objects and body parts Snapchat will be able to instantly identify. As an example of the technology, Lens Studio will