That was effectively the only conversation missing from Facebook’s F8 conference — an event that even made time for a progress update on a moonshot project that might one day allow us to hear through our skin. Perhaps that’s the luxury of a company whose advertising revenue is up 57 percent year-over-year.
But don’t misconstrue unseen for unimportant. Behind machine intelligence “at scale” and cute Camera Effects lies a complete dependence on ad revenue. And barring a black swan event, Facebook’s expensive DARPA complex is only going to grow more dominant.
One in four engineers at Facebook are active users of its internal artificial intelligence platform. In that sense, the same technology that allows you to take a selfie with a virtual flower-crown at Coachella also allows the company to identify objects within public photos and videos on your News Feed.
As it becomes more difficult to cram ads onto Facebook without sacrificing the user experience, targeting, engagement and connectivity become the next natural priorities. This explains why Facebook’s ten-year roadmap highlights exactly that — connectivity to get more users on the platform and AI to boost engagement and targeting. In other words, machine intelligence is the future of monetization for Facebook.
The shrinking News Feed problem
In a way, Facebook’s platform is shrinking. Just a few years ago, almost every post a user made was text. But today, an agglomeration of pictures, videos, text and live content define the platform. Without advances in computer vision and machine learning more broadly, Facebook would be handicapped by its own new features.
“Content is getting more complex,” Mark Rabkin, Facebook’s VP of engineering for ads told me. But this complexity brings with it serious upside potential. It’s still very early days for understanding the human context that enables ad deliveries that lead to conversions — despite the fact that Facebook is the world’s second largest digital advertiser behind Google and the meeting room I met Rabkin in was adorned with oversized champagne bottles.
Today, the company already knows a lot about you. Hordes of metadata and explicit groups you join, pages and posts you like, among many other things, have helped Facebook target ads as well as any company in the world.
Looking at the first three ads in my News Feed right now, I see WordPress (the platform I am currently typing on), DigitalGlobe (a company I have covered) and TechCrunch (not sure why I got this one tbh). But what’s missing from this is context, or as Rabkin calls it, “the why.” While all three of those aforementioned ads are loosely relevant, I’m not really the target market for DigitalGlobe’s services, I don’t have time to keep a personal blog and if I don’t already know about what TechCrunch is advertising to me then I should be fired.
“Just understanding time is huge,” added Rabkin. “We want to understand whether you’re interested in a certain thing generally or always. Certain things people do cyclically or weekly or at a specific time and it’s helpful to know how this ebbs and flows.”
Facebook’s core is its internal social graph. Mining new forms of content for information that can be added to this graph, and made available across the company, opens up new possibilities for understanding user behavior.
It’s like Warby Parker for Facebook
Computer vision gives computers reading glasses to interpret information from strings of images. The key to uncovering new patterns of user behavior is being able to effectively extract signal from sounds, language and images in harmony. It can help differentiate long-standing interests from mood-driven impulses. And particularly for the later, being able to perform these computations in real time is critical.
Manohar Paluri and Merlyn Deng, leaders on Facebook’s applied computer vision team, have been working on this exact challenge. Their work isn’t for ads specifically, but it could soon find its way into the hands of other leaders at Facebook like Mark Rabkin and Andrew Bosworth.
Both Rabkin and Deng relayed the same underlying thesis to me about helping people discover content. To the Facebook ads team, discovery is critical because it not only puts the illusion of power and control in the hands of users but is already a natural task that users organically want assistance with. Of course, Facebook gets to collect a lot more information about users along the way, but conversions are generally a positive thing for everyone involved.
In practice, this means that you shouldn’t expect Facebook to start the next phase of its augmented advertising journey by doing anything abrupt. It seems the company would rather you not notice its AI efforts and merely benefit passively from more holistic back-end mappings of user preferences. This differs in strategy a bit from companies like Pinterest that are building dedicated discovery tools.
Instead, Facebook will go the way of Snap (yet again) and work to index its content more thoroughly. Snapchat Stories Search uses computer vision and other metadata to index stories for search in near real time — although to be fair the idea itself is a completely obvious next step for any social network driving revenue growth with ads.
Flipping the switch
Once you’re able to index videos, pictures and live content in the same way as text, that “why” question that Rabkin hinted at starts to become answerable. Facebook isn’t new to qualitative research. It notoriously worked side-by-side with researchers from Cornell and the University of California–San Francisco to explore the spread of emotional contagion through social networks — think feed manipulating to see what happens when someone sees depressing posts all day.
But from an ads perspective, Facebook might be able to answer hypothetical questions about why more women than men finish a video ad and why more men actually engaged with it. Facebook would only grow more powerful, in control of valuable brand insights and best practices.
Facebook has executed many times over in applying machine learning to deliver engaging content. Its understanding of faces helped power features like Memories that drive shares. And its automatic video captioning feature increased video watch times by 12 percent.
The value that can be extracted from the core Facebook platform is asymptotic as many on Wall Street believe, but that asymptote becomes considerably longer with the company’s emphasis on machine intelligence.
Show Comments (0)