The hottest new thing in recommendations: The "Why?"

By Erin Griffith , written on August 9, 2013

From The News Desk

Call it a filter bubble, call it personalization overload, call it the Amazon effect. Recommendation engines, created by algorithms, have a serious impact on the way we navigate the Web.

The first generation of the Web brought the world to our fingertips, but it turns out the world is a big place. This generation is about distilling what we really want, even if we don't know we want it.

We need recommendation tools to sort through all the noise online, so we have Pulse, or Bloglovin or Flipboard to recommend news articles to read. We have shopping sites like Wanelo, Svpply, Pickie and even Pinterest to curate shoppable items and for music, we have Songza and Soundwave to recommend playlists and new artists. Foursquare recommends the best bars nearby, Sosh and Stash recommend the best weekend activities and Jukely, Songkick, Thrillcall and BandsInTown curate concert listings for you.

Even our junk mail from J Crew makes recommendations based on what the company thinks is our favorite color.

The few times an algorithm serendipitously makes our lives easier, it's great. For example, Amazon, the king of the recommendation, has suggested several perfect gifts for my boyfriend's quirky brother based on our past purchases for him. (The latest perfect gift being this ridiculous robe.)

The problem is that most recommendation engines don't work as well as Amazon's. With others, you constantly have to wonder whether they're giving us the answer to a question we're even asking.

How often are we shown a pair of shoes to buy, or a news article to read, or an activity to do, and think, "No! Why did (service in question) think I'd like this? ShoeDazzle/Foursquare/SongKick/Pickie doesn't know my taste at all." The trust in that service, no matter how hard-won, is immediately lost.

My favorite example of bad algorithms is Gmail's early ads. At first it felt invasive -- Google's robots are reading my emails! -- but then it just seemed bizarre. I'd send an "I miss you!" email to an old friend and get ads offering to "find help for my depression." I'd mention a crazy party and get ads for rehab centers. There was a loose connection, but it was just wrong enough to be comical, when it wasn't totally off-putting.

Big data and algorithms can only take you so far -- the web needs a human touch to make effective recommendations. That's why curation was such a hot buzzword last year.

Businesses based on taste, human curation and aesthetic, from Fab and Sulia to Birchbox and Songza, took off because they have the one thing algorithms don't: a point of view. I'm more likely to buy some expensive thing Birchbox sends me because I trust the their editor's recommendations. I'm more likely to go to Fab first for home goods I don't need, because I know the items on their site have been curated with a certain aesthetic in mind.

Spotify is well aware of this lesson. The company's most persistent product-related knock is its lack of curation. Users open the program to stare down the barrel of a search bar, having no idea what they want to listen to. The auditory version of performance anxiety.

At that point they might jump over to Pandora's lean-back radio experience. Pandora's whole business model is based on algorithmic recommendations. That has its own limitations, but the company smartly recognized that users want to understand their recommendations. So Pandora's radio stations (on the Web at least) have a layover which explains the types of songs they'll be playing and why:

From here on out we'll be exploring other songs and artists that have musical qualities similar to Led Zeppelin. This track, "Come Together" by The Beatles, has similar blues influences, great lyrics, repetitive melodic phrasing, extensive vamping and minor key tonality.
With this, Pandora has invested in making its algorithm as transparent as possible. It goes a long way in building trust with users. Now whenever I get that creeping thought, "No! Why would Pandora think I'd like this?" Pandora has an answer. Because of extensive vamping, obviously!

Notably, A human had to write that -- it's not a very scalable and "lean" approach to recommendations. That's the obvious problem with any human curation -- it is  expensive and limited in its scale. Amazon would not be Amazon if it had relied on humans for its recommendations and merchandizing. No one can parse that much inventory.

That's why "why" is a good middle ground. The algorithm pulls the recommendation, and the explanation humanizes it.

In December, Spotify took a big step in that direction, too. The company unveiled an extensive new discovery tab, which recommends new and old songs to listen to. Over the months, it has become my main method of discovering new music. I realized that it's because of the context.


You listened to J. Roddy Walston and the Business. Here's a song you might like. (By a similar, lesser known artist.)

People who listen to Father John Misty are also listening to The Love Language.

It's been awhile since you listened to Desaparecidos. Play now?

And so on. Every recommendation, I go in knowing why Spotify thinks I'd be into it, and I have valuable context. That not only makes me more likely to enjoy the recommendation, it also makes me more forgiving when I don't.  My mental reaction to a miss is usually, "Bad tip, but I understand why you'd think I'd like that. Let's try again!"

Foursquare's Explore feature has embraced the "why" as well. Each of the top results has a reason for its inclusion. Four of your friends have been to this restaurant. This place is on three lists. This bar is popular on weeknights. This neighborhood is known for its Indian food. Your friend Mary P. left a tip at this restaurant.

This week Facebook also embraced the "why," revealing to users the science behind its News Feed algorithm. The company explained that each time a user visits the site, Facebook chooses between 1500 stories to populate the News Feed. The site curates those stories based on how often you interact with the poster on Facebook, as well as what kind of content it is (photos are prioritized, for example).

It's not the first time Facebook has tried to explain this algorithm phenomenon to users and small businesses with Pages. As Quartz pointed out, Facebook has very explicitly been spelling this stuff out for months, with Help pages, a sister site for agencies, and press events. The problem is, it's not simple -- no matter how often Facebook describes it, the choosing still happens behinds closed doors and we'll never know exactly why each thing in our News Feed was chosen.

Meanwhile, Spotify, Pandora, Foursquare and others are building up our trust by just being straight with us on each and every recommendation: "Here is why we showed you this." No room for confusion.

Sure, there is slight feeling of magic when a secretive algorithm works. It's perfect! How did they know? By explaining that magic each and every time -- here's how we knew -- that magic is obviously lost. Knowing how the recommendation sausage is made demystifies the app.

It's like when I finally learned how to make cocktails -- once I understood how simple it was, the excitement I had previously attached to ordering, say, an Old Fashioned, was gone. I suddenly found them less interesting. It's only, like, three ingredients. 

Understanding how Foursquare powers its killer restaurant recommendations, or how Spotify finds its smart song recommendations, does make that process feel slightly less "magical," if you're the type of person who uses that word to describe apps.

But it's worth giving up a little magic to earn a little trust. If you know why an algorithm recommended something to you, you're less forgiving when you get a dud. And -- bonus! -- people will stop asking you how the hell your algorithm works. If any tech company needs more trust, it's Facebook. This week was another step in that direction. Time to make it even more transparent.

Illustration by Hallie Bateman for PandoDaily