Innovation in protective bubbles

 •  Filed under digital markets, social media and journalism

Facebook encourages people to collect "memories" in the same place. It can certainly be useful down the road to have a centralized repository, given that many people now use Facebook as something like a diary, blog, noticeboard, or a photo book.

(Let's put data ownership issues aside, and assume Facebook will not charge you to access your memories in the future. The data is valuable to the company as is, but it could clearly be monetized even more aggressively...)

Facebook's "On This Day" feature brings up old photos ("memories") with an "X years ago" note. The stated goal of the product is to create "a new memories experience that packages your recent memories in a delightful way for you to enjoy and share".

People could have always browsed their old posts, of course, and Facebook justifies the innovation by saying that the "idea of a memory product on Facebook emerged to fill a need by helping people re-experience their posts without having to manually sift through their Timeline."

"Packaging memories"

Facebook explicitly says that it does not want you to see some parts or aspects of your past. The feature is designed, we learn from the research team, to filter out things like "failed ventures":

There are certain types of posts that people would prefer not to see surfaced again as memories. For example, people might post about violence, accidents or failed ventures, which can be rather unpleasant, particularly to re-experience as memories years later. We’ve been working on ways to better identify these types of memories and filter them out, so that people see more enjoyable content instead. However, even with the considerable progress we’ve made, we know there will always be the possibility that positive memories that have become negative – or contamination sequences – slip through. For example, a post about a first day at a new job that has lots of positive reactions and comments may have become negative over time if the person has left that job. Thus, we learned we also needed to give people more control over the memories they see.

To be fair, Facebook teams are not deciding based on their whims what the algorithms should show. They apparently asked users what they like to see.

Here's my concern: we may not "like" to see reminders of our past mistakes, but we also know painful lessons can be quite useful. Wouldn't we be better off learning from our mistakes?

Also, Facebook says that "there needs to be options to help people adjust On This Day so they can remove contamination sequences and other memories they might prefer not to see." It sounds like people from our past who we no longer like to think about are now basically these infectious agents that must be erased...

The hashtag writes itself... #sanitizingthepast