Monday, March 19, 2012

Fortuituous Discovery


A friend of mine was searching through his contact list for a colleague who happened to have the same first name as his (Seth).   As he searched for this contact, he noticed how many of his contacts shared his first name, which is only moderately common in the US.  His curiosity got the better of him and he looked through all of his contacts to count the ones that were named Seth. 

I bring this up today because I am concerned that we are getting too good at getting the information we need very accurately, quickly, and precisely. We now have powerful search engines, business analytics, and filters.  Companies have comprehensive profiles of us and sophisticated algorithms that figure out exactly where we want to go next or what would be a perfect match for what we are looking for.  How could this possibly be bad? 

Well, as in the example I described above, sometimes we find something even better through fortuitous, random, unexplainable happenstance.  If he had found the contact immediately, he wouldn't have gone on his "Seth" quest.  If we always get right to where we are going, when will we take the road less traveled, the unfamiliar fork in the road, the thorny thicket, etc?  This will make us very efficient but not nearly as creative.  It is the random factor that gives humans the advantage over computers.  It is what gives us humor and poetry.  It is what makes us innovative and artistic. 

Take for example the announcement this week that Britannica is cancelling its eponymous encyclopedia.  I am not sure how many of you remember flipping through encyclopedias to find a topic that we were assigned in 4th grade social studies for a report.  While we were on our way to “the chief agricultural products of sub-Saharan Africa,” we had to flip through half of the A section to find Africa, some of the S section to find Saharan, maybe back to A to find agriculture.  Maybe we looked through E to find exports.  Along the way the animal lover might have been attracted by the pages on elephants and antelopes.  The scientist might have been distracted by etymology and astronomy.  The athlete stopped at Ebbet’s Field and the Astrodome.  Who knows? 

My point is simply that this wouldn't happen when the target is immediately available.  I wonder what is lost. . . . . 

Friday, March 09, 2012

Veiled Viral Marketing

Now that I follow some really informed companies, consultants, and trendwatchers, it is rare that a new business strategy hits me over the head with surprise. I may not be able to take advantage of new business ideas that are bouncing around the innovation and entrepreneurship domains, but I usually hear about them pretty early.
 
OK that said, here is a new one that is just being experimented with. I think it has potential, but definitely needs some work.  

The best names I have heard for this strategy so far is "veiled viral marketing" or "anonymous friend recommendations". I am sure all of you have received typical commercial recommendations. Based on all the tracking that companies like Facebook and Google do, they can show you ads for something that you are more likely to be interested in than something totally random. Sometimes they are on target and sometimes they are way way off. But on average, it is better to use targeting than most of the other options out there.  "People like you have rated this movie 5 stars."  "Your friends have given this book a thumbs up.You have also received the more direct social recommendations. "John Franken just bought a new pair of Nikes."  "Sally Stansfield the NYT article “Romney wins Super Tuesday.” 

The idea behind this new strategy is to come in somewhere in the middle. Lets say you want to make a recommendation to a friend but don’t want them to know the recommendation comes from you. What if Facebook tells you “A friend who prefers to remain anonymous recommends to you personally the movie “The Lorax."

I am pushing the envelope with a movie because there are few reasons someone would want to remain anonymous there. But one application is for sensitive information. What if you received a private message in Facebook that says “One of your friends who would prefer to remain anonymous would like you to visit the site “How to talk to a gay friend.” And of course there would be a link to the site. Perhaps a friend isn’t ready to come out of the closet yet and wants to make sure you can handle it before telling you. The same service would work for serious diseases as well - “One of your friends, who would prefer to remain anonymous, would like you to visit the site “How to talk to a friend with cancer.”

The other example I have seen I am less sure of, but maybe it could work if it were designed a little more friendly. What if you received a private message on Facebook that said “One of your friends, who prefers to remain anonymous, thinks you would look great in a J Crew navy blue rugby shirt.” And of course has the embedded link to the J Crew site. 

The one that I saw was for a support bra, but that one is over the line if you ask me. Hey, you can’t fault them for thinking outside the box. How many times have you really wanted to make that kind of recommendation but didn’t have the chutzpah?  But I am not sure the receiver would appreciate it very much.

Fast Follower or Infringement??

(Note - I have to give credit for finding this to Business Week, but you have to be a subscriber to see the article online, so I can't link to it).

Here is a business model that is right on the gray line between intellectual property infringement and fast follower. There is a German company called Bamarang that is a pure fast follower. A few months after a company called Fab.com launched in New York City (it is a flash deal site for vintage accessories like vinyl records, Bauhaus posters, etc), Bamarang came out with an identical business in Germany. And as a web company, it is just as accessible to customers all over the world. What makes their "cloning" a little over the top is that the color scheme of the site, the layout, typeface, and even the photos on the home page look exactly the same as Fab. There was once a lawsuit between Apple and Microsoft over the similar look and feel of their Office products. But apparently Bamarang knows how to skirt these laws by registering only in Germany.

They cloned EBay in 1999. Since then, they have cloned eHarmony, Groupon, Facebook, Zappos, Airbnb, Pinterest, and 100 other web-based companies. They don’t need to spend a penny on R&D, just copy and paste. They create the company, build up some sales, and then sell it off to VCs for hundreds of millions of euros. For example, their Groupon clone was the top deal of the day site in half of Europe. Because they know how to get around international copyright and trademark laws (they don’t register in the US – the home country of the companies they are cloning), there is little the US companies can do. Groupon threw in the towel by buying the clone for shares currently worth about $1 billion.  Not bad for a simple copy and paste action. 

One last note – would you believe that the founders are brothers whose parents are corporate lawyers?

Innovation in Social Recommendation Systems

Do you know anyone who has so much expertise in a particular domain that you would trust their recommendations without feeling any need to do research of your own? In fashion? In music? Phones/tablets? Cars?

Here is the business model (article in Entrepreneur magazine March 2012 edition will be available online March 17). The trusted experts create an online portfolio of products that they can continually curate depending on how fast the domain changes. If they are a music expert, they might keep adding to their "music collection". If they are smart phone experts, they might delete the old ones as the add the new ones. For cars, perhaps they have specific recommendations for different kinds of friends (families, singles, price ranges, etc). The idea is that they create this portfolio on a special web site designed for this purpose (StyleOwner.com) and each item is linked to the company that sells it. They can link through Facebook, Twitter, or LinkedIn so that their connections can see their “clothes closet” or “driveway.” They don’t actually have to own the product, just find it worthy of their personal recommendation. If any of their friends purchases the item, the friend-expert gets a cut, StyleOwner gets a cut, and the store gets the rest. 

This business model relies on the friend-friend relationship. It takes trust because if they see the recommendation, do independent research, and then go directly to the store, StyleOwner gets zilch and the friend-expert just gets a thank you. The trust has to be so strong that you are willing to buy it just because of the recommendation and do it right there and then. Are there friends you trust that much? For wine? Movies? Restaurant reservations? The possibilities are endless, but only if there are product categories where trust is all that you need.  Having a central location to put them all is helpful, but no guarantee of success.

Friday, March 02, 2012

Ethical consequences of Augmented Cognition

A very hot topic today in the Human Factors discipline is augmented cognition.  There are so many ways we can enhance the sensation, perception, memory, attention, decision making, psychomotor coordination, etc.  The military is the biggest and was the original customer of this kind of technology.  But now the mobile web is enabling an incredible assortment of "augcog".

I recently read an article in the Atlantic on military uses of augcog that got me thinking.  This article was about the ethics of augmenting soldiers.  Right now, it is against the Geneva conventions to keep military prisoners awake for extended periods.  But if we can manipulate soldiers not to need sleep (using the genes that dolphins use to sleep just one half of their brain at a time so the other half can make sure they go up for air) does it then become ethical to do it?  If we engineer them not to feel fear (by manipulating the genes in the amygdala), would waterboarding become ethical because the only real consequence of that is making them think they are going to drown?  What if we engineer them not to feel pain by dulling the pain receptors in the brain?  Are other kinds of torture now ethical? 

And then there are the ethics of extending the digital divide.  What if high end smart phones become so powerful that they enable their owners to get all kinds of benefits not available to others?  They can use these benefits to increase the income gap that is already too big. 

Does our discipline have a duty to put an equal amount of time and resources into designing significant benefits into low end technology to mitigate the widening of the digital divide?  We can't guarantee equal outcomes, but should we at least try for equal opportunities?  Technology can be a powerful source of differentiation, but it can also be a powerful equalizer.  Is there an ethical duty that if we do the former, we also do the latter?