Thursday, March 12, 2009
Sphere: Related Content
Monday, February 23, 2009
On a related topic check out this clip of two Brazilian street artists performing a call-response style rap called Repente. I first saw this performance on film at the Smithsonian in 2001 and the cadence stuck in my head. A few years later I was excited to hear this song by Punjabi MC go mainstream (with the help of Jay-Z). It may just be me, but the beat is clearly the same and the rap cadences are very similar. Fun to think about Brazilian street music and Indian Hip-Hop converging. Sphere: Related Content
Wednesday, February 18, 2009
Monday, January 19, 2009
At lunch the other day Phil Wickham, CEO of the Kauffman Fellows Program, shared bit of insight with me that has stuck in my head. Our conversation turned towards the economy, as most conversations do these days, and how VC firms and entrepreneurs are responding to the uncertainty and deceleration. As I talked about managing risk and developing alternative strategies, Phil came out and said that Silicon Valley culture is not compatible with the strategy of hedging risk.
To use the tried and true surfing analogy, entrepreneurs are like surfers. They are out there to catch a powerful wave (market trend) and ride it to rapid growth without wiping out - and the bigger the wave the more powerful the ride and the more dangerous the wipe out. So what happens to a surfer who is not 100% focused on catching the wave, and instead is thinking about what would happen if they fall or the other surfers who are in their line? They usually wipe out.
And it is the same with launching a start-up. Unless a team is 100% focused on catching that wave, they will likely fail.
So how does this apply when our industry is in the doldrums and people don’t see any big-waves on the horizon? The strategy is the same. We may not be able to see the waves now, but they are out there and when they do start rolling in teams need to be 100% focused, or they will wipe out trying. Getting a team to stay focused in an environment where sales are below plan, cash balances are shrinking, and friends are getting laid-off is easier said than done. So what can you do? Quantify the risks, put together a new plan, and let the team know what to expect.
Getting back to the idea of hedging and distraction, the goal is not just to reduce the uncertainty, but to actually reduce the number of perceived choices. Too many choices creates distraction, creates unhappiness, creates failure. So a manager who can reduce the energy spent contemplating choices will have a happier team that is focused on catching the wave and prepared for success. This lecture by Dan Gilbert, a Psychology Professor at Harvard, summarizes the point better than I can.
Over the last few months CEOs at my portfolio companies have welcomed the opportunity provided by the economic slowdown to re-evaluate and re-forecast. Heads were up, assumptions were questioned. But now it is time to put heads back down and prepare to ride that big wave when it rolls in!
Friday, July 18, 2008
The only "Boone" I have to pick with his presentation is the point he makes at the end about wind replacing natural gas in our electrcity infrastrucutre. Wind cannot be used to replace natural gas turbines. Natural gas generators are so called "peaker plants" meaning the are high cost relative to coal/hydro/nuclear and are only used during periods of peak demand. Wind electricity is similar to natural gas electricity in cost, but it CANNOT be spun up during periods of peak demand. Wind blows when it wants to blow. We cannot ask the wind to start blowing on a summer afternoon when the temperature peaks or when a power line goes out and the grid needs to be balanced.
Yes, more wind. Yes, more natural gas for transportation. And thanks to T Boone Pickens for making this a compelling marketing message, but these are complex problems and we will all have to work together and strike compromises to make it happen. I hope we can do it!
Sphere: Related Content
Wednesday, July 09, 2008
Tuesday, October 23, 2007
Deep thinking about the business models that are truly consistent with a collaborative, attention-based society. The insight is grounded in sociological and historical observations. This is the thinking that completes the ideas discussed in "The Long Tail" and "Tipping Point".
Sphere: Related Content
Monday, April 16, 2007
Well, turns out DoubleClick has a very important asset and it took the private equity guys to take advantage of the opportunity. So what does it mean?
I leave you with three important implications
1) Competition - in the near term other ad serving platforms will benefit from the acquisition
- I believe 24/7, aQuantive, and RightMedia will be the subject of acquisition interest as MSN, AOL, and Yahoo react
- Publishers will consider moving away from Doubleclick
2) Google Strategy
- Google now has access to the largest publishers on the web including deeper connections to MySpace and AOL
- Google can leverage Doubleclick’s display ad serving technology to make display advertising accessible to small advertisers and publishers, similar to how they leveraged paid search to make drive AdSense penetration for small publishers.
3) Valuation - Double click’s estimated 2006 revenues were $150 million to $300 million. At $3.1 B, this is a trailing twelve month revenue multiple of 10-20x.
Wednesday, February 07, 2007
Given the flurry of new energy and funding going into companies attempting to improve web search, I thought I should spend some time trying to make sense of it all. From what I’ve seen most companies follow one of two approaches to improving search: 1) human edited solutions and 2) next generation algorithms. (Caveat, I’m focusing on companies improving the quality of search results for web search rather than the spate of companies improving UI or going after vertical search.)
THE FIRST GROUP includes companies such as Jimmy Wales’ Wikia, TextDigger, and Eureekster. They are often referred to as Social Search companies. The general idea is to use allows users to determine which results are most relevant for a search query - sometimes to explicit voting, sometimes through more implicit behaviors such as tracking user clicks. If this sounds familiar, you're right. Yahoo started out with an army of editors. So did Ask. Ever heard of DMOZ? It's still alive. The difference today is that these editors have been open-sourced - de-centralized volunteers. Seeing what Wikipedia has accomplished it is obvious that this strategy can lead to very powerful results. I have high hopes for this approach. Humans will always be better than judgment when it comes to natural language processing. Language after all, is a human construct best interpreted by the humans who created the code in the first place. The brick wall that the first round of human edited search companies hit was cost. On today's web, if a company can effectively address issues of SPAM, there is a real shot at building a great search engine which adapts as quickly as our discourse adapts without bearing the burden of armies of editors.
THE SECOND GROUP includes new companies such as Powerset and Hakia, and even Microsoft implies they have natural language processing with their Ms Dewey search engine, even if somewhat tongue in cheek. These companies all use some form of language processing to improve search, either through advanced parsing of the search query, clustering of results, or relevance ranking results. Sound familiar? AltaVista had, and still has a great categorization tool that didn't help them avoid the spiral into irrelevance. Ask(Jeeves) touted the ability to input English language queries. Heard of MeaningMaster/Cognition. They're still around, but have yet to hit their stride despite years of effort. Danny Sullivan has a great history of natural language search processing here. Of course the main challenge with natural language processing is the algorithms. Processing is getting cheaper and AI is getting better every year. Now may be the time for a quantum leap in algorithmic processing, but I haven't seen any evidence yet. Remember a great search engine should use a large number of matching technologies. Imagine what Google search would be like if they exclusively used page rank and ignored word counts, taxonomies, and meta data. The point is natural language processing (NLP) is just one technique and any search company that relies on NLP alone will really struggle to deliver the most relevant results. Two other issues I have with NLP are that 1) the improvements to be had pertain to only a subset of searches. There are without doubt examples where NLP is advantaged (see Bambi Francisco's starry-eyed review of Powerset here, but I did a few test of my own and even a search like "What is Powerset doing?" which contains plenty of language ambiguity yielded virtually the same results on Google as they did on Hakia. (The social search approach btw, is valuable to a much wider set of search queries in my opinion). My second issue is probably more personal...it takes me longer to enter a natural language search query than a keyword query, e.g. "Who are the presidents of the
So which approach is better? It's likely both Natural Language Processing and social search will be integrated into every search engine in 5 years. Until then users will always gravitate towards solutions that provide obviously better search results within the first few queries and given social search's applicability to a larger set of queries and the proven existence of large and highly active volunteer contributors I would put my money on social search to drive the next successful up-start search engine.
Footnote: According to Jim Armstrong, who really should have started a blog by now, there are bigger search problems to fry looking at search UI and information discovery and integrating data from multiple sources (desktop, enterprise server, and web) into a single search interface. I’ll leave it to Jim to create a blog entry to address this issue.Sphere: Related Content