Roads to R&D success – part 1

Large corporations have a serious problem.  We have talked about this before (see Is M&A the only way to grow, R&D Effectiveness, and Technology innovation).

It’s been brewing for years, some say decades. Successful company’s generate lot’s of cash but investing in current lines of business seldom propels corporations into new markets.

So what can they do?

  • Buy startups – yes, doing so can move corporations into new markets, obtain new technology and perhaps, even a functioning product.  However, they often invest in  unproven technology, asymmetrical organizations and mistaken ROIs.
  • Invest internally – yes, they can certainly start new projects, give it resources and let it run it’s course.  However, they burden most internal project teams with higher overhead, functioning perfection, and loftier justification.

Another approach trumpeted by Cisco and others in recent years is spin-out/spin-in which is probably a little of both.   Here a company can provide funding, developers, and even IP to an entity that is spun out of a company.  The spin-out is dedicated to producing some product in a designated new market and then if goals are met, can be spun back into the company at a high, but fair price.

The most recent example is Cisco’s spin-in Insieme that is going after SDN and Open Flow networking but their prior success with Andiamo and it’s FC SAN technology is another one.  GE, Intel and others have also tried this approach with somewhat less success.

Corporate R&D today

Most company’s have engineering departments with a tried and true project management/development team approach that has stage gates, generates  requirements, architects systems, designs components and finally, develops products.   A staid, steady project cycle which nevertheless is fraught with traps, risks and detours.  These sorts of projects seem only able to enhance current product lines and move products forward to compete in their current markets.

But these projects never seem transformative.  They don’t take a company from 25% to 75% market share or triple corporate revenues in a decade.  They typically fight a rear-guard action against a flotilla of competitors all going after the same market, at worst trying not to lose market share and at best gain modest market share, where possible.

How corporation’s succeed at internal R&D

But there are a few different models that have generated outsized internal R&D success in the past.  These generally fall into a few typical patterns.  We discuss two below.

One depends on visionary leadership and the other on visionary organizations.  For example, let’s look at IBM, AT&T’s Bell Labs and Apple.

IBM R&D in the past and today

First, examine IBM whose CEO, Thomas J. Watson Jr. bet the company on System 360 from 1959 to 1964.  That endeavor cost them ~$5B at the time but eventually catapulted them from one of many computer companies to almost a mainframe monopoly for two decades years.  They created an innovative microcoded, CISC architecture, that spanned a family of system models, and standardized I/O with common peripherals.  From that point on, IBM was able to dominate corporate data processing until the mid 1980’s.  IBM has arguably lost and found their way a couple of times since then.

However as another approach to innovation in 1945, IBM Research was founded.  Today IBM Research is a well funded, independent research lab that generates significant IP in super computing, artificial intelligence and semiconductor technology.

Nonetheless, during the decades since 1945, IBM Research struggled for corporate relevance.  Occasionally coming out with significant IT technology like relational databases, thin film recording heads, and RISC architectures. But arguably such advances were probably put to better use outside IBM.  Recently, this seems to have changed and we now see significant technology moving IBM into new markets from IBM Research.

AT&T and Bell Labs

Bell  Labs is probably the most prolific research organization the world has seen.  They invented statistical process control, the transistor, information theory and probably another dozen or so Nobel prize winning ideas. Early on most of their technology made it into the Bell system but later on they lost their way.

Their parent company AT&T, had a monopoly on long distance phone service, switching equipment and other key technologies in USA’s phone system for much of the twentieth century.  During most of that time Bell Labs was well funded and charged with advancing Bell system technology.

Nonetheless, despite Bell Labs obvious technological success, in the end they mostly served to preserve and enhance the phone system rather than disrupt it.  Some of this was due to justice department decrees limiting AT&T endeavors. But in any case, like IBM research much of Bell Labs technology was taken up by others and transformed many markets.

Apple yesterday and today

Then there’s Apple. They have almost single handedly created three separate market’s, the personal computer, the personal music player and the tablet computer markets while radically transforming the smart phone market as well.   In every case there were sometimes, significant precursors to the technology, but Apple was the one to catalyze, popularize and capitalize on each one.

Apple II was arguably the first personal computer but the Macintosh redefined the paradigm.  The Mac wasn’t the great success it could have been, mostly due to management changes that moved Jobs out of Apple.  But it’s potential forced major competitors to change their products substantially.

When Jobs returned, he re-invigorated the Mac.  After that, he went about re-inventing the music player, the smart phone and tablet computing.

Could Apple have done all these without Jobs, I doubt it.  Could a startup have taken any of these on, perhaps but I think it unlikely.

The iPod depended on music industry contracts, back office and desktop software and deep technological acumen.  None of these were exclusive to Apple nor big corporations.  Nevertheless, Jobs saw the way forward first, put the effort into making them happen and Apple reaped the substantial rewards that ensued.

~~~~

In part 2 of the Road to R&D success we propose some options for how to turn corporate R&D into the serious profit generator it can become.  Stay tuned

To be continued …

Image: Replica of first transistor from Wikipedia

 

Analog neural simulation or digital neuromorphic computing vs. AI

DSC_9051 by Greg Gorman (cc) (from Flickr)
DSC_9051 by Greg Gorman (cc) (from Flickr)

At last week’s IBM Smarter Computing Forum we had a session on Watson, IBM’s artificial intelligence machine which won Jeopardy last year and another session on IBM sponsored research helping to create the SyNAPSE digital neuromorphic computing chip.

Putting “Watson to work”

Apparently, IBM is taking Watson’s smarts and applying it to health care and other information intensive verticals (intelligence, financial services, etc.).  At the conference IBM had Monoj Saxena, senior director Watson Solutions and Dr. Herbert Chase, a professor of clinical medicine a senior medical professor from Columbia School of Medicine come up and talk about Watson in healthcare.

Mr. Saxena’s contention and Dr. Chase concurred that Watson can play at important part in helping healthcare apply current knowledge.  Watson’s core capability is the ability to ingest and make sense of information and then be able to apply that knowledge.  In this case, using medical research knowledge to help diagnose patient problems.

Dr. Chase had been struck at a young age by one patient that had what appeared to be an incurable and unusual disease.  He was an intern at the time and was given the task to diagnose her issue.  Eventually, he was able to provide a proper diagnosis but it irked him that it took so long and so many doctors to get there.

So as a test of Watson’s capabilities, Dr. Chase input this person’s medical symptoms into Watson and it was able to provide a list of potential diagnosises.  Sure enough, Watson did list the medical problem the patient actually had those many years ago.

At the time, I mentioned to another analyst that Watson seemed to represent the end game of artificial intelligence. Almost a final culmination and accumulation of 60 years in AI research, creating a comprehensive service offering for a number of verticals.

That’s all great, but it’s time to move on.

SyNAPSE is born

In the next session IBM had Dr. Dharmenrad Modta come up and talk about their latest SyNAPSE chip, a new neueromorphic digital silicon chip that mimicked the brain to model neurological processes.

We are quite a ways away from productization of the SyNAPSE chip.  Dr. Modha showed us a real-time exhibition of the SyNAPSE chip in action (connected to his laptop) with it interpreting a handwritten numeral into it’s numerical representation.  I would say it’s a bit early yet, to see putting “SyNAPSE to work”.

Digital vs. analog redux

I have written about the SyNAPSE neuromorphic chip and a competing technology, the direct analog simulation of neural processes before (see IBM introduces SyNAPSE chip and MIT builds analog synapse chip).  In the MIT brain chip post I discussed the differences between the two approaches focusing on the digital vs. analog divide.

It seems that IBM research is betting on digital neuromorphic computing.  At the Forum last week, I had a discussion with a senior exec in IBM’s STG group, who said that the history of electronic computing over the last half century or so has been mostly about the migration from analog to digital technologies.

Yes, but that doesn’t mean that digital is better, just more easy to produce.

On that topic, I asked the Dr. Modha, on what he thought of MIT’s analog brain chip.  He said

  • MIT’s brain chip was built on 180nm fabrication processes whereas his is on 45nm or over 3X finer. Perhaps the fact that IBM has some of the best fab’s in the world may have something to do with this.
  • The digital SyNAPSE chip can potentially operate at 5.67Ghz and will be absolutely faster than any analog brain simulation.   Yes, but each analog simulated neuron is actually one of a parallel processing complex and with a 1’000 or a million of them operating even 1000X or million X slower it’s should be able to keep up.
  • The digital SyNAPSE chip was carefully designed to be complementary to current digital technology.   As I look at IT today we are surrounded by analog devices that interface very well with the digital computing environment, so I don’t think this will be a problem when we are ready to use it.

Analog still surrounds us and defines the real world.  Someday the computing industry will awaken from it’s digital hobby horse and somehow see the truth in that statement.

~~~~

In any case, if it takes another 60 years to productize one of these technologies then the Singularity is farther away than I thought, somewhere around 2071 should about do it.

Comments?