Our long romance with Apple technology

 

Lisa 2/5 by MattsMacintosh (cc) (from Flickr)
Lisa 2/5 by MattsMacintosh (cc) (from Flickr)

We all heard last night of the passing of Steve Jobs.  But rather than going over his life I would like to here discuss some of the Apple products I have used over my life and how they affected our family.

 

I don’t know why but I never got an Apple II. In fact the first time I saw one in use was in the early 80’s. But it certainly looked nifty.

But I was struck with love at first sight when I saw the Lisa, a progenitor of the Mac.  I was at a computer conference in the area which had a number of products on display but when I saw the Lisa I couldn’t see anything else.  It had a 3.5″ floppy drive which was encased in hard plastic, hardly ever considered a floppy anymore.  But the real striking aspect was its screen, a white background, bit mapped screen that sported great black and white graphics.

At the time, I was using IBM 3270 terminals which had green lettering on a dark screen and the only graphics were ones made with rows and columns of asterisks.  To see the graphics pop to life on the Lisa, different font options, what you see is what you get was just extraordinary at the time.  The only downside was its $10K price.  Sadly we didn’t buy one of these either.

Mac worship

Then the 1984 commercial came out in the superbowl spot.  The one where Apple was going to free the computing world from the oppression of big brother with the introduction of the first Macintosh computer.

We got our hands on one soon after and my wife used it for her small accounting business and just loved it.   Over time as she took on partners their office migrated to business applications that were more suited for PCs but she stayed on the Mac long after it was sub-optimal, just because it was easy to use.

 

Apple Fat Mac by Accretion Disc (cc) (From Flickr)
Apple Fat Mac by Accretion Disc (cc) (From Flickr)

Ultimately, she moved to a PC  taking her Fat Mac home to be used there instead.  Over the next decade or so we updated the Mac to a color screen and a desktop configuration but didn’t really do much with it other than home stuff.

 

Then the iMac’s came out. We latched onto the half basketball one which had a screen protruding out of it.  We used this for some video and photo editing and just loved it.  Video upload and editing took forever but there was nothing else out there that could even come close.

 

Our 1st iMac
Our 1st iMac

I ended up using this machine the first few years after I left corporate America but also bought a Mac lap top, encased in aluminum for my business trips.    Both these ran PowerPC microprocessor but eventually ran an early generation of Mac OSX.

 

A couple of years later we moved on to the all-in-one, Intel based, desktop iMac’s and over time updated to bigger screens, faster processing and more storage.  We are still on iMac desktops for home and office use today.

iPhone infatuation

In 2008 we moved from a dumb cell phone to a smart iPhone 3G.  We wanted to wait until the world phone came out which supported GSM.

But this was another paradigm shift for me. When working in the corporate world I had a blackberry and could use it for contacts, email, and calendar but seldom did anything else on it.  And in fact, at the time I used a PalmPilot for a number of business applications, games, and other computing needs.

When the iPhone3G came out, both the PalmPilot and dumb cell phone were retired and we went completely Apple for all our cell phone needs.  Today, I probably scan email, tweet, and do a number of other applications on my iPhone almost as often as I do them on the iMac.  Over time we moved one or the other of us to the 3Gs and 4 and now the children are starting to get hand me down iPhones and love them just as well.

iPad devotion

Then in May of 2010, we bought an iPad.  This was a corporate purchase but everyone used it.  I tried to use it to replace my laptop a number of times (see my posts To iPad or Not to iPad parts 1, 2, 3 & 4) and ultimately concluded it wouldn’t work for me.  We then went out and got a Mac Airbook and now the iPad is mainly used to check email do some light editing as well as gaiming, media and other light computing activities.

The fact is, sitting on our living room couch, checking email, twitter and taking noteshas made using all these tools that much easier. When we saw the iPad2 we liked what we saw but it took so long for it to become available in the stores that we had lost all gadget lust and are now waiting to see what the next generation looks like when it comes out.

—-

All in all almost 30 years with Apple products both in the home and at work have made me a lifelong advocate.

I never worked for Apple but have heard that most of these products were driven single-mindly by Steve Jobs.  If that was the case, I would have to say that Steve Jobs was a singular technical visionary, that understood what was then possible and took the steps needed to make it happen.  In doing that, he changed computing forever and for that I salute him.

Steve Jobs RIP

Big data and eMedicine combine to improve healthcare

fix_me by ~! (cc) (from Flickr)
fix_me by ~! (cc) (from Flickr)

We have talked before ePathology and data growth, but Technology Review recently reported that researchers at Stanford University have used Electronic Medical Records (EMR) from multiple medical institutions to identify a new harmful drug interaction. Apparently, they found that when patients take Paxil (a depressant) and Pravachol (a cholresterol reducer) together, the drugs interact to raise blood sugar similar to what diabetics have.

Data analytics to the rescue

The researchers started out looking for new drug interactions which could result in conditions seen by diabetics. Their initial study showed a strong signal that taking both Paxil and Pravachol could be a problem.

Their study used FDA Adverse Event Reports (AERs) data that hospitals and medical care institutions record.  Originally, the researchers at Stanford’s Biomedical Informatics group used AERs available at Stanford University School of Medicine but found that although they had a clear signal that there could be a problem, they didn’t have sufficient data to statistically prove the combined drug interaction.

They then went out to Harvard Medical School and Vanderbilt University and asked that to access their AERs to add to their data.  With the combined data, the researchers were now able to clearly see and statistically prove the adverse interactions between the two drugs.

But how did they analyze the data?

I could find no information about what tools the biomedical informatics researchers used to analyze the set of AERs they amassed,  but it wouldn’t surprise me to find out that Hadoop played a part in this activity.  It would seem to be a natural fit to use Hadoop and MapReduce to aggregate the AERs together into a semi-structured data set and reduce this data set to extract the AERs which matched their interaction profile.

Then again, it’s entirely possible that they used a standard database analytics tool to do the work.  After all, we were only talking about a 100 to 200K records or so.

Nonetheless, the Technology Review article stated that some large hospitals and medical institutions using EMR are starting to have database analysts (maybe data scientists) on staff to mine their record data and electronic information to help improve healthcare.

Although EMR was originally envisioned as a way to keep better track of individual patients, when a single patient’s data is combined with 1000s more patients one creates something entirely different, something that can be mined to extract information.  Such a data repository can be used to ask questions about healthcare inconceivable before.

—-

Digitized medical imagery (X-Rays, MRIs, & CAT scans), E-pathology and now EMR are together giving rise to a new form of electronic medicine or E-Medicine.  With everything being digitized, securely accessed and amenable to big data analytics medical care as we know is about to undergo a paradigm shift.

Big data and eMedicine combined together are about to change healthcare for the better.

IBM research introduces SyNAPSE chip

IBM with the help of a Columbia, Cornell, University of Wisconsin (Madison) and University of California creates the first generation of neuromorphic chips (press release and video) which mimics the human brain’s computational architecture implemented via silicon.  The chip is a result of Project SyNAPSE (standing for Systems of Neuromorphic Adaptive Plastic Scalable Electronics)

Hardware emulating wetware

Apparently the chip supports two cores one with 65K “learning” synapses and the other with ~256K “programmable” synapses.  Not really sure from reading the press release but it seems each core contains 256 neuronal computational elements.

Wikimedia commons (481px-Chemical_synapse_schema_cropped)
Wikimedia commons (481px-Chemical_synapse_schema_cropped)

In contrast, the human brains contains between 100M and 500M synapses (wikipedia) and has ~85 billion neurons (wikipedia). Typical human neurons have 1000s of synapses.

IBM’s goal is to have a trillion neuron processing engine with 100 trillion synapses occupy a 2-liter volume (about the size of the brain) and consuming less than one kilowat of power (about 500X the brains power consumption).

I want one.

IBM is calling such a system built out of neuromorphic chips a cognitive computing system.

What do with the system

The IBM research team has demonstrated some typical AI applications such as simple navigation, machine vision, pattern recognition, associative memory and classification applications with the chip.

Given my history with von Neuman computing it’s kind of hard for me to envision how synapses represent “programming” in the brain.  Nonetheless, wikipedia defines a synapse as a connection between any two nuerons which can take two forms electrical or chemical. A chemical synapse (wikipedia), can have different levels of strength, plasticity, and receptivity.  Sounds like this might be where the programmability lies.

Just what the “learning” synapses do, how they relate to the programmatical synapses and how they do it is another question entirely.

Stay tuned, a new, non-von Neuman computing architecture was born today.  Two questions to ponder

  1. I wonder if they will still call it artificial intelligence?
  2. Are we any closer to the Singularity now?

—-

Comments

 

M-Disc provides a 1000 year archivable DVD

M-Disc (c) 2011 Millenniata (from their website)
M-Disc (c) 2011 Millenniata (from their website)

I heard about this last week but saw another notice today.  Millenniata has made what they believe to be a DVD which has a 1000 year archive life they call the M-Disc .

I have written before about the lack of long term archives for digital data mostly focused on disappearing formants but this device if it works, has the potential to solve the other problem (discussed here) mainly that no storage media around today can last that long.

The new single layer DVD (4.7GB max) has a chemically stable, inorganic recording layer which is a heat resistant matrix of materials which can retain data while surviving temperatures of up to 500°C (932°F).

Unlike normal DVDs which record data using organic dyes within a DVD, M-Disc data is recorded on this stone-like layer embedded inside  the DVD.  By doing so, M-Disc have created the modern day equivalent of etching information in stone.

According to the vendor, M-Disc archive-ability was independently validated by the US DOD at their Church Lake facilities. While the DOD didn’t say the M-Disc DVD has a 1000 year life they did say that under their testing the M-Disc was the only DVD device which did not lose data. The DOD tested DVDs from Mitsubishi, Verbatum, Delkin, MAM-A and Taiyo Yuden (JVC) in addition to the M-Disc.

The other problems with long term archives involve data formats and program availability that could read such formats from long ago. Although Millenniata have no solution for this, something like a format repository with XML descriptions might provide the way forward to a solution.

Given the nature of their DVD recording surface, special purpose DVD writers, with lasers that are 5X the intensity of normal DVDs, need to be used. But once recorded any DVD reader is able to read the data off the disk.

Pricing for the media was suggested to be about equivalent per disk for archive quality DVDs.  Pricing for the special DVD writers was not disclosed.

They did indicate they were working on a similar product for BluRay disks which would take the single layer capacity up to 26GBs.

—-

Comments?

Technology innovation

Newton & iPad by mac_ivan (cc) (from Flickr)
Newton & iPad by mac_ivan (cc) (from Flickr)

A recent post by Mark Lewis on innovation in large companies (see Episode 105: Innovation – a process problem?) brought to mind some ideas that have been intriguing me for quite awhile now.  While Mark’s post is only the start of his discussion on the management of innovation, I think the problem goes far beyond what he has outlined there.

Outside of Apple and a few select others, there doesn’t appear to be many large corporate organization that continually succeed at technology innovation.  On the other hand there are a number of large organizations which spend $Millions, if not $Billions on R&D with at best, mediocre return on such investments.

Why do startups innovate so well and corporations do so poorly.

  • Most startup cost is sweat equity and not money, at least until business success is more assured.  Well run companies have a gate review process which provide more resources as new ideas mature over time, but the cost of “fully burdened” resources applied to any project is much higher and more monetary right from the start.  As such, corporate innovation costs, for the exact same product/project, are higher at every stage in the process, hurting ROI.
  • Most successful startups engage with customers very early in the development of a product. Alpha testing is the life blood of technical startups. Find a customer that has (hopefully, a hard) problem you want to solve and take small, incremental steps to solve it, giving the customer everything you have, the moment you have it, so they can determine if it helped and where to go next.  If their problem is shared by enough other customers you have a business.  Large companies cannot readily perform alpha tests or in some cases even beta tests in real customer environments.  Falling down and taking the many missteps that alpha testing would require might have significant brand repercussions.  So large companies end up funding test labs to do this activity.  Naturally, such testing increases the real and virtual costs of corporate innovation projects versus a startup with alpha testing.  Also, any “simulated testing” may be far removed from real customer experience, often leading corporate projects down unproductive development paths, increasing development time and costs.
  • Many startups fail, hopefully before monetary investment has been significant. Large corporate innovation activities also fail often but typically much later in the development process and only after encountering higher real and virtual monetary costs.  Thus, the motivation for continuing innovation in major corporations typically diminishes after every failure, as does the ROI on R&D in general.  On the other hand, startup failures, as they generally cost little actual money, typically induce participants to re-examine customer concerns to better target future innovations.  Such failures often lead to an even higher motivation in startup personnel to successfully innovate.

There are probably many other problems with innovation in large corporate organizations but these seem most significant to me.  Solutions to such issues within large corporations are not difficult to imagine, but the cultural changes that may be needed to go along with such solutions may represent the truly harder problem to solve.

Comments?

 

R&D effectiveness

A recent Gizmodo blog post compared a decade of R&D at Sony, Microsoft and Apple.  There were some interesting charts but mostly it showed that R&D as a percent of revenue, fluctuates from year to year and R&D spend has been rising for all the companies (although at different rates).

R&D Effectiveness, (C) 2010 Silverton Consulting, All Rights Reserved
R&D Effectiveness, (C) 2010 Silverton Consulting, All Rights Reserved

Overall from a percentage of Revenue basis, Microsoft wins, spending ~15% of revenue on R&D over the past decade, Apple loses, spending only ~4% on R&D and Sony is right in the middle at spending ~7% on R&D.  Yet viewing the impact on corporate revenue R&D spending had significantly different impacts on each company than what pure % R&D spending would indicate.

How can one measure R&D effectiveness.

  • Number of patents – this is often used as an indicator, but unclear how this correlates to business success.  Patents can be licensed but only if they prove important to other companies. However, patent counts can be gauged early on during the R&D activities rather than much later when a product reaches the market.
  • Number of projects – by projects we mean an idea from research taken into development.  Such projectst may or may not make it out to market.  At one level this can be a leading indicator of “research” effectiveness, as this means an idea was deemed at least of commercial interest.  A problem with this is that not all projects get released to the market or become commercially viable.
  • Number of products – by products, we mean something sold to customers.  At least such a measure reflects that the total R&D effort was deemed worthy enough to take to market.  How successful such a product is still to be determined.
  • Revenue of products – product revenue seems easy enough but often can be hard to allocate properly.  Looking at the iPhone, do we count just handset revenues or include application and cell service revenues. But assuming one can properly allocate revenue sources to R&D efforts, one can come up with a revenue from R&D spending.  The main problem with revenue generated from R&D ratios are all the other non-R&D factors confound it, e.g., marketing, manufacturing, competition, etc.
  • Profitability of products – product profitability is even messier than revenue when it comes to confoundability.  But ultimately profitability of R&D efforts may be the best factor to use as any product that’s truly effective should generate the most profits.

There are probably other R&D effectiveness factors that could be considered but these will suffice for now.

How did they do?

Returning to the Gizmodo discussion, their post didn’t include any patent counts, project counts (only visibly internally), product counts, or profitability measures but they did show revenue for each company.  From a purely Revenue impact one would have to say that Apple’s R&D was a clear winner with Microsoft a clear second.  Although we would have to say that Apple started from considerable smaller revenue than Sony or Microsoft but Apple’s ~$148B of revenue in 2005 was only small in comparison to other giants.  We all know the success of the iPhone and iPod but they also stumbled on the Apple TV.

Why did they do so well?

What then makes Apple do so good?  We have talked before about an elusive quality we called visionary leadership.  Certainly Bill Gates is as technically astute as Steve Jobs and there can be no denying that their respective marketing machines are evenly matched.  But both Microsoft and Apple were certainly led by more technical individuals than Sony over the last decade.   Both Microsoft and Apple have had significant revenue increases over the past ten years, that parallel one another while Sony, in comparison, has remained relatively flat.

I would say both Microsoft and Apple results show that “visionary leadership” has a certain portion of technicality to it that can’t be denied.  Moreover, I think that if one looked at Sony under Akio Morita, HP under Bill Hewlett and Dave Packard or many other large companies today, one could conclude that technical excellence is a significant component of visionary leadership.  All these companies highest revenue growth came under leadership which had significant technical knowledge.  There’s more to visionary leadership then technicality alone but it seems at least foundational.

I still owe a post on just what constitute’s visionary leadership, but I seem to be surrounding it rather than attacking it directly.

Is M and A the only way to grow?

Photograph of Women Working at a Bell System Telephone Switchboard by US National Archives (cc) (from flickr)
Photograph of Women Working at a Bell System Telephone Switchboard by US National Archives (cc) (from flickr)

Oracle buys Sun, EMC buys Data Domain, Cisco buys Tandberg, it seems like every month another major billion dollar acquisition occurs.  Part of this is because of the recent economic troubles, which now values many companies at the lowest they have been for many years and thus, making it cheaper to acquire good (and/or failing) companies.  But one has to wonder is this the only way to grow?

I don’t think so.

Corporate growth can be purely internally driven or organic just as well as from acquisition.  But it’s definitely harder to do internally.  Why?

  • Companies are focused on current revenue producing products – Revolutionary products rarely make it into development in today’s corporations because they take resources away from other (revenue producing) products.
  • Companies are focused on their current customer base – Products that serve other customers rarely make out into the market from today’s corporations because such markets are foreign to the companies current marketing channels.
  • Company personnel understand current customer problems – To be successful, any new product must address it’s customer pain points and offer some sort of a unique, differentiated solution to those issues and because this takes understanding other customer problems, it seldom happens.
  • New products can sometimes threaten old product revenue streams – It’s a rare new product that doesn’t take market share aware from some old way of doing business.  As companies focus on a particular market, any new product development will no doubt focus on those customers as well.  Thus, many new internally developed products will often displace (or eat away at) current product revenue.  Early on, it’s hard to see how any such product can be justified with respect to current corporate revenue.
  • New products often take efforts above and beyond current product activities – To develop, market and sell revolutionary products takes enormous, “all-out” efforts to get off the ground.  Most corporations are unable to sustain this level of effort for long, as their startup phase was long ago and long forgotten.

We now know how hard it can be but how does Apple do it?  The iPod and iPhone were revolutionary products (at least from Apple’s perspective) and yet they both undeniably became great successes and helped to redefine industries in the process.  And no one can argue that they haven’t helped Apple to grow significantly in the process.  So how can this be done?

  • It takes strong visionary leadership in the company at the highest level – Such management can make the tough decisions to take resources away from current, revenue producting products and devote time and effort to new ones.
  • It takes marketing genius – Going after new markets, even if they are adjacent, requires in-depth understanding of new market dynamics and total engagement to be succesful.
  • It takes development genius – Developing entirely new products, even if based on current technology, takes development expertise above and beyond evolutionary product enhancement.
  • It takes hard work and a dedicated team – Getting new products off the ground takes a level of effort above and beyond current ongoing product activities.
  • It takes a willingness to fail – Most new internally developed products and/or startups fail.  This fact can be hard to live with and makes justifying future products even harder.

In general, all these items are easier to find in startups rather than an ongoing corporation today.  This is why most companies today find it easier and more successful to grow through acquisitions rather than through organic or internal development.

However, it’s not the only way.  ATT did it for almost a century in the telecom industry but they owned a monopoly.  IBM and HP did it occasionally over the past 60 years or so, but they had strong visionary leadership for much of that time and stumbled miserably, when such leadership was lacking.  Apple has done it over the past couple of decades or so but this is mainly due to Steve Jobs.  There are others of course, but I would venture to say all had strong leadership at the helm.

But these are the exceptions.  Strong visionary leaders usually don’t make it to the top of today’s corporations.  Why that’s the case needs to be the subject of a future post…