Monday, March 22, 2010

Second Sourcing, Application Interfaces, and a 16 Bit Static Ram

While returning my Mac Plus to the attic, I decided to bring out some electronics relics I have from an even earlier era. The photo shows an undiced 1.25" wafer of integrated circuits and two packaged chips from around 1969. (The acorn hat is for scale) There are about 200 transistors on each chip. I think it's a 16 bit static RAM chip- with a magnifying glass, I can see and count the 16 cells. For context, when the Mac 128K Mac came out, it shipped with 64Kb DRAM chips. (about 1000x more dense). Today 4Gb chips are in production, a factor of a billion denser than my relic, and the silicon wafers are 30 cm in diameter.

My father was one of the founders of Solid State Scientific, Inc., (SSSI) a company that made CMOS integrated circuits. SSSI, located in Montgomeryville PA, started out as a second-source supplier for RCA's line of low-power CMOS logic chips. In the electronics industry, it has been a common practice for component manufacturers to license their circuit designs or specifications to other manufacturers so that their customers would be assured of an adequate supply. The second source company could compete on price or performance. For example, engineers could design systems with the 4060 14-bit ripple counter chip with internal oscillator, and know that they could buy a replacement chip from either RCA or SSSI. If RCA's fab was fully booked, SSSI would be able to fill the gap. There was no vendor lock-in.

Second source relationships could be tricky- AMD and Intel famously ended up litigating AMD's second-source status for the 8086 series of microrocessors. Logic family chips were commodities, and profit margins were thin. The second-source gambit was a judgment that a company could make more money by driving prices down and volume up. Companies like SSSI were always chasing after higher profit margins in new applications such as custom circuits for digital watches. The large volume parts would pay for their fabs, and the proprietary circuits would earn the profits, or at least that was the idea. Vendor lock-in, while while it might discourage adoption and reduce volume, is good for profitability.

As chips become more and more complicated, the chip manufacturing industry realligned. Today, apart from giants like Intel, most chips are manufactured by foundry companies that don't do chip design at all. Chip design companies try to maintain high margins with exclusive intellectual property; the foundry companies aggregate volume and drive down cost by manufacturing chips from many different design companies.

I've been thinking about the way that the advance of technology moves application interfaces. In the days of the CMOS logic chips, the application interface was a spec sheet and a logic diagram. That was everything an circuit designer needed to include the component in a design. Today that interface has migrated onto the chip and into software;  chip foundries provide software models for components ranging from transistors to processor blocks for designers to include in their products.

When software engineers talk about application interfaces, they're usually thinking about function calls and data structures that one block of software can use to interact with other blocks of software. These interfaces, once published and relied on, tend to be much more stable over time than the code hidden behind them. To some extent, software application interfaces can hide hardware implementations as easily as they can hide code. One result of this is that new chips may come with software interfaces that persist through different versions of the chip. In something of a paradox, the software interface is fixed while the hardware interface moves around.

Software has become more and more part of our daily work, and interfaces have become important to non-engineers. File formats are a good example of application interfaces that are important to all of us. The files I produced on my Mac Plus 25 years ago are still with me and usable; because of that, but you can read the Ph. D. dissertation I wrote using it. OpenOffice serves as a second-source for Word, and I can use either program with some assurance that I will continue to be able to do so into the future.

There's some backstory there. The "interchange format" for the original Word was "RTF". RTF is a reasonably good format, informed by Donald Knuth's TeX, but it was always a second citizen compared to the native "DOC" format. Microsoft published a spec, but they didn't follow it too closely and they changed it with every new release of Word. One result was that it was difficult to use Word as part of a larger publishing system (which I tried to do back in my days as an e-Journal developer). The last thing Microsoft wanted was for competition to Word develop before it grew to dominate the marketplace.

Cloud based software (software as a service) depends in a interesting way on application interfaces. Consider Google docs. You can send it a ".DOC" file created in Microsoft Word, do something with it, then export it. In a sense, Word is a "second source" for Google Docs, and consumers can use Docs without fear of lock-in. Docs adds its own web API so that developers can use it as a component of a larger web-based system. This is the "platform" strategy.

These new interfaces offer a user lock-in trade-off. While the customer gains the freedom to use a website's functionality with services from other companies, the control of the interface leaves the other companies at the mercy of the  company controlling the API. Developers coding to the interface are in the same situation as a second source chip supplier- always exposed to competition, while the platform provider becomes more and more locked in with every new component that plugs into it.

We now see a very interesting competition in platform strategies emerging. Apple's iPad/iPhone/iTouch software platform tries to lock-in consumers by opening an attractive set of API's for app development. It goes further, though, by attempting to control a marketplace (the app store) and imposing restrictive terms on app vendors. Google's Android platform tries to do the same thing in a much more open environment. Apple seems to have learned an important lesson, though. The biggest difficulty facing a company trying to plug into a platform is profitability, and the iPhone software marketplace appears to be offering viable business models for developers. It remains to be seen whether that condition will last, but it's clear that technology shifts are pushing services (such as phone service) that used to be stand-alone products into large, more complex ecosystems.
Enhanced by Zemanta

2 comments:

  1. A bit of a quibble about the beginning of the above:

    Chip companies second sourced because their customers were afraid they'd lose their ability to make a chip at a critical time: fab lines can be zapped by all sorts of things (especially back in those days), like a change in the water supply. Or read Cringely's book on how a shipping clerk single handedly almost killed Intel.

    Intel broke this pattern by setting up fab lines in three different geographic locations ... and by being (aside from AMD, with as noted much legal fun) the only source for "got to have" parts like IBM PC compatible CPUs.

    ReplyDelete
  2. hga- While I agree with your point, it wasn't so simple. When I worked at Intel in 1981-1982, my process, "P462", was implemented only at Fab 1A in Santa Clara, and our parts were not "second sourced" by anyone. Having fabs in New Mexico and Oregon that made the 80286 processor might have been a good marketing attribute, but internally, it was a capacity issue. If the water had gone bad in NM, customers wouldn't have gotten their orders because the OR fab would have been maxed out.

    ReplyDelete

Note: Only a member of this blog may post a comment.