Many of my colleagues know that in the last several years I have become a big Apple-holic. When IBM SWG Tech Sales sponsored the first “Buy a Macbook, get a subsidy” program, I bought my first Macbook. I now have ejected all Microsoft Windows-based hardware from my household, and we have a couple of Macbooks, a couple of iPads, iPhones, a Apple TV and a few other misc. items. I’m pretty impressed by Apple’s hardware and how they integrate the hardware and software to create a system that performs well and provides high value to the user.
This should sound familiar, as it’s the same proposition we have with System z. Over the last 8-10 years, IBM has made great strides towards improving System z’s usability, performance, and cost of ownership. One key way we have done that is by introducing the “Specialty Processor”. Now for years, System z has offloaded a lot of different functions from the main central processing “engines” and improved system performance and reliability. For example, the System Assist Processor (SAP) is an engine in System z that is dedicated to I/O processing. It allows offloading of I/O functions and frees the general-purpose processors to execute user programs. We also introduced other speciality processors such as the zAAP (z Application Assist Processsor), zIIP (z Integrated Information Processor), and a few others such as the on-chip Crypto Coprocessor.
But what does this have to do with the new iPhone?
When I was reading the various press pieces on the iPhone 5S announce, I ran across this article from Computerworld. It pointed out a new function that I had completely missed in the announcement - the M7 “Motion Coprocessor”. Now “coprocessing” is not exactly new in the small systems world - PCs have been using discrete graphics processors for a long time now, which, like IBM’s System z Speciality Engines, offloads functionality to free up resources on the main processor. On the iPhone and iPad, Apple uses ARM architecture chips, most recently the A6 on the iPhone 5 and new 5C, and now the A7 on the 5s, for “general processing” of applications. The latest iPads use an A6X processor that has offloaded graphics to a 4-core graphics processor. And now the M7 chip will offload functions such as accelerometer, gyroscope and compass processing.
So why is this a big deal?
Mobile computing is often thought of as applications and functions we carry around on our phones and tablet computers. But it’s much more than that. It’s wearable devices…Google Glass is but one example of this…it also includes those monitor devices that track your heartbeat, your temperature and other functions while you exercise. It’s medical devices and equipment that can measure your vital signs. It’s on-board computers in cars. It’s tiny computing devices in shipping containers or cartons… It’s just about any kind of computing device that moves or transmits data. We often refer to it as “The Internet of Things" (Parenthetically, I’ll note that IBM actually envisioned all of this many years ago. Most recently we referred to it as "Pervasive Computing”) The kinds of information that the M7 processor can handle is but one category of “pervasive” technology/data that will increase the demands of processing data and continue to drive the “Big Data” strategy of companies around the world.
So now Apple has discovered what IBM System z did - using speciality engines helps your computing platform deliver higher performance and better value by moving specialized computing functions to other offload processors so the main computing cores can work on the important tasks of processing the user’s applications. I think that’s pretty cool.