Differentiate or die, says ARM microcontroller firm

Alf Egil-BogenIt is interesting to see how microprocessor architectures have gone from being a significant differentiator to being a “not important” factor.

Being one of the inventors of the AVR the transition to ARM has been a long journey. How long will this trend last and what is next?

Since late 70´s, microcontrollers have rapidly been replacing discrete logic and supporting more features such as advanced control, complex calculations, high-speed communication and intuitive user interface even in small low cost systems.

Early on most microcontroller code was written in assembly language and this “hardware near” programming became the signature for embedded designers. No one could (or should) touch anyone else’s code.

In the early 90´s, engineers felt the need for more features in their microcontroller systems and started to look at the C-language even for 8-bit microcontrollers. The biggest issue at the time was that the microcontroller architectures did not support the language very well and the code became larger and less optimised than assembly.

This was when we designed the AVR. The AVR architecture was designed to support the C-language, and to optimise the use of the C-language for hardware-near programming without losing efficiency in any way.

With a true Risc architecture comprising single cycle instructions, register file, separate busses for program/data memory, and I/O it was just superior to any other architecture in the 8-bit world, and probably still is, 20 years later.

The performance of the architecture was important, but other factors like built-in high performance analogue, self-programmable flash and EEPROM memory were also attractive to engineers.

The low cost development tools were also welcomed by the engineering communities. A large reduction in the entry cost from a few thousand dollars to almost free was essential for the wide use of microcontrollers.

About the same time ARM started to find its way into embedded applications. ARM had a different approach with their processors.

First of all, they used a 32-bit CISC von Neumann architecture in their core design that is not as processing efficient as a Risc. ARM devices use an extensive amount of data storage even for small control applications because of its 32-bit wide bus structure.

A 32-bit data location would store 4 bytes and that is twice as many as most numbers that are used in control applications. To overcome some of the code size issues, the Thumb instruction set was designed.

This was a subset of the full instruction set and resulted in less code and data usage in application where the 32-bit number crunching was not needed. This opened engineers’ eyes to ARM and how the core could serve 8/16-bits applications as well as full 32-bit applications.

Secondly ARM had a different business model. By licensing the core to manufacturing houses and product companies, their processor was loved by purchasing departments around the world fighting their “true second source” battle. However we all know that the same core from various vendors does not really mean you can swap a processor over night.

ARM also did a few other important things to achieve this success. Firstly they were successful in getting designed in as a main core in a few GSM phones, Nokia being one of them. It was hard for microprocessor companies to break into these huge volume applications trying to sell both a processor core and a new supply chain. ARM’s licensing model allowed them to select a new processor and stay with their existing and well-proven supply chain.

Secondly these high volume customers were willing to take the cost of the development of debuggers, real-time operating system (RTOS) and other support tools that was needed to support the ARM cores used in their Asics.

Several high-end vendors started to support ARM and after a few years ARM also developed low cost tools for the processors as well. This is when ARM started to be adapted by the large engineering community around the world.

Today almost 20 years later, ARM processors are dominant in embedded 32-bit microcontroller designs, and ARM eats its way into 8/16 bits applications with their various cores.

Another very important contributor to the migration from 8 to 32-bit has been the manufacturing process development over the years.

The gate count of the processor core is insignificant in today’s system on chips because huge memories, peripherals and functional blocks are on the same die. On a 1Mbyte flash Cortex-M3 device the core is only a few percent of the silicon cost.

Another interesting observation is that ARM processors, maybe because of their early success in cell phones, are getting into tablets and the new age of handheld computers. This is probably one of the most fascinating parts of the ARM story. Who would guess that a small UK based processor-core company would challenge Intel on their true home field?

So I take my hat off – ARM has done a tremendous job by establishing a fantastic range of cores, a powerful ecosystem and a great marketing machine.

Now, when so many companies are using the same technology the term “differentiate or die” really becomes valid.

Only the companies with creative engineers that dare to think differently can achieve this. A risk tolerant culture with curious people is the way to go.

Not all companies act that way.

Alf-Egil Bogen CMO, Energy Micro, co-inventor of the AVR microcontroller assesses the impact the ARM processor architecture has made on 32-bit microcontroller design.

www.energymicro.com

 

 

 

Tags: assembly language, discrete logic, intuitive user, microcontroller architectures

Related posts