Talking chip design at the Electronics weekly conference

Talking chip design at the Electronics weekly conferenceRichard Wilson PC or not PC, that’s the information appliance
by Trung Nguyen, a digital video specialist at National Semiconductor
A common element in all information appliances is the microprocessor. However, unlike the PC market where megahertz is considered everything, in the information appliance market, this in itself is not important.
Consumers measure performance by yardsticks other than raw processing power such as the quality of the video, the graphics, user friendliness in terms of the on-screen display, the electronic programme guide and the remote control, as well as the quality of the audio. The whole processing unit, that is the power of the CPU and the architecture of the system, is what becomes important.
The processor becomes more like the power supply, taken for granted to be powerful enough to do its job without the user having to worry about whether it has enough horsepower to perform its function effectively. The processing engine becomes hidden inside the appliance.
The success of the Internet has been driven by the low-cost PC and its success as a home entertainment gaming platform is due mainly to the superb graphics and picture quality compared to the TV. Not to mention the vast amount of software available for the PC.
Despite the adverse publicity behind the PCplatform in terms of cost and its short obsolescence period, it is still an open platform with many hardware suppliers. Software vendors can write software knowing that it will work across the majority of machines, assured of a vast market.
The PC does have its drawbacks. We all know that it is not always user-friendly or idiot proof, requiring more technical know-how to operate than a traditional TV or video.
Although technically, it is as advanced as any consumer product on the market, the PC’s downfall is this short-coming in its ease of use, which means that as it stands, it will never make it into the living room.
A convergent platform will combine the best features of a PC such as the graphics, the amount of software available, a low cost hard disk or DVD storage with the best features of a TV such as ease of use.
Such a convergent platform will be the information appliance. An example of an information appliance of the future is something we call the WebPAD conceptual demonstration unit.
This is a wireless Internet browser with its own 800 x 600 x 16 pixel flat panel display and touch sensitive stylus. It can be used for Web surfing in the comfort of an armchair.
Connection to the Internet is made via a radio basestation, which communicates with the handheld unit using the DECT, the European digital cordless phone standard. Connection to the network at the basestation may be either via normal modem, ISDN, Ethernet, XDSL or cable modem.
A certain number of web pages can be stored using the on-board Flash memory. Audio is supported with built-in speakers and microphones and two USB ports are provided for an optional keyboard or mouse.
Today’s digital TV set-top boxes, for example, are characterised, and some would say severely limited, by what CPU and software are being used. The hardware is designed for a particular service provider and conditional access system and they will not work with a different service provider’s services or broadcasts.
Standardisation work such as the one currently being undertaken by the DVB (Digital Video Broadcast) project in the area of a common DVB-Java API will mean that in future, all hardware set-top box platforms will be able to receive transmissions from various service providers or broadcasters. This will be so regardless of the hardware used, provided a minimum level of performance is achieved.
Software can be downloaded in the form of Java applets, which will run on any CPU platform. This will even further diminish the importance of which type of CPU is being used. For semiconductor suppliers, the challenge will be to provide a processing platform with the right level of integration, which includes the video, graphics and audio processing at the most cost effective price, performance and, not to forget, flexibility. Forward error correction improved by French style
by Anthony Simon from Comatlas, a subsidiary of VLSI Technology
Six years ago two researchers at the France Telecom’s research centre in Brittany invented a novel mathematical coding scheme which some communications designers now believe will revolutionise forward error correction code techniques.
The paper appeared at a time when the industry as well as most of mathematicians and theorists in the field of information theory had figured that there was to be no more to be done in the area of channel coding and forward error correction.  
 
In fact the information theorists had turned their attention to subjects like data compression and other means to get more bandwidth efficiency through systems. They assumed that with the existing forward error correction techniques were the best you could do. The new scheme, dubbed the turbo code, sparked a new interest in this area of research.
Turbo codes are a type of forward error correction, the unsung hero of the information age. Binary information can be corrupted once transmitted.
Electromagnetic disturbances and noise such as lighting, reflections, echoes and even sunspots can degrade or even knock out the bits that make up the digital message.
Forward error correction works by sending extra bits of information attached to the original information. These extra bits are then used at the receiver side to detect and/or correct errors and regenerate the information in the message. The most popular forward error correction techniques are known as Reed-Solomon and codes known as convolutional (Viterbi) codes.
Forward error correction development started with Claude Shannon. Not only was he the first to recognise the applicability of boolean algebra to digital systems but also proposed a mathematical definition of information in his paper: The Mathematical Theory of Communication (1948).
Accordingly, the theoretical limit of how much information you can transmit through a transmission is called the Shannon limit. In my view, turbo code is the first code that can achieve near Shannon limit decoding without impractical complexity.
One advantage of the turbo code is that performance can always be improved by cascading decoders on the receiver side (called iterative decoding) without making adjustments on the encoder side. This cannot be done on classical Viterbi or Reed-Solomon systems.
In practice turbo codes should allow you to transmit more TV signals through single broadcast channel. They should also improve the capabilities of AM radio.
Significantly, similar error correction codes are now being adopted as part of the design of the next generation mobile communications standard in Europe, known as UMTS.
The efficiency of the coding technique is particularly important for wireless applications. In mobile phone design increased error correction performance cannot impose higher power requirements. The turbo code gives you higher performance and greater battery life at the same time.


Leave a Reply

Your email address will not be published. Required fields are marked *

*