Making software follow hardware's footsteps

Date:   Saturday , March 31, 2007

The fast metamorphosing world of technology presents before developers a perennial problem: that of tying their software engineering skills to the over all hardware roadmap. Gone are the days when computer science was considered diametrically different from electrical engineering. The road ahead calls for a marriage of convenience between hardware and software. That alone can give a techie and even organizations a decisive head start.

Let me corroborate this with an example from what’s happening around us right now. Suddenly, the world seems to have woken up to multi-core processors, and hardware companies are on an accelerated path to develop them, lest they fall behind in competition.

According to market researchers at IDC, multi-core technology may be one of the most significant industry developments of the past 40 years. Given time, it will likely translate into a sort of paradigm shift for hardware, thus reigning in a buzzword already doing the rounds of the industry: parallelism.

What this would mean, in effect, for software developers is that they must understand the intricacies of parallel programming; While earlier, they were required to write processes that would result in serial execution, given that there was only one core, parallelism necessitates writing the software in a way so as to enable it to run various tasks in parallel, across two (and later multiple) cores.

To do this, developers need to identify the natural ‘modes of parallelism’ in an application, decipher which problems are appropriate for parallel decomposition and how functions might be re-factored so that different sections can run simultaneously. Following that, they will need to build a framework structured around the understanding, test it and optimize it.

On part of organizations, they must make some adjustments to the process of software-writing to accommodate the development of parallel software. One roadblock that organizations face in situations like this is that they do not know where to start. If precedents are a must, companies can look back to the time when the process of software development migrated following the introduction of 64-bit architectures, over the erstwhile 32-bit ones.

As such, companies must identify tools that will enable them to factor in changes into the process without affecting its efficacy. This of course should be backed by a will to change the process in the first place.

It is also essential that changes in the paradigm of hardware and resultantly software development be taken into account in the curricula of various technology schools, the logic being that when youngsters graduate out of these schools, they should be ready for the industry.

This would call for, on the one hand, a certain amount of headstrong attitude from the students, in terms of demanding their colleges for providing the necessary inputs for software development; one that is in consonance with the latest shifts in hardware. On the other hand, leading hardware companies will have to make available courses and tools to make the process of transition easy and seamless.

All the while, one must keep in mind that competency in writing software for the older hardware order (for example, single core architecture) must not be neglected, since at any given time, legacy systems form a major chunk of the existing infrastructure.


The author is Director of Sales and Business Development, Developer Products Division, Intel. He can be reached at phil.de.la.zerda@intel.com