Sunday, January 08, 2006
Thoughts on Sun, HP and the state of the computer hardware industry
Industry Guru Sameer Tyagi of Sun in his blog commented on an attack on the Sun Niagra chip by HP. I believe that while the conversation was interesting, but a bigger discussion needs to occur...

HP states the following:
The real problem that should be discussed is not in chip benchmarking but in the perceptions of customers related to chip performance. For us technologists that pay attention to the marketplace, we have realized that the Megahertz game for the most part is coming to an end. There will never be 1 Billion Ghz Pentium CPUs. Pretty much every single hardware vendor is moving toward approaches where they instead put multiple CPUs on a die. In order words, all hardware vendors are somewhat saying that multithreading an application in terms of scale will be the future over singlethreading an application in terms of performance.
The disconnect is whether the vendors will now expect the world to learn this fact via osmosis or will hardware vendors step up and help us write better multithreaded applications?
While hardware vendors may "partner" with software vendors to make commodity J2EE application servers and other things in this family work, all of the hardware vendors for the most part still have their head in the sand and haven't realized that the vast majority of software written on the planet is not in terms of commercial off-the-shelf but written within large enterprises. If you don't make an effort to teach large enterprises how to write multithreaded applications to take advantage of your hardware then all customers may end up disappointed.

Likewise, HP also makes another interesting quote:
Many CIOs who practice Management by Magazine will of course run in fear when they hear such a phrase but I believe reality is something different. If large enterprises are adopting either J2EE and/or .NET (This seems to be the direction in the vast majority of IT shops) then the problem may be as simple as changing configuration values.
Let's say you are running BEA Weblogic on Niagra and you wanted to increase the amount of threads of execution. You wouldn't go re-architect anything. Instead you would simply go to the console and change the number of execute queues and restart. That's it.

Maybe instead of folks at Sun and HP throwing shots at each other, they could start focusing on their mutual customers by putting their heads together and creating a new form of "reference architecture" that shows best practices for creating multithreaded applications so us customers can not only take advantage of your hardware but actually have a strong desire to buy more of it?

The one thing I would say to all vendors in this space is that customers do pay attention to the conversation and may start asking their own questions. For example, if you look at the product offering from Azul Systems one may note that they have managed to put 24 cores on a chip. We may ask ourselves the following questions:

| | View blog reactions
HP states the following:
- Sun's Niagara chip is made up of individual cores that have much slower single thread performance when compared to the higher performing cores of the Intel® Xeon[TM], Itanium® , AMD Opteron® or even current UltraSPARC processors
The real problem that should be discussed is not in chip benchmarking but in the perceptions of customers related to chip performance. For us technologists that pay attention to the marketplace, we have realized that the Megahertz game for the most part is coming to an end. There will never be 1 Billion Ghz Pentium CPUs. Pretty much every single hardware vendor is moving toward approaches where they instead put multiple CPUs on a die. In order words, all hardware vendors are somewhat saying that multithreading an application in terms of scale will be the future over singlethreading an application in terms of performance.
The disconnect is whether the vendors will now expect the world to learn this fact via osmosis or will hardware vendors step up and help us write better multithreaded applications?
While hardware vendors may "partner" with software vendors to make commodity J2EE application servers and other things in this family work, all of the hardware vendors for the most part still have their head in the sand and haven't realized that the vast majority of software written on the planet is not in terms of commercial off-the-shelf but written within large enterprises. If you don't make an effort to teach large enterprises how to write multithreaded applications to take advantage of your hardware then all customers may end up disappointed.

Likewise, HP also makes another interesting quote:
- To fully exploit Sun's Niagara systems, developers may have to change how applications are architected.[6] Sun has stated that Niagara changes the minimum application scalability demands from 1-4 threads to 32 concurrent threads
Many CIOs who practice Management by Magazine will of course run in fear when they hear such a phrase but I believe reality is something different. If large enterprises are adopting either J2EE and/or .NET (This seems to be the direction in the vast majority of IT shops) then the problem may be as simple as changing configuration values.
Let's say you are running BEA Weblogic on Niagra and you wanted to increase the amount of threads of execution. You wouldn't go re-architect anything. Instead you would simply go to the console and change the number of execute queues and restart. That's it.

Maybe instead of folks at Sun and HP throwing shots at each other, they could start focusing on their mutual customers by putting their heads together and creating a new form of "reference architecture" that shows best practices for creating multithreaded applications so us customers can not only take advantage of your hardware but actually have a strong desire to buy more of it?

The one thing I would say to all vendors in this space is that customers do pay attention to the conversation and may start asking their own questions. For example, if you look at the product offering from Azul Systems one may note that they have managed to put 24 cores on a chip. We may ask ourselves the following questions:
- If they can do 24 cores, why can't you guys? Does it have anything to do with protecting a product line? What should I think about a vendor that is holding back technology innovation from us customers?
- Instead of talking to me about virtualization, maybe you could tell me how your chip can become "aware" of the languages I code enterprise applications in and provide optimization not acheivable by using general purpose CPUs?
- Are you participating in the open source community so that I can also save money on software acquisition and easily download code that takes advantage of your hardware offering? You don't think that I only will run your open source stack and not want to include components of other open source projects? Maybe you can tell me how you contribute to open source projects that you don't control?
