Archive for the ‘Software’ Category

Got parallel programming skills?

Thursday, August 2nd, 2007

We all know the glamour of having the fastest HPC machine, or the most nodes, or fattest pipes. But what ends up lost in the hoopla of all the hardware hype is the fact that someone has to write the code for this stuff to be even marginally useful for handling enormous computations. Herein lies one of the problems with high performance, scientific computing - not enough skilled programmers. Simply put, software development isn’t keeping pace with hardware development. This has been a problem for some time and still is. Writing code and programming applications (from middleware to debuggers) that enable a large computational, data intensive problem to be broken into parts that are solved individually and then reassembled into a single solution is non-trivial. Though a little dated, Susan Graham and Marc Snir, of Cal Berkeley and Illinois, Ubana-Champaign respectively, touched on this still relevant problem in their February 2005 CTWatch Quarterly article “The NRC Report on the Future of Supercomputing.” Gregory Wilson, a CS professor, gets a little more specific in “Where’s the Real Bottleneck in Scientific Computing?” from American Scientist. A more recent discussion of the lag in software development can be found in Doug Post’s keynote talk “The Opportunities and Challenges for Computational Science and Engineering” from the inauguration of the new Virtual Institute - High Productivity Supercomputing (VI-HPS).

IT in the future

Monday, January 9th, 2006

Whether or not you’re a fan of Ray Kurzweil, he has made some interesting predictions for the future of IT in this article from Computerworld. As a self-proclaimed futurist, Kurzweil is no stranger to making bold predictions. In this article he states that

[In the late 2040s], one cubic inch of nanotube circuitry will be 100 million times more powerful than the human brain.

In a 1999 article titled “When Machines Think” published by Maclean’s, Kurzweil said

By 2019, a $1,000 computer will match the processing power of the human brain–about 20-million-billion calculations per second.

You do the math. Accurate or not, these predictions are pretty telling about the evolution of computing power.

A meeting of the minds?

Monday, December 19th, 2005

Software research collaboration between universities and private business may soon become easier, depending on the results of the new intellectual property model recently announced by 11 academic and commercial partners. The New York Times also has a piece about this new partnership, which includes HP, IBM, Stanford and Georgia Tech among others.

eBay integration of Skype

Wednesday, September 28th, 2005

Here’s one for the VoIP crowd and eBay fans. It was reported earlier this month that eBay purchased the Internet communications company Skype for $2.6 billion in cash and stock. How eBay intends to integrate Skype technology into their operations isn’t entirely clear except that it’s clear eBay’s sole reliance on e-mail as the primary form of communication between parties has become rather archaic and untimely. VoIP has potential and is getting more and more press lately, but it seems that Skype’s technology, which includes peer-to-peer service (IM) and file transfer capability, and its marriage with eBay may have some problems.

Microsoft tackling cluster computing

Sunday, September 18th, 2005

Microsoft has entered the cluster computing market with hopes of grabbing a share of the market currently led by Linux. The proliferation of clusters for heavy duty computing continues across many business segments, from industry to government, as evidenced by the latest edition of the Top500, which shows that cluster systems comprise 60% of the list.

According to this article from Grid Computing Planet, Microsoft’s initial software entry works on clusters of up to 128 machines and Microsoft intends to better integrate heterogeneous applications on the cluster compared to Linux as well as offering better support. An overview of their cluster solution can be found here.

The new software will include an open source MPI middleware. That’s right - open source. More info about Microsoft’s decision to implement this into the Compute Cluster Solution can be found in this eWeek piece.

Need to save a little energy?

Tuesday, September 13th, 2005

Not everyone has a supercomputer lying around, especially an idle one, but if you have a high performance system and are looking to save some energy without losing much performance, Los Alamos National Laboratory might have the solution. With the use of EnergyFit 1.0, LANL is claiming a potential 10-25% energy savings in system energy consumption. According to this mobile piece from LinuxHPC.org, EnergyFit is

a transparent software layer based on a novel algorithm that reduces the power and energy consumption of high-performance computing systems.

Developed by Chung-Hsing Hsu and Wu-chun Feng at LANL, this software is another approach to addressing the enormous energy consumption and heat distribution of multiple processor architectures. Reducing the amount of energy used and consequent head produced by big machines leads, among other things, to a decrease in MTBF (mean time between failure) of processors and an increase in overall system reliability.

More information about power consumption and savings on large systems can be found in this Computerworld article.

The moderators and/or administrators of this weblog reserve the right to edit or delete ANY content that appears on the site. In other words, the moderators and administrators have complete discretion over the removal of any content deemed by them to be inappropriate, in full or in part.

Any opinions expressed on this site belong to their respective authors and are not necessarily shared by the sponsoring institutions or the National Science Foundation.

Any trademarks or trade names, registered or otherwise, that appear on this site are the property of their respective owners and, unless noted, do not represent endorsement by the editors, publishers, sponsoring institutions, the National Science Foundation, or any other member of the CTWatch team.

No guarantee is granted by CTWatch that information appearing in the Blog is complete or accurate. Information on this site is not intended for commercial purposes.