Are you willing to sacrifice some privacy for personal convenience or benefit? As digital technologies continue to evolve, creating, storing, moving, and sharing digital data and information more efficiently and more securely become the foci of innovation. The dilemma presented is that, while data and information become easier to manipulate, they inherently also become easier to access - by both you and others. Invariably, this is a social problem as much as a technological one, so some argue that such innovation just exacerbates existing social ills. Such is the case with RIFD (radio frequency identification) for tracking everything from products to pets to humans, and its potential socio-economic implications for medical care, national defense/security, and even religion. Want your medical information or other personal information with you all the time? This CNN article from yesterday provides good context to the issue.
Archive for the ‘Scientific Applications’ Category
With a nod to the newly formed Virtual Institute - High Productivity Supercomputing (VI-HPS), we thought it might be useful to provide a link to the Keynote talk given at the institute’s inauguration a couple of weeks ago. Doug Post, the chief scientist for the DoD High Performance Computing Modernization Program (HPCMP), gave a very informative talk about current/future challenges and opportunities in computational science and engineering. Loaded with statistics and information about challenges and bottlenecks related to code development and deployment, most of the talk is centered around HPCMP work, but much of it is applicable to HPC efforts within the US national research agenda.
Ok, well, maybe it’s not the masses but the upgraded Cray XT4 at ORNL (aptly named Jaguar) is available for use and is plenty fast as the most powerful supercomputer in the United States that is open for scientific computing. Coming in at number two on the most recent list of Top500 Supercomputers after being number 13 a year ago, such a move, according to Dr. Jack Dongarra, one of the list’s maintainers, “…reflects the fact that more and more researchers are turning to high-performance computing as a method to address pressing scientific questions” (a nod to the Knoxville News Sentinel for the quote). As part of ORNL’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, Jaguar is just one of several supercomputers that will serve the likes of organizations like Corning, Boeing, Dreamworks Animation, Proctor and Gamble as well as academic institutions such as Auburn University, University of Michigan and others.
With the announcement of a $24 million award to create the Community Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analysis (CAMERA), the Gordon and Betty Moore Foundation is establishing a new computational infrastructure to explore marine microbial genome sequencing. The effort establishes the UCSD - Venter Institute partnership, which is charged with CAMERA development. CAMERA will leverage the already established TeraGrid HPC computing facility and create a link between the two partners via the OptIPuter model of high-performance, computational collaboration.
The OptIPuter was described briefly back in the Introduction of the May 2005 issue of the CTWatch Quarterly. As a first real test case for this collaboratory model, it will be interesting to see how the effort progresses. We’ll check in with Larry Smarr, the PI on both CAMERA and OptIPuter, later in the year. More info on the OptIPuter project can be found in this article from R&D Magazine.
Standards are good. But too many are bad. Such is the case with open standards for the Grid. In this article from Grid Computing Planet, standards are touted as one of the reasons for slower adoption of Grids, or at least slower migration from academia to the business enterprise. With the proliferation of web services, grid management tools are becoming more important as the article also touches on the lack of consensus for Globus as the way to go in Grid middleware.
The Australian Research Council (ARC), an Australian equivalent of the NSF, recently awarded more than $3.5 milliion over the next couple of years for grid computing technologies aimed to increase medical research collaboration. One key beneficiary of the grant, Dr. Andrew Lonie of the University of Melbourne, will be using his share of the funds to work on the international Physiome Project, the successor to the Human Genome effort, which has a goal to
describe the human organism quantitatively, so that one can understand its physiology and pathophysiology, and to use this understanding to improve human health.
As part of this new ongoing effort, Dr. Lonie’s research centers around modeling and simulation of the human kidney, via the Kidney Simulation Project.
Continued funding for grid technologies and the maturation of high-speed networking will boost opportunities for international reearch collaboration and engagement. The result will be the ability to link the worlds foremost authorities in medical science to massive amounts of data, which will ultimately lead to quicker solutions to, and better treatment for, both local and global health issues.