India to build world’s fastest Supercomputer:

by Sep 17, 2012

Print

India wants to get ahead in the technological revolution. And just how will it manage this? By building a new supercomputer that aims to be 61 times faster than IBM Sequoia, currently the world’s fastest.

According to reports, Telecom and IT Minister Kapil Sibal has written to Prime Minister Manmohan Singh sharing the roadmap to develop “petaflop and exaflop range of supercomputers” at an estimated cost of Rs 4,700 crore over 5 years.

“In his (Sibal’s) letter, he has said that C-DAC has developed a proposal with a roadmap to develop a petaflop and exaflop range of supercomputers in the country with an outlay of Rs 4,700 crore,” a government official said.

India’s attempts at making the world’s cheapest tablet, Aakash might not have been so successful thanks to in-fighting among the manufacturers and government agencies, but the government is clearly not disheartened and has moved on to bigger and more powerful projects.

India wants to win the supercomputer race. Getty Images

So what will India have to beat as far current supercomputers go? The world’s fastest supercomputer is the IBM Sequoia, which has a peak speed of 16.32 petaflops. The computer is based in Livermore, USA and consumes, nearly 7890.0 kW of electricity. According to the Top500 list, the Sequioa is one of the most energy efficient systems in the world.

But does India have a supercomputer in the current top ten list? No, India’s highest ranked supercomputer in the 2012 list is the one at CSIR Centre for Mathematical Modelling and Computer Simulation in Bangalore which is ranked at 58. You can view the entire list of supercomputers for 2012 here.

As far as rivals go, China has 2 supercomputers in the top ten list for 2012. Tianhe — 1A at the National Supercomputing Center in Tianjin was the world’s fastest super computer in 2010. The other Chinese computer in the 2012 list is Nebulae at the National Supercomputing Centre in Shenzhen which is at number 10.

Floating Operations per seconds (Flops or Flop) determines the time used by a computer to make heavy calculations. Exaflops are higher than petaflops and the Indian government claims that its five year project will be enough to build a range of supercomputers with processing speeds in petaflops and exaflops. Click here to know more about petaflops.

Hopefully this one won’t be another failed IT project and India will finally get a supercomputer in the top ten.

Berkeley Laser Fires Pulses Hundreds of Times More Powerful Than All the World’s Electric Plants Combined


ShareShare ShareEmail PrintPrint



BELLA petawatt laser at LBL

BELLA laser. Credit: Roy Kaltschmidt, Lawrence Berkeley National Laboratory

Blink and you’ll miss it. Don’t blink, and you’ll still miss it.

Imagine a device capable of delivering more power than all of the world’s electric plants. But this is not a prop for the next James Bond movie. A new laser at Lawrence Berkeley National Laboratory was put through its paces July 20, delivering pulses with a petawatt of power once per second. A petawatt is 1015 watts, or 1,000,000,000,000,000 watts—about 400 times as much as the combined instantaneous output of all the world’s electric plants.

How is that even possible? Well, the pulses at the Berkeley Lab Laser Accelerator (BELLA) are both exceedingly powerful and exceedingly short. Each petawatt burst lasts just 40 femtoseconds, or 0.00000000000004 second. Since it fires just one brief pulse per second, the laser’s average power is only about 40 watts—the same as an incandescent bulb in a reading lamp.

BELLA’s laser is not the first to pack so much power—a laser at Lawrence Livermore National Laboratory, just an hour’s drive inland from Berkeley, reached 1.25 petawatts in the 1990s. And the University of Texas at Austin has its own high-power laser, which hit the 1.1-petawatt mark in 2008. But the Berkeley laser is the first to deliver petawatt pulses with such frequency, the lab says. At full power, for comparison, the Texas Petawatt Laser can fire one shot per hour.

Laser wakefield acceleration
Simulated image of laser accelerating electrons in a plasma. Credit: Cameron Geddes, LOASIS Program, at the National Energy Research Scientific Computing Center, NERSC

The Department of Energy plans to use the powerful laser to drive a very compact particle accelerator via a process called laser wakefield acceleration, boosting electrons to high energies for use in colliders or for imaging or medical applications. Electron beams are already in use to produce bright pulses of x-rays for high-speed imaging. An intense laser pulse can ionize the atoms in a gas, separating electrons from protons to produce a plasma. And laser-carved waves in the plasma [blue in image at right] sweep up electrons [green], accelerating them outward at nearly the speed of light.

BELLA director Wim Leemans says that the project’s first experiments will seek to accelerate beams of electrons to energies of 10 billion electron-volts (or 10 GeV) by firing the laser through a plasma-based apparatus about one meter long. The laser apparatus itself is quite a bit larger, filling a good-size room [see top photo]. For comparison, the recently repurposed Stanford Linear Accelerator Center produced electron beams of 50 GeV from an accelerator 3.2 kilometers in length.

About the Author: John Matson is an associate editor at Scientific American focusing on space, physics and mathematics. Follow on Twitter @jmtsn.

Researchers Consider Graphene as a Cure for Desalination Woes




desalination,freshwater,crisis GOING WITH THE FLOW: Hydrogenated (a) and hydroxylated (b) graphene pores, and (c) side view of the computational system described in this research. Image: Courtesy of the Massachusetts Institute of Technology (M.I.T.)

More In This Article

The earth harbors about 1.4 billion cubic kilometers of water. Unfortunately, the vast majority of that water comes from the sea and is not potable unless treated by expensive, energy-hungry desalination plants. Those problems stem largely from inefficiency in the way salt ions are separated from water molecules, and the solution, says a team of materials scientists from the Massachusetts Institute of Technology, lies in fundamentally revising that process.

The predominant desalination method today—reverse osmosis (RO)—relies on polymer-based membranes to remove salt and requires great pressure to push water through a semipermeable film.The more pressure applied, the higher the cost. The M.I.T. researchers, led by Jeffrey Grossman and David Cohen-Tanugi, propose that films made of graphene could filter out salt without inhibiting the water flow as much. Graphene, a superstrong sheet of carbon that is only one atom thick, has mostly been seen as a material for improving electronics and optical communications.

Reverse osmosis requires less energy than other desalination approaches—such as thermal distillation—but graphene membranes containing nanoscale pores that are more permeable than the polymers currently used would further cut energy requirements, the researchers reported online last month in Nano Letters.

The idea is to discriminate between water molecules and salt ions based on size. "Reverse osmosis uses size exclusion, except it excludes everything," says Grossman, an associate professor of power engineering.

A graphene membrane would provide well-defined channels that allow water molecules to flow through at lower pressures while blocking salt ions, Grossman says.

Using software simulations, the M.I.T. researchers experimented with different pore sizes to desalinate seawater with a salt concentration of 72 grams per liter, about twice the salinity normally found in the ocean. They found that, theoretically at least, pores 0.7 to 0.9 nanometer in diameter were most effective at passing water molecules while blocking sodium ions. "That's the sweet spot," Grossman says. "If it's bigger, salt's going to flow through. If it's smaller, nothing flows through."

Grossman and his team are trying to determine whether chemical reactions might be used to tweak desalination performance. The researchers programmed their digital graphene pores to be coated with either hydrophobic (water-repelling) or hydrophilic (water-loving) atoms. The former slowed the flow but cut down on the salt ions passing through, while the latter allowed faster flow but blocked fewer salt ions. The type of coating may ultimately depend on conditions at a given facility. Still, the scientists report, simulations indicate that graphene nanopores could reject salt ions with a water permeability two-to-three orders of magnitude higher than RO membranes.

Of course, working with graphene in reality is more challenging than filtering pixilated salt from digital water molecules on a computer. For starters, although chemical etching and ion beams can be used to create holes in graphene, it is difficult to produce holes of a specific size in an even configuration, Grossman acknowledges. Nor does graphene eliminate the quandary of how much leftover brine can be safely returned to the ocean without hurting underwater habitats. Toxicity could also be an important issue, he says, "although there are no real answers right now in terms of [graphene's] potential impact on [the safety of] drinking water."

Grossman does not know when graphene-based desalination might be ready for commercial use. He and his team, though, continue to run simulations and have begun testing actual membranes in the lab to study flow rates and salinity.

Demand for potable water is expected to escalate worldwide in the coming years. Grossman says the key to meeting that need is not necessarily tweaking existing technology. "We looked around at who's working on desalination in the scientific community, and it's mostly mechanical engineers working at the systems level," he says. "Little is being done on the system design side using basic science and working from the bottom up."