From big bang to big data
How's this for a challenge? Create a computer based on technology that currently does not even exist, analyze and store more data than what is on the daily Internet. And your goal is to simply answer the biggest scientific questions that have puzzled humankind for hundreds of years.
In theory, 13.75 billion years ago the Big Bang happened and the Universe was born. Not only did this soon thereafter give birth to the stars and planets, but on a much smaller scale it gave birth to particles, including protons and electrons -- the essential building blocks of life.
While this scientific theory is widely accepted and well-tested, there is not much insight about what happened at 0.00000000001 seconds, and between 400,000 to 800 million years after the Big Bang occurred. Astronomers believe that if we could better understand what occurred during these unique periods of time we could unlock some of the mysteries of the Universe, including how galaxies were formed, how they evolve and the ultimate science fiction question, "Are we alone?"
IBM researchers, working with the Netherlands Institute for Radio Astronomy, are investigating whether big data and emerging exascale computing technologies can be used to find some of the answers, by analyzing data gathered from radio signals generated by the birth of the universe.
Humankind has been trying to peek into the depths of space for hundreds of years starting with the first working telescopes, which were created by three Dutch spectacle-makers around 1608. Fast forward to the early 1990s and the Netherlands is again right in the middle of pushing the limits of space exploration.
In 1994 an international group conceived what would become a huge undertaking called the Square Kilometer Array (SKA). The SKA, which is to be funded by 20 countries to the sum of EUROS 1.5B, is a radio telescope that can read faint signals from the Big Bang. Thousands of dishes and arrays spanning 3,000 miles (5000 km) or roughly the distance between New York and Los Angeles, will comprise the radio telescope.
All of these faint radio signals create an enormous amount of data that needs to be shared with astronomers around the world. In fact, the SKA will generate enough raw data to fill 15 million 64 GB iPods every day."
"This is big data analytics to the extreme," said IBM Research scientist Ton Engbersen. "If you take the current global daily Internet traffic and double it *, you are in the range of the data set that the SKA will be collecting every day."
But if the SKA were completed today, the computer systems needed to analyze these massive volumes of data, known as exascale, simply don't exist. Exascale systems will be able to reach 1 quintillion (or 1 million trillion) floating point operations per second, a thousand times faster than today's most powerful supercomputers.
To solve this unprecedented challenge, IBM scientists in the Netherlands and Switzerland and ASTRON, the Netherlands Institute for Radio Astronomy, have launched an initial five-year collaboration called DOME, named for the protective cover on telescopes and the famous Swiss mountain.
DOME will investigate emerging exascale technologies including data transport and storage processes, and streaming analytics that will be required to read, store and analyze all the raw data that will be collected daily.
Only by basing the overall design on architectures that are beyond the current state-of- the-art will it be possible to handle the vast amounts of data produced by the millions of antenna systems of the SKA.
Specifically, scientists at ASTRON and IBM will investigate advanced accelerators and
3D stacked chips for more energy-efficient computing. They will also research novel
optical interconnect technologies and nanophotonics to optimize large data transfers, as well as high-performance storage systems based on next-generation tape systems and novel phase-change memory technologies.
When the SKA is completed in 2024, the Big Bang may have finally met its match.
* Based on 20,634 petabytes per month (2011).