Cosmology@Home
Encyclopedia
Cosmology@Home is a BOINC distributed computing
project, run at the Departments of Astronomy and Physics at the University of Illinois at Urbana-Champaign.
to the data measured to date and search for the model that best matches it. Other goals may include:
as well as fluctuations in the cosmic background radiation as measured by the Wilkinson Microwave Anisotropy Probe
.
For any given class of theoretically possible models of the Universe, Cosmology@Home generates tens of thousands of example Universes and packages the cosmological parameters describing these Universes as work units. Each work unit represents a single Universe. When the work unit is requested by a participating computer, this computer simulates this Universe from the Big Bang until today. At the time of writing, each such simulation takes an average of 4 hours on a current CPU. The result of this simulation is a list of observable properties of this Universe.
This result is then sent back and archived at the Cosmology@Home server. When a sufficient number of example Universes have been simulated, a machine learning algorithm called Pico, which was developed by the project scientists of Cosmology@Home for this purpose, learns from these example calculations how to do the simulation for any Universe similar to the example Universes. The difference is that Pico takes a few milliseconds per calculation rather than several hours. Training Pico on 20,000 examples takes about 30 minutes. Once Pico is trained, it can run a full comparison of the class of models (which involves hundreds of thousands of model calculations) with the observational data in a few hours on a standard CPU.
Distributed computing
Distributed computing is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal...
project, run at the Departments of Astronomy and Physics at the University of Illinois at Urbana-Champaign.
Goals
The goal of Cosmology@Home is to compare theoretical models of the universeUniverse
The Universe is commonly defined as the totality of everything that exists, including all matter and energy, the planets, stars, galaxies, and the contents of intergalactic space. Definitions and usage vary and similar terms include the cosmos, the world and nature...
to the data measured to date and search for the model that best matches it. Other goals may include:
- results from Cosmology@Home can help design future cosmological observations and experiments.
- results from Cosmology@Home can help prepare for the analysis of future data sets, e.g. from the Planck spacecraft.
Science
The goal of Cosmology@Home is to search for the model that best describes our Universe and to find the range of models that agree with the available astronomical and particle physics data. The models generated by Cosmology@home can be compared to measurements of the universe's expansion speed from the Hubble Space TelescopeHubble Space Telescope
The Hubble Space Telescope is a space telescope that was carried into orbit by a Space Shuttle in 1990 and remains in operation. A 2.4 meter aperture telescope in low Earth orbit, Hubble's four main instruments observe in the near ultraviolet, visible, and near infrared...
as well as fluctuations in the cosmic background radiation as measured by the Wilkinson Microwave Anisotropy Probe
Wilkinson Microwave Anisotropy Probe
The Wilkinson Microwave Anisotropy Probe — also known as the Microwave Anisotropy Probe , and Explorer 80 — is a spacecraft which measures differences in the temperature of the Big Bang's remnant radiant heat — the Cosmic Microwave Background Radiation — across the full sky. Headed by Professor...
.
Method
Cosmology@Home uses an innovative way of using machine learning to effectively parallelize a large computational task that involves many inherently sequential calculations over an extremely large number of distributed computers.For any given class of theoretically possible models of the Universe, Cosmology@Home generates tens of thousands of example Universes and packages the cosmological parameters describing these Universes as work units. Each work unit represents a single Universe. When the work unit is requested by a participating computer, this computer simulates this Universe from the Big Bang until today. At the time of writing, each such simulation takes an average of 4 hours on a current CPU. The result of this simulation is a list of observable properties of this Universe.
This result is then sent back and archived at the Cosmology@Home server. When a sufficient number of example Universes have been simulated, a machine learning algorithm called Pico, which was developed by the project scientists of Cosmology@Home for this purpose, learns from these example calculations how to do the simulation for any Universe similar to the example Universes. The difference is that Pico takes a few milliseconds per calculation rather than several hours. Training Pico on 20,000 examples takes about 30 minutes. Once Pico is trained, it can run a full comparison of the class of models (which involves hundreds of thousands of model calculations) with the observational data in a few hours on a standard CPU.
Milestones
- 2007-06-30 Project launches for closed alpha testing - invitation only.
- 2007-08-23 Project opens registration for public alpha testing.
- 2007-11-05 Project enters beta testing stage.