Computational Science Research has great breadth at the University of Zurich. There are researchers in fundamental computer science, finance, linguistics, ecology, bioinformatics, neuroinformatics, statistics, applied mathematics, medicine, chemistry, particle physics and astrophysics.

The Institute for Computational Science was founded at the beginning of 2014. Its mission is to pull together these disparate areas at the University. Currently, the Institute is the University home for Theoretical and Computational Astrophysics. Nearly all of the Founding faculty (Lake, Mayer, Moore and Teyssier) were in this research area and it will broaden from this core. Already, there is one SNF Professor (Diemand) working on Molecular Dynamics. Prof. Abgral of Mathematics is the first Professor affiliated to the Institute. In the summer of 2014, we will welcome a new SNF Professor in Computational Cosmology (Yoo).

Astrophysics has a rich history in computing. Newton declared it an impossible task for humans:

To consider simultaneously all these causes of motion and to define these motions by exact laws allowing of convenient calculation exceeds, unless I am mistaken, the forces of the entire human intellect.

During the Second World War, Prof. Erik Holmberg (Lund University) was unable to travel to the south to use astronomical observatories. So, he built a special purpose computer to study the interactions of galaxies.

Like the force of gravity, the intensity of light decays as the inverse of the distance squared. So, Holmberg used light bulbs to simulate gravitating masses.

He placed 74 light bulbs on lined graph paper with 37 in each of two “disk” galaxieEach light bulb was taken out and replaced with a photocell to calculate “the force”. Velocities and positions were updated using these forces. The calculation was “parallel with a serial bottleneck”. Holmberg discovered some critical effects that weren’t rediscovered until digital computers were able to run comparable simulations in the 1970s.

Astrophysics has also long been at the forefront of “Big Data”. In the 1850’s, George Ellery Hale defined the “Astrophysical Observatory” as one having a darkroom. No longer would visual description suffice. This lead to vast basement “plate vaults” of astrophysical data. This became a digital revolution around 1980. At that time, the first CCD cameras were built for astronomy. Early ones were 100x more sensitive than film, but had small fields of view as they were a mere 128x128 or 256x256 pixels. To conquer wider areas, plate scanning machines were built to provide digital data over large fractions of the sky. This lead to massive surveys and archives such as the Sloan Digital Sky Survey and the Hubble Space Telescope archive. The next generation LSST will generate 30TB of data each night and resurvey the entire sky every 3 days.