[ECAR Summer 2008] Rudolf Dimper – Cyberinfrastructure at the European Synchotron Radiation Facility and Its Impact on Science


Rudolf is head of computing services at ESRF.

Observations today done with sophisticated instruments – including synchrotrons. A synchrotron is a super microscope to examine condensed matter. Operate from the UV to the hard x-ray spectrum. Syncrhotron light – used because it has remarkable properties: brilliance (1,000 billion times brighter than a hospital x-ray tube).

The European light source is a 6 GeV source in Grenoble. Concentrates 10,000 researchers and engineers. Cooperation between 18 countries. Annual budget of ~80 m€.

Some applications – studying the structure of spider silk; medical applications, including angiography that gives better results than conventional hospital techniques; Geophsysics, sutdying samples which undergo extreme changes; Chemistry, how do catalytic processes function; Semiconductors; Paleontology, high res microtomography of fossils.

52 Beamlines, 6222 user visits in 2007. 15,308 eight hour shifts scheduled for experiments in 2007. > 1500 peer reviewed publications/year.

Over last 10 years the data volume has increased by a factor ~300. In 2007: 300TB, ~1*10E8 files.

Storage policy is only to keep 6 months of data, even for internal users. This is under heavy discussion.

French network infrastructure has not increased bandwidth in three years. That’s a problem

Network based on Extreme Network switches, storage on NAS systems, StorageTek tape for offline. Commodity clusters in teh data center, ~400 CPUs (totally insufficient).

ESRF Upgrade Programme – 290 m€ programme.

Single most productive facility producing protein structures for the Protein Databank.

New methods: nanobeams & raster scans. Looking to increase resolution by two orders of magnitude. Petabytes of data – how to carry away the data. Easy to imagine a PB per day in 10 years, contrasted to 15 PB per year for the LHC.

Two fundamental problems:

Latency diminish teh time to measure, store and analyze data.

Add functionality – new ways to measure, store, and analyze data. “Have to get our hands dirty with grid tools”

Looking desperately for 100 Gbps network.

I/O bottlenecks in research clusters is a big issue.

ESFRI position paper on digital repositories – lots of storage and access policies.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s