[CSG Spring 2008] Cyberinfrastructure Workshop – CI at UC San Diego

Cyberinfrastructure at UC San Diego Elazar Harel Unique assets at UCSD: CalIT2; SDSC; Scripps Institution of Oceanography They have a sustainable funding model for the network. Allows them to invest in cyberinfrastructure without begging or borrowing from other sources. Implemented ubiquitous Shibboleth and OpenID presence. Formed a CI design team – joint workgroup. New CI … Continue reading “[CSG Spring 2008] Cyberinfrastructure Workshop – CI at UC San Diego”

Cyberinfrastructure at UC San Diego

Elazar Harel

Unique assets at UCSD: CalIT2; SDSC; Scripps Institution of Oceanography

They have a sustainable funding model for the network. Allows them to invest in cyberinfrastructure without begging or borrowing from other sources.

Implemented ubiquitous Shibboleth and OpenID presence.

Formed a CI design team – joint workgroup.

New CI Network designed to provide 10 gig or multiples directly to labs. First pilot is in genomics. Rapid deployment of ad-hoc connections. Bottleneck-free 10 gig channels. Working to have reasonable security controls and be as green as possible.

Just bought two Sun Blackboxes – being installed tomorrow. Will be used by labs.

Chaitan Baru – SDSC

Some VO Projects – BIRN (www.birn.net) – NIH Biomedical Informatics Resarch Network – shares neuroscience imaging data. NEES (www.nees.org) Network for earthquake engineering simulations; GEON (www.geongrid.org) Geosciences network; TEAM (www.teamnetwork.org) field ecology data; GLEON (www.gleon.org) Global Lakes; TDAR (www.tdar.org) digital archaeology record; MOCA (moca.anthropgeny.org) comparative anthropogeny

Cyberinfrastructure at the speed of research – research moves very fast. researchers think that google is the best tool they’ve ever used – In some cases “do what it takes” to keep up: take shortcuts; leverage infrastructure from other CI projects and off-the-shelf products. Difficult because – can be stressful on developers who take pride in creating their own; engineers may think PI is changing course too many times. In other cases “don’t get too far ahead” of the users – sometimes we build too much technology – user community may see no apparent benefit to the infrastructure being developed.

The sociology of the research community influences how you think about data.

Portal-based science environments. Support for resource sharing and collaboration. Lots of commonalities, including identity and access issues. Lots of them use the same technologies (e.g. GEON and others). Ways of accessing data and instruments. Lots of interest from scientists in doing server-side processing of data rather than just sharing whole data sets for ftp. e.g. LiDAR on the GEON portal. opentopography model is an attempt to generalize that. EracthScope data portal is another example – includes SDSC, IRIS, UNAVCO (Boulder), adn ICDP (Potsdam).

Cyberdashboards – live status of information as it’s being collected. Notifications of events is also desirable.

Cyberdashboard for Emergency Response – collecting all 911 calls in California. Data miniing of spatiotemporal data. Analysis of calls during San Diego wildfires Oct 2007. Wildfire evacuations – visualization of data from Red Cross disastersafe database.

Cyberinfrastructure for Visualization

On-demand access to data – short lead times from request to readiness to rendering and display.

On-demand access to computing – online modeling, analysis and visualization tools

Online collaboration environments – software architecture, facility architecture.

SDSC/Calit2 synthesis center – conceived as a collaboration space to do science together – brings together – high performance computing; large scale data storage; in person collaboration; consultation. Has big hd screens, steroscopic screen, videoconferencing, etc. Used for workshops, classes, meetings, site visits. Needs tech staff to run it, and research staff to help with visualization, integration, data mining. So far has been on project-based funding, lately there’s been a recharge fee.

Calit2 stereo wall (C-Wall) – Dual HD resolution (1920 x 2048 pixels) with JVS HD2k projectors.

Calit2 digital cinema theater – 200 seats, 8.2 sound, Sony SRX-R110, SGI Prism with 21 TB, 10GE to computers

The StarCAVE – 30 JVC HD2k (1920 x 1080) projectors.

225 megapixel hiperspace tiled display.

In response to a question from Terry Gray, Chaitan notes that the pendulum is swinging a bit in that PIs still want to own their own clusters, but they no longer want to run them – they want them housed and administered in data centers. Elazar notes that they’re trying to make the hardware immaterial – a few years from now they may all be in the cloud, but the service component to help researchers get what they need will remain on campus.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: