CNI Fall 2015 Day 1

I’m at the fall meeting for the Coalition for Networked Information. For those who don’t know, CNI is a joint initiative of Educause and the Association of Research Libraries and was founded in 1990 to promote the use of digital information technology to advance scholarship and education. I was involved in the early days of CNI and I’m happy to have recently been appointed as a representative of Educause on the CNI Steering Committee.

Cliff Lynch is CNI’s Executive Director, and one of the highlights of the member meetings is his plenary address, where he generally surveys the landscape of digital information and pulls together interesting, intriguing, and sometimes troubling themes that he thinks are worth watching and working on.

In today’s plenary Cliff talked about the evolving landscape of federal mandates for public access to federally funded research results. It is only in 2016 that we will see the actual implementation of the plans the various federal agencies put forward to implement the directive that the Office of Science and Technology Policy put out in 2013. Cliff noted that the implementations of the multiple federal funding agencies are not coordinated, and that some of them are not in sync with existing practices at institutions, and there will be a lot of confusion.

Cliff also had some very interesting observations on the current set of issues surrounding security and privacy. He cited the recent IETF work on pervasive surveillance threat models, noting that if you can watch enough aggregate traffic patterns going to and from network locations you can infer a lot, even if you can’t see into the contents of encrypted traffic.  And with the possible emergence of quantum computing that may be able to break current encryption technologies, security and privacy become much more difficult. Looking at the recent string of data breaches at Sony, the Office of Personnel Management, and several universities, you have to start asking whether we are capable of keeping things secure over time.

He then moved on to discussing privacy issues, noting that all sorts of data is being collected on people’s activities in ways that can be creepy – e-texts that tattle on you, e-companions for children or the elderly that broadcast information. CNI held a workshop in the spring on this topic, and the general consensus was that people should be able to have a reasonable expectation of privacy in their online activities, and they should be informed about use of their data. It’s generally clear that we’re doing a horrible job at this. NISO just issued work on distilling some principles. In our campuses people have different impressions of what’s happening in authorization handoffs between institutions and publishers – it’s confused enough that CNI will be fostering some work to gather some facts about this.

The greatest area of innovation right now that Cliff sees is where technology gets combined with other things (the internet of things) – like drones, autonomous vehicles, machine learning, robotics, etc.  But there isn’t a lot of direct technical IT innovation happening, and what we’re seeing is a degree of planned obsolescence where we’re forced to spend lots of time and effort to upgrade software or hardware in ways that don’t get us any increased functionality or productivity. If that continues to be the case we’ll need to figure out how to “slow down the hamster  wheel.”

Finally Cliff closed by talking about the complexity of preservation in a world where information is presented in ways increasingly tailored to the individual. How do we document the evolution of experiences that are mediated by changing algorithms? And this is not just a preservation problem but an accountability issue, given the pervasive use of personalized algorithms in important functions like credit ratings.

 

 

 

Advertisement

[CSG Winter 08] Minimizing Use, Misuse, and Risk of Inadvertent Disclosure of SSN and Other Sensitive Data at Institutions of Higher Education

The last morning of CSG kicks off with a policy discussion on minimizing use of SSN and other sensitive data. Steve Shuster, Cornell Started data security policy work two years ago. Has had a long-standing data stewardship program on campus, aligned to Vice President offices. There were gaps – VPs don’t think about security as … Continue reading “[CSG Winter 08] Minimizing Use, Misuse, and Risk of Inadvertent Disclosure of SSN and Other Sensitive Data at Institutions of Higher Education”

The last morning of CSG kicks off with a policy discussion on minimizing use of SSN and other sensitive data.

Steve Shuster, Cornell

Started data security policy work two years ago. Has had a long-standing data stewardship program on campus, aligned to Vice President offices. There were gaps – VPs don’t think about security as rules change. Policies and practices haven’t always been consistent. Started Data Incident Response Team (DIRT) – determines need to notify, how much analysis is enough, etc. VP of IT, Policy Office, Audit, Counsel, etc. Were taking about one incident per month to that group where sensitive data were involved.

Stepped back to think about data exposure – three categories: public, restricted, confidential. “restricted” is the default – allows the stewards to just worry about the extremities. Defined specific security requirements for the three classifications. IT security council – lead security person from each of the units, meets monthly. Established strong exception process – first thing you hear when talking about requirements is why people can’t conform. Have mechanism to update requirements continuously.

Policy has highlighted some things: Missing some data stewards, pieces of data that run across the data stewards, eg SSN. Looking at having a PII Officer that would be responsible for that kind of data. Finding the data is hard. Created a Cornell Spider application that can crawl a computer to look for confidential data. 50-60% of computers on campus have some confidential data on them.

Randy Marchany, Virginia Tech

Their needs: Stay out of the press; stay out fo the courts; preserve data integrity; respect the privacy of students and employees.

privacyrights.org has a good chronology of data breaches.

Steps for managing sensitive data:

#1 – Do what you can do when you can do it.

Pre 2003 –

Building Blocks – a one page acceptable use policy; Data classification

Tools – SSL

Education – awareness seminars

Compliance – HR disciplinary action

#2 Create a framework for doing it – an IT Security Task Force – has lots of committees across the entire scope of the central IT division.

#4 (what happened to 3?) – Don’t think you’re done.

Built tools (including use of Cornell’s spider), encryption

Education – awareness sessions, faculty institute

Compliance – IT security reviews of departments, Audit

A complete solution is not needed to get something done.

Everyone has a role

Pulling all the pieces together is the challenge, and making sure it works

Cam Beasley, Texas

Formed compliance group with admin units in 2002.

Had a significant SSN incident in 2003, so got really serious.

2006 had another incident – turned out that they hadn’t involved very many academic representatives in their work.

Since that have implemented formal policies, how systems are to be managed and how apps are to be developed. The two major incidents were insecure apps.

Have developed data stewardship program.

By 2005-2006 had shut off admin sensitive info flows. But still had problems out in the units. Developed a point ‘n’ click sensitive number finder – built in Java, uses bit-mask pattern matching (faster than regexp). Applied it in client and also for open shares over SMB or NFS. Also worked with Sourcefire (their IDS vendor) to build this algorithm in as a preprocessor (also works with Snort).

Developed another tool (issora?) – federated risk assessment tool. Applied data classification tool to it, were able to classify data on almost 48k machines. Now have faculty members who speak the same language (eg. know what category 1 data means).

Klara Jelinkova, Duke

A lot of what they’re trying to do is about balance – divided problem: Duke Medicine security handles HIPAA data and policy; University handles FERPA and DMCA. That’s been very effective. As the two groups move closer together (joint ID mgmt, networking, etc), there’s more need for a higher-level policy group, which they’re exploring. As a technologist she’s been skeptical of policy and whether it works.

Longstanding policy – unique ID should be substituted for SSN. Talman Trask (exec VP) sent letter to all the deans – storage of SSN requires his approval. Had a breach in a departmentl web server – found out it had an app for brochure requests that asked for SSN to do later correlation. Who has the responsibility policies aren’t followed? Is it the CIO?

Lots of discussion – one question that came up is what’s the sensitivity of passport numbers? Wasn’t in any of the policies.