CSG Spring 2015: Internet of Things

Opportunities: Better services; find new cost efficiencies (e.g. trash cans that let people know when they need emptying); Improved sustainability; Safety

Challenges: Network; Security; Privacy; Support

Networks: Apple watch doesn’t support WPA2 and is not a good 802.1x supplicant. Inexpensive data acquisition devices – $5 wifi module. They’ll all connect to our network address space. This may be a driver to get serious about IPv6. BYOD is the stepping stone to the Internet of Things. Devices talking to each other – “Your refrigerator is talking to my car!”. Do you need directly addressable IP addresses, or can we keep NATing forever? UCSD starting to roll out carrier-grade NAT on their wireless with “sticky” addresses for a few hours. Will we need to have “eduthing” like we have eduperson? Many of the things won’t use WiFi because the power consumption is too large. There are other emerging (conflicting) protocols.

Students will be doing data acquisition and wanting to do data analysis – we should be providing tools for managing and analyzing data.

90% of the world’s data has been created in the past two years. The concept of digital exhaust – have to analyze data as it flows, looking for trends and patterns, not saving it.

What can we do with this data? Could see, for instance, when everyone is fleeing a building. If we’re collecting sensor data and correlating it to other data, do we need to involve the IRB?

Who is the data custodian of the trash can data? How do we think about data governance for this kind of data? It’s not about the source of the data, but the attributes. There are regulatory and compliance concerns. Merging of data changes the concerns.

Advertisement

CSG Spring 2015: The Future IT Organization / Talent Management

What kind of people do we need?

  • Hire for speed (the best people you can find, then figure out where to fit them in).
  • More business and entrepreneurial skills – services moving more towards products.
  • Technical curiosity.
  • More willing to engage with the business partners.
  • Wisconsin had a hard time finding Peoplesoft developers so they set up a Peoplesoft Academy and selected a dozen people to train and then hired six of them.
  • Google used to talk about people who are like “stem cells” – can adapt to different environments.
  • Look for resiliency.

The next generation: willing to work hard, but less patient for delayed gratification – want life balance from the start. Very social, not tied to their employer. Watched their parents go through the great recession, so don’t trust employers. Learn fast and think they should be highly empowered from the start. Fueling a crowd-sourcing model, with lots of shifting around.

What do we have to change in how we recruit and employ?

  • Want to work from home, later in the day.
  • Want opportunities to explore – spend some time that might not be relevant yet.
  • Create environment for those type of employees to be successful.
  • University HR paradigms might not work for new IT employees.
  • Gartner’s work on bimodal IT is worth following – we have business applications that have to be solid and reliable, and new activities that can be innovative and constantly changing. There are employees of all ages that prefer to work in both modes and we need to find ways to accommodate them.
  • Millennials like to get groups together and fix things – they have low tolerance for things that are broken and take a long time to fix.

How do we recruit the best IT staff for the future?

  • Compete with mission and brand.
  • Can build and tear down stuff in the cloud without all the brittle process around it.
  • Looking for systems thinkers.
  • Instituting a formal internship program for undergrads and grads – good pipeline. If you hire a student when they graduate, even if they stay for just a couple of years you get great work.
  • Higher Ed IT is failure-averse. Best practice now is to fail early and fail cheap – put together an internship program and if it doesn’t work, shut it down. Penn State has a course on intelligent fast failure.
  • Cultural fit is important.

Do we have a talent management process?

  • Small things like recognition from the CIO with small spot gift cards can help.
  • Acknowledgement of colleagues from peers.
  • Be visible in the technical communities showing the quality of work we do, making it attractive.
  • In some kinds of jobs people are frustrated by the lack of career advancement opportunities.

How do we transition current employees to newer modes of work?

  • Employees are looking for more agile leadership.
  • Look at identifying individuals to give temporary opportunities to move elsewhere on a temporary assignment to get a specific job done. Needs to be a project with urgency to make it concrete. Then they have a different perspective when they go back to their line organizations.
  • It’s like a college basketball team – the good ones come in and out quickly so we just need to keep the pipeline flowing.

How do we encourage diversity?

  • Georgetown has around 400 regular participants in a Women Who Code effort, mostly not from computer science.
  • Plug students in at more strategic levels – not just answering the help desk phone.
  • Campuses can connect with community coding groups.
  • Google got its idea for flex time and research from academia – we are the source of innovation and we need to reclaim that.
  • Very little of the new cool stuff from our research and education programs filter into our organizations. How do we short-circuit that?

CSG Spring 2015 – Technology Business Management Discussion

Cost of Services at UVa – Virginia Evans

The homegrown beginnings of TBM at UVa. Today approx. 35% of central IT budget charged back to units, whether that’s fee for service or a head count fee. Next year will be fully allocated out under new budget model.

Cost of Services Initiative: Central IT only; early discussions with Indiana & Cornell; Modeled with WTC on Indiana’s activity-based costing in 2011; continuing simplification.

181 cost categories, 50 lines of business likd “Hosting Services”, Network Connectivity, etc. Coordinated with service catalog.

Methodology: All coss (people time (72%of cost), other than personal services). Allocated to services for direct and indirect costs (shared costs and overhead).

Collect staff time through managers – once a year now.

Benefits so far: Awareness of all service costs and dependencies; streamlined rate setting; helpful for sourcing decisions (Help Desk, Cloud IaaS); Streamlined responding to surveys like Core Data, Hackett Consulting Engagement; Enabled greater transparency with constituents (schools, advisory groups, etc).

Limitations/Challenges: Takes a lot to maintain; Accuracy of time reporting; Capital costs hit budget in year they’re expended not over life; One time project costs hard to allocate; Benchmarking/Comparability – partial service costing makes it difficult. E.g. ERP costs are not just in central IT, so how do you pull it together?). Also how institutions budget (fringe benefits, power, space, etc).; Managing costs of services that go across org boundaries – how do you get managers to think about lowering costs – what’s good, what’s bad? Using data to manage is next step.

Possible Future uses: Benchmarking (what services? common tool?);

Jim Phelps – Washington

There’s a Technology Business Management Council – includes lots of big companies. They have a framework for what you call services, which Aptio can track and then you can benchmark. You can plug Aptio into CMDB and service catalog, or you can add data manually. Aptio has an integration with ServiceNow. One of the values of Aptio is that you are forced into using their definition of service towers.

There’s discussion about the value of understanding costs vs. the value of benchmarking. Several people have tried to benchmark, but it’s hard, and may even be impossible.

The data and process architecture side of this is not trivial.

Panel discussion

Does this force simplification of service catalog? Need to keep service catalog in sync with costs of services. Services role up into portfolios.

CSG Spring 2015 – The Data Driven University, part 2

Tom Lewis, Washington

Who are the traditional players? Institutional Research; Office of Educational Assessment; Data Warehouse Team (do good work, saw their client as being Finance).

Modern players & practices – Sources of Change: From Above (President, Provost, VPs, AVPs, Chancellors); From the middle (Deans, chairs, heads of admin units (especially those focused on undergrads); From below (staff doing work, faculty); From the outside (BI and analytics vendors).

Becoming Modern –

Course Demand Dashboards – Notify.uw. Enterprising students screen scraping registration system for notifying about openings in courses, charging other students. So built notify.uw – can notify when openings occur in class via email or SMS. Almost 25k subscribers. What else can be done with the data? Understanding course demand: Notify.UW knows what classes students want; student system knows about course offerings and utilization of capacity. Mashed them up to see where demand exceeded capacity.

The Cool stuff: Central IT BA’s and engineers pulled in a like minded colleague from the DW to do innovation work with data. Provost, deans, and chairs got excited; built out dashboards using Tableau.

The Great Civitas Pilot – Why Student Success Analytics? People don’t understand much about their students, when to do interventions, longtitudinal views of program efficacy and impacts. Tried to use Civitas – take data from student system, LMS, and data warehouse. Illume: Analyze key institution metrics, starting with persistence; view historical results and predictions of future. Inspire for Advisors

The Cool stuff: Admin heads looked to IT to help solve problem because of success of course dashboard. Faculty, teaching and program support staff are eager to get started.

Show Me the Data!

Assessment folks didn’t understand the value of giving access to data that hasn’t been analyzed. IT team interviewed people for data needs, then involved assessment people in building dashboards with Tableau to realize those needs.

Data Warehouse folks have gotten the religion – look at the UW Data & Analytics page.

Central IT is the instigator and change agent, but needs BAs with deep data analysis skills.

We all need to be hiring data scientists with deep curiosity – can’t keep having technical folks with answers of it takes too long to go through the data. Should partner with existing data science centers on campus. If we’re really going to data-driven universities IT will be at the center – we touch all the parts of the institution, we have the tools, and we know more about how data interacts.

Mark Chiang – UC Berkeley

Used to have to go to separate offices to get data, mash up into spreadsheets, do pivot tables, for every request.

Data Warehouse: Cal Answers – Students (applicants, curriculum, demographics, financials); Alumni; Finance; Research; HR; Facilities.

Built out high level dashboard for deans and chairs – answer questions about curricula. Enrollments, Offerings, instructor data, etc.  Facilitates discussions between deans and faculty and administrators. Effort was driven by CFO. Makes job much easier. Added substantial additional investment.

Can build out prototypes in a couple of weeks on top of live data to prove concepts before building the real enterprise work.

Discussion

Will the data warehouse look significantly different in a few years? We don’t do a good job of understanding the way data security needs to change as data ages. There’s a place to incorporate new types of data like sentiment analysis on social media. Instructure is working on making Canvas data available via AWS Redshift. Much of the new thinking and activity about data is not coming from the traditional BI/DW teams, but those folks are more willing to partner now than they used to be.

CSG Spring 2015 – The Data-Driven University – part 1

DKelly Doney – Changing the Conversation at Georgetown

Getting lots of questions around data not collected in traditional ERP – how many times did you visit your advisor? What volunteer opportunities did you do? Who was your favorite professor?

Advancement needs to follow alumni every step of the way.

Provost asking question – process efficiency, quality of instruction, but also outcomes – what happens to graduates in first five years and beyond, relating those data back to experiences on campus.

Vice Provost for education sponsoring an effort – wants to measure cultural impact of Georgetown on students: learning to learn, well-being, empathy, etc. Creating embedded cultural practices to track that.

Using Enterprise BI + CRM for data analysis

Trying go break down silos of data ownership. Workday enabled some of this as shadow system owners realized they weren’t getting feeds from the new system. Went live with Finance and Student data warehouse this year.

Been partnering with Advancement to bring enterprise CRM to campus. Need to think about other sources too. Just finished first part of playbook project with Deloitte and Salesforce to create a playbook for higher ed institutions that want to take a look at CRM at an enterprise level. Talked to 20 different offices, identified 150 use cases for CRM. Have a high level Salesforce object model. Going to take on a pilot.  Needs to be refined by the community.

Phase 1 – Advancement and Requirements. Phase 2: Advancement and CRM Core. Future phases: CRM and larger engagement.

Salesforce licensing model is cost prohibitive for higher education – they’ve agreed to come to the table to discuss this.

User community always asks for lots of control and flexibility in reporting, but doesn’t make time to learn tools.

Debbie Fulton – VA Tech – Role of BI tool at VT

It’s not how you get there… unless you can’t get there. The perfect BI tool is not the goal and will not create a data-driven university. But if you have no viable tool, your goals may be unattainable.

VT’s journey – Any tool will do (almost). Needed to figure out what mattered to VT. They had Brio since the early 2000s, had a lot of limitations. Licensing, required desktop installation, browser problems, etc. Had a lot of standardized reports that required developers to create. Put out a RFP.

Was important that sponsors realized that getting a tool did not create the data-driven university. Brought in EAB to make recommendations on creating the data-driven university which added credibility.

Goals: Replace soon-to-be obsolete technology; leverage data warehouse (didn’t want to rebuild); position VT for future (unstructured data, mobile access, diversity of data sources); Address issues with current environment (inconsistent distribution and management of information; report development cycle is lengthy and process varies; lack of modern presentation and analytical functionality; inadequate licensing of legacy tools and product obsolescence).

RFP Requirements: Pixel Perfect Enterprise Reporting (not just SQR reports); Ad hoc reporting; analytics, visualization, and predictive modeling; scheduling and distribution; dashboards; mobile implementation; common data model (virtual data model, supporting a common data model regardless of reporting tool used).

Two vendors supported the data model concept: Attivio (search based), and denodo (which actually builds a data model). Both add a layer complexity that would’ve added to the timeline, and expensive. MicroStrategy added ability to build model that other tools could look at. That layer isn’t as robust as the dedicated tools, but was good enough.

Purchased Microstrategy.

Benefits realized and next steps: Site license for Microstrategy including admin and academic usage; have a tool with full functionality to support BIT; opportunity to jumpstart BI dialogue – questions have changed beyond complaining about lack of good tools; BI sponsorship and steering committee; data governance – beyond data stewards; BI leadership and evangelism.

Questions for consideration in achieving a data-driven university: How do we progress with all aspects of a BI implementation (data governance, evangelism, anlytics, etc.) that need to come together? Where does IT fit? could we learn from the evolution of learning systems for how we might create data analytics services, partnerships, and direction between IT and the university?

Business Intelligence Pain Points – Todd HIll, Notre Dame

Finding and acquiring BI talent – can’t pay what industry does. Some places use staff who were gradate assistants. Some use offshore resources, but that presents some challenges. 1 excellent BI person is worth 3 mediocre ones – invest wisely. Build your own BI skills internally. Develop BI competency center.

Tools – Notre Dame historically used Business Objects, but now moving towards Microsoft stack + Tableau. Found that over half of what they built didn’t get used, so needed to change the model. Build Personal BI, Team BI, Enterprise BI. Find what works in a less costly way before moving up the maturity level. Can’t go right from zero to enterprise. 1 month personal BI solutions – 1-2 customers, non refreshing data. Then add data governance, build for the team, then after that build in security at an enterprise level.

Assessment Framework: How well do your customers know what they want? How clean is the data? How clearly defined are the data elements’ How well understood ae data access and security; How technically savvy are your customers?

Create a data steward position; involve constituencies, show a RACI matrix; publish data definitions – BI portal. Notre Dame has a data governance seal of approval for data that’s been defined by the process.

Addressing Organizational Silos – co-locate when possible to promote teaming; have cross-departmental user stories; use sponsors to clear organizational silos. Deans are asking for dashboards that cross those silos – e.g. research, finance, HR.

Sometimes you can take advantage of new ERP implementations to change the model of (for example) data access.

Addressing BI Project Demand – Agile methodologies can help. Partner with app development teams; partner with tech savvy customers; build BI competency center.

CSG Spring 2015 – Research Computing Directions, part 1

The afternoon workshop is on research computing directions. Strategic drivers are big data (e.g. gene sequencing); collaborations; mobile compute; monetization.

Issues: sfotware defined everything enables you to do things cheaper; cloud/web scale IT drives pricing down; mobile devices = sensor nets + ubiquitous connectivity; GPUs/massive parallelism. Containerized and virtualized workloads and commodity computing allows moving analysis tools to the data. Interconnect science DMZs. Federations, security & distributed research tools.

Case Studies – where are we today?

Beth Ann Bergsmark, Georgetown: Ten years ago did very little to shape central IT to align with researchers. We always started conversations as security issues. Researchers started realizing that the complexity of what they were building needed IT support. Central IT has adopted research support – grew organization to build support across the fabric of the organization. Built a group for supporting regulatory compliance. Most research has moved into the data centers on premise. Putting staff on grants – staff from traditional operations areas. Fantastic career path, plus making them more competitive for grants. Understanding how to create partnerships. Regulatory compliance control complexity continuing to grow, but research management software is also maturing. Thinking about integration of those apps. Research computing is driving future planning – networking, storage (including curation), compute. Research driving need for hybrid cloud architecture. Researchers will go where the opportunities and data are. Watching open data center initiative closely – AWS hosting public federal data. Portability becomes key. PIs and researchers move – on premise that’s hard. In cloud it should be easier.Need to build for portability. New funding models for responding to the life cycle of research.

Charles Antonelli – Michigan: Has been doing research support in the units since 1977. No central support for research computing except for 1968 era time sharing service. In 2008 there was a flurry of clusters built on campus in various units that had HPC needs. One of those was Engineering’s. In 2009-10 first central large cluster was born. Been growing since that tiem. Flux cluster: 18k cores with ~2500 accounts. Primary vehicle for on campus support of large scale research computing. Does not yet support sensitive data because it speaks NFS v3. That will be fixed with a new research file system. Cluster around 70% busy most of the time.  Central IT does not provide much help for consulting on the users. There is a HP consulting service currently staffed by 20% of one person. Have been looking at the cloud. Hard to understand how to use licensed software in the cloud. Have been using Globus Connect for a long time. Hooking up group stuff to the Globus endpoints.

Charley Kneifel – Duke: Duke Shared Cluster Resource – Monlothic cluster, Sun grid engine scheduler, solid scientific support staff, problematic financial model, breakaway/splintered clusters spun up by faculty. New provost and Vice Provost for research. Active faculty dissatisfaction, new director of research computing in IT. Now: Duke Compute Cluster, SLURM job scheduler, reinvigorated financial model – cover 80% of need on campus with no annual fees for housing nodes. Moving capex to opex. Faculty who’ve built their own clusters are now interested in collaborating. Going towards: Flexible compute cluster with multiple OS, virtualized and secure. Additional computing servers/services: specialized services such as GPU clusters or large memory machines. Flexible storage – long term, scratch, high performance SSD. Flexible networking: 10GB minimum, 40G+ interswitch connections; 20G+ storage connections; SDN services. Challenges: History, wall between health system and university. How to get there? Allocations/vouchers from middle; early engagement with researchers; matching grants; SDN services; cooperation with colleges/departments; support for protected network researchers; Outreach/training – docker days, meetings with faculty. Requires DevOps – automation, work flow support, hadoop on demand, GUI for researcher to link things together. Carrots such as subsidized storage, GPUs, large memory servers. Cut-n-pastable documents suitable for grant submissions; flexibility; removal of old hardware.

Tom Lewis, Chance Reschke, WashingtonConversations with research leaders in 2007-8. 50+ central IT staff involved, 127 researchers interviewed, selected according to: number and dollar amount of current grants relative to others; awards and recognitions. Learn about future directions of research and roles of technology. IT & data management expertise, data management infrastructure, computing power, communication & collaboration tools, data analysis & collection assistance. That equals cyberinfrastructure. By 2005 data centers were overwhelmed. In 2005 data science discussions began. By 2007 VP of research convened forums to discuss solutions, by 2010 rolled out first set of services. Why? Competitiveness & CO2 – faculty recruitment & retention, data center space crisis, climate action plan, scaling problem. Fill the gap: Speed of science/thought – faculty wanted access to large scale, responsive, supportable computing as they exceed the capacity of departments. A set want to run at huge scale – prep for petascale. Need big data pipelines to instruments. Data privacy for “cloudy” workloads.  Who’s doing it? UW-IT doing most of it through service delivery, mostly through cost recovery. Libraries work on data curation, the eScience institute works on big data research. First investment was to build scale for the large researchers who were asking the Provost. Built credibility, and now getting new users. Faculty pay for the blade, which is kept for 4 years. Just added half an FTE data scientist for consulting.

UCSD – UCSD has 25% of all research funding for all of UC. Most research computing is at San Diego Supercomputer Center. Two ways to access – XSEDE (90% of activity). Users get programming support and assistance. There are champions around campus to help. Triton Shared Computing Cluster – recharge HPC. Can buy cycles or buy into the condo. 70% of overall funding comes from research side, rest comes from campus or UC system. Integrated Digital Infrastructure is a new initiative started by Larry Smarr: SDSC, Qualcomm Institut, PRISM @ UCSD, Library, Academic Computing, Calit2. Research data library for long term data curation is part of that initiative.

CSG Spring 2015 – Security 3.0: Physical meets cyber (IDS meets GIS)

Randy Marchany – VA Tech

All security is local; empower the local departmental IT staff; Business Process trumps the Security Process if there’s a conflict; learn the business process before imposing security requirements; restrictive security practices cause worse problems overall.

Three main business processes at Universities _ Academic, Administrative, Research.

Continuous Monitoring: Keeping someone from getting inside has failed miserably. Firewalls are not effective proteection devices – they are effective detection devices. Change the strategy – assume they are in so go hunt for compromised hosts; monitor outbound traffic; prevent their command and control communication; inbound monitors server side attacks; outbound monitors client side attacks.

Map developed tool that displays estimate of number of people occupying general-use classrooms and dining facilities, hour by hour throughout the week. They have a gameday app that keeps track of concentrations of people within the football stadium. Married GIS with IDP sensors so they can see where machines are that are being attacked or compromised.

Challenges: Funding, Training, Process, Technology.

CSG Spring 2015: Security 3.0 – SCADA Tales of Horror!

After the break there’s discussion about risk management, and business impact analysis. How do we know what’s worth protecting for what kind of investment? There’s a point made that we’ve focused on protecting devices, not data. Should be focusing on what is your data and where does it reside? Can we partner with audit teams to look at these risks?

When Worlds Collide: Security, the physical world & IoT – Bill Allison, UC Berkeley

Consider what’s coming: media, environmental monitoring, infrastructure management, manufacturing, energy management, medical and healthcare systems, building and home automation, transportation, large scale deployments (e.g. smart cities).

More is not just more – it’s different.

Security at Berkeley began around WWI with three guys with flashlights, guns, and sticks. 1986 – online intrusion was the Cuckoo’s Egg and the 1988 Morris Worm.

SCADA _ Supervisory Control and Data Acquisition – generations: First generation, monolithic; second generation, distributed; third generation, networked; fourth generation, Internet of Things.

Most institutions have SCADA systems but they’re not controlled in IT.

SCADA AND BAS – Jim Jolkl, UVa

SCADA: Focus: industrial process automation, utilities, gas pipelines; BAS – Building Auotmation Systems. Same technologies, by and large.

Monitor and Control: Heating and cooling systems including our data centers, hospitals and clinics, animal studies areas, biosafety rooms, HAZMAt areas; Power systems, distribution, generators and transfer switches; often fire and security systems, sometimes door locks. How secure are these systems?

Systems arenot small – 200 buildings; 90,500 physical BAS points used for monitoring and/or control; 15,000 BAS controllers, at UVa.

BAS Network Technology: Common protocol, BACnet, supports services beyond HVAC. Security: security was not a focus, but standards now exist. But deployment use of BACnet security is limited. Multiple transport forms supported: RS-485, ARCNET, ethernet, BACnet over IP.

SCADA – what is different? Scada networks perform critical functions: temperature, pressure, valves, generators, chillers, etc. Constructed with old technology with a very long refresh cycle (15-30 years). Intrinsic security generally lacking; Expensive ($1m for a moderate building); limited CPU power in devices, so hard to do crypto or mutual authentication. Firmware update facilities are good, allowing to push anything to it.

Typically campuses hire control vendors whose knowledge of networking comes from dedicated dialups. See that in things like video surveillance systems too.

Decentralized SCADA – Much SCADA gear is outside the control of facilities: freezers; lab equipment; door systems; classroom controls; cameras. Many of us have no knowledge of what types of control and data acquisition equipment departments place on the network.

Protection strategy: IT Security: Firewalls, etc; Physical security; Monitoring; Cryptography – work towards being able to consider the SCADA network as an untrusted network.

We’re back in the 90s again, sort of: important equipment that can’t protect itself. Protofols are open, widely deployed and insecure; large installed base of old equipment on a slow refresh cycle. But our ability to add external protection is much better, active monitoring is generally in place; system owners generally understand the problem and want to fix; main control software runs on modern platforms.

CSG Spring 2015 – Security 3.0: The CISO’s Empty Cooking Pot, Part 1

Stefan Wahe, Madison:The CISO’s Empty Cooking Pot

Goals: Describe the baseling of Cyber Security Strategif Plan; Learn how to gain participation in achieving the plan; identify how you may help Cyber Security on your campus.

Background: If a strategy’s posted on a website does it make a sound? UW Madison 2011 IT Security Strategy.

People forgot about the strategy – no reporting, no accountability. Positive outcomes: Consolidated two competing groups, elevated security to report through CIO’s office. New CISO with risk-based methodology. Created a 100 day plan including drafting a cyber-security plan. Hired a Chief Data Officer – brings governance groups together to talk about data.

Baseline strategy will: have a commonly agreed to purpose; be understood by the community; establish a governance model; assign accountability; have a communications plan; be flexible or adaptable to change

Cyber Security Baseline: Identifies current and emerging threats to support the strategy; identifies the responsibilities of the CISO and IT Security org; identify and empower governance groups to participate in and evolve the strategy; Identify goals, assign accountability and timeframes; Align with the campus and IT strategies.

Strategic Elements: Complete data governance and information classification plan; establish risk management framework to reduce cybersecurity risk; build a community of experts; consolidate seccurity operations; improve cyber threat intelligence analysis, dissemination, and remidation; optimize services, establish metrics, promote compliance. Each element has SMART goals.

Enabling Objectives: Tactical things that need to be done. Establish restricted data environments; centralize data collection, etc.

Governance: Identify governance groups to empower community to meet goals.

CSG Spring 2015 – Security 3.0: Card Access Security using Grouper

Charlie Kniefel – Duke

Legacy environment: Prior to 12/14 was using Blackboard Uptim to manage door access; independently managed buildings; no consistent rules for building access on campus; Would have to ask each building owner for access; deactivation of cards not as quick as desired.

Implementation of Blackboard Transact: Started in early 2014 with planning starting in 2013. Covers both financial systems and access control; access control services are separated from financial transactions as part of implementation; lots of cleanup/preparation prior to transition.

Management of access plans: seems to be a lot like groups; based on who you are, your role, what classes you take.

Community plans: groups created for faculty, staff, students; needed to agree on standard business hours.

Individual building plans: access for students can be based on major or classes; challenges – financial system based on who pays you, not where you work; local coordinators trained to use tools to manage membership; goal is to get to role-based access.

Future steps: Wrapping grouper seems to be a common trend: toolkits for instruction; research toolkits for research group services management; access control membership; share? Additional technology enablement: contactless but with legs? Apple’s plans?; Roles, roles, roles.

Built access for students to get the data about their card activity.

Lack of good space management data hinders usefulness of roles.