I.     Announcements

Box.com:  February is box.com awareness month.  OIT will be sending out communications on the enhancements and changes coming to box.com on February 1st which include a new user interface that allows users to edit directly in the new web application.  The new interface will also give users the ability to set homepage to favorites. OIT will be having Lunch and Learns and will be posting enhancement documentation and videos on the website (https://box.duke.edu/).

Duke Research Computing Symposium:  The second annual Research Computing Symposium held over 3 days starting January 20th capped off events with a series of scientific presentations.  The poster session was particularly successful this year with 35 posters submitted, including 6 that were submitted in the Scholars@Duke visualization challenge which focused on datasets describing research activity and output of Duke researchers. 

New Associate Vice Provost for Digital Education and Innovation:  Matthew Rascoff has been named the new Associate Vice Provost for Digital Education and Innovation.  He will succeed Lynne O’Brien who retired late last year.  Matthew Rascov is local to the Raleigh-Durham area.  He has national and international experience including Google and JSTOR (a digital library of academic journals, books, and primary sources).  He begins work on Feb 8th.  

II.    Agenda Items

 

4:05- 4:15 – Gigabit Internet at Duke  Bob Johnson (5 minute presentation, 5 minute discussion)

What it is:   Gigabit broadband Internet service is almost 100 times faster than the average Internet connection speed available in America today.  The Duke University campus has had multi-gigabit Internet service capability for about a decade, but the ability to take that technology off-campus gives that same power to students, staff, faculty and our neighbors at home.

Why it’s relevant: Gigabit connections open the door for more possibilities to leverage technology that were not possible before.  Bob will provide a brief update on the Duke gigabit Internet environment.

https://today.duke.edu/2015/07/gigabit

http://www.dukechronicle.com/article/2015/08/faster-internet-coming-to-durham-triangle-region(link is external)

https://ncngn.org/relatedlinks/(link is external)

Regional Fiber Ring:  Duke has been in partnership with MCNC, a technology company located in RTP that provides broadband connectivity to universities and community colleges across NC as well as a vendor to build out a regional fiber ring that would cover Wake, Orange and Durham Counties.  In addition to providing broadband service to Duke and Durham Regional Hospital, the new fiber ring would provide ~74 sites, including NC1 Data Center and Duke Clinics, with direct linkages to the Duke network, including redundant paths to improve resilience of those connections. 

The benefits of Duke’s partnership with MCNC extends beyond the shared build out of the fiber ring.  The partnership will also allow the Duke Marine Lab to connect to MCNC’s expansive state-wide network allowing the Marine Lab to increase to a 10 gigabit connection.  The Lemur Center which has had connectivity issues in the past will also benefit from the partnership with increased bandwidth. 

Another benefit of the partnership with MCNC is the extended relationships that it has across the country with Internet2 and other international providers such as Cernet in China.  This allows Duke to remain flexible in our gigabit growth as communities begin moving to higher speeds. Duke currently has 30 gigabits of network connectivity: 10 gigabits for the University, 10 for the Health System and 10 for Science DMZ – a portion of the network designed for high-performance scientific applications.

Project Status:  The project is on track.  We are currently in the most time-consuming part of the project which is permitting. The first permits will allow us to pull fiber from campus west towards Chapel Hill (inclusive of Univerisity Tower) and east towards our main peering point with research and engineering networks and Duke Regional Hospital in Raleigh.  Conduit is the most expensive part of the project.  We originally planned to purchase the use of conduit from a vendor but those plans changed, allowing Duke to purchase the conduit.  This gives us the option of selling it to others interested in using it.

Questions and Discussion

Question:  Why do we want to own the conduit? 

Answer:  Fiber is pulled through conduit and provides protection to the fiber.  Eight-five percent of the cost is in the conduit.  Owning the conduit will give us future capacity and flexibility.

Question:  Who will be laying the conduit? 

Answer:  Duke has contracted with a vendor to build out the fiber in the northern half of the fiber ring.  MCNC will be building out the southern half.

 

4:15- 4:35 – Gigabit Internet in the Triangle, Dennis Newman (10 minute presentation, 10 minute discussion)

What it is:  Duke has been a significant force behind the effort to bring gigabit broadband to our region, including two years of coordination with the North Carolina Next Generation Network (NCNGN).  We worked closely  with N.C. State, UNC, Wake Forest, Durham and surrounding communities to attract interest from several companies who offered to launch gigabit broadband service.

Why it’s relevant:  Affordable, ultra-fast broadband connections can allow Duke faculty, staff and students engaged in data-intensive project to work more seamlessly between the classroom, office, lab and home.  Increased availability of these networks can also transform the way Duke faculty, staff and students engage with one another as well as the community and world around them.  Dennis will provide an update on the progress of gigabit broadband service in the Triangle region.

Fiber-Based Internet:  Dennis Newman, former CIO of Winston-Salem and current Program Director of NCNGN - a regional initiative focused on stimulating the deployment of next generation broadband networks in NC, provided an overview of the expansion of gigabit connectivity in our region.  Dennis was an instrumental member of WinstonNET, a project that deployed fiber network throughout the city of Winston-Salem and allowed them to save money and expand fiber throughout their municipal operations and universities.

Residential and Commercial Gigabit History:

    • In 2012 there were no privately owned, commercially available fiber services for residents in North Carolina. 
    • In 2014 NCNGN signed an agreement with AT&T to bring gigabit broadband to the Triangle region. 
    • In 2015 Google announced that it would bring Google Fiber to the Triangle and Charlotte. 
    In 2017 gigabit efforts were underway throughout the state. 
      • AT&T GigaPower service is now available in cities across North Carolina including Charlotte, Greensboro, Raleigh-Durham and Winston-Salem. 
      • Google has a smaller presence in NC.  They are reevaluating their fiber plans.  They plan to complete installations they’ve started but others are on hold.  

Accelerating Gigabit Adoption:  There are several efforts underway to accelerate the adoption of gigabit fiber. 

    • NCNGN – Promotional video explaining how and why we are using gigabit fiber (https://warpwire.duke.edu/w/G24BAA/?t=1.3100001)
    • Smart City Initiatives – sustainability initiative
    • NCNGN and AT&T Enablement – collaborative efforts to demonstrate capabilities
    • US ignite smart Gigabit Communities – national network of communities that have made a commitment toward leveraging next-generation technology applications and internet technologies to keep pace with the world’s rapidly changing technology and economy

There will be a reverse pitch event taking place on February 28th in Raleigh, Greensboro and Charlotte (https://www.us-ignite.org/blog/2017/2/reverse-pitch-events/(link is external)).  Leaders are asked to present challenges that require the high bandwidth and low latency of Gigabit Fiber infrastructure.  US Ignite and other partners will award two teams $19k each to develop applications that rely on high bandwidth/low latency connections and that can be scaled nationally.

Questions and Discussion

Question:  Has there been any resistance on the part of the state regarding non-commercial city-based development. 

Answer:  NCNGN has been careful to work within current legislation.  Municipalities can encourage investors to develop these networks but they cannot own the networks.

 

4:35- 4:45 – Gigabit Internet in the Home, Shilen Patel (5 minute presentation, 5 minute discussion)

What it is:  Gigabit Internet provides speeds of 1,000Mbps, which is also referred to as 1 Gbps or Gigabit Internet.  This is the speed of the connection provided by Internet Service Providers to residences, resulting in an Internet connection that is up to 100 times faster than today’s average broadband service.

Why it’s relevant:   Gigabit Internet has been part of initiatives to improve access for the communities it serves.  Shilen will provide a brief overview of his experience with the installation and use of this service in his home.

An End User’s Perspective:  Shilen Patel, a member of Identify Management in OIT, presented his experience as a Google Fiber end user.  Google began laying fiber in his neighborhood in Morrisville – Wake County in the spring/summer of 2015.  Underground boxes were buried every other house.  They were in and out quickly, installing the fiber across the 700+ houses in their neighborhood.  There was minimal property damage and Google was responsive when problems did arise.

It took just over 2 weeks from the time Shilen signed up for gigabit service on September 13th to have service operational in his house. The total appointment time for the installation was 4 hours which was mostly spent running a new cat5 line to his 2nd story office.  They also set up 2.4 and 5 GHz SSIDs for his wireless router.  They said the 5 GHz gives better bandwidth but lower range.  The network settings are managed over the web and can be easily changed.

Speed Test Results:  Speed test results in the chart below were done on a Tuesday at 10:00 pm using Google’s testing tool (http://speedtest.googlefiber.net/(link is external)).  Shilen also did a comparison test using Time Warner Cable’s testing tool which returned similar results.

 

Ping

Download

Upload

MacBook Pro – same room

5.9ms

652 Mbps

352 Mbps

MacBook Pro – different room

6.3ms

461 Mbps

304 Mbps

Galaxy S6 – same room

8.9ms

306 Mbps

211 Mbps

Galaxy S6 – different room

9.0ms

275 Mbps

166 Mbps

Ethernet

3.4ms

941 Mbps

948 Mbps

Overall, Shilen felt that the installation was good and service performed as advertised.

 

Questions and Discussion

Question:  I have found replication to the cloud for backups frustrating with 25Mbps.  Have you tested replicating to the cloud?

Answer:  Not yet. 

Comment:  Nick Tripp had a similar experience with his Google Fiber installation in his Morrisville home.  His installation took 1.5 hours since the installation point already was Cat5e ready.  Having the same upload and download speeds has been one of the best features. 

Question:  Do you have any experience with static IPs, Dynamic DNS or port forwarding using Google Fiber service?

Answer:  You have to use the Google Fiber Network Box which is managed from the cloud rather than locally.  The box will support 30 to 35 Wi-Fi devices.  It supports static IPs and dynamic DNS.  If you are very knowledgeable with Google and have networking knowledge you can use other routers, but it is not an easy swap out.

Comment:  An ITAC member shared his experience with AT&T Fiber. The fiber wasn’t run through the conduit until he signed up for service.  They ran into a problem getting the fiber through the curved area of the conduit so they had to temporarily run the cable across the street and his service went out due to the cable being cut before the permanent underground installatio was complete. The speed test results for the 300 Mbps service using a Cat6 ethernet cable were 370 Mbps download and 370 Mbps upload.  Wireless results were less impressive with 30 Mbps download and 30 Mbps upload.  There was some discussion around what may be contributing to these low numbers since he should be seeing around 300 Mbps.

Question:  Did AT&T change its stance on mining data packets? 

Answer:  AT&T has eliminated its controversial traffic scanning program and its 2 tier pricing which gave customers the choice to opt into their traffic scanning program in order to receive the lowest available rate.  Unless you are using VPN or another means, Google is mining traffic data.

Comment:  If you have AT&T’s 1 Gbps service then you have unlimited data.  If you have 300 Mbps or 100 Mbps service then it is 1 TB per month.

 

4:45 – 5:00 – Shared Campus/Medical Research Network, Charley Kneifel, Billy Willis (10 minute presentation, 5 minute discussion)

What it is:  Duke Health Technology Solutions (DHTS) has launched a project to build a network in support of research use of Duke clinical data (PACE – Protected Analytics Computing Environment).  The project has reached a point where OIT, in conjunction with DHTS will deploy a proof of concept Virtual Machine (VM) into this environment.  The VM will run on OIT hardware to allow campus researchers to accesses data and storage provided by DHTS as part of PACE, keeping Protected Health Information (PHI) within the bounds of the DHTS network.

Why it’s relevant:  Basic and clinical research are the engines that drive advances and innovation in medical care, health promotion and policy, and improved outcomes.  PHI is at the core of the data being used to support research.  This collaboration between OIT and DHTS allows Duke researchers to access this data while maintaining its security.   Charley and Billy will provide an overview of the project.

 

What is PACE:  The Health System has been building out a network in support of research use of Duke clinical data.  This includes sensitive PHI (Protected Health Information) data that resides outside of clinical apps and the tools used to extract the data.  The network which will house the data is called PACE – Protected Analytics Computing Environment. 

The network that supports PACE is a logical network separated by firewalls.  Users will access the PACE gateway using RDP (Remote Desktop Protocol).  The environment will support both Linux and Windows operating systems and will have the option to enforce multi-factor authentication.  Storage will be provided by the Health System on their Isilon VMs.

OIT will provide research applications such as MATLAB and Mathematica.  OIT will be connecting a Linux VM with a GPU (Graphics Processing Unit) using VRF (Virtual Routing and Forwarding) within the next week as a proof of concept. That VM will only be connected to the PACE network and not the campus network. 

In order for data to be released, it will have to go through a trusted broker service that the Health System will provide.  The Health System will still have research data in production applications that will continue to be audited.  These data will be outside of the PACE environment. 

There are a few adjustments that still need to be made, but we should have pilot users within a week or two.  The goal is to move all users off of the program DEDUCE (a Duke enterprise data unified content explorer which gives access to medical records in open format) by June 2017 and onto PACE. 

Questions and Discussion

Question:  Is there a national effort to replicate this effort so that data can be shared across institutions?

Answer:  Some data sharing already is taking place such as Carolinas Collaborative which shares data across institutions in North and South Carolina.  Tightly regulated clinical payor data is also shared. 

The extent of sharing is limited though due to legal barriers and the fear of making costly mistakes.  The matching of patients is also an inhibitor.  Additionally, large medical center’s proprietary data can be worth a lot of money in the form of grants. 

Question:  Can the Library use this space so that it can passively store research data until it is needed?

Answer:  We need a cohesive response to archiving across Duke as it pertains to research data.  Once we have that established, we can determine where those data are stored. 

 

5:00 – 5:15 – CSG Update, John Board, Charley Kneifel, Mark McCahill(10 minute presentation, 5 minute discussion)

What it is:  The Common Solutions Group works by inviting a small set of research universities to participate regularly in meetings and project work. These universities are the CSG members; they are characterized by strategic technical vision, strong leadership, and the ability and willingness to adopt common solutions on their campuses.

Why it’s relevant:  CSG meetings comprise leading technical and senior administrative staff from its members, and they are organized to encourage detailed, interactive discussions of strategic technical and policy issues affecting research-university IT across time. We would like to share our experiences from the recent January 2017 meetings.

Winter 2017 CSG Meeting:  CSG (Common Solutions Group) met January 18th-20th in Temple AZ.  CSG meetings occur 3 times per year and are intended to bring individuals from the top research universities together to find solutions to common IT problems in the higher education sector.  The following is a summary of the workshops that John Board, Mark McCahill, Evan Levine and Charley Kneifel attended.

    1. Cloud ERP Workshop:  Most of our peers are not pressing to take ERP (Enterprise Resource Planning) software such as SAP and PeopleSoft to the cloud.  When they are, they are doing so not to save money but to free up data center space to make room for higher value purposes, or they have very old legacy (e.g., Cobol) systems that need to be retired.  Duke, along with many other CSG peers, is cautious about moving ERP to the cloud.  We do have services in the cloud like email and our learning management system, but we don’t have any of our core enterprise resource tools in the cloud and don’t have plans to do so in a significant way in the near future. 

The following points were made by CSG peers regarding the transition to and management of cloud ERP:

    1. Unanticipated testing is required.
    2. There was resistance to change in business units that had highly customized workflows and disparate applications.  There was concern about losing customizations.
    3. The first attempt to move to the cloud failed because processes hadn’t changed. 
    4. A 'Project Management as a Service’ office was set up to keep projects on track. 
    5. The transition to the cloud is easier if there are integrated systems on campus.

Questions and Discussion

Comment:  Even though we haven’t had a push to move our enterprise systems to the cloud, we have been pushing to get campus adoption of common systems and common processes which would make transitioning easier as things change.

Question:  Is there a cloud-based version of SAP?

Answer:  Yes.  However, we aren’t planning to move SAP to the cloud.  We do have plans to move our enterprise core component SAP R/3 to SAP S/4HANA in April/May 2017.  Some of the S4 services are in the cloud.

    1. New Models for Supporting the Academic Enterprise Workshop:  The following points/observations were made:
    1. Duke is attempting to bridge the skill gap of what is being taught in the computer science curriculum and what is needed in the IT organization.  Some institutions are engaging IT professionals as guest lecturers within their educational programs.  Guest lecturers are rare at Duke.  How can Duke do this better? 
    2. UC Berkeley uses learning data which consists of clicks and interactions users have within the learning management system to better understand student outcomes and to help predict success.  The data are stored in their Learning Record Store.  If Duke has future interest in big data, UC Berkeley would be a good contact.  They have developed policies around the use of the data.  Duke hasn’t shown abundant interest in mining big data because our students have high success and graduation rates.  However, we could use big data to improve educational offerings or to improve the learning management system.
    3. UC Boulder requires students to take a mini-course as an introduction to the learning management system and campus.  You can’t register for classes until you take the mini-course. 

Comment:  Big data tools exist in open source form for Sakai and we would be very interested in partnering with other campus groups to make this a reality.  It is a big undertaking.  We have a student analytics group, maybe they should be invited to an ITAC meeting. 

    1. Recommendations/Guidelines for Updating IT Skill Portfolio Workshop:  The following points/question/observations were made by CSG peers:
    1. IT is a rapidly changing environment.  How do we manage growth of skills and get employees to change?  What do we do with those that don’t want to change?
    2. There are two work climates: a learning climate and a performance climate.  In a performance climate, we end up hiring more people that are risk adverse. 
    1. Accessibility Issues:  One school reported working with the Department of Justice to make their 10,000+ hours of MOOC videos (publicly available “course or learning” content) compliant with the Americans with Disabilities Act accessibility standards.  They are trying to find an appropriate resolution that allows them to serve the seeing and hearing-impaired community and continue to provide free online content given limited resources.

The next CSG Meeting is scheduled for May 17th - 19th.  Proposed topics include:

    • Automating Campus Network Configuration, Provisioning, and Monitoring
    • Challenges of Shifting IT to be a Trusted Business Transformation Partner
    • Product Management, Service Ownership, etc.