ITAC Meeting Minutes
November 8, 2012, 4:00 – 5:30 pm
Announcements
SDN – Software Defined Networking – Jeff Chase, Computer Science
Big Data - Robert Calderbank, Dean, Natural Sciences, Arts & Sciences, Electrical Engineering & Mathematics
Innovation & Entrepreneurship - Robert Calderbank
Announcements
Minutes from the 10/25/12 meeting were approved.
Tracy reminded all of the Tech Expo in January and encouraged submissions for presentations
SDN – Software Defined Networking – Jeff Chase, Computer Science
Tracy introduced Jeff by saying that he has been working with IT for many years helping us learn and think about things like shared computing clusters and access to dynamic resources. These are areas of his research expertise. For the last couple of years he has been pretty involved with the efforts of GENI. GENI stands for Global Environment for Network Innovations and is aimed at researching better networking practices.
Jeff started by saying that Software Defined Networking (SDN) has been promoted in universities by NSF as a significant outcome of NSF research. Jeff showed a cartoon version of an SDN network. SDN networks are more dynamical and programmable. Stanford has developed an open standard for SDN’s called OpenFlow. It’s essentially an open standard for controlling network switches. OpenFlow capable network switches are now coming on the market.
For Duke, this means that we have grant funded opportunities to figure out what we can do with SDN. We are looking at building multi-campus, multi-domain test-beds and linking multiple test-beds together. We think this will be a vehicle for deploying the next generation of networking services. There are two projects at Duke. One is a small grant aimed at developing some SDN enabled networking capability. The other is a larger project where we will buy some switches and distribute them at the University to meet a couple of network enhancement goals. We plan to go slow, develop incrementally and do pilots. We will be able to leverage the highly reliable and technically advanced Cisco IP network that we have here at Duke.
Jeff first showed how SDN can improve big data transfers by creating more direct linkage between departments, by-passing the central IP interchange services. This can be done by adding links around the edges of the IP network core. These would be controlled linkages that we can turn on and off as needed.
A second opportunity is utilizing advanced national network research fabrics with groups like NLR and Internet2. Both offer a dynamic circuit service which allows high-speed, dynamic point-to-point Ethernet connections. This means that dedicated, high-speed connections can be created for specific needs to points around the country. It also means that we can control how these connections come into our system. A current strategy is to setup a file server outside our network for this purpose. Again we want to be able to manage when we are connected to help keep our network safe.
The third use-case is “Clouds”. Over time, the DSCR will evolve to a model where we can allocate resources from the cluster and configure them to order for a particular use. Right now you can run jobs, but the allocation is one-size-fits-all. This doesn’t match everyone’s needs. This is like VCL on a larger scale. We’d like to be able to extend the networks of on and off-campus groups by going through the cloud. SDN may play a part in how we connect up virtualized services in the cloud to the home network resources like file servers, etc. We are close to getting there in a project with RENCI and next generation GENI test-beds. We are in the process of deploying little open-stack cloud clusters across fourteen campuses through national fabric backbones. We have most in place, except for the OpenFlow piece that actually connects them.
The common underlying capability that we need to do these three things is to strategically place OpenFlow enabled switches around the edges of the Duke network. OpenFlow switches will allow us to create commands that select certain sub-sets of packets and divert that traffic directly to the desired point. The acronym for this process is “ONRAMPS”, open-flow enabled, network resource access that is manageable programmatic and safe.
Jeff went into some detail about how the current network functions and how OpenFlow might improve functionality. The current network requires lots of manual administration and coordination that is prone to mistakes that affect the stability of the network. The software defined strategy is to have the commands to the switches be done by software. Commands put the policies in the software so that the network part stays pretty much unchanged. We can do some of this today. Remote network management is possible, but very expensive and proprietary. The OpenFlow strategy is to promote an open standard.
The risks for introducing this open standard are the possibility of it “breaking” the network, or introducing malevolent outside people who might write software that changes how our network works.
Stanford is promoting the advantages of this commodity networking infrastructure. This is good, important stuff, but we want to approach cautiously. It’s still very early. There is hype, and there are technology and market barriers.
Jeff shared some technical details about the technology and how this works. Basically we’re installing some pattern action rules with the switch. A well-defined set of actions is implemented when a set of packets matches. They only look at the header. Headers contain addressing information, similar to mail, and email. The software would slightly change the header information to accomplish this.
There are big players involved. Cisco has taken an “embrace and extend” approach. There is great potential in collaborating with them. NEC is also interested. The products and tools are not fully mature. The usage model is evolving. There is some risk. Standards are incomplete and unproven. Vendors are trying to sell tool-kits, and trying to get vendor lock-in.
There was discussion about the ownership of OpenFlow and how legal issues might impact development in the SDN arena.
Big Data - Robert Calderbank, Dean, Natural Sciences, Arts & Sciences, Electrical Engineering & Mathematics
Robert started by describing the committee that was formed by Peter Lange earlier in the year. Robert and Keith Whitfield were asked to form a committee to think about data of enormous scale might transform research and teaching, as well as what sorts of things we should do and not do. The focus was primarily on analytics, as opposed to computing infrastructure. The committee was organized around themes, such as digital humanities. For each theme someone was charged with putting together three ten-minute presentations followed by 45 minutes of unstructured discussion. There were about ten themes. Robert captured each into a three page summary that was approved by the participants. Findings and recommendations were put together and submitted at the end of the spring semester. Robert would be happy to share this report with the ITAC Committee. Altogether about 50 faculty presented.
Robert shared the development strategy that has resulted from the effort. He explained that in the past scholarship was a world of disciplinary silos. Access to information was a privilege that required going to the where the information resides. Instruction was also centralized.
We are headed to a world where barriers between disciplines are low. There is an opportunity to make connections across disciplines, particularly between computational sciences, social sciences and humanities. There are educational opportunities to enable leaders to come together so that we can do research and teaching in new ways.
We used to be a manufacturing economy. We’d aspire to get back to doing more, but we are largely a services economy. This doesn’t mean that technology doesn’t matter. Services economy is technical based, but it’s a different kind of technology. The recipe for success in the services business is to get to a bigger scale than your competitors and drive cost down. This is done by leveraging technology to diminish salary costs.
Big Data gives us all sorts of opportunities to be actively engaged with the world. There are on-line opportunities that we are looking into. Understanding data that is available gives us enormous opportunities for research and teaching.
Robert showed some examples;
The first was the engagement between mathematics and art history. This project involved the “reconstruction” of a fresco in Padua that was destroyed by Allied WWII bombing. Fragments that were gathered by the local inhabitants were pieced together using old black-and-white photographs of the fresco along with computer science algorithms. The algorithms rotated the pieces and placed them on the black-and-white picture. You can now see the fragments superimposed on a black-and-white photograph. The result is that you get a sense of what it was. This is beautiful example of putting computational science and heritage together. Robert also explained how “virtual” reconstruction is valuable as it does not prevent future reconstruction using new technologies that may develop. Robert also cited an example where a Vermeer painting was validated using modern forensic technology.
The second example involved children’s health and human development. It paired electrical engineering and the medical school in observing for autism. Clinicians have a protocol of scoring interactions using video recordings. They do diagnosis based on these scores. There is a long wait for doing diagnosis. The idea here is to take video of interacting children that could be automatically be annotated. Diagnosis could be made on the basis of a larger set of videos like home videos. We don’t know how to cure autism, but we do know that early intervention better than later. We have new faculty who have new ideas about how we can get out into local schools to develop solutions.
Innovation & Entrepreneurship - Robert Calderbank
We are also thinking a little bit about the on-line courses like Coursera. Our thinking is that we don’t know which of the companies will succeed. We think that the company that figures out how to build learning communities will do best.
We have many pre-med students at Duke. Organic chemistry is a rite of passage for these students. We want to find ways to make students successful at learning organic chemistry so that we don’t lose valuable students to the process. Student success is correlated to math SAT scores. A new course was developed around using mathematics to solve chemistry problems. They are analyzing whether they can tell a difference with students who start chemistry this new way. We think there are lots of opportunities to do experiments about how we build communities like study groups and special courses.
It is hard to tell where innovation ends and entrepreneurship begins. Robert thinks it’s more of a continuum. Where we want to be requires working on both. What we’re trying to do is to provide resources so that different schools can come together to create new curriculum and new opportunities. There is a sense that this doesn’t apply to Arts and Sciences. One purpose of making these investments is to change that perception. This is not about apps on iPhones. It’s about finding ways for students who are passionate about their area of study to continue in life after graduation through active engagement with the world.
Fuqua and Global Health just won an enormous grant from USAID for social entrepreneurship. We want to expand people’s view of the reasons for entrepreneurship. It’s not just about getting rich. We won’t be unhappy if some people get rich in the process, but that’s not all of what it’s about. We can try to change the campus culture to be more innovative and entrepreneurial. Robert noted that Tracy has a wonderful scheme for involving students in Duke processes and operations.
Robert showed a slide with a variety of entrepreneurial efforts;
• Student company developing carbonated, low calorie tea
• Neuro-marketing to analyze neuron response when you see something in an ad
• Alumni involvement with internships
• Faculty using of Old Chemistry Building for innovation projects For the future we need to develop some new faculty.
We think that Big Data will touch and change all departments. We have to think about how we infuse new thinking. We need active engagement, experts in residence, faculty fellows, community focused teaching and lifelong learning. Q&A followed. There was discussion regarding;
• When Big Data might become too big if we aren’t proactive
• The difference between disciplines and how we train faculty for this century
• How Duke is positioned relative to peers
The meeting adjourned at 5:32