ITAC Meeting Minutes
February 18, 2010, 4:00-5:30
RENCI Center

  • Announcements and meeting minute review
  • eLearning Roadmap and Update (Samantha Earp, Ed Gomes, Julian Lombardi, Lynne O’Brien)
  • CSG Update (Klara Jelinkova)
  • GPU Cluster in SCSC (John Pormann)


Announcements and Meeting Minutes

Terry Oas called the meeting to order at 4:04pm.  Minutes from the previous meeting were approved, and it was announced that this would be Klara Jelinkova’s last ITAC meeting before her move to the University of Chicago, where she will be serving as Chief Information Technology Officer.

  • Announcements and meeting minute review
  • eLearning Roadmap and Update (Samantha Earp, Ed Gomes, Julian Lombardi, Lynne O’Brien)
  • CSG Update (Klara Jelinkova)
  • GPU Cluster in SCSC (John Pormann)


Announcements and Meeting Minutes

Terry Oas called the meeting to order at 4:04pm.  Minutes from the previous meeting were approved, and it was announced that this would be Klara Jelinkova’s last ITAC meeting before her move to the University of Chicago, where she will be serving as Chief Information Technology Officer.

eLearning Roadmap and Update (Samantha Earp, Ed Gomes, Lynne O’Brien)

Ed Gomes began with an overview of the eLearning Roadmap Committee, which is charged with assessing e-learning needs of the campus community and identifying the tools, support, and infrastructure that should be centrally provided at Duke.  The committee, which Ed chairs, is researching student learning systems across the university and medical center.  More information about this initiative can be found at http://elearning.duke.edu.

The committee has identified four primary components to this research:

  • User profiles – researching which e-learning tools are popular at Duke, as well as how faculty and students interact with these tools.
  • LMS review – reviewing LMS reports from eight institutions to identify trends and evaluation methods from different institutional efforts.
  • Guiding principles – establishing a set of guiding principles that can be applied to any set of e-learning tools.
  • Functional requirements – providing a preliminary list of functional requirements for a variety of e-learning services and solutions.


There is a subgroup tasked with deliverables pertaining to each of these components, and reports from the fall semester are now available on the website.  On behalf of the committee, Ed invited feedback on these reports to help the group better evaluate and tune its direction.

Ed explained that the group is trying to evaluate the three LMS contenders (those most often used by peer universities): Sakai, Moodle, and Blackboard.  The committee is working to bring in instructional technologists who know the ins and outs of each LMS in order to conduct day-long reviews of each tool.  These reviews will be driven by the core functionality outlined within the functional requirements; students and faculty will be encouraged to attend these reviews and provide feedback on how well these tools perform against the set criteria. Additionally, meetings will be coordinated between these experts and Duke’s IT professionals to discuss logistics of implementing each tool and migrating content.

The committee will be meeting on February 24th to begin going over feedback and discussing plans for the spring semester.

John Board asked if the committee has any plans for implementing a real course at Duke in a different LMS.  A discussion ensued regarding the feasibility of this idea.  To date, the committee has elected to focus on providing test courses developed to demonstrate the full breadth of available functions within each tool, allowing testers to really experiment with the tools outside of the context of a live course.   Terry Oas pointed out that brief tests of a tool is not the same as using a tool, and inquired as to whether it would be possible to implement a test course in a different LMS by next semester if a volunteer offered to use it for their course.  It was agreed that this could be accomplished by partnering up with another Shibboleth university willing to give us access to one of the other LMSes.

The discussion then turned to comparative evaluation of the LMS contenders. Molly Tamarkin voiced concern over how a university-wide shift from Blackboard to another LMS would be received by the community if the advantage of one over the other were minor or debatable, and pointed out that the differences between supporting one tool over the other should be evaluated along with the functionality if it is not already.  Ed clarified that the guiding principles do include support considerations, and reiterated that these documents are available on the committee’s website and are open for review and discussion should anyone wish to contribute to the guiding principles.

Terry mentioned that one lesson learned in similar projects over recent years has been that more useful feedback comes out of letting someone sit down and try to figure out a tool for themselves rather than providing guided walkthroughs.  By giving a student or faculty member a list of tasks, Tracy Futhey noted, you can often get more realistic feedback regarding the intuitiveness of a tool and where improvements should be made.  DSG representative Ben Getson suggested majors unions as a potential resource for targeting non-technical students to join these focus groups, in order to best gauge usability across many levels of technical expertise.

Lynne O’Brien discussed pilot programs at other universities, and Terry suggested that findings of peer universities would likely be very similar to what we would find with a similar pilot at Duke.  Lynne agreed that there is much to be learned from the experiences of these pilot programs.

John Board asked what the committee would do if, after all this investigation, the LMSes are found to be more or less interchangeable.  Ed responded that function is not the only concern, and that the LMSes will need to be evaluated against the committee’s guiding principles as well as logistics of support, integration, et cetera.   

Having established that reported satisfaction Blackboard is relatively high, a discussion followed on whether a change is necessary at all.  Samantha explained that the next version of Blackboard is quite distinct from the currently implemented version, so this is a great opportunity to consider alternatives before a major change becomes necessary.  Lynne added that we’re also not currently using Blackboard to its fullest potential, and a fully loaded version of Blackboard should be compared to the alternatives.

Ben Getson pointed out that as a second-semester junior, he is currently taking his first course at Duke that utilizes Blackboard for course materials.  Ben suggested that although reported satisfaction with the tool is high, dissatisfaction might be communicated by non-use.  Terry agreed and commented that he’d like to see more of the known statistics (such as number of classes using Blackboard) included in the LMS evaluation reports.

Another concern among faculty is a need for some to host documents and course materials in more than one place.  Faculty seeking tenure, for example, may host course materials on a publicly accessible site in order to keep them available to the public for review.  Noting that this was a common complaint, Ed agreed that more granular privacy controls on documents would be desirable.

Molly Tamarkin asked the committee if they think the trend is moving away from a single system, possibly making this evaluation less critical to future university activities.  Ed responded that this was certainly a point for consideration, and if findings of this research support that, they may propose a “best of breed” approach to university-supported tools rather than expecting any LMS to do it all.

Terry noted that Blackboard is currently the only way to distribute mid-semester grades electronically, and the group discussed the inconvenience of having to use many different tools to distribute course materials.

John Board asked how many of the universities identified as using Blackboard 9, how many are already live.  Lynne explained that most are not at this moment, but many expect to go live over the summer.  John voiced interest in hearing from some of the universities that already have field experience working with Blackboard 9.

The session concluded with a discussion regarding comparative costs of support, implementation, and training between the LMS contenders.  Ed identified a subgroup tasked with providing analysis of these expenses, to be discussed at a later meeting.



CSG Update (Klara Jelinkova)

Klara provided an overview of presentations given at a recent meeting of the Common Solutions Group (CSG), a set of higher education institutions and IT consortia working together to create a common infrastructure and toolset to be shared by participating universities.

The first presentation was by Ken Klingenstein of Internet 2, who talked about collaboration platforms.  Ken’s idea of a collaboration platform involves a suite of integrated tools that work together behind a shared identity management system.  Federated identity reduces authentication barriers to collaboration and can provide multiple levels of assurance to meet a variety of application requirements.  Sharing content across a common IDMS is a fairly resolved problem, Klara explained, but there are still challenges with integration of content and metadata. It is critical that these platforms are presented as straightforward and consistent, but this is difficult due to the differences in vocabulary, campus culture, and role-based permissions between the research universities.  Although we have the technology to share protected content across institutions, these challenges will necessitate quite a bit of discussion in order to implement a truly usable collaboration platform.

Klara then discussed some research presented by Chad Kainz on humanities collaboration.  Chad’s research suggests that principles of social networking are not necessarily appropriate for scholarly networking, as social networking involves connecting an individual to a known group, while scholarly networking should aim to connect individuals across disciplines or locations who might otherwise never come in contact with each other.  Chad refers to this as the “pub problem” (i.e., you need to be in the right pub to make the right connection) and notes that this kind of networking currently occurs at meetings and workshops, but in an increasingly interdisciplinary world, we are failing to address this problem adequately.  Chad’s research is focused on exploring how technology could be applied to put researchers working on similar problems in touch with each other across institutions and disciplines.

The next presentation was by RL “Bob” Morgan on the basics of identity assurance, which, as Klara explained, is not a one-size-fits-all kind of problem.  For some applications, it is not critical to verify identity, provided the credentials supplied are consistent.  For others, it may be very important.  High-assurance practices are very intrusive to users, and better not introduced unless deemed necessary.  Bob identified four levels of application security and discussed possible strategies for balancing the inconvenience of high-assurance practices with need.

The last presentation was by Klara and CMU’s Joel Smith, who discussed CMS usage and whether we can expect them to continue to be utilized into the future.  Klara and Joel sent universities a detailed survey about LMSes to the CSG schools.  What they found was that only 70% of surveyed institutions had implemented a central LMS that was commonly used. Blackboard is the predominantly used LMS, but 50% are evaluating alternatives, citing fear of business continuity much more than dissatisfaction with the tool.

Looking at utilization (or campus penetration) by product, Klara and Joel found that the tool itself did not drive use nearly as much as the university-specific implementation.  Princeton, for example, has a very successful Blackboard implementation, while other Blackboard schools do not see nearly as much usage of the application. It appears that the support on top of the tool is the make-or-break factor, rather than the LMS implemented.

Klara mentioned that between Google Docs, Ning, Facebook, and the like, students are commonly breaking away independently and using other tools outside of the central LMS, although no single application seemed to be taking over.  According to Klara, the jury is still out on the future of the CMS/LMS, but what we can take away from this research is that it would be beneficial to focus on administrative support in whatever tool(s) we choose to implement.


GPU Cluster in SCSC (John Pormann)

John began with thanking Tom Milledge for his work putting together the GPU cluster in the Scaling Computing Support Center (SCSC).  He then introduced graphical processing units (GPUs) as a possible alternative to CPU processing.

While market CPUs typically double in capacity every 18 months, over the last decade or two, GPUs have been doubling every 12 months.  Modern GPUs can have 3 billion transistors to the modern CPU’s 1 billion, with more of their resources dedicated to computations rather than cache memory and other functions emphasized in CPU function.  For this reason, GPUs are attracting a lot of attention in the world of high-powered computing.

Citing a lot of fear marketing and misinformation about GPU usage, John clarified a few things about GPU processing.  You can use your desktop while doing GPU computations, although sharing resources will cause it to run more slowly. Just like a CPU, multiple users can use the GPU at once (although there is no priority mechanism within the GPU).  Display drivers are fully supported as well – John mentioned that most people already have a CUDA-capable driver installed on their machines.

GPUs can do general purpose computing, including random read/write, IEEE-compliant floating-point calculations, integer calculations, and usually double precision, although some care should be taken to get the right GPU if double precision support is desired.  Computations are bit-accurate in terms of the IEEE spec, although maybe not relative to the CPU.

John then discussed some emerging programming models for use with the GPU.  Among them are nVidia Cuda, ATI Stream Technology, and Open CL (hailing from the group doing Open GL).  All three platforms are essentially C programming in terms of memory allocation and the like.  Matlab plugins are also available for GPU programming.

The Blue Devil Grid GPU Cluster features a login machine, 11 machines with GTX-275 (a consumer-grade GPU), 4 machines with Tesla C-1060 (a high-end GPU), and a machine with 2x Tesla C-1060.

John then explained good and poor matches for GPU processing.  Good candidates for GPU processing include molecular dynamics (which see a 50x-100x improvement), physically based simulations (10x-40x), signal processing (40x-70x), and Monte Carlo simulations (50x-80x).  Poor candidates at the present time include gene matching and database-heavy applications like search and sort, as the cost of transport negates the processing gains.

DSG representative Ben Getson asked if the SCSC had considered chaining PlayStation 3s together as some have to create a powerful machine for distributed computing.  John responded that it was considered and the price advantage is certainly compelling, but that the approach is sufficiently burdensome programmatically to have swayed the SCSC in a different direction.

John finished the session with a demonstration of some GPU development in Matlab, explaining that although he was working on a Linux platform, the Matlab plugin for GPU development is available for other operating systems (including Windows) as well.  He encourages those interested in learning more to visit http://www.nvidia.com/cuda(link is external) for a showcase of possibilities in GPU development.