4:00 - 4:05 p.m. - Announcements (5 minutes)

David MacAlpine – Approved Minutes from March 10th

JoAnne Van Tuyl – Expresses appreciation on her behalf and Charlie Becker’s (Econ) for help with technology support for Ukrainian colleagues associated with the Kyiv School of Economics. This followed a sort of meeting Information Talk with some of the people from that institution last week, and at the end Charlie asked if there is something to do to help. The Ukrainian colleagues expressed a need to help with getting data safely moved to the cloud; their School has been fine so far, but they were madly digitizing all kinds of archives and journals and things like that, and saving them on their hard drives, and they weren't confident about their the cloud storage options and long-term storage. So I just went to Tracy and Tracy tracked down what they needed. And you have helped them out. So I just want to shout out to Tracy. She did good, and I told her that she and Zelensky are my current heroes.

David MacAlpine  -  Well, thank you very much for hearing that's fantastic, and thank you, Tracy. Do you want to describe what we are going to be doing with our next topic.

Tracy Futhey - Thanks, thanks, Dave. This will be the first of 6 or 7 areas we are discussing in ITAC. I came to do 20 years ago and at the time we didn't have much in the way of centralized support for research. And so we started to work on the research side. For much of the 20 years in between then and now what we did to support research was termed research computing, because, as things emerged in those earliest days of supporting research with technology it was mostly about computation, and it was also about networks, and of course, those both remain important today.

But if we fast forward to today a lot of the research and the academic scholarship that occurs around Duke is not explicitly research computing it's about data it's about digital uses and digital representation of scholarship in ways that are still dependent on computation at some level.  But they don't all revolve around computation necessarily. It's also about a lot of collaboration locally here at Duke and across units, but also nationally and internationally, as Joanne made that nice reference to a few moments ago.

It became pretty clear that as we're getting a new Vice President for Research and Innovation, that we're continuing to think about how we evolve and support our campus research needs, coupled with going through a process of a fair number of mergers and consolidations and changes to our actual IT support function. That meant it was probably a good time for us to really do the first kind of broader look at what support in terms of information technology was required for research in its different manifestations throughout campus.

So this semester we've spun up 6 or 7 different groups to each engage a handful or so plus of faculty. And talk about it-- what does research support look like from your disciplinary perspective? And what is required to support research in an area like the natural sciences or engineering, or aspects of visual arts and digital humanities. And so, today's the first of these conversations—we will aim for each of these discussions with ITAC to be conversations at the end of the day, not simply presentations.

What my desire and outcome is to have a better sense of the range of activities at Duke, which are the ones that really rise to that highest priority of we need to do centrally, what we would consider being kind of an “entitlement service” that everybody has access to versus things that may be a bit more nuanced and specific to a department or a unit, or a particular kind of research or scholarship. This process involves a bit of fact finding and so we've launched these groups and I know they're going to be way more needs than we can meet centrally out of the gate right away, and more needs than we have the resources to contribute right away. But I hope this will give us a bit of a roadmap, so that over the next year years or longer, we can evolve and continue to provide the kind of support that our faculty need.

What are the things that are common enough that we should do once in a comprehensive way, as we've tried to do with the Duke shared compute cluster and other services?

We're going to hear today from Greg Wray, and he’s representing other colleagues in the natural sciences will be the first of a several discussions. They will probably be cases where we will hear we need way more need than we've got today, and it will feel a little audacious and terms of how we would ever get there. But it should start to give us an ability to compare, prioritize and decide what we should carve off first, and how we do that, whether in OIT, in units at the at the school level, or with the Vice Provost for research or other resources.

 

4:05 - 5:05 p.m. - Natural Sciences Research and IT Support), Gregory Wray, Ph.D. (40 minute presentation, 20 minute discussion)

What it is: Professor of Biology and Director of the Duke Center for Genomic and Computational Biology, Gregory A. Wray, will be joining us to present upon his research/academic efforts, discuss the role IT currently plays in support of his work, and identify and review some of the areas for growth and additional opportunity between the Natural Sciences and IT support. 

Why it’s relevant: In an effort to learn about the overall character of research and research IT support throughout the University, as well as to explore commonalities between the needs of individual domains and Duke as a whole, ITAC will be hosting a series of presentations/discussions over the course of the summer semester with key researchers and their colleagues. These discussions will aim to distinguish the most prevalent services for which IT need to aim to provide institutional level support versus those that are surely essential for certain research but are not pervasively used, and so may be better supported from the school/institute/department/lab level. Ultimately, OIT is seeking to open better lines of dialogue with the major research efforts at Duke, to learn how to better support our researchers, overcome any gaps in the current system, and collaborate to identify new ways to assist in elevating Duke's Research Community as a whole.

 

MINUTES

David MacAlpine - Thank you very much. Well, Greg welcome and I’m excited to hear about natural scientists in the center for computational biology and genomics for genomics and computational biology.

Gregory Wray - Thanks, Dave. So, I’ll just say that there were a group of us that met in the middle of February with representatives from physics, chemistry, mathematics, biology, and evolutionary anthropology this. Several of you were actually also at that meeting. I feel a little awkward for representing that group, because I don't know a lot about their research so, I’m hoping that that Ed and Richard and Evan and Logan can help me.

I’ll just start with a little bit of first person. I work in the area of genomics and my lab faces challenges moving big files around fairly with a certain amount of agility. We might get up one day and decide we need to move 3010 GB files over-stage them and do some stuff with them, and tomorrow we don't need them there anymore. But we've got a different set of things and we want to move them over, and this is these poses some challenges.

So that's one issue is just very large files that require not just transfer, but high I/O capacity during the actual compute process. The second issue may be true of a lot of people. we like to just try stuff out and so, having some quick loop on installing, running, getting rid of something, installing something else, trying to run. This is central to the way we like to do things, and in some cases a little bit of a problem outside of HARDAC, where I’ve done a lot of my computing. HARDAC has played a major role for centralized infrastructure, and I hope we are going to get to a point where we don't need a lot of these specialized side installations because I think they do come with a lot of problems in terms of management sustainability.

HARDAC was set up in the era of the IGSP, the Institute for Genome Sciences and Policy and I am a major consumer of its resources, as is Dave MacAlpine, and it was really built to be a super-fast I/O, but not necessarily a super number crunching installation. So it was built around very generic, Unix boxes, but with InfiniBand and GPFS (General Parallel File System) so that we could move the files on and off CPUs and nodes as quickly as possible.

It's but it's been hugely enabling for the Genomics research community and not only school of medicine users, but others from evolutionary anthropology. biomedical engineering and other domains including Arts and Sciences who use it. But the truth of the matter is a lot of the usage also does come from School of Medicine, users. So, it's a little bit spans the domains of OIT and DHTS

HARDAC has been really great because we have a lot of control over, we can just install software whenever we feel like it. And we can move files around and it is performant in the way that we need it to be and so we're actually really happy right now, except that the school of medicine is about to shut it down so that's my big pain point.

David MacAlpine - I can echo all those sentiments from the genomic side. Being able to collaborate on HARDAC with people from biology, or computer science which is very difficult on some of the health side networks

Greg Wray - So what I thought I would do next is just give it over, to maybe some of the more generalized natural sciences research computing pros and cons.

 

PRO’s:

OIT rocks compared to DHTS I mean this is not a national secret, or anything like that.

But I think I feel very lucky that this ITAC group even exists. There is no counterpart to this group in the school medicine. There is no strategic vision for computing in the School of Medicine.  It's not even clear who owns what parts of who's the decider about things who you know thinks about budgets.  It's just it's an, it's a hot mess. So, I’m really glad, I want to just lead by saying, I’m really glad that there actually is a group of people who think about this stuff on the campus side. I don't take that for granted because of living a little bit in both worlds.

 

CONCERNS:

Colleagues in physics and chemistry report insufficient raw compute power. These colleagues can’t do what they consider to be fairly routine operations for their field. They actually have to work with collaborators at other institutions who have more raw compute power. And the way they represented it was that these are sort of pretty routine things that they need to do quite commonly and we're not talking about they call up their friend at MIT or you know, Harvard, or something like that they call up their friend, East Louisiana State University, and they've got a better compute installation than we do. So that that was we heard a little bit of that.

Another pain point is people who have to move data between special installations, such as going from a DHTS environment to OIT. This actually affects even people who only work on the campus side because many of the core facilities are in the DHTS network and deliver their data to a file share in the DHTS network. And then the person in campus side has to move it over into wherever they're going to store it stage it work with it. And so that file transfer process can be a bit fraught because you need to work across the that firewall.

Mark Palmieri – In BME we run studies in the hospital and the need to balance between security in DHTS & OIT creates difficulty for fluidity of data flow—we have to set up VMs to log in and move data because you can't get two networks to talk to each other and you're creating virtual hot points for multiple users getting permissions in different groups. How much bandwidth and effort gets consumed just because we're copying files internally to mitigate between different security requirements?

John Board – As far as a simply lack of bulk compute power is it a sense the basic entitlement level that any Duke faculty member should be able to access which was committed to years ago by the Provost is too low? Is it the sense that we should revisit what our kind of minimum entitlement level is, or is it nuanced somewhat differently?

Greg Wray – I think you probably would hear a lot of different answers. If you pose this to faculty For the genomics research community we huddled after we heard that HARDAC might be going away, and collectively agreed, we’d be happy to pay for this.

It's really that we just want the capability, so I think there is willingness to put actual resources on the table. They just want access to infrastructure and ability to really utilize it freely.

Steffen Bass - In physics our biggest user are heavy consumers of computational resources and they have needs that far out strip what OIT could conceivably offer so we have to go off-site to major facilities, which involves writing grant applications to get our CPU hours. The competition for these is getting more and more commoditized, and it is more and more woven into a lot of things that we do in the research world. Generally speaking, it behooves an institution like Duke to provide a base level of capabilities in terms of computing and storage that can satisfy these general needs, setting aside the millions of hours per year that really high-end labs need.

Open Science Grid is incredibly useful and facilitates faculty research in terms of making sure that we are matched up with what our needs require, especially when faculty don’t know a priori what they need.

Tracy Futhey - Thanks for sharing that Steffen. I think I would just point out that historically, when we thought about again research computing, we thought about computation, and as Greg's early remarks point out computation is important and having enough of it, or access to enough of it from third-party locations.

But your point is as or more important because just having the computation isn't good enough if people don't know where it is, or how to access it. The importance of additional people resources has become a lot clearer even during the first couple of these conversations: the need for people with different kinds of specialized expertise who could be resources either locally or more globally.

Robert Wolpert – Several years ago the Provost, she agreed to committed to covering 80% of the computing needs of the faculty, recognizing the 20% of the most computationally intensive users would need to utilize grants or other approaches to meet those needs. Recognizing those needs may have changed over time we should reassess whether the level of “entitlement” computing being offering by OIT still meets that 80% threshold.

Tracy Futhey - I'd agree with that, Robert, and if the 80% then was about however many cores, but if the 80% of people now need more computation, we should address. But we also need to understand how they want to accomplish that 80% of their computation--onsite or in the cloud--to decide what combination of resources, from data, analytics, and tools to do that to computation itself. We haven't redrawn that line and I think just.

Richard Biever - I'll ask this you know Greg you know please jump in and correct this. but one of the things I think we heard when we were talking to the Natural Sciences team, and including you was around the generalized compute needs of meet the 80%. In some cases there are more specialized needs which you highlighted don’t actually fit with the 80%

Greg Wray - Yeah, I think that's right. I don't want to give the impression that I'm an advocate for having sort of specialized infrastructure hiding in special places in the campus. I actually would really rather utilize a central resource. And Charley Kneifel's been very helpful in helping the genomics community.

I did want to follow up on something that Steffen said. People are important and some degree of domain knowledge is also really important --what a physical chemist needs, or what particle physicists need, or what genomicists need -- they're not really the same thing. The experience that some of these people have gained in their in their collaboration with researchers is a really important part of that human capital.

Robert Wolpert – There used to be what they called a faculty oversight group for particularly for research computing that met I don't know quarterly or so, and somehow it faded away. A couple of years ago. Is there a role for such an organ again?

John Board - Yes, and I think part of the current process is to redefine that behind sort of a group of kind of the normal HPC Suspects into a group that can bring a somewhat broader vision of what research computing support means for Duke going forward.

Robert Wolpert - And, indeed, it may not be just boxes that we need to buy, but people.

John Board - It's fair to say compared to some of our peers we are under-resourced in that kind of domain computational expertise. The funding models of our peers are all over the map, from, fully allocated to heavily charged out, but just in pure body count. we have fewer of those people than many of our peers do right now.

Tracy Futhey - So. Yes, Robert, I think this process will lead us to better understanding what would constitute the appropriate support group. We'll learn through the process whether we've got the right amount and type of computation and networking for traditional research, but I think increasingly, we're understanding that even what we are providing, even if we supercharged it, represents a necessary aspect of supporting research but not sufficient for the breadth and kinds of research and scholarship going on. So, the domain coverage in the other areas gets us to sufficiency.

Robert Wolpert - I think the funding models are critical here. The reason that there was no support for research computing when you came 20 years ago is that the scientist’s kind of held Central computing off because we viewed central computing as people who were coming in to take the resource, take our resources away, that we had gotten through grants and whatnot, and they did. They would tax grants fund. central facilities that were of no use to us, and it didn't reduce our productivity. So be careful not to fall into those same ruts.

Brandon Le  - BioData Catalyst is a computing platform that's run out of the NHLBI, the National Heart Lung Blood Institute, and they recently launched this platform. I use this platform because it has helped accelerate my research at Duke through whole genome sequencing data that can literally take up hundreds of gigabytes per patient cohort. I used that platform because it allows me to scale up these analyses and just be able to store like literal terabytes of data at once about having to worry about processing limitations or storage capacities that may be restricted on Duke's side.

Greg Wray - Yeah I think it raises an interesting question that we've been that has been ongoing on the school and medicine side, which is, when do you move off-prem completely and into the cloud, you can’t do it abruptly. The SOM plan moving forward is for everything to be hosted in Azure. The idea is that HARDAC will get shut down and bucket storage will be provided, and pretty much everything is pay as you go after that. And there’s almost nothing in the way of people support.

David MacAlpine - I'll just echo we've been trying to be onboarded onto Azure for a while. But the stumbling block is it all of our data is hosted on an OIT VM and they have no ingress or egress path into the Azure secure storage blobs. Of a lot of the basic science researchers now decide well there's just too many obstacles to do it in the cloud.

Greg Wray - Yeah some SOM refugees are going to be coming over to OIT’s DCC because when HARDAC gets shut down, most of those users are not planning to go to Azure.  There are about 20 people so 20 labs I should say,

Richard Biever - Yeah.  And speaking in data, there were a couple of other points that arose with the group. Regarding data management, policies, and security, it feels like in some cases, we have a very binary decision on either 1) all research is sensitive or 2) it's all public and so there's not enough gray area recognized in between on how to deal with that.

Colin Rundel - There's a bit of pain point for data management within statistics particularly on shared servers. It's not necessarily private data, so it doesn't have to go on to the PRDN but that middle ground is a pain point because there's data that is restricted and it cannot go on a shared server, and so in that case the current solution is just to invest in private resources in the department that somebody puts in a locked room and then it's underutilized. It'd be much nicer if that's the kind of thing is in a central pool and we can spin a VM off of that and do it..

John Board – Were any of our computational chemist friends on this call. It was not that many years ago, but they were the bane of every CPU on campus, in any cycle that was not locked down was going to be absorbed by them. Are they kind of self-contained now, or did they have any presence at this meeting, or do we have any sense of sort of life in their world?  These days.

Greg Wray - They were one of the groups that complained about not having enough compute, raw, compute power here to do even what they consider to be fairly routine tasks. So, I think I chemistry has its own Linux cluster set up but that's only that's not suitable for sort of frontline work.

Mark Palmieri - Yeah, one other comment is you know, with growing resources, computational efficiency of our groups is never really discussed. And so, I do think that you looked at how many CPU hours and terabytes of storage and quotas of RAM that are actually needed versus it's easier to just be as gluttonous as possible and throw it up there and call it. Your need is a balancing point you know I'm from the generation where you know to do my simulations in my grad work. It was a six-month run time. I was happy.

At some point scaling computational resources and power becomes very costly, and so putting calibrated people and boots on the ground could put our finite resources to way more efficient use at times.

John Board - Within the natural science group, I haven't heard the magic words machine learning and the infinite amount of GPU power that that tends to involve is, and then I know we will hear about that from some of the other groups.

Greg Wray – Machine learning needs of researchers came up in the meeting in the working group meeting, while recognizing Duke has a very active machine learning community here on campus. These needs are expected to become a bigger and bigger part of what researchers do.

But I think that the question is, is that something we want to build as an infrastructure on premises? Or do we want to.. Is that the kind of thing that really is better served by going off-site and just paying as we go rather than having within OIT.

Robert Wolpert - It's a moving target and you really don't want to buy neural net machines unless you're gonna get your payback today because by tomorrow you'll be looking for something else.

David MacAlpine - And I’ll just add this week alone I've been on 2 thesis committees or committee meetings in the genomic and biological space where the students are making heavy use of GPUs and machine learning. One that was in computer science; the other was a university program in genetics and genomic. But they were both pushing the limits of what's available to them. And if you tried to go off-site and even spin up toy cases like TensorFlow or resources in Google's cloud, you can't even get GPUs now,

John Board -The idea of a core facility for training and executing neural networks on the campus side that's shared across many different application areas is interesting. It might be a way we could make it really transparent for people in many disciplines to use this capacity but as always the funding model would be important to get right.

David MacAlpine – We could perhaps partner with Apple or Google others in RTP bringing in big machine learning component.

Richard Biever - So I think we covered most everything here in the list. Greg and Katie anything else that we're missing.

Evan Levine – A takeaway for me is on that training and education in using these resources came up in in more than one of these sessions that Duke. Duke broadly has a lot of sort of introductory level. Basic level training from, but then not so much specialized. And that starts becoming some of the domains specific.  How do I do more complex things with it?

Greg Wray - So one model that we've we're looking into when we were thinking about this was essentially a percent effort model. Not a freely available but one with a department or a graduate program or an individual researcher committing 10% of a salary or 30% of a salary, and we're going to expect that person to put that much time into that project. So, if people really want that and are willing to pay for it. That's how at least some of that expertise could potentially be funded.

Tracy Futhey - Charley and his team plus others have started to provide tech resources or project management resources or other support that can be could be charged against grants. We've got a couple of successful cases where we've had people full time or nearly full time over years, paid off a grant so if anybody has those kinds of needs we can help.

David MacAlpine - Any more questions or comments before we move on.

Greg Wray - Thank you for this opportunity I can't tell you how much I appreciate the fact that you're trying to do this because it's a really it's a difficult situation in the School Medicine. And the conversations there are basically non-existent.  So. so I really do appreciate this and thanks for hearing me out.

 

5:05pm - 5:30 p.m. - Introduction to Duke AIICE (Alliance for Identity Inclusive Computing Education), Shaundra (Shani) Daily, Ph.D., (15 minutes presentation, 10 minutes discussion)

What it is: The Alliance for Identity-Inclusive Computing Education (AiiCE) aims to increase the entry, retention, and course/degree completion rates of high-school and undergraduate students from groups that are historically underrepresented in computing through evidence-based, identity-inclusive interventions. AiiCE’s collective impact approach to broadening participation convenes national leaders in K-16 CS education to transform high-school and postsecondary CS education using innovative strategies that target the people (secondary/ postsecondary educators, teaching assistants, and students), policies [state (K-12) and institutional (postsecondary) policies, as well as postsecondary accreditation criteria], and practices (classroom/department cultures) that directly impact student entry, retention, and course/degree completion. Backbone Director Shani Daily will join us to introduce the Alliance and their current and future initiatives.

Why it’s relevant: As computing becomes more ubiquitous, it is imperative that technology creators from a diverse range of identities are in development and leadership positions to ensure that harmful technologies are avoided. This requires creating academic cultures in computing that emphasize the importance of identity, its societal impacts, and the impacts of technology on people from non-dominant identities.

 

MINUTES

Dr. Shani Daily: Dr. Nicky Washington sends her regards she wasn't able to be here today with other commitments. AIICE's is the Alliance for Identity Inclusive Computing Education. We're an NSF includes alliance that was funded last July.

So, I’m guessing with this group you know that computing has the diversity equity and inclusion problem.

I don't think graduation rates for women have improved in the last 7 to 10 years.

The same with other minorities groups so Black, Native, Hawaiian, Indigenous populations, etc.  And then, if you started, look start to look at the intersections between different minoritized identities the statistics just get worse, and most of the statistics that you see out there don't even include disabled and queer communities, since there's not often data collected for those groups, and typically because of the way that government collects information Asian is presented as a monolithic group. but we know that there are groups within the Asian community that are also underrepresented in computing.

So again, didn't want to delve into all the statistics, just because it's gotten so much press over the last 5 years that I’m hoping we can start there as Okay, you know we have we have we have a challenge!

So, let me be super clear before I move forward on this the things that I’m going to list on this slide. I'm not taking a dig at what we already do; I am also served as faculty director for Duke Technology Scholars program, and with a fabulous team, Amy Arnold and Kelly Perry.

We have here a 35% graduation weight for women in computing at Duke, which exceeds the national average of 18%. But I’ll explain more why the things that we already do aren't enough if we're really going to push the needle.

Let's think about diversity, Let's think about opening up pathways the kinds of things that you see in the field are:

  • Scholarships,
  • Mentoring programs - So providing students with people that they can talk to about navigating their challenges.
  • Affinity groups where it's specialized groups within either departments or the organizations
  • Special panels and workshops that you can go to, and just kind of learn what you should be doing as a group often especially when something happens.
  • Action forums, you know, ex only problems, programs, you know, making sure we have role models and campaigns.

 

So again, I’ve done all of these things. I've been a part of all of these things. I've benefited from all of these things but these things alone are not going to push the needle, because when we have students that we put into institutions, and we're not paying attention to what we're doing in the institutions, and specifically challenges with people, challenges with policies, and challenges with practices within institutions. Then we're only doing half the battle, so what happens is you know we give students all these tools. We stick them into institutions that have not changed they way they operate and if those students then manage to persist. it's not without serious hardships and sometimes even trauma... if they can even make it through.

So when you only have certain students who make it through or students who are changed in negative ways as part of making it through:

  1. They go into industry right. they go into the academy and they're not bringing the diverse perspectives, there
  2. So, we end up with harmful technologies.
  3. We end up with toxic cultures
  4. We end up with things that perpetuate the cycle.

 

There are also obstacles that you might see as a student or faculty, or from industry.

  • Lack of lack of AP or computing access
  • Computing course access
  • Inaccessible materials - which is important for everyone, but especially if you're talking about disabilities
  • Being stereotyped by educators’ staff and peers
  • Campus Policing
  • Being asked to shoulder all of the DEI work,
  • Not having culturally relevant curriculum - That's curriculum that connects with who you are, and what you do,
  • Requiring self-funded laptops - Again that kind of policy there
  • Entrance exams which have been shown really to not be very informative, but also we see that minorities, groups perform a lot of times less well
  • Inequitable teaching assistant selection –
    •  if there's not a process in place for selecting TA’s or and then only certain students end up being TA’s, that’s who's represented as like the good students
    • if we don't have equitable process in place for TA selection that can be problematic
  • A lack of diversity, diversity amongst educators right - It's been shown over and over again that role models do matter. That's why we've done this work in the past. But if you don't have people who are in in front of you, including TA’s that can really impact the student experience
  • Expectation that everyone learns at the same pace
    • There's this cool initiative at Berkeley around I think they're calling it A's for everybody and it's around mastery learning.
    • I know Drew Hilton is a huge fan of mastery learning and figuring out how to make that happen. And I think that's a really important thing that we should pay attention to

 

Obstacles for Faculty:

  • Limited perspectives on what counts as research.
  • H-index measures.
  • Macro aggressions. you may have heard of microaggressions. But macroaggressions are basically systemic right?
  • Angry stereotypes,
  • Asked to shoulder all the DEI work
  • Inequitable pay
  • Biased student evaluations

 

Obstacles for Industry Employees and Founders:

  • Companies only hiring from the same set of top-tier universities
  • Bias in resume review and interview processes
  • Pay inequality
  • Being Passed Over for Advancement.
  • Again microaggressions, I would add harassment,
  • Little access to startup funds.

 

So hopefully that paints a portrait of where we were coming from and thinking about it's really important to think about student but it's also important to decenter our students and start to think about the institutions they and how the people, policies and practices are impacting.

AIICE’s vision is to increase the entry, retention, and course degree completion of high school and undergraduate students from groups that are historically underrepresented.  We utilize evidence-based and identity inclusive interventions throughout. We are utilizing a collective impact approach, which means we have a lot of different partners.

 

We are working under a shared vision. In order to broaden participation in computing, and so, we're focusing K-16 and specifically focusing on the people.

The policies and the practices that are impacting our ability to broaden participation.

Right now the way collective impact works is that we have a bunch of partners. We have a Backbone. Tracy and Evan whose voices I think I heard on the call today are a part of that backbone as is Jen Vizas and that Backbone team is responsible for facilitating activities to build the infrastructure and supporting elements for this shared vision, as well as shared measures of the impact. We also have Constellations, comprising working groups that are organized around Training, around Policy, around Research and around Curricula and Pedagogy.

The Training Constellation really pays attention to, how are we making sure that our faculty that our TA’s and our staff have the training they need to interact with a broad spectrum of identities.

The Policy Constellation right there's things in place that are negatively impacting groups. So, we're having conversations with ABET (an accrediting body for university programs in computing and STEM areas) toward informing new policies that include diversity, equity, and inclusion.

The Curriculum and Pedagogy group focuses on what are the things that are being done in the classroom. How are we talking about identity? How are we talking about the negative impacts of computing on marginalized identities? And how are we making sure that the kind of examples the kinds of projects that we're using are really connecting with a broad spectrum of identities?

And then, with Research, of course, we want to make sure that all of our all of the things that we're doing, are research informed.

 

Our Team:

  • Dr. Shaundra (Shani) Daily
  • Dr. Washington
  • Dr. Alison Scott
  • Valerie Barr
  • Joanna Goode

 

Our Partners

  • Duke University
  • Mount Holyoke College
  • Kapor Center
  • Georgia Tech
  • Constellations College of Computing
  • CSTA
  • University of Oregon
  • Northeastern University – Center for Inclusive Computing
  • CSAB – Leading Computing Education
  • DO-IT
  • Duke CCT
  • Reboot Representation

You can see us on social media identity in CS - @IdentityinCS

 

David MacAlpine - Thank you very much. What kind of programming are you doing in the K through 12 or 16 space to begin to address these issues?

Shani Daily - Our biggest partners in the K-12 space are CSTA (Computer Science Teachers Association) as well as Joanna Goode from the School of Education at the University of Oregon.  We are not necessarily doing programming for students, but we are doing programming for educators. CSTA is developing in coordination with AIICE an intro to identity toolkit where educators can learn more about identities. They're working on expanding their ability to develop curriculum curricula that they can use in their classroom. And we have actually Duke Web Services working with us on our new website, and we'll have resources up for teachers that they can grab and have access to be able to use in their classroom.

Paul Jaskot - Are you engaged with the curriculum discussion at Duke that is rethinking the whole curriculum?

Shani Daily - I’m not personally but Owen Astrokan from the AIICE team, is a part of that conversation.

Mark Palmieri - Yes, is there any cross talk between these initiatives, and ones like the QuadEx plans focused on the community experience at Duke?

Shani Daily – It’s important to know AIICE is not Duke focused. but I am here, and Nikki is here, and I'm also a part of the faculty fellows for QuadEx and so it's a part of everything that I speak about when I'm in those conversations. There are really some fantastic people in student affairs who are very aware of identity and making sure that we're being inclusive; and inclusion, you know, is, is really at the core of QuadEx.

Mark Palmieri - You think there are opportunities for outreach and ways that students can help to impact the K-12 programs? Duke students might be able to actually get into the community of get the word out and be boots on the ground That way It's a very different Duke's impact in Durham.

Shani Daily - We're not doing a lot of student programming, but I am personally a huge advocate that we have some more in the K-12 space. For example, I'm the educational workforce director for Duke’s NSF-funded Athena Institute and in that space I think Pratt, in general, really has a lot a lot to offer, and a lot to learn from the community. I would love to see us, being more strategic and coordinated with our efforts on the K-12 level.

Ken Rogerson – I’m wondering about possible connections with student groups on campus like science and societies or the summer programs +PROGRAMS or Hack Duke or ethical tech or the cyber club? I know of a number of students in these groups who would love to participate in your work, or potentially volunteer or help just wondering about connections with students on campus and student groups.

Shani Daily - Yes, we've spent our first year doing a lot of infrastructure building. In the fall, we have a Bass Connections project where we bring on students. For this grant specifically, I would be very interested in what does it mean for student groups to make sure that they're creating inclusive spaces?

Ken Rogerson - And if I may follow up that I love the idea of maybe them having conversations with you about how to make their groups. if that were a possibility, I think others would appreciate that. Can you share with this group when the Bass Connections project gets going so we can connect to, for example our Sanford Masters of Public Policy students.

David MacAlpine - Any more comments or questions. Thank you very much for that presentation. I really appreciate it.

Shaundra Daily - Yeah, thank you for having me it's good to see everyone.