This fall, roughly 60 faculty members will take part in a new artificial intelligence initiative from the Office of the Vice President for Research. Called Propel AI, the effort aims to empower faculty to incorporate artificial intelligence into their work — regardless of their academic area or previous experience with AI.
Through a series of workshops throughout the 2024-2025 academic year, faculty will gain insights into the rapidly evolving landscape of AI research and work with each other to foster collaborative partnerships across disciplines.
We spoke with Bryant Walker Smith, associate professor of law and an expert on emerging technologies, about the new initiative. Smith is one of three faculty members tapped by Vice President for Research Julius Fridriksson to guide Propel AI as it launches this semester.
My understanding is that you’re part of a group that’s been formed by the vice president for research to increase the adoption of AI tools on campus. Is that kind of the goal here?
Bryant Walker Smith: What we are trying to do is foster a community within the university of people who are really interested in AI in whatever way that that speaks to them. And so that includes some people who have no real experience with AI and some people who have been using some of these machine learning tools for a decade.
I think that’s really important, because if a university goes out and just tries to compete with every other university for the established AI superstar in each field, we’re never going to get all of them. I think the better approach is to realize how much is already happening, how we can support what’s happening, and even more than that, how many people have yet to pick the low-hanging fruit applicable to their work?
That’s what I’m really interested in — somebody who has an amazing idea, and it’s just never occurred to them that they could use some AI tools to do the work 10 times as fast, or analyze 10 times as much data or reach people 10 times more effectively.
So, one of our challenges is, how do we take this diverse group of participants and make the internal connections in a way that is helpful to them and also try to bring in other expertise, whether that’s expertise from our own faculty, our own AI Institute or expertise from outside the university.
You had an application process for faculty. Did you see a lot of interest?
I was surprised by the level of interest, and I was even more impressed with the application statements. There were so many times when I added a note like, ‘We need to get them involved as a teacher here,’ ‘we need to partner with them,’ ‘they need to present’ or ‘I had no idea this was happening.’
For example, I have for years been obsessed with the idea of animal communication and the ways that machine learning tools might provide us insight or give us the impression that we’re able to communicate with animals. And I saw in the applications that one of my colleagues is doing that. I was like, ‘Oh my goodness, I need to meet them.’ That was really exciting — seeing how much cool stuff is happening at our university, and how much thought is already going into the potential for AI tools.
What are some other examples that crossed your desk?
One person wrote, ‘I don’t have any experience with AI, but I work with large data sets and complex engineering and social science topics.’ And immediately that just shouted, ‘Oh my goodness, you have access to data. And if we give you this little boost — an introduction to these tools — you could potentially do so much more with those data.’ And then there was someone who works in film history and the history of technology. So, again, they have access to incredible data — and this person actually has some experience with machine learning. So, they’ve been thinking about, ‘How can I play with these vast data sets that previously would have literally taken somebody watching every movie, every film?’ Now there’s the possibility to do in hours what previously would have taken years.
Did you get the kind of breadth that you were hoping for — from arts and humanities, as opposed to the fields that we might expect to see more applications from?
I have not analyzed the proportion of our applicants and how they correspond to their numbers at the university as a whole. But, for example, we have a fair number of people who are doing medical or public health research, and I think that was to be expected. We also have people in English, in criminology, in law, in journalism, and in social work, African American studies, environmental science, business. Also nursing, exercise science and several from education.
I think the other thing that we’ve emphasized is people from a range of points in their career. We have some instructors, some assistant professors, some associate professors, some full professors. That’s another kind of diversity we really want.
Along with the excitement about adopting AI and its potential, there’s also some people who have fear or hesitation around it. Are limits and potential misuses going to be sort of baked into the workshops that you’re doing?
Certainly, one of our goals is to provide a very holistic assessment of AI, including the various and varied limits, misconceptions, benefits, opportunities and risks — the ways that it can be used and misused in society and in research and in teaching.
This is the story of progress or policy or research or anything where you replace an old set of problems with a new set of problems. And you really hope that the new set of problems, in aggregate, are less than in your old set. I think we’re going to see that here as well.
It sounds like a very complicated but worthwhile endeavor.
Our colleagues are incredible, and that is both with regard to what they are thinking about generally, and the way that many of them have already integrated AI into their work. This is an impressive group, and they are bringing a lot of expectations to this initiative. I really want to give them an experience that delivers what they’re looking for, whether that’s information or knowledge or skills or community or connections.