AI by Rice, for Rice

How Raza Dawood is transforming university operations with purpose-built AI models.

Photo of Raza Dawood

Fall 2025
By Katharine Shilcutt

As Rice’s associate vice president of transformational technology and innovation, Raza Dawood leads efforts within the Division of Operations, Finance and Support to bring advanced artificial intelligence solutions into university operations. With a unique blend of humanities education and technical expertise, he’s developed enterprise AI models now being piloted across campus.

Photo of Raza Dawood
Raza Dawood. Photo by Jeff Fitlow

What was your journey from English major to data scientist building AI models at Rice?
When I was at NYU getting my master’s in English, I worked full time to pay for my education as a payroll clerk. While working there, and going to school at night, I began to notice that a lot of my tasks could be automated. That’s what first got me interested in programming — I just wanted to make my own work easier. That newfound passion for programming helped me with my assigned readings, which is how I got into natural language processing. Eventually, I changed my thesis to be a program that analyzed texts based on word counts.

How has your humanities background informed your work in data science?
Considering the main way that people interact with large language models is via written or spoken language, then humanities degrees become incredibly practical. Expressing yourself in ways that an algorithm can parse is as critical as doing the same with another person. But beyond that, I’ve always felt the humanities gave science a reason for being. Why do we care? Why do we want to cure cancer? The humanities provides that ethical and moral framework, so I believe the humanities are coming back in a big way.

Can you tell us more about the models you’re deploying and how you see them being used at Rice?
We’re deploying three tools for the campus: NotebookLM Enterprise, Donor AI and Grants AI [see sidebar]. It’s important to remember that these tools are accelerators. They don’t replace people but expand what’s possible in the time we spend working, researching and learning.

What does it mean to create these models at a university alongside researchers known for their AI work?
It’s been amazing. I sit on several AI committees alongside brilliant faculty who focus on responsible AI. They wrestle with moral dilemmas; I try to make those lessons become practical tools. The goal is to remove artificial friction. Done well, the technology fades into the background, and you get a lot more done than you could have without those tools.

What are you looking forward to in this coming academic year?
The team has a lot of prototypes and projects that we’re excited to launch for the campus. We’ve been looking really deeply at how we can provide tools that create equitable access to AI across campus, and we’ve also been working on some new and exciting AI agents to take our work to the next level.

Inside the Models

NotebookLM Enterprise: A focused AI assistant that only answers questions using information you provide — PDFs, Word documents, spreadsheets, even videos or audio. Ask it anything about a strategic plan or internal document collection, and it will pull answers solely from those sources, with citations. 

Grant AI: Designed to streamline the grant-seeking process, this model quickly matches faculty to relevant funding opportunities. What once took weeks now takes hours. Faculty can scan dozens of viable grants, not just a few, and spend more time writing compelling proposals.

Donor AI: This tool helps align potential donors with campus initiatives. By analyzing donor profiles and institutional priorities, it surfaces connections that might otherwise be missed — speeding up cultivation strategies and deepening relationships in meaningful, data-driven ways.