The work done in Desmond Upton Patton’s SAFELab at the University of Pennsylvania focuses on AI, computational social sciences and ethics, so when ChatGPT came out, he and his team quickly started playing around with it as a research tool.
Upton Patton, a professor at Penn’s School of Social Policy and Practice (SP2) and director of SAFELab, published a paper earlier this summer about the potential opportunities and challenges of generative AI in the social sciences, specifically social work. His co-authors are fellow researchers at SP2 and Penn’s Annenberg School for Communication, Aviv Y. Landau and Siva Mathiyazhagan.
Read the research paperUpton Patton wanted to create a conversation around the tool in academia, he told Technical.ly, before it got lost in a sea of negative use cases.
“What we’re trying to do is to anticipate, before mass use, some of the challenges of the application for societal use, but also encourage positive use and uses that can really help us answer questions that are hard to answer in social science,” he said.
Upton Patton likes that the tool can identify gaps in his thinking, he said. For example, he may ask ChatGPT to recommend scholars of color or interdisciplinary research he could include in a paper or syllabus. He has also asked it to act as an editor for his writing.
ChapGPT is not replacing his thinking process or his knowledge, but supplementing it and helping him to make his work more thorough and efficient, he said, describing the tech as something like a “virtual research assistant.”
Users can also make the tech work for them, by automating or expediting the more tedious tasks of their job. As Plymouth Meeting-based consultant and designer Beth Brodovsky said during her talk “AI: The Ultimate Sidekick” at this month’s MILLSUMMIT in Wilmington: “I’m not a programmer, but now I can be.”
Upton Patton thinks ChatGPT can be used to ask bigger and better questions about social sciences. The tool can help come up with new questions and pull data and context that hadn’t been considered before.
“How does ChatGPT affect our future research? How might ChatGPT reimagine ethics in social science spaces?” he said. “And I think we need to be considering new and deeper, more refined questions of how these tools affect the kind of work you want to do in communities.”
Upton Patton said he encourages his team at SAFELab to use ChatGPT in their workflow as well. In this research paper, Upton Patton said ChatGPT can be particularly helpful with qualitative research by identifying trends and suggesting other sources.
“With my team, it’s really important for us to identify authors of color, to identify research that comes from interdisciplinary spaces, from other disciplines,” he said. “[We] want to make sure that we’re using it ethically and as a resource and a tool. It’s not intended for us to be using it to write or to produce ideas.”
In terms of challenges with ChatGPT, Upton Patton said the main thing to note is that it uses unverifiable sources. This means it’s more important than ever to check sources and use proper citations for information. To get ChatGPT to give valuable responses, the user still needs to deeply understand the topic so they can provide clear prompts and judge the quality of its responses.
(Another important consideration: who’s developing the tech. “I think it has to be a stewardship — and it requires all of us, because otherwise there’d be too much bias in the system, if it’s only being stewarded by a very select set of people,” Carolyn Yap, Google’s director of AI Practice, said during PACT’s AI-focused Phorum conference in May. “The more perspectives, the more communities, the more cultures are brought into this conversation, the better the guardrails, the systems, the safeguards, the policies, and even the responsibility matrixes can be.”)
For social work in particular, Upton Patton feels like the field tends to be slow to adopt new technologies, and that there isn’t a strong template for how to use generative AI in social sciences yet. However, the introduction of ChatGPT is an opportunity to create that blueprint for engaging with new technology.
“We see in the hard sciences, we see in computer science, how these tools revolutionize how work is done and how people collaborate and what types of problems people are able to tackle,” Upton Patton said. “We really need these types of insights in social work and social science because we’re dealing with the everyday human experience.”
Sarah Huffman is a 2022-2024 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Lenfest Institute for Journalism.Before you go...
Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.
3 ways to support our work:- Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
- Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
- Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
Join our growing Slack community
Join 5,000 tech professionals and entrepreneurs in our community Slack today!