I, like you, have no doubt asked ChatGPT, the open AI-powered text generator, to respond to some questions and been dazzled by the results.
I asked ChatGPT to draft syllabi for two of my upcoming courses to see how its recommendations differ from my own course outlines. I found the results thought-provoking. The application suggested topics as well as books and articles that I hadn’t thought of. It really will make my courses – one on the history of childhood, the other on museums, past, present, and future — better.
I also asked the program to predict what a college education would look like a decade from now.
Here’s what the application wrote without delay:
“It’s difficult to predict exactly what a college education will look like in 10 years, as it depends on many factors such as technological advances and changes in the job market. However, it’s likely that there will be a continued trend toward online and distance learning, as well as an increasing focus on practical, hands-on learning and real world experience. There may also be a greater emphasis on interdisciplinary studies and on providing students with the skills they need to succeed in a rapidly changing job market.”
Not bad, methinks.
I also asked it how AI will transform higher education. Here’s its response:
“AI has the potential to transform higher education by making it more accessible and personalized. For example, AI-powered learning platforms can provide personalized learning paths for each student, based on their strengths and weaknesses. This could allow students to learn at their own pace and in a way that is tailored to their individual needs. Additionally, AI could be used to automate grading and other administrative tasks, freeing up instructors to focus on providing support and guidance to their students.”
Again, not bad.
Working with ChatGPT is a bit like consulting the Oracle of Delphi for advice. It can provide useful advice, but it’s suggestions need to be taken with care and caution. One of my sons asked for ten suggestions about how to address missed sales projections. Once again, much of the advice made sense: Offer a discount to customers who make a purchase in the next week; run a social media campaign; reach out to past customers; offer a bundle of products and services in collaboration with other businesses; host a flash sale or a contest; create a referral or loyalty program. Impressive.
You can even ask ChatGPT to offer explanations tailed to different age levels.
What we see, even at this early stage of development, is what many of us imagined Google would become: a magic box that can synthesize vast amounts of information and produce useful outputs.
What if JSTOR and other research repositories were to lift their gates and allow AI to work with the full breadth of the scholarship they contain?
I have come away from ChatGPT convinced that it would indeed make sense to consult the platform before tackling a topic. It would prompt me to consider ideas that I hadn’t previously considered.
One of my favorite bloggers, Matthew Yglesias, wrote briefly about the likely impact of AI-powered text generation in his weekly mailbag. He quite correctly observes that educators can respond in one of two contrasting ways. They can “think of these tools as a source of cheating and redesign their assignments so that students aren’t using them, or should they think of them as writing tools (comparable to a spellcheck program) that everyone should use and that just set the bar higher in terms of what kinds of outputs we expect to see.” He, like, I, favors the latter approach.
I think of automated text generators the same way I think of Wikipedia. If all we ask of students can be achieved by copying and pasting a Wikipedia entry or asking ChatGPT to respond to an essay prompt, we aren’t doing our job. We need to ask ourselves:
1. What are Wikipedia’s or ChatGPT’s limitations?
In both instances, the text may contain inaccuracies, it may be written at an inappropriate level (for example, for a general or a technical audience), it may reflect bias, it may be insufficiently documented, and it may not be responsive to the big issues or debates surrounding the topic.
2. How, then, can an assignment address these shortcomings?
Design assignments so that students must lay bare their research, thinking, and writing process. Consider requiring an annotated bibliography. Ask students to situate a topic in a critical or historiographical context and evaluate contrasting points of view. Make sure their essay exhibits higher order thinking skills: Analyzes and evaluates sources, makes an evidence-based argument, and results in broader generalizations.
3. What can we learn from ChatGPT about the research and writing process?
Why not introduce a topic by discussing the relevant Wikipedia article or ChatGPT generated text. Ask your students to critically consider the pieces’ strengths and weaknesses, biases, and style. Remind your students that Wikipedia and ChatGPT provide text that sounds highly authoritative and that rarely considers ambiguities, counterarguments, or contrasting points of view.
Push your students to go beyond what Wikipedia or ChatGPT can currently do. The key, in my opinion, is to complexify the discussion – to add nuance and context, explore ambiguities, contradictions, and implications, and generate more sophisticated hypotheses, arguments, and interpretations.
In his mailbag, Matthew Yglesias also responds to another question well worth asking: Is there anything productive that a good faith conservative state legislature could do to promote less of an ideological bias in state universities?
Yglesias responds by arguing that there are three center-right narratives in wide circulation:
1. That “we need more practical education that is aimed at useful job skills and delivers economic benefits to individuals and society.”
2. That “we need more ‘old-fashioned’ education that challenges preconceptions, wrestles with difficult ideas, and engages the canon.”
3. That “we need to accept that education is largely just pointless status-seeking and consumption, and we should reduce the number of resources our society dedicates to this and the power and prestige of top universities.
That latter argument – made most explicitly by the George Mason economist Bryan Caplan in his 2019 book The Case against Education: Why the Education System Is a Waste of Time and Money — argues that a college education’s primary function is not to build students’ skills but to signal other qualities that employers seek: employees that are intelligent, patient, conscientious, conformist, well-behaved, meticulous, and docile – who have a proven ability to function in environments that are dull and boring, and who can multitask.
The claim that education successfully prepares students for the future, Caplan argues, is in most cases dated, irrelevant, and out of touch with both student interests and the jobs they are likely to get. Caplan himself favors more vocational education and a college education that focuses more systematically on essential skills: reading, writing, and mathematics. If we teach foreign languages, we must ensure that students achieve fluency.
The three narratives that Yglesias identifies are, of course, in direct conflict with one another. One narrative calls for an education that is more practical and applied and job-aligned; another for a higher education that is more rigorous and intellectually demanding; a third, for recognizing that the pablum that envelops college – that it is a “special, deeply political, almost sacred civic activity” (in Derek Bok’s words) – is bogus, obfuscating higher education’s role in buttressing class hierarchies.
It’s not just moderate conservatives who have adopted one of those points of view. My impression is that many faculty members, irrespective of their politics, do worry about the lack of ideologically diversity on campus and self-censorship in the face of “cancel culture.” Many, too, fear that academic standards are declining as expectations about the amount of reading and writing that students should undertake diminish and that the overall level of students’ analytic, interpretive, writing, and critical thinking skills is too low. Many recognize that students aren’t receiving the personal attention or level of feedback that they need to achieve the skills and knowledge expected of a college graduate.
I am privileged to teach at an R1 with a very substantial endowment. But when I lecture to 800 students without breakout sessions, how can I possibly provide my students with the learning experience they deserve? Even in my 40 student supposedly small discussion classes, how can I conceivably offer the individualized constructive feedback that my students require? And how can my students feasibly juggle 5 courses, 20 or more hours of work, and a host of extracurricular activities and still do justice to their schoolwork?
These strike me as the pressing questions that the faculty needs to ask and address. I may not be able to provide a Columbia-like experience, where my largest class consisted of 17 students. But how, given budgetary constraints, might it be possible to do better?
Here’s my answer.
If communication skills are as important as we claim, we need to offer many more classes that are writing and oral presentation skills intensive and that are distributed across the curriculum. Ditto for quantitative or digital skills or for comparative, cross-cultural perspectives.
Undergraduates, I am convinced, would benefit from learning experiences that take very different forms. Supplement our standard lecture and discussion courses with other kinds of experiences, with communities of inquiry and solver communities, maker spaces, innovation labs, and entrepreneurship centers, courses built around gamified, role playing experiences, as well as studio classes where students workshop papers or presentations, research, field, and community based courses, service learning and clinical experiences, and with virtual and in-person internships.
Place many more undergraduates in thematic learning communities with a faculty mentor. These might be organized around a future career (e.g. the arts, business, healthcare, information technology, public policy, or the natural and social sciences); or an area of special interest (such as film, music, or sports); a societal problem (for instance, climate change, equity, or sustainability).
For those students who want something like a Great Books experience, let’s make that an option, too — though I do hope it will not be exclusively Western-centric, and will include works that the legal analyst who goes by the pseudonym Unemployed Northeastern recommends: “the Four Classic Novels of China, the Ramayana, Mahabharata, and various Sutras and other Indian texts, The Tale of Genji or Tale of the Heike from Japan, the Shahmaneh from Persia, at least selections from the Koran (assigning the entire book in one undergrad course is no more realistic than assigning the entire King James), the Popol Vuh – the only remaining major work from pre-Columbian Mesoamerica, Rumi, The Conference of the Birds, some transcribed African epics of the oral tradition (Gassire’s Lute, the Sunjata, etc.).”
Let’s help all our students bloom.
AI currently exists in the same space occupied by other technologies with exciting, but as yet, unrealized, potential, like fusion power, genomics, quantum computing, xenotransplantation, and, of course, the metaverse (but unlike the technologies whose impact will not materialize in the foreseeable future, like brain implants, hover boards, human cloning, and light sabers).
Without becoming all rah-rah, what might we conceivable in the next few years? AI powered tutors and tutorials, AI assistants (to create bibliographies or even initial paper drafts); more sophisticated early warning systems to spot students who are off-track academically or in danger of failing; tools to analyze students’ writing and formulas); and various kinds of task automation.
But let’s not just think of AI simply as an automation tool or as an assistant. We might, instead, think of it as a collaborator – as a resource that we can use to in research, writing, and thinking. I feel today a bit as I did in 1993, when the Internet browser was introduced.
As a historian I should be cautious and should beware of frenetic enthusiasm. We know all too well that highly touted technologies, like Blockchain, frequently fail to live up to the hype. So let me echo the Lincoln Steffins’s words after visiting the Soviet Union in 1919, fully aware that the phrase is fraught with irony: “I have seen the Future and it works.”
Steven Mintz is professor of history at the University of Texas at Austin.