Technology | AI and Education | Large Language Models
Today's "artificial intelligence" technologies are still computing engines, and they are limited to the data we feed them. They don't "think" in the same way that humans think — instead, they are averaging the sum total of information that's available for processing. Although they are able to imitate artistic and musical styles while also providing specific information sifted from large bodies of text, their outputs are still only averages of the prior human innovations. Thus, the current models are fundamentally constrained to the limits of human knowledge.
More limiting still, not all human knowledge is written down. Take oral histories, for example, such as how Native peoples passed down the art of making waterproof clothing from animal membranes (Serban). Or consider the mystery of Roman concrete, which researchers have spent decades trying to reconstruct (Chandler). How many personal "life hacks" have you figured out for yourself? How much do you depend on "common sense"? If that advice for living hasn't been written down for the AI to scrape off a webpage, then the AI can't share it.
But not all human knowledge is equally valid, and the machines can't differentiate between accurate and popular. We argue, we make assumptions, and we draw conclusions based on incomplete data about the universe we inhabit. When a large language model like ChatGPT averages these massive collections of human-derived knowledge, it will also average in our own limits. If the majority of people believe in something that's incorrect, then AI will give that belief more weight than something that's actually true. History is filled with widespread assumptions that were later proven wrong — such as the conviction that humans were never meant to fly, or the belief that the Earth was flat.
But humans don't need to agree on what's accurate — we are free to pursue our personal beliefs while ignoring those propositions we don't accept. Sure, this means that some people make terrible decisions based on ridiculous assumptions — but it also means that some individuals will find more accurate knowledge to share. Our society is free to choose the information that we find the most useful — but we often get it wrong. The large language models are fundamentally limited because it's not the computers out there living, breathing, and bleeding through our mistakes. The machines aren't conducting this experiment called life — we are.
That said, machines have a major advantage in perceiving and sorting our collected knowledge. As individual beings, we we humans are each limited to two eyes, two ears, and our limited years on this Earth. No one human being could possibly read or perceive any significant portion of the internet — but an AI platform can. A large language model can sort and evaluate and rank vast sums of information. Now, that evaluation may be flawed because the machine is averaging rather than experimenting, but the sheer breadth of information available to a single AI machine is simply staggering.
AI has revolutized the possibilities for research. Once upon a time, we used index card catalogs in libraries to help researchers find specific books related to topics of interest — later, the mass growth of the internet along with search engines like Google, Yahoo!, and Bing revolutionized the ability to track down almost any conceivable source that anyone might need. And now with AI? The search engine doesn't just point you to reading material — it reads hundreds of sources, condenses them, and provides a summary to answer any question you might pose. Sure, that information may include massive mistakes and hallucinations (Maliugina), but it's still drawing from far more information than any single human being could evaluate in a reasonable period of time.
Despite the limits and flaws of AI, it is a disruptive technology — both for good and ill. It offers incredible efficiencies in certain areas, and it will continue overturning many of our traditional social structures. In some situations, AI can augments the capabilities of ordinary people — in other situations, it will take away jobs while reducing service. I believe this will accelerate the enshittification trend described by Cory Doctorow, where digital platforms reduce quality in order to serve purely financial goals.
AI is replacing teams of workers who once collaborated to review information and make decisions. AI queries deliver the same types of information in a fraction of the time at a tiny percent of the labor cost, leading to an organizational "flattening" as corporations replace managerial teams with AI (Nolan).
AI is also able to conduct the large-scale social engagements of massive organizations, both for corporations and criminal enterprises. AI agents are rapidly replacing customer service personnel (Floyd), and deepfake AI likeness are being used to scam people (van der Linde).
AI isn't cheap — ethically, economically, or environmentally:
Large language models can be used as plagiarism machines. Why is Anthropic paying authors $1.5 billion to settle a lawsuit? Because it's chatbot training materials came from pirated books.
AI is expensive — and the price is not being shouldered equally. Industry experts aren't sure who will ultimately pay for the costs of AI platforms (Grant Gross), and residents are already paying higher electricity bills as AI drives up utility prices (Paige Gross). Meanwhile, chip manufacturer NVIDIA is funding the very same AI platforms buying their chips, leading to a single company able to control both supply and demand (Khan). This will inevitable drive up processor prices — and every consumer who owns a computer is affected by those price hikes.
AI requires massive amounts of energy and raw materials. Some regions of the U.S. have seen electricity rates skyrocket due to AI's increasing electricity demands on existing generators (Kearney). AI data centers also use vast quanties of freshwater for cooling (Yañez-Barnuevo). And these data centers require everything from conductive metals and precious minerals (James and Meyers) to steel and concrete (Engle).