Discover more from Life in the 21st Century
AI
De heathen back dey pon de wall
It’s essential to think about AI not as a new technology, but the latest generation of compute technology, more or less beginning with the invention of the transistor 75 years ago. We don’t well understand the impact this technology has had on society to this point, much less trying to crystal ball this next generation. Understanding the past provides some insight on the future.
AI is a super powering of existing technology. Intel's first microprocessor in 1971 had 2300 transistors. The much ballyhooed Nvidia AI chip has 80 billion transistors. In a half-century, that's an incredible increase in raw compute power. 80 billion is an inconceivable number to the human brain, in popular American vulgar math it's “a lot,” combining the numbers being computed at any given time, it’s a real lot.
From a software perspective, this compute power allows access to the manipulation of massive amounts of data. Over the last 20 years, data and its storage is growing exponentially. Businesses, governments, and individuals have placed increasing amounts of data into centralized data bases marketed as “the cloud.” The increase in compute power allows greater brute searches to manipulate these massive data bases. This power combined with more sophisticated algorithms creates ordered results which are then put through algorithms refined by initial search results which give even better defined results, then rinse and repeat. This is the process of large language models. This simple generalization is in no way meant to denigrate the process, which from a technological perspective is powerful, complex, and sophisticated. You might be tempted to say it's a process not so different from how we biological organisms learn, how we define intelligence. Like us, AI digs repetition.
The final essential component in thinking about AI of already established technology is the internet, not only as a data source, but enabling greater simultaneous compute and extensively networked communication, again, a lot of data. The net is a networked compute collective, corporate (from the Latin corporātus,"to form into a body, form).
The FT has an interview with a professor, a senior fellow at the Stanford Institute for Human-Centered AI – yeah really, that's its name. He crystal balls from a very limited view. He's an economist, his thinking completely constrained by his past, not in a good way. It is another piece bereft of explaining the technology, but all sorts of speculating on what may come. It is speculation filtered via a very narrow, established interests fractured lens, a lens that has the dominant and in many ways exclusive measure of value not only of compute technology, but the last two-centuries of industrial technology. It is perspective exemplified by the article spending the most effort mulling whether compute technology has added to productivity, a narrow, entirely economic measure of value.
Here's the interview nut, the professor states,
“The Digital Economy Lab, which I direct, is premised on the idea that we need to rapidly invest more in understanding the economics, organisational, cultural, political and ethical side of AI.”
First, the name “Digital Economy Lab” assures the overwhelming measures will be anachronistic, archaic, and sponsored values. Case and point, he says we need to “invest” in understanding AI's “organizational, cultural, political, and ethical” impacts. Phew, this is so hopelessly naïve, not only have we done none of this to date for compute technology, but little for the whole Industrial Era. To date, the subjective values of economics and those demanded by the technologies themselves have overwhelmingly been the dominant value measures, productivity being an excellent example. We don't even know how to incorporate others. Best about the interview, you're supposed to believe a program at a university almost largely funded by the Valley will incorporate any measures but their own, much less even contemplate them. Maybe we should check-in with Stanford’s Bankman-Frieds on how you instill noneconomic, nontech values into tech development?
This is funny,
“I encourage technologists to think hard about how they can use the technology not just to imitate or mimic humans, but to augment and complement people.”
Technologists think about their technology. They represent technology both as individuals and corporations. They advocate the technology and how it will be monetized, nothing else. Some might say that's a big problem.
Then the professor states,
“The other aspect is minimising the harms, including on humans. We are capable of changing our environment faster than we evolve to adapt to it.”
Evolve is a problematic word here. With natural selection, the greater environment shapes the species. With technological development, the species reshapes the environment. “Adapting” in this regards means the species, us, reshaping the environment with technology, then in turn we all adapt to the new environment shaped by the technology, call it artificial selection. Industrialism completely altered the environment from which Homo sapiens evolved. Our ability as species to adapt to our industrially created environment remains uncertain.
Finally, the professor states,
“There are two powerful trends going on in opposite directions. One is driving towards more concentration. Scaling laws mean that bigger systems tend to be more powerful. If they have more computing power, more data and more parameters, they perform better. And this is the reason that companies like OpenAI, Google and Anthropic are spending billions of dollars to build gigantic systems. It is hard for smaller companies to keep up.”
“On the other side, open source and very small systems have become able to get close to the frontier models. Often they have local data that the bigger models do not have access to, and that can be more important than just raw power. In that second path, of having local data and rapid iteration where they could be trained in days or even hours instead of months, the economic impact can be larger.
“I’m uncertain which of the two trends will ultimately dominate. Part of it will depend on our policy, antitrust, choices by executives and rules about data ownership.”
“So there’s a real premium on smart governance, managers and policymakers really paying attention to this technology.”
First, “smart governance, managers, and policymakers,” are not going to come from established organizational structures, whether the industrial corporation, agrarian government institutions, or feudal universities. Secondly, there's no trend in two directions, both industrialism and compute technologies have resulted in centralization of political power and concentration of wealth, both becoming more extreme with our adaption to compute technology.
That “small systems” can effectively apply AI technology seems no surprise, though it absolutely indicates no trend for distributed wealth or politics. It is in small systems where AI should be most immediately impactful. For example, if you're looking at the books of a small firm with very narrow accounting rules and limited amounts of data, running it through an AI system should more easily designate established patterns, that can then be repeated, that is automated, by the machine. It doesn't in anyway imply control of these small systems will be distributed, they can easily all be aggregated and centrally controlled through present technologies. The present internet is structured with a completely undemocratic technological architecture, it will not foster greater democracy as currently organized. AI seeks greater central control through the net.
The energy essential for the great AI transformation is a subject not broached, despite the Valley's call for more nukes. I've even read someone say by increasing the energy needs of the industry by a factor of 2, 3, maybe 4, AI will then tell us how to use less energy. In this regard, if AI's opinion is worth anything more than its creators, it will say unplug me.
Subscribe to Life in the 21st Century
History, Science, Energy, Technology, Environment, and Civilization