Artificial Intelligence (I)
When I was sophomore in high school, we went on a trip to University of Illinois, Champaign-Urbana. U of I was a computer school with a top of the line mainframe. You'd sit in a room with a couple rows of ten or so terminals networked to the computer. We all typed “Hi” or something. A few years later, my uncle who was mainframe computer programmer brought over one of the first small computers. I think it was a Commodore. I remember he wrote a small quick program. I was more interested in the little spaceship game that came programmed with it. A few years later, when I graduated college, it was the beginning of the Personal Computer (PC) era. Upon graduation, my school was offering a deal on an Apple machine which I very much wanted, but the Jesuits had bled me dry. I think it was around $1500, which was a lot money then, still is.
Over the next ten years, I started learning and working with PCs. They were invaluable in election campaigns; word processors for writing, spreadsheets for number crunching, and data bases for voter files – the handiest of tools. When I left campaign work in 1992, the internet was just beginning its expansion across the larger public.
I became more aware of the internet from a Rolling Stone interview with Mitch Kapor, who had founded Lotus 1-2-3, a spreadsheet program, the first “killer app” of the new PC era. In just four years, Lotus was a tremendous success, making Mitch a very wealthy guy. By 1992, he had become a big-time evangelist for the internet. I immediately became intrigued by the possibilities of the Net as a new communications and information medium. I began thinking about what it might mean for politics, which was the completely wrong thing to think. Pretty much everyone else was thinking only about the money to be made.
Mitch was one of the few public advocates who understood the Net would have profound political implications, we should at least be thinking a little about them. I got in touch with him, developing a long relationship. Initially, he taught me a lot about computers and the industry's history. After a decade, I'd like to say we worked together for three years, but that wouldn't quite be right. He did however pay me for three years, so I guess that comes under some definition of work. I was unable to teach him anything about politics, may have been my worst student ever, though over the years I had learned this wasn't just a failing of Mitch's, but an industry affliction. It could well be argued it's a national ailment.
It's important to understand in general and even better specifically how computers are designed, function and are networked — the architecture. In fact, one of the great, though distressingly few political insights of the early Net era, Kapor stated, “Architecture is politics.” The design of the machines and the way they're networked together has political implications. For example, you won't get democracy out of a centrally controlled network.
In one of the great political bait and switches of all times, the internet was promoted as a decentralized, distributed network, which it wasn't. The internet was entirely dependent, and remains so, on centralized servers for access and organization. This fact was largely ignored by most of its initial promoters, who instead focused on the routing of the Net's data traffic, which could follow a number of different routes from where it was generated to its eventual destination. Nevertheless, routing packets of data across a variety of different paths dependent on massive centralized servers doesn't make a distributed, decentralized architecture. The network is controlled by these centralized servers.
Working with PCs and gradually learning more about the industry, I came to understand the role Microsoft played in its development. In the early 80s, IBM, at the time the world's biggest computer company, granted Microsoft the license to provide the operating system for the new IBM PC. As I worked more extensively with desktop applications and programmers, I learned of Microsoft's machinations to gradually take control of the desktop using their control of the operating system. Any application written for a PC needed access to and operating system specifications, a position Microsoft ruthlessly took advantage. By the mid-90s, having not developed word processing, data bases, or spreadsheets, Microsoft dominated them all. With control of the desktop, they were now looking to dominate the burgeoning internet.
One of the biggest lost stories of the computer industry, besides the little discussed essential role of the military (see Yasha Levine's excellent Surveillance Valley), is the 1998 Microsoft Antitrust Case. By 1996, the notorious libertarian, though never turn down a military buck, computer industry began to fret that just as Microsoft used its control of the operating system to dominate the desktop, they would now use their monopolist position to control the Net. In the infamous words of Microsoft Chairman Bill Gate's number two man, Steve Ballmer, Microsoft sought to get a “vig,” a gangster term for grabbing a piece of every bet, for every transaction occurring across the internet. In 1998, with a case more or less clandestinely developed by the industry, the Department of Justice launched an antitrust case that two years later found Microsoft guilty of monopolist actions. The findings quashed Microsoft's plans for internet dominance. Unfortunately, the decision left the company intact and allowed Chairman Gates to keep his mountain of illicit profits that now, combined with fifty sheltered years focused on one industry, allow him to give exceedingly bad advise on numerous topics and even more detrimentally influence development of many others (see my review of Alexander Zaitchik's essential, Owning the Sun).
Today, a dozen years later, Mr. Gates announces the computer industry has reached its long quest for the grail, “The Age of AI had Begun.” This immediately begets a number of questions, first and foremost what is Artificial Intelligence, which would require an answer to the question of what exactly is intelligence, a question first asked thousands of years ago and still very much without an easy answer.
Certainly its easy to think this is just another PR campaign like the most insidious “Cloud,” which is simply the storing of your data on their centralized servers, not so fluffy as felonious. Microsoft, one of the biggest Cloud blowers states theologically on their website, “The cloud is not a physical entity” — metaphysical marketing bullshit, worse than promoting ancient superstition of the planets as gods. Considering Microsoft's now integral role in launching Artificial Intelligence, such marketing nonsense should cause everyone concern. It doesn't matter what its called, it's pretty clear Artificial Intelligence is a next generation information technology, a very powerful technology in need of much greater attention and thinking than marketing slogans.
In “The Age of AI Has Begun,” Chairman Gates announces the launching of a new revolution, though unlike his power zealous forbearer, permanent revolutionary Mao Zedong, Mr. Gates is continually met with hosannas of praise by moneyed elites. It's both telling and amusing Gates claims the first revolution he was introduced to was the graphical user interface, “the front runner of every 'modern' operating system.” It demonstrates not only the Chairman's lack of imagination, but also his role as that revolution's Bonaparte, Windows as empire. At the technological forefront, Microsoft's story was never a tale of heroic innovation, but as with their first IBM PC operating system, MS-DOS, simple transactions of filthy lucre.
Which gets us to the other ferociously stereotypical Valley character of this Age of AI – Sam Altman. A recent Wall Street Journal piece dubs Mr. Altman an “AI Crusader.”