The late and great anthropologist David Graeber left a tiny oops of a few sentences in his massive opus Debt: The first 5000 years. It gives a completely wrong description of the origins of Apple Computer. Ever since, conservative and even liberal economists like Brad DeLong have used this as a Gotcha Point to discredit Graeber's entire thesis, which is that Debt is just as if not more ancient than Barter, and classical (and neoclassical, etc) economists have wrongly dismissed the macroeconomic consequences of personal debt. It seems that economists would rather not talk about that.
I thoroughly detest the rhetorical approach of scoring gotcha points on insignificant matters while ignoring larger ones. Graeber claims, and I have no reason to disbelieve, that his larger points have not been so dismissed by other experts.
Meanwhile, there is no larger pile of BS than mainstream economics in all its flavors. I do find the Post Keynesian tendency and a few others to be worthwhile, but it's not a single theory really.
This is somewhat different than saying Graeber was always correct. As a communist, I basically don't trust anarchism and therefore anarchists must be getting something wrong, I believe, in their core thinking. However I haven't been able to prove this to my anarchist friends. He also did have, as you can tell from the above thread, a lot of personality.
But this has nothing to do with the origins of Apple Computer, of which I believe the key historical fact was the construction of Black Boxes to steal free long distance service over the AT&T system, which led Steve Jobs to the belief that electronics can be fun and profitable at the same time. But Apple Computer didn't really invent much, it was more a polishing act. I remember examining computers around 1980 and concluded that Apple had nothing but hype and colored advertisements over anyone else. The real invention of modern computing occurred at Bell Labs after the industry-wide Multics project was abandoned by Bell Labs and others in 1969. Working on their own time, technical staff members implemented the core ideas into the Unix operating system, which has itself lived on through MacOS 10, linux, and inspired all others. Meanwhile, other scientists at Xerox PARC crystalized the essence of Graphical User Interface ideas which themselves had been pioneered by other scientists, and which were then infamously stolen by Jobs himself without royalties or credit (reminiscent of those Black Boxes). And of course the invention of TCP/IP by DARPA, which survived endless attempts by AT&T and other telecoms to destroy it with proprietary schemes. And semiconductors and integrated circuits and magnetic storage. Pretty much everything else since has just been slick marketing, spyware, and misuse of government monopolies. The IBM PC in particular was a travesty as is its spawn Windows. Bill Gates should be understood not as a computer geek but as the son of a lawyer. If we ever want to make real progress again, it had best be for people and not profits.
For what it's worth, a friend of mine tried to get me interested in building black boxes in 1973. I declined, and I also declined to go to UC Berkeley that year after being admitted (I chose to go somewhere else). I wonder if I had gone to Berkeley whether I would have ended up on the BSD project, which considerably improved on the speed of Unix--which had been abysmal, and was a household word among computer geeks for decades, and minted a billionaire or two after they started SUN computers--which was the first to pull all of modern computing together aka "the Network is the Computer," and was always a technological pioneer, but could never compete on price and was sold for parts in 2010. I started programming in 1972 and by 1973 was programming on DEC. DEC computers--mainly used in academia--had various clunky OS's not totally unlike Unix. Unix itself had originally been programmed on and ran on DEC computers because they were relatively cheap, open, and ubiquitous in universities. These were vastly more powerful and flexible than the IBM PC in 1981, which was a catch-up effort by IBM pulling in off-the-shelf parts and an outsourced OS because IBM's in-house personal computer project had gotten bogged down and IBM didn't want to be too late for the game. At the time there were already dozens of unique Personal Computers, nobody I knew paid any attention to Apple, and Digital Research was the leading maker of Personal Computer OS's. Gary Kildal of Digital Research--thinking he already owned the world--refused to meet with IBM and so IBM found another guy willing to sign their unwieldy OS contract. The only reason the IBM PC succeeded was the name "IBM." Meanwhile, DEC had gotten too big for it's britches...it had begun trying to compete with IBM mainframes with it's VAX line of computers, which were nice but could never be made as fast as IBM mainframes, or even close. DEC infamously wasn't interested in "personal computers" but rather in "departmental computers." The company I worked for in the 1980's programmed Computer Aided Design systems on Data General, VAX, and Apollo. We looked at the original IBM PC and concluded it was not good enough, not even close. A tiny competitor jumped right in and eventually ate our lunch as the PC and clones got way faster. The real history of computing is far more complicated than can be described in a few sentences. And we tend to view the past through the lens of the present. Back then, where we are now didn't seem to be where we were going, and I'm not sure it was all for the best. At one time, computer programming was a genteel club and we cared about the goodness of everything. That's when the seminal progress was being made. Now it's a sweatshop, modern OS's and programming languages make Frankenstein look cute, and I'm glad to be retired. I can easily understand how someone not part of this subculture would have trouble writing something accurate and meaningful down. I often find my intended words mangled on my own screen and can barely imagine what it must be like to get a huge book published.
No comments:
Post a Comment