One of my friends just purchased a new computer. i7-930 2.80 GHZ 8 M LGA 1366, 1 TB HD, 6 GB DDR3 PC 1333 RAM, Liquid Cooling, Gigabyte X58A-UD3R 3Way Crossfire and SLI MB, 1000 W Corsaid PSU, 2x Nvidia GTX 470s (1.28 GB). And he promptly put his stat line up as a status message for all to envy. Which prompted a lot of conversation as to overclocking, prices, watercooling and blah blah blah. This guy’s EE degree aside, he will never use anywhere near this computer’s true potential. This is like a guy buying a Lamborghini to pick up the groceries, I guess it’s impressive but impractical for the job at hand. I will say up front that I was the only nay-sayer in this rogues gallery of computational wang-measure.
Overclocking is the process of pushing a computer processor up and over its standard operating load, while (hopefully) taking additional steps to manage the corresponding increases in power, heat generation, and increased bus speeds required to pass sped up data from various locations in the computer. It can be a useful method of getting a more powerful machine when you’re on a budget, allow you to take a stepwise approach to machine improvement as more funds become available, and can in general be a great problem solving/how-does-a-computer-work understanding experience if one has a moderate-to-high tolerance for frustration and lots of time to read/return parts. Especially because there’s no guarantee that whatever company you were going to buy didn’t discontinue a key component, change a chip manufacturer, or just have some bizarre manufacturing defect.
Maybe there’s a certain degree of envy – my single-core desktop with its quasimodoed power supply (until I work out 10 minutes with a borrowed Dremel – the hand drill/hacksaw didn’t cut it), my Toshiba R15 Tablet suffered from tragic hinge collapse after 2 years of hard use – between that and a lucky break with a scavenged Inspiron 8200, I’ve got one functional wall-bound laptop computer doubling as my TV. For the average computer user – most people can accomplish everything that they need on a netbook – no heavy number crunching, image/video processing, and so on. Computers have been getting smaller, faster, and lighter without any real clear purpose or direction. Saving a discussion of quantum tunneling, the limits of modern lithography, and the “technological singularity” for another post, human limitations are becoming the limiting factor for technology – sensory (“retina grade” display resolution, lossless codecs), interactive (novel imput methods – Wii/multi-touch are gimmicks at their current level), temporal (how many hours can be spent consuming), creative (procedurally generated code and the “Simpson’s did it!” phenomenon), etc. Is this why there’s not much in the way of quality hard science fiction these days, or maybe I just have a hard time finding it? With technology in general – iterative, short-sighted improvement (smaller, cheaper, faster, more) long-term consequences are considered and downplayed (energy, RoHS, scrambled attention spans, cognitive development) but long term goals are largely ignored (Where are we going as individuals/as a society, why are we developing this?) I would say the greatest single human limitation to computers is one of expectation.
I’ve had this post up for 3 days trying to figure out how to end it – Just put it up “Incomplete” and go from there.