I’ve never gotten sucked into the hoary Mac vs. PC debate. As far as I’m concerned, they both suck. They both keep us perpetually off balance, technologically speaking, and both leave a trail of obsolete peripherals in their wake.
This morning my wife got on the laptop I’d just loaded with Windows 7 and reported (I’m paraphrasing here): “This *&$^% printer doesn’t work.” I checked it out and was able to confirm her findings. Microsoft’s own support site tells me that my little printer, about two years old, is not compatible with their latest and greatest OS. No apology, no hints on how to make it work. Basically, if I want to print anything from Windows 7, I’m going to have to take that 2-year-old printer to the curb and get a new one.
Just for fun, I checked on Apple’s site, to see if a Mac running Snow Leopard might have better luck. Maybe it was time to switch. But nope. My printer’s dead to Apple, too. But they’d be happy to sell me a new one that would work.
Home computers are wondrous machines, able to Hoover up hours of vitality and convert it seamlessly into useless butt time. You can play amazing games, watch streaming HD video, play JibJab mashups and organize millions of crappy photos and videos into convenient libraries you will never use. But try to print a single black-and-white document after an OS upgrade, and things can get difficult.
I get it, OK? It’s cutting-edge technology. The idea is that we upgrade everything on the same cycle and send our perfectly good stuff to the landfill with every incremental advance. But I’ve been doing that too long. I’ve owned computers since 1984 (the first was an Apple IIe) and I shudder to think of all the functioning hardware I’ve disposed of since then: printers and modems and headphones and monitors and mice and scanners. I love tech as much as the next guy — maybe more — but those landfills can only hold so much.