Intel Thunderbolt: brilliant innovation or worthless grasp at the past?
Cables are a pain in the ass. Almost any device you have can hook to your big screen — if you are willing to jump through hoops.
I’m personally on a quest to find the cheapest way to hook my Frankenputer media monster computer, my cameras, my Ipod, and my other computers to the bigass flat panel and kickass 7.1 speakers, but there are tons of problems to be ironed out.
Widgets and dongles are the best and easiest path where you can make them work, but it’s looking more and more like I need either an AV switcher device or a full bore AV Receiver to make this all work easy.
I am reticent to shell out those $$$$$$ since that whole “Home theater” scam is geared towards people who have 20k to drop on their showplace installation rather than the average guy who just wants all his stuff to work on the good monitor. I’m almost there, but it’s getting rougher along the home stretch, almost like I have to whip out my breakout box and start re-pinning out my own custom cables.
But perhaps it won’t be this way forever. Over the past few years, Intel has been developing a new cable standard that’s designed to replace all of the different cables that now tangle up our lives. On Thursday, the company officially launched the technology, called Thunderbolt. It’s debuting on Apple’s new line of MacBooks, but Intel is pushing for it to become the dominant cable standard for the PC and electronics industry. It has a good chance of doing so: Thunderbolt works for video, audio, and data, meaning that you can use the same cable to connect your computer to your monitor, your external hard drive, your router, your TV, or any other Thunderbolt-enabled device. If Thunderbolt succeeds, then, it will replace USB, HDMI, Ethernet, DVI, VGA, RCA, and pretty much every other impenetrably named connector. You’ll have one cable to connect everything—won’t that be grand?