Cloud Computing will change the way we do databases
Thanks in part to Larry Ellison’s hard work and rapacious libido, databases are to be found everywhere. They lie at the bottom of most web applications and in nearly every bit of business software. If your web site uses dynamic content, you need a database. If you run SAP or any ERP or CRM application, you need a database. We’re all using databases all the time, whether we actually have one installed on our personal computers or not.
But that’s about to change.
We’re entering the age of cloud computing, remember? And clouds, it turns out, don’t like databases, at least not as they have traditionally been used.
This fact came out in my EmTech panel and all the experts onstage with me nodded sagely as my mind reeled. No database?
No database.
Parallel computing used to mean scientific computing, where hundreds or thousands of processors were thrown at technical problems in order to solve them faster than Moore’s Law might otherwise have allowed. The rest of us were relying on rising clock rates for our performance fix, but scientists — scientists with money — couldn’t wait so they came up with the idea of using multiple CPUs to solve problems that were divided into tasks done in parallel then glued back together into a final result. Parallel computing wasn’t easy, but sometimes that was the whole point — to do it simply because it was so difficult. Which is probably why parallel computing remained a small industry until quite recently.
What changed was Moore’s Law put an end to the clock rate war because chips were simply getting too hot. While faster and faster chips had for the most part linear performance increases along with cost and power consumption decreases, the core temperature inside each microprocessor chip was going up at a cubic rate. Back in 2004 Intel released a chart showing that any clock speed over 5 GHz was likely to melt silicon and Moore’s Law would, by 2010, make internal processor temperatures similar to those on the surface of the Sun!