I got a kick out of reading the Slashdot posting
Replacing Traditional Storage, Databases With In-Memory Analytics.
One of my personal quirks is that the relational/sql model has never made
much sense to me. It's both cumbersome and slow. Give me a big
bucket of RAM and a log file any day. It's always hugely faster and
more flexible. If the database is too big for RAM,
There's a odd sort of political correctness about SQL. I've frequently
run into people with high performance transaction systems. When asked
how they achieved that performance, "big HashMap" comes up often,
and often with a hint of embarrassment. Some people seem to think
that it's just a hack that they're forced into to achieve performance.
But there's a murky distinction in my mind between "hack" and
"elegant technique". I tend to think of the log as the Truth, and RAM
as a cache that just happens to be big enough to contain everything.
There's a huge bag of tricks to trade off reliability, scale, distribution
and startup time. Pick a point in that multidimensional space, and there's
almost always a set of tricks to get you there.