Posted by: Clinek
Managing an Oracle shop, Oracle database administration
I received some interesting comments about my post Is the Oracle DBMS a legacy technology? from Mich Talebzadeh, a London-based independent Sybase consultant. Let me know what you think . . . .
There is a convincing argument these days to classify Oracle’s approach to handling both OLTP and OLAP type load as fundamentally flawed.
The argument is not so much that relational database management systems like Oracle “should be considered legacy technology” since they are more than a quarter of century old, but more to do with the philosophy and the design approach that they were created for. Twenty five years ago, and even ten years ago, databases were far smaller, on the order of a few gigabytes in size. They were dealing with a smaller community of users that were equally adding and reading data. This was the era of the transactional base databases in which the user — for example, a trader – was only interested in his/her own portfolio. Adding a few trades and reading them with some aggregates was fine and within the capability of the old optimiser, relying purely on nested-loop, join, etc. At that time, most RDBMS engines, including Sybase and others, did not even bother implementing sort-merge joins, never mind the hash-joins.
Things have moved on since then. Today’s systems and users deal much more with non-transactional (read) activity than with transactional (write/update) activity. An average trader today is not only interested in his/her portfolio but also interested in other portfolios and analytics, which requires sifting through millions of lines of records.
If I were to design a new database engine today, I would gear it towards OLAP type rather than OLTP type. I would probably make the deign *column-based* as opposed to row-based. A column-oriented system is a database management system that stores its content by column rather than by row. This has advantages for databases that most read data, such as data warehouses, plus adds the convenience of data compression, as Sybase IQ does. My argument would be that in today’s read-hungry world, an average user will read far more than he/she writes.
Going back to the approach of Oracle, it is becoming increasingly clear than *one size/one philosophy* does not fit all. When it comes to OLTP applications, we know well that Oracle is slower compared to Sybase (or Microsoft SQL Server for that matter). On the other hand, its transactional and concurrency approach is better suited for large volume of data. However, what it cannot do (although it tries very hard) is to adapt the relational model, which Oracle (much like Sybase and Microsoft SQL server) is based on, to handle what a column-based system inherently does much better!
Does that mean that Oracle and for that matter Sybase and Microsoft SQL Server are legacy systems? I don’t think so. All it means is that there is a limit that and you can take the relational model only so far. The RDBMSs are perfectly OK for certain things. However, the column-based systems are good for heavy read, low volume, transactional activity.
So what is the solution in a business world that is growing increasingly distributed and has to deal with the increased use of heterogeneous systems and proprietary databases across different levels of business? I genuinely think that we need to deploy tools for the purpose and relying on one engine philosophy (say RDBMS) solution is no longer viable. The solution is not to bolt a Ferrari and a bus together and create an all purpose vehicle, but more to use a Ferrari and a bus where appropriate. Today’s technology perfectly allow us to use a good RDBMS system like Sybase ASE for transactional activity with a column-based system like Sybase IQ. For that matter, you can use Oracle as an RDBMS and bolt it to Sybase IQ, if that’s what you have to do.
Mich tells me that he drives a Ferrari.