Next-generation database technologies
There is a need for a third-generation of database technologies,
as we are forced to embrace a world of large-memory models, clustered servers,
and highly compressed column-wise storage. By Nivedan Prakash
database management systems (DBMS) technology has matured, there remains potential
for innovation in integrating structured and unstructured data, virtualizing
access to data, and simplifying data management through greater automation and
DBMS technology and middleware will also evolve to support the information fabric
by virtualizing access to heterogeneous data. These trends will offer an evolutionary
path to a future world of information management in which all forms of information
will be easier to access, integrate, and control, and this will all come at
a lower cost, due to increased automation.
Many organizations will move to upgrade or expand existing legacy networks and
infrastructure; hence the database market will see lots of activity and increased
competition in an already mature space.
According to IDC reports, most data warehouses will be stored in a columnar
fashion and not in rows, reporting and data collection problems will be solved
with databases that have no formal schema at all, horizontal scalability through
clustering will be achieved by large-scale database servers; and most OLTP databases
will either reside entirely in memory or be augmented by an in-memory database.
These new systems will encourage companies to forget disk-based partitioning
schemes, buffer management, indexing strategies and embrace a world of large-memory
models, many processors with many cores, clustered servers and highly compressed
column wise storage.
Springboard Research reports suggest that databases are critical for data intensive
environments like banking, financial services and insurance telecom, retail
and PSU. Sanchit Vir Gogia, Associate Research Manager - Software, Springboard
Research said that India as a market for DBMS is at an inflection point. While
large enterprises are clearly dedicating a portion of their IT budget to better
manage data, SMBs are also waking up to these benefits. Interestingly, investment
in DBMS by SMBs is largely driven through the pent-up demand for enterprise
applications like ERP, CRM, etc.
We are foreseeing a rapid growth in data volumes as organizations try
to get more granular control over what is happening within. The number of events
that organizations will track in the near future in real time will definitely
rise. Hence, the modernization of application infrastructure and growing data
warehousing needs are the key driving factors for the growth of the database
market, opined Sundar Ram, VP - Technology Sales Consulting, Oracle Asia
Emerging database technologies
memory databases are faster than disk-optimized databases, as the internal
optimization algorithms are simpler and execute fewer CPU instructions.
Accessing data in memory provides faster and more predictable performance
than disk. In applications, where response time is critical, such as telecommunications
network equipment that operates
emergency systems, main memory
databases are often used"
- Sanjay Mehta
CEO of MAIA Intelligence
Of late, the industry has seen the emergence of business intelligence
and data warehousing as a major factor influencing business decisions. Businesses
have gone past the applications phase to a phase where useful, accurate
and timely information from application data is being used to make better business
Databases have also begun to align with this shift in customers needs.
They continue to be the heart of transaction processing; but there is a definite
shift in focus towards providing databases that support superior performance
as far as reporting, analytics and data mining are concerned. The trend towards
the use of databases for mining useful, accurate information has led to the
creation of a new category of databases.
Sybase IQ uses a patented columnar architecture that
provides superior performance in querying and reporting. It works with any transactional
database at the back end and with any reporting software on the front,
said Sudesh Prabhu, Director - Presales and Services, Sybase India.
Moreover, the adoption of specialty servers is growing and customers are moving
away from increasingly constrained row-based databases for analytics and data
warehousing. There is greater demand for advanced analytics to uncover business
opportunities and risk, thereby driving demand for forecasting, predictive modeling
and data mining. The market is also seeing the integration of all data types
into BI, wherein traditional plus streaming data, non-relational/ unstructured
data, (XML data, geospatial data and media) is taking place.
Some of the other latest databases that are being deployed by Indian organizations
are Oracle 11g (includes high availability solution), SQL Server 2008 (also
includes high availability solution), and DB2/UDB V9.0.
Meanwhile, many of the latest database technologies not only
help in solving exotic data warehouse or other intensive number-crunching problems
but also real-world data management problems.
Sophisticated analytics to outsmart the competition is emerging as a must-have
business practice in many industries. Vast amounts of current and historical
data must be run against intricate analytical models to accurately predict future
However, these analytics systems are where the data explosion has had the most
impact. Implementing more efficient analytics software, therefore, can solve
not only the data explosion and its byproduct (rampant energy use) but also
dramatically increase the speed, scalability, and flexibility of business intelligence.
Points to remember
When looking at database platforms, and making choices around selecting a database,
it is vital to look at the functional requirements within the database that
the customer requires and match that to what the database delivers right from
the start without additional customization by a professional services team.
If the choice of a database also enforces a concomitant requirement for specialized
skills in consulting around the technology, the customer is then forced into
a situation where the amount of engineering for the application will only increase.
Bhaskaran Gurumoorthy, Senior Manager at CSC pointed out that it is important
to understand the functional requirement and get acquainted with the environment
and business criticality. Various factors should be considered before making
a final decision on the databases including response time, the ability to handle
huge data stores and data security.
He added that implementing or choosing a database without understanding the
functional requirement would result in the escalation of both the initial cost
of acquisition as well as make it more difficult to modify the architecture
and environment at a later date. It would also result in inefficiencies in terms
of responsiveness of the system/database and customer losing the faith of the
However, there are possibilities that the situation might worsen, as data volumes
are increasing at an exponential rate. Today, even small and medium business
are transacting at a higher rate than expected and its not uncommon to
find enterprise users asking for over 100 business transactions to be completed
every second. At this rate, we are seeing typical operational and analytical
databases crossing the terabyte mark as a standard, and can even think of 10
TB databases as the norm.
Unless database technology is able to demonstrate its ease in managing and manipulating
data on these scales, and show its ability to perform with a high rate of growth,
the customer will have no choice but to keep investing in more computing capacity
for the same application.
Other significant aspects
Here, it will be right to say that most OLTP databases will either be augmented
by an in-memory database or reside entirely in memory; and most large-scale
database servers will achieve horizontal scalability through clustering.
An in-memory database stores records in the systems main memory, resulting
in performance that is an order of magnitude faster than that of traditional,
file system-based database management systems. Such a databases streamlined
design can also greatly reduce code and CPU footprint.
The technological advancement now allows even terabytes of data to be stored
and managed in memory, to serve as a front end cache for an even larger backend
database, which is stored on a hard drive (or several hard drives as the case
Most OLTP databases will either be augmented by an in-memory database or reside
entirely in memory; and most large-scale database servers will achieve horizontal
scalability through clustering.
Sanjay Mehta, CEO of MAIA Intelligence, added, In recent years, main memory
databases have attracted the interest of larger database vendors. Main memory
databases are faster than disk-optimized databases, as the internal optimization
algorithms are simpler and execute fewer CPU instructions. Accessing data in
memory provides faster and more predictable performance than disk. In applications,
where response time is critical, such as telecommunications network equipment
that operates emergency systems, main memory databases are often used.
In order to ensure a smooth transition to the next generation of DBMS, vendors
should consider problems their products solve today that might be more effectively
addressed by these third-generation technologies. They also need to enhance
or evolve their DBMS products to incorporate the technology or functionality
in order to address the demands associated with those database workloads that
they have targeted.
Besides, vendors should also determine whether there are other workloads, and
therefore other opportunities, that they do not address today, but that they
could address if they develop or acquire third-generation DBMS technology.
Moreover, to satisfy the needs of users outside of business applications, database
technologies must be expanded to offer services in two other dimensions, namely
object management and knowledge management. Object management entails efficiently
storing and manipulating non-traditional data types such as bitmaps, icons,
text, and polygons. Object management problems abound in CAD and many other