In the past years our company moved from collecting excel files to OLTP database storage. Focusing on data quality, processes integration and reporting tools now the path has been created.
Traditional optimization methods applied to OLTP databases (indexing, partitioning etc.) are going to be insufficient in terms of improving reporting time as the amount of data will increase. Furthermore non-IT users have difficult to extract data from OLTP systems, and also extracting data through data visualization tools is possible only if they have a good command of OLTP relations/tables/rules.
In the other hand de-normalization of data can create incongruences in terms of reporting and collection of data. Develop OLAP systems requires also a big amount of time in creating schemas, rules, systems with the only gain to have better reporting systems and more users involved.
Is there any best-practice to handle this change?
How I can decide the best direction for my company?
Do you have any suggestion on books, cases, discussions that can help my on manage this change?
Free Guide: Managing storage for virtual environments
Complete a brief survey to get a complimentary 70-page whitepaper featuring the best methods and solutions for your virtual environment, as well as hypervisor-specific management advice from TechTarget experts. Don’t miss out on this exclusive content!