Does Big Data sweep away data warehouses or do the classic data warehouses have future?
Data warehouses and data exploiting solutions built on them flourished 10 years ago. A lot of analytical requests and the diversified analyst database buildings finally led in word or in thought to data warehouse projects. Development and keeping up-to-date of a data warehouse has huge cost. Should we forget our data warehouses due to the growing penetration of Big Data solutions? Has our investment been unnecessary?
Workers in the „data industry” have been arguing nowadays a lot about when Big Data tools are worth using and when we should use our classic data warehouse. Moreover this could be different in different organizations depending on the refreshing and complexity of our data warehouse, and the availability of the production Big Data enviroment is. The decision-makers often choose the fast and safe beaten path instead of the more innvovative solutions, however we can see hybrid solutions as well, where the unstructured data are processed by Big Data tools and are loaded into classic warehouse in a structured model. However experts generally agree that the tested data and data in use inside the structured data warehouse built at high cost should be utilized as much as possible. The examination of the following aspects can be helpful during evolving the solution:
- Is the necessary data available in our classic data warehouse?
- How structured is the available source data?
- How big is the data quantity?
- How educated users will use data and how?
- How fast data updating is needed?
Our possibilities of data processing and analysis have been growing by the penetration of Big Data methods and tools. Furthermore the market needs more and more analysts who are experts in data interpretation and analytical methodology.