Data Warehousing has come a long way. The evolution of technology – especially in storage and processing – causes the technical solutions to evolve as well. It is time to revisit the traditional techniques of data warehousing, right from the data modeling aspects.

Most data warehouse implementations, from my experience, focused on quantitative aspects such as managing large volumes of data. There was an interviewer who almost mocked at me when I said I have not worked with terabytes of data. What we missed was the qualitative aspects of data, until it became an imperative. A large portion is then spun off as Master Data Management. There was a colleague who told me the client needs just an EDW, and we should not build an MDM solution. At one point in time EDW fulfilled most of today’s MDM requirements. In essence, the qualitative aspects, including “single version of truth”, has a large say on the success or failure of an data warehouse implementation.

De-normalization and degeneration slowly causes issues in data management. It could be due to duplication of data, increased storage, or challenges in improving the real-time performance. Then there are non-relational data warehouses – implemented on flat-files, there is semi-structured data, search.

Data warehousing is on the verge evolving into another generation. Traditional data warehouses suffered from being tethered to business intelligence systems. Business Intelligence is cutting itself loose from data warehouses. Soon data warehouses will then be designed from data perspective, not from analytics or reporting perspective.