Data Architecture Optimization describes an approach to apply Apache Hadoop and related community-driven open source technologies to make data architectures capable of realizing new and improved business outcomes while driving significant cost out of the IT budget.
The rapid growth in data volumes from a wide range of new sources indeed offers disruptive opportunity to those who can put it to use. There is a change in mindset among IT organizations and data architects, who now look to capture all data, keep it longer, and prepare to use the data in new ways as business conditions evolve. These changes create dramatic pressure on traditional data architectures, which were built to support structured data with modest growth.
Download this whitepaper to learn how Hortonworks Data Platform (HDP), built on Apache Hadoop, offers the ability to capture all structured and emerging types of data, keep it longer, and apply traditional and new analytic engines to drive business value, all in an economically feasible fashion. In particular, organizations are breathing new life into enterprise data warehouse (EDW)-centric data architectures by integrating HDP to take advantage of its capabilities and economics.