Welcome!

Server Monitoring Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Carmen Gonzalez, Ken Schwaber

Related Topics: Server Monitoring, Containers Expo Blog, @CloudExpo

Server Monitoring: Article

Hybrid Solutions for Data Virtualization and Enterprise Data Warehouses

Eight best practices – Part 2

Weblogic at Cloud Expo

The intersection of data virtualization and enterprise data warehouses represents corporate best practices for delivering the rich data assets available in the enterprise data warehouse with the myriad sources of data now available outside the data warehouse.

In Part Two of this two-part series, I will target improving data warehouse efficiency by showing four best practices where data virtualization, used alongside data warehouses, saves time and money.

Part One examined ways that data virtualization improves data warehouse effectiveness.

4. Data Warehouse Hub and Virtual Data Mart Spoke
In a common scenario in today's enterprises, a central data warehouse hub is surrounded by satellite data marts in the same way that spokes surround a hub. These marts typically contain a subset of the warehouse data and are used by a subset of the users. These marts are often created because analytic tools require data in a different form than the warehouse. On the other hand, they may be created to work around warehouse controls. Regardless of their origination, each additional mart adds cost and potentially compromises data quality.

IT teams use data virtualization to create virtual data marts to eliminate and/or reduce physical data marts. This approach abstracts warehouse data to meet specific consuming tool and user query requirements, while preserving the quality and controls of the data warehouse.

In the integration pattern shown in Figure 1, the data virtualization middleware hosts virtual data marts that logically abstract and serve specific analytical reporting requirements.

Industry Examples
A mutual fund company uses data virtualization to enable its 150-plus financial analysts to build portfolio analysis models leveraging a wide range of equity financial data from a ten-terabyte financial research data warehouse with MATLAB and similar analysis tools. Prior to introducing data virtualization, the financial analysts frequently created satellite data marts for every new project. Now the IT team offers virtual data marts with a set of robust, reusable views that directly access the financial data warehouse on demand. Analysts now spend more time on analysis and less on accessing data, thereby improving their portfolio returns. The IT team has also eliminated extra, unneeded marts and their maintenance/operating costs.

An energy and oil company uses data virtualization to provide oil well platform data from a central Netezza data warehouse to engineers, maintenance managers and business analysts. This data is optimally formatted for a wide range of specialized analysis tools including Business Objects, Excel, Tibco Spotfire, Matrikon ProcessNet and Microsoft Reporting. The IT team quickly builds virtual views and services, thus rapidly responding to new, ad hoc queries. Analysts now leverage the warehouse with its virtual data marts as the single source of truth rather that replicating the data in local, "rogue" data marts.

6. ETL Pre-Processing
Extract, Transform and Load (ETL) tools, leveraging virtual views and data services as input to their batch processes, appear as another data source. This best practice also integrates data source types that ETL tools cannot easily access, as well as reuses existing views and services, saving time and costs. Further, these abstractions do not require ETL developers to understand the structure of, or interact directly with, actual data sources, significantly simplifying their work and reducing time to solution.

In the integration pattern shown in Figure 2, the data virtualization middleware complements ETL by providing access, abstraction and federation of packaged applications and web services data sources.

Industry Example
An energy company wanted to include SAP financial data in its enterprise data warehouse along with other sources and content. However, its ETL tools alone were unable to decode the SAP R/3 FICO complex data model. The IT team used data virtualization to access the SAP R/3 FICO data, abstract it into a form more appropriate for the warehouse, and stage it virtually for the ETL tools. With more complete and timely financial data in the data warehouse, the company can now better manage its financial performance.

7. Data Warehouse Prototyping
Building data warehouses from scratch is time-consuming and complex, requiring significant design, development and deployment efforts. One of the biggest issues early in a warehouse's lifecycle is frequently changing schemas. This change process requires modification of both the ETL scripts and physical data and typically becomes a bottleneck that slows new warehouse deployments. Nor does this problem go away later in the lifecycle; it merely lessens as the pace of change slows.

In the integration pattern shown in Figures 3 and 4, the data virtualization middleware serves as the prototype development environment for a new data warehouse. In this prototype stage, a virtual data warehouse is built, saving the time required to build a physical warehouse. This virtual warehouse includes a full schema that is easy to iterate as well as a complete functional testing environment.

Once the actual warehouse is deployed, the views and data services built during the prototype stage retain their value and prove useful for prototyping and testing subsequent warehouse schema changes that arise as business needs or underlying data sources change.

Industry Example
By using data virtualization during the prototyping stage, a government agency has speeded up the ETL and warehouse development process by four times. This result remains consistent during subsequent translations of working views into ETL scripts and physical warehouse schemas.

8. Data Warehouse Migration
Enterprises migrate their data warehouses for a number of reasons, including saving costs and reducing total cost of ownership. Another is mergers and acquisitions. In these scenarios, duplicate data warehouses need to be rationalized. A third reason is standardization. When an enterprise or government agency wants to rationalize different warehouses based on disparate technology platforms, they may migrate to a standard platform.

Regardless of the reason, in every case the reporting and analysis supported by the migrating data warehouse must continue to run seamlessly.

Data virtualization removes reporting risk by inserting a virtual reporting layer between the warehouse and the reporting systems. Decoupling these systems enables the reporting to continue before, during and after the migration. The integration patterns shown in Figure 5, Figure 6, and Figure 7 depict the original state prior to migration, the modification to reports via the virtualization reporting layer, and the stage where the new warehouse and supporting ETL are brought online and the old data warehouse is retired. The virtual layer insulates reporting users by enabling a controlled migration of the reporting views. Each existing view can be duplicated, modified to point at the new warehouse, and tested before the actual cutover, thereby insulating the reporting users from undue risk. Further, the virtual reporting layer is easily extensible for adding more sources or supporting new reporting solutions.

Industry Example
To reduce total cost of ownership when migrating to a data warehouse appliance, a large technology company used data virtualization to decouple its reporting from the data warehouse. Significant data warehousing cost reductions were achieved and reporting successfully migrated without interruption.

Conclusion
The articles in this two-part series have examined the eight best practices for using hybrid solutions of data virtualization and enterprise data warehouses to deliver the most comprehensive information to decision-makers. The intersection of the virtual and the physical data warehouse is aiding forward-thinking enterprises that must deal with the proliferation of data sources, including many web-based and cloud computing sources outside traditional enterprise data warehouses. Enterprises and government agencies that can learn from and adapt these best practices to their own enterprise information architectures will be best prepared to handle the continuous deluge of data found in most enterprise information systems today.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...