Welcome!

Server Monitoring Authors: Carmen Gonzalez, AppDynamics Blog, Yeshim Deniz, Liz McMillan, Pat Romanski

Related Topics: Server Monitoring, Containers Expo Blog, @CloudExpo

Server Monitoring: Article

Hybrid Solutions for Data Virtualization and Enterprise Data Warehouses

Eight best practices – Part 2

Weblogic at Cloud Expo

The intersection of data virtualization and enterprise data warehouses represents corporate best practices for delivering the rich data assets available in the enterprise data warehouse with the myriad sources of data now available outside the data warehouse.

In Part Two of this two-part series, I will target improving data warehouse efficiency by showing four best practices where data virtualization, used alongside data warehouses, saves time and money.

Part One examined ways that data virtualization improves data warehouse effectiveness.

4. Data Warehouse Hub and Virtual Data Mart Spoke
In a common scenario in today's enterprises, a central data warehouse hub is surrounded by satellite data marts in the same way that spokes surround a hub. These marts typically contain a subset of the warehouse data and are used by a subset of the users. These marts are often created because analytic tools require data in a different form than the warehouse. On the other hand, they may be created to work around warehouse controls. Regardless of their origination, each additional mart adds cost and potentially compromises data quality.

IT teams use data virtualization to create virtual data marts to eliminate and/or reduce physical data marts. This approach abstracts warehouse data to meet specific consuming tool and user query requirements, while preserving the quality and controls of the data warehouse.

In the integration pattern shown in Figure 1, the data virtualization middleware hosts virtual data marts that logically abstract and serve specific analytical reporting requirements.

Industry Examples
A mutual fund company uses data virtualization to enable its 150-plus financial analysts to build portfolio analysis models leveraging a wide range of equity financial data from a ten-terabyte financial research data warehouse with MATLAB and similar analysis tools. Prior to introducing data virtualization, the financial analysts frequently created satellite data marts for every new project. Now the IT team offers virtual data marts with a set of robust, reusable views that directly access the financial data warehouse on demand. Analysts now spend more time on analysis and less on accessing data, thereby improving their portfolio returns. The IT team has also eliminated extra, unneeded marts and their maintenance/operating costs.

An energy and oil company uses data virtualization to provide oil well platform data from a central Netezza data warehouse to engineers, maintenance managers and business analysts. This data is optimally formatted for a wide range of specialized analysis tools including Business Objects, Excel, Tibco Spotfire, Matrikon ProcessNet and Microsoft Reporting. The IT team quickly builds virtual views and services, thus rapidly responding to new, ad hoc queries. Analysts now leverage the warehouse with its virtual data marts as the single source of truth rather that replicating the data in local, "rogue" data marts.

6. ETL Pre-Processing
Extract, Transform and Load (ETL) tools, leveraging virtual views and data services as input to their batch processes, appear as another data source. This best practice also integrates data source types that ETL tools cannot easily access, as well as reuses existing views and services, saving time and costs. Further, these abstractions do not require ETL developers to understand the structure of, or interact directly with, actual data sources, significantly simplifying their work and reducing time to solution.

In the integration pattern shown in Figure 2, the data virtualization middleware complements ETL by providing access, abstraction and federation of packaged applications and web services data sources.

Industry Example
An energy company wanted to include SAP financial data in its enterprise data warehouse along with other sources and content. However, its ETL tools alone were unable to decode the SAP R/3 FICO complex data model. The IT team used data virtualization to access the SAP R/3 FICO data, abstract it into a form more appropriate for the warehouse, and stage it virtually for the ETL tools. With more complete and timely financial data in the data warehouse, the company can now better manage its financial performance.

7. Data Warehouse Prototyping
Building data warehouses from scratch is time-consuming and complex, requiring significant design, development and deployment efforts. One of the biggest issues early in a warehouse's lifecycle is frequently changing schemas. This change process requires modification of both the ETL scripts and physical data and typically becomes a bottleneck that slows new warehouse deployments. Nor does this problem go away later in the lifecycle; it merely lessens as the pace of change slows.

In the integration pattern shown in Figures 3 and 4, the data virtualization middleware serves as the prototype development environment for a new data warehouse. In this prototype stage, a virtual data warehouse is built, saving the time required to build a physical warehouse. This virtual warehouse includes a full schema that is easy to iterate as well as a complete functional testing environment.

Once the actual warehouse is deployed, the views and data services built during the prototype stage retain their value and prove useful for prototyping and testing subsequent warehouse schema changes that arise as business needs or underlying data sources change.

Industry Example
By using data virtualization during the prototyping stage, a government agency has speeded up the ETL and warehouse development process by four times. This result remains consistent during subsequent translations of working views into ETL scripts and physical warehouse schemas.

8. Data Warehouse Migration
Enterprises migrate their data warehouses for a number of reasons, including saving costs and reducing total cost of ownership. Another is mergers and acquisitions. In these scenarios, duplicate data warehouses need to be rationalized. A third reason is standardization. When an enterprise or government agency wants to rationalize different warehouses based on disparate technology platforms, they may migrate to a standard platform.

Regardless of the reason, in every case the reporting and analysis supported by the migrating data warehouse must continue to run seamlessly.

Data virtualization removes reporting risk by inserting a virtual reporting layer between the warehouse and the reporting systems. Decoupling these systems enables the reporting to continue before, during and after the migration. The integration patterns shown in Figure 5, Figure 6, and Figure 7 depict the original state prior to migration, the modification to reports via the virtualization reporting layer, and the stage where the new warehouse and supporting ETL are brought online and the old data warehouse is retired. The virtual layer insulates reporting users by enabling a controlled migration of the reporting views. Each existing view can be duplicated, modified to point at the new warehouse, and tested before the actual cutover, thereby insulating the reporting users from undue risk. Further, the virtual reporting layer is easily extensible for adding more sources or supporting new reporting solutions.

Industry Example
To reduce total cost of ownership when migrating to a data warehouse appliance, a large technology company used data virtualization to decouple its reporting from the data warehouse. Significant data warehousing cost reductions were achieved and reporting successfully migrated without interruption.

Conclusion
The articles in this two-part series have examined the eight best practices for using hybrid solutions of data virtualization and enterprise data warehouses to deliver the most comprehensive information to decision-makers. The intersection of the virtual and the physical data warehouse is aiding forward-thinking enterprises that must deal with the proliferation of data sources, including many web-based and cloud computing sources outside traditional enterprise data warehouses. Enterprises and government agencies that can learn from and adapt these best practices to their own enterprise information architectures will be best prepared to handle the continuous deluge of data found in most enterprise information systems today.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the mod...
Tapping into blockchain revolution early enough translates into a substantial business competitiveness advantage. Codete comprehensively develops custom, blockchain-based business solutions, founded on the most advanced cryptographic innovations, and striking a balance point between complexity of the technologies used in quickly-changing stack building, business impact, and cost-effectiveness. Codete researches and provides business consultancy in the field of single most thrilling innovative te...
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
CloudEXPO has been the M&A capital for Cloud companies for more than a decade with memorable acquisition news stories which came out of CloudEXPO expo floor. DevOpsSUMMIT New York faculty member Greg Bledsoe shared his views on IBM's Red Hat acquisition live from NASDAQ floor. Acquisition news was announced during CloudEXPO New York which took place November 12-13, 2019 in New York City.
OpsRamp is an enterprise IT operation platform provided by US-based OpsRamp, Inc. It provides SaaS services through support for increasingly complex cloud and hybrid computing environments from system operation to service management. The OpsRamp platform is a SaaS-based, multi-tenant solution that enables enterprise IT organizations and cloud service providers like JBS the flexibility and control they need to manage and monitor today's hybrid, multi-cloud infrastructure, applications, and wor...
The Master of Science in Artificial Intelligence (MSAI) provides a comprehensive framework of theory and practice in the emerging field of AI. The program delivers the foundational knowledge needed to explore both key contextual areas and complex technical applications of AI systems. Curriculum incorporates elements of data science, robotics, and machine learning-enabling you to pursue a holistic and interdisciplinary course of study while preparing for a position in AI research, operations, ...
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
Intel is an American multinational corporation and technology company headquartered in Santa Clara, California, in the Silicon Valley. It is the world's second largest and second highest valued semiconductor chip maker based on revenue after being overtaken by Samsung, and is the inventor of the x86 series of microprocessors, the processors found in most personal computers (PCs). Intel supplies processors for computer system manufacturers such as Apple, Lenovo, HP, and Dell. Intel also manufactu...