Welcome!

Server Monitoring Authors: JP Morgenthal, Pat Romanski, Elizabeth White, Carmen Gonzalez, Ken Schwaber

Related Topics: Server Monitoring, Containers Expo Blog, @CloudExpo

Server Monitoring: Article

Hybrid Solutions for Data Virtualization and Enterprise Data Warehouses

Eight best practices – Part 2

Weblogic at Cloud Expo

The intersection of data virtualization and enterprise data warehouses represents corporate best practices for delivering the rich data assets available in the enterprise data warehouse with the myriad sources of data now available outside the data warehouse.

In Part Two of this two-part series, I will target improving data warehouse efficiency by showing four best practices where data virtualization, used alongside data warehouses, saves time and money.

Part One examined ways that data virtualization improves data warehouse effectiveness.

4. Data Warehouse Hub and Virtual Data Mart Spoke
In a common scenario in today's enterprises, a central data warehouse hub is surrounded by satellite data marts in the same way that spokes surround a hub. These marts typically contain a subset of the warehouse data and are used by a subset of the users. These marts are often created because analytic tools require data in a different form than the warehouse. On the other hand, they may be created to work around warehouse controls. Regardless of their origination, each additional mart adds cost and potentially compromises data quality.

IT teams use data virtualization to create virtual data marts to eliminate and/or reduce physical data marts. This approach abstracts warehouse data to meet specific consuming tool and user query requirements, while preserving the quality and controls of the data warehouse.

In the integration pattern shown in Figure 1, the data virtualization middleware hosts virtual data marts that logically abstract and serve specific analytical reporting requirements.

Industry Examples
A mutual fund company uses data virtualization to enable its 150-plus financial analysts to build portfolio analysis models leveraging a wide range of equity financial data from a ten-terabyte financial research data warehouse with MATLAB and similar analysis tools. Prior to introducing data virtualization, the financial analysts frequently created satellite data marts for every new project. Now the IT team offers virtual data marts with a set of robust, reusable views that directly access the financial data warehouse on demand. Analysts now spend more time on analysis and less on accessing data, thereby improving their portfolio returns. The IT team has also eliminated extra, unneeded marts and their maintenance/operating costs.

An energy and oil company uses data virtualization to provide oil well platform data from a central Netezza data warehouse to engineers, maintenance managers and business analysts. This data is optimally formatted for a wide range of specialized analysis tools including Business Objects, Excel, Tibco Spotfire, Matrikon ProcessNet and Microsoft Reporting. The IT team quickly builds virtual views and services, thus rapidly responding to new, ad hoc queries. Analysts now leverage the warehouse with its virtual data marts as the single source of truth rather that replicating the data in local, "rogue" data marts.

6. ETL Pre-Processing
Extract, Transform and Load (ETL) tools, leveraging virtual views and data services as input to their batch processes, appear as another data source. This best practice also integrates data source types that ETL tools cannot easily access, as well as reuses existing views and services, saving time and costs. Further, these abstractions do not require ETL developers to understand the structure of, or interact directly with, actual data sources, significantly simplifying their work and reducing time to solution.

In the integration pattern shown in Figure 2, the data virtualization middleware complements ETL by providing access, abstraction and federation of packaged applications and web services data sources.

Industry Example
An energy company wanted to include SAP financial data in its enterprise data warehouse along with other sources and content. However, its ETL tools alone were unable to decode the SAP R/3 FICO complex data model. The IT team used data virtualization to access the SAP R/3 FICO data, abstract it into a form more appropriate for the warehouse, and stage it virtually for the ETL tools. With more complete and timely financial data in the data warehouse, the company can now better manage its financial performance.

7. Data Warehouse Prototyping
Building data warehouses from scratch is time-consuming and complex, requiring significant design, development and deployment efforts. One of the biggest issues early in a warehouse's lifecycle is frequently changing schemas. This change process requires modification of both the ETL scripts and physical data and typically becomes a bottleneck that slows new warehouse deployments. Nor does this problem go away later in the lifecycle; it merely lessens as the pace of change slows.

In the integration pattern shown in Figures 3 and 4, the data virtualization middleware serves as the prototype development environment for a new data warehouse. In this prototype stage, a virtual data warehouse is built, saving the time required to build a physical warehouse. This virtual warehouse includes a full schema that is easy to iterate as well as a complete functional testing environment.

Once the actual warehouse is deployed, the views and data services built during the prototype stage retain their value and prove useful for prototyping and testing subsequent warehouse schema changes that arise as business needs or underlying data sources change.

Industry Example
By using data virtualization during the prototyping stage, a government agency has speeded up the ETL and warehouse development process by four times. This result remains consistent during subsequent translations of working views into ETL scripts and physical warehouse schemas.

8. Data Warehouse Migration
Enterprises migrate their data warehouses for a number of reasons, including saving costs and reducing total cost of ownership. Another is mergers and acquisitions. In these scenarios, duplicate data warehouses need to be rationalized. A third reason is standardization. When an enterprise or government agency wants to rationalize different warehouses based on disparate technology platforms, they may migrate to a standard platform.

Regardless of the reason, in every case the reporting and analysis supported by the migrating data warehouse must continue to run seamlessly.

Data virtualization removes reporting risk by inserting a virtual reporting layer between the warehouse and the reporting systems. Decoupling these systems enables the reporting to continue before, during and after the migration. The integration patterns shown in Figure 5, Figure 6, and Figure 7 depict the original state prior to migration, the modification to reports via the virtualization reporting layer, and the stage where the new warehouse and supporting ETL are brought online and the old data warehouse is retired. The virtual layer insulates reporting users by enabling a controlled migration of the reporting views. Each existing view can be duplicated, modified to point at the new warehouse, and tested before the actual cutover, thereby insulating the reporting users from undue risk. Further, the virtual reporting layer is easily extensible for adding more sources or supporting new reporting solutions.

Industry Example
To reduce total cost of ownership when migrating to a data warehouse appliance, a large technology company used data virtualization to decouple its reporting from the data warehouse. Significant data warehousing cost reductions were achieved and reporting successfully migrated without interruption.

Conclusion
The articles in this two-part series have examined the eight best practices for using hybrid solutions of data virtualization and enterprise data warehouses to deliver the most comprehensive information to decision-makers. The intersection of the virtual and the physical data warehouse is aiding forward-thinking enterprises that must deal with the proliferation of data sources, including many web-based and cloud computing sources outside traditional enterprise data warehouses. Enterprises and government agencies that can learn from and adapt these best practices to their own enterprise information architectures will be best prepared to handle the continuous deluge of data found in most enterprise information systems today.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.
SYS-CON Events announced today that T-Mobile will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on ...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Analytic. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists will examine how DevOps helps to meet th...
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure network...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help globa...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
SYS-CON Events announced today that Hitachi Data Systems, a wholly owned subsidiary of Hitachi LTD., will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City. Hitachi Data Systems (HDS) will be featuring the Hitachi Content Platform (HCP) portfolio. This is the industry’s only offering that allows organizations to bring together object storage, file sync and share, cloud storage gateways, and sophisticated search an...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
The age of Digital Disruption is evolving into the next era – Digital Cohesion, an age in which applications securely self-assemble and deliver predictive services that continuously adapt to user behavior. Information from devices, sensors and applications around us will drive services seamlessly across mobile and fixed devices/infrastructure. This evolution is happening now in software defined services and secure networking. Four key drivers – Performance, Economics, Interoperability and Trust ...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software in the hope of capturing value in IoT. Although IoT is relatively new in the market, it has already gone through many promotional terms such as IoE, IoX, SDX, Edge/Fog, Mist Compute, etc. Ultimately, irrespective of the name, it is about deriving value from independent software assets participating in an ecosystem as one comprehensive solution.
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...