Welcome!

Server Monitoring Authors: Liz McMillan, Carmen Gonzalez, Ken Schwaber, JP Morgenthal, Pat Romanski

Related Topics: Server Monitoring, Containers Expo Blog, @CloudExpo

Server Monitoring: Article

Hybrid Solutions for Data Virtualization and Enterprise Data Warehouses

Eight best practices – Part 2

Weblogic at Cloud Expo

The intersection of data virtualization and enterprise data warehouses represents corporate best practices for delivering the rich data assets available in the enterprise data warehouse with the myriad sources of data now available outside the data warehouse.

In Part Two of this two-part series, I will target improving data warehouse efficiency by showing four best practices where data virtualization, used alongside data warehouses, saves time and money.

Part One examined ways that data virtualization improves data warehouse effectiveness.

4. Data Warehouse Hub and Virtual Data Mart Spoke
In a common scenario in today's enterprises, a central data warehouse hub is surrounded by satellite data marts in the same way that spokes surround a hub. These marts typically contain a subset of the warehouse data and are used by a subset of the users. These marts are often created because analytic tools require data in a different form than the warehouse. On the other hand, they may be created to work around warehouse controls. Regardless of their origination, each additional mart adds cost and potentially compromises data quality.

IT teams use data virtualization to create virtual data marts to eliminate and/or reduce physical data marts. This approach abstracts warehouse data to meet specific consuming tool and user query requirements, while preserving the quality and controls of the data warehouse.

In the integration pattern shown in Figure 1, the data virtualization middleware hosts virtual data marts that logically abstract and serve specific analytical reporting requirements.

Industry Examples
A mutual fund company uses data virtualization to enable its 150-plus financial analysts to build portfolio analysis models leveraging a wide range of equity financial data from a ten-terabyte financial research data warehouse with MATLAB and similar analysis tools. Prior to introducing data virtualization, the financial analysts frequently created satellite data marts for every new project. Now the IT team offers virtual data marts with a set of robust, reusable views that directly access the financial data warehouse on demand. Analysts now spend more time on analysis and less on accessing data, thereby improving their portfolio returns. The IT team has also eliminated extra, unneeded marts and their maintenance/operating costs.

An energy and oil company uses data virtualization to provide oil well platform data from a central Netezza data warehouse to engineers, maintenance managers and business analysts. This data is optimally formatted for a wide range of specialized analysis tools including Business Objects, Excel, Tibco Spotfire, Matrikon ProcessNet and Microsoft Reporting. The IT team quickly builds virtual views and services, thus rapidly responding to new, ad hoc queries. Analysts now leverage the warehouse with its virtual data marts as the single source of truth rather that replicating the data in local, "rogue" data marts.

6. ETL Pre-Processing
Extract, Transform and Load (ETL) tools, leveraging virtual views and data services as input to their batch processes, appear as another data source. This best practice also integrates data source types that ETL tools cannot easily access, as well as reuses existing views and services, saving time and costs. Further, these abstractions do not require ETL developers to understand the structure of, or interact directly with, actual data sources, significantly simplifying their work and reducing time to solution.

In the integration pattern shown in Figure 2, the data virtualization middleware complements ETL by providing access, abstraction and federation of packaged applications and web services data sources.

Industry Example
An energy company wanted to include SAP financial data in its enterprise data warehouse along with other sources and content. However, its ETL tools alone were unable to decode the SAP R/3 FICO complex data model. The IT team used data virtualization to access the SAP R/3 FICO data, abstract it into a form more appropriate for the warehouse, and stage it virtually for the ETL tools. With more complete and timely financial data in the data warehouse, the company can now better manage its financial performance.

7. Data Warehouse Prototyping
Building data warehouses from scratch is time-consuming and complex, requiring significant design, development and deployment efforts. One of the biggest issues early in a warehouse's lifecycle is frequently changing schemas. This change process requires modification of both the ETL scripts and physical data and typically becomes a bottleneck that slows new warehouse deployments. Nor does this problem go away later in the lifecycle; it merely lessens as the pace of change slows.

In the integration pattern shown in Figures 3 and 4, the data virtualization middleware serves as the prototype development environment for a new data warehouse. In this prototype stage, a virtual data warehouse is built, saving the time required to build a physical warehouse. This virtual warehouse includes a full schema that is easy to iterate as well as a complete functional testing environment.

Once the actual warehouse is deployed, the views and data services built during the prototype stage retain their value and prove useful for prototyping and testing subsequent warehouse schema changes that arise as business needs or underlying data sources change.

Industry Example
By using data virtualization during the prototyping stage, a government agency has speeded up the ETL and warehouse development process by four times. This result remains consistent during subsequent translations of working views into ETL scripts and physical warehouse schemas.

8. Data Warehouse Migration
Enterprises migrate their data warehouses for a number of reasons, including saving costs and reducing total cost of ownership. Another is mergers and acquisitions. In these scenarios, duplicate data warehouses need to be rationalized. A third reason is standardization. When an enterprise or government agency wants to rationalize different warehouses based on disparate technology platforms, they may migrate to a standard platform.

Regardless of the reason, in every case the reporting and analysis supported by the migrating data warehouse must continue to run seamlessly.

Data virtualization removes reporting risk by inserting a virtual reporting layer between the warehouse and the reporting systems. Decoupling these systems enables the reporting to continue before, during and after the migration. The integration patterns shown in Figure 5, Figure 6, and Figure 7 depict the original state prior to migration, the modification to reports via the virtualization reporting layer, and the stage where the new warehouse and supporting ETL are brought online and the old data warehouse is retired. The virtual layer insulates reporting users by enabling a controlled migration of the reporting views. Each existing view can be duplicated, modified to point at the new warehouse, and tested before the actual cutover, thereby insulating the reporting users from undue risk. Further, the virtual reporting layer is easily extensible for adding more sources or supporting new reporting solutions.

Industry Example
To reduce total cost of ownership when migrating to a data warehouse appliance, a large technology company used data virtualization to decouple its reporting from the data warehouse. Significant data warehousing cost reductions were achieved and reporting successfully migrated without interruption.

Conclusion
The articles in this two-part series have examined the eight best practices for using hybrid solutions of data virtualization and enterprise data warehouses to deliver the most comprehensive information to decision-makers. The intersection of the virtual and the physical data warehouse is aiding forward-thinking enterprises that must deal with the proliferation of data sources, including many web-based and cloud computing sources outside traditional enterprise data warehouses. Enterprises and government agencies that can learn from and adapt these best practices to their own enterprise information architectures will be best prepared to handle the continuous deluge of data found in most enterprise information systems today.

More Stories By Robert Eve

Robert "Bob" Eve is vice president of marketing at Composite Software. Prior to joining Composite, he held executive-level marketing and business development roles at several other enterprise software companies. At Informatica and Mercury Interactive, he helped penetrate new segments in his role as the vice president of Market Development. Bob ran Marketing and Alliances at Kintana (acquired by Mercury Interactive in 2003) where he defined the IT Governance category. As vice president of Alliances at PeopleSoft, Bob was responsible for more than 300 partners and 100 staff members. Bob has an MS in management from MIT and a BS in business administration with honors from University of California, Berkeley. He is a frequent contributor to publications including SYS-CON's SOA World Magazine and Virtualization Journal.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
SYS-CON Events announced today that Datera will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera offers a radically new approach to data management, where innovative software makes data infrastructure invisible, elastic and able to perform at the highest level. It eliminates hardware lock-in and gives IT organizations the choice to source x86 server nodes, with business model option...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...