Data warehouse
Encyclopedia
In computing
Computing
Computing is usually defined as the activity of using and improving computer hardware and software. It is the computer-specific part of information technology...

, a data warehouse (DW) is a database
Database
A database is an organized collection of data for one or more purposes, usually in digital form. The data are typically organized to model relevant aspects of reality , in a way that supports processes requiring this information...

 used for reporting and analysis. The data stored in the warehouse is uploaded
Uploading and downloading
In computer networks, to download means to receive data to a local system from a remote system, or to initiate such a data transfer. Examples of a remote system from which a download might be performed include a webserver, FTP server, email server, or other similar systems...

 from the operational systems. The data may pass through an operational data store
Operational data store
An operational data store is a database designed to integrate data from multiple sources for additional operations on the data. The data is then passed back to operational systems for further operations and to the data warehouse for reporting....

 for additional operations before it is used in the DW for reporting.

A data warehouse maintains its functions in three layers: staging, integration, and access. Staging is used to store raw data for use by developers. The integration layer is used to integrate data and to have a level of abstraction from users. The access layer is for getting data out for users.

Data warehouses can be subdivided into data marts. Data marts store subsets of data from a warehouse.

This definition of the data warehouse focuses on data storage. The main source of the data is cleaned, transformed, catalogued and made available for use by managers and other business professionals for data mining
Data mining
Data mining , a relatively young and interdisciplinary field of computer science is the process of discovering new patterns from large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics and database systems...

, online analytical processing
OLAP
In computing, online analytical processing, or OLAP , is an approach to swiftly answer multi-dimensional analytical queries. OLAP is part of the broader category of business intelligence, which also encompasses relational reporting and data mining...

, market research and decision support (Marakas & O'Brien 2009). However, the means to retrieve and analyze data, to extract, transform and load
Extract, transform, load
Extract, transform and load is a process in database usage and especially in data warehousing that involves:* Extracting data from outside sources* Transforming it to fit operational needs...

 data, and to manage the data dictionary
Data dictionary
A data dictionary, or metadata repository, as defined in the IBM Dictionary of Computing, is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format." The term may have one of several closely related meanings pertaining to...

 are also considered essential components of a data warehousing system. Many references to data warehousing use this broader context. Thus, an expanded definition for data warehousing includes business intelligence tools
Business intelligence tools
Business intelligence tools are a type of application software designed to retrieve, analyze and report data. The tools generally read data that have been previously stored, often, though not necessarily, in a data warehouse or data mart....

, tools to extract, transform and load
Extract, transform, load
Extract, transform and load is a process in database usage and especially in data warehousing that involves:* Extracting data from outside sources* Transforming it to fit operational needs...

 data into the repository, and tools to manage and retrieve metadata
Metadata
The term metadata is an ambiguous term which is used for two fundamentally different concepts . Although the expression "data about data" is often used, it does not apply to both in the same way. Structural metadata, the design and specification of data structures, cannot be about data, because at...

.

Benefits of a data warehouse

A data warehouse maintains a copy of information from the source transaction systems. This architectural complexity provides the opportunity to:
  • Maintain data history, even if the source transaction systems do not.
  • Integrate data from multiple source systems, enabling a central view across the enterprise. This benefit is always valuable, but particularly so when the organization has grown by merger.
  • Improve data quality
    Data quality
    Data are of high quality "if they are fit for their intended uses in operations, decision making and planning" . Alternatively, the data are deemed of high quality if they correctly represent the real-world construct to which they refer...

    , by providing consistent codes and descriptions, flagging or even fixing bad data.
  • Present the organization's information consistently.
  • Provide a single common data model for all data of interest regardless of the data's source.
  • Restructure the data so that it makes sense to the business users.
  • Restructure the data so that it delivers excellent query performance, even for complex analytic queries, without impacting the operational system
    Operational system
    An operational system is a term used in data warehousing to refer to a system that is used to process the day-to-day transactions of an organization...

    s.
  • Add value to operational business applications, notably customer relationship management
    Customer relationship management
    Customer relationship management is a widely implemented strategy for managing a company’s interactions with customers, clients and sales prospects. It involves using technology to organize, automate, and synchronize business processes—principally sales activities, but also those for marketing,...

     (CRM) systems.

History

The concept of data warehousing dates back to the late 1980s when IBM researchers Barry Devlin and Paul Murphy developed the "business data warehouse". In essence, the data warehousing concept was intended to provide an architectural model for the flow of data from operational systems to decision support environments
Decision support system
A decision support system is a computer-based information system that supports business or organizational decision-making activities. DSSs serve the management, operations, and planning levels of an organization and help to make decisions, which may be rapidly changing and not easily specified in...

. The concept attempted to address the various problems associated with this flow, mainly the high costs associated with it. In the absence of a data warehousing architecture, an enormous amount of redundancy was required to support multiple decision support environments. In larger corporations it was typical for multiple decision support environments to operate independently. Though each environment served different users, they often required much of the same stored data. The process of gathering, cleaning and integrating data from various sources, usually from long-term existing operational systems (usually referred to as legacy system
Legacy system
A legacy system is an old method, technology, computer system, or application program that continues to be used, typically because it still functions for the users' needs, even though newer technology or more efficient methods of performing a task are now available...

s), was typically in part replicated for each environment. Moreover, the operational systems were frequently reexamined as new decision support requirements emerged. Often new requirements necessitated gathering, cleaning and integrating new data from "data mart
Data mart
A data mart is the access layer of the data warehouse environment that is used to get data out to the users. The data mart is a subset of the data warehouse which is usually oriented to a specific business line or team.- Terminology :...

s" that were tailored for ready access by users.

Key developments in early years of data warehousing were:
  • 1960s — General Mills
    General Mills
    General Mills, Inc. is an American Fortune 500 corporation, primarily concerned with food products, which is headquartered in Golden Valley, Minnesota, a suburb of Minneapolis. The company markets many well-known brands, such as Betty Crocker, Yoplait, Colombo, Totinos, Jeno's, Pillsbury, Green...

     and Dartmouth College
    Dartmouth College
    Dartmouth College is a private, Ivy League university in Hanover, New Hampshire, United States. The institution comprises a liberal arts college, Dartmouth Medical School, Thayer School of Engineering, and the Tuck School of Business, as well as 19 graduate programs in the arts and sciences...

    , in a joint research project, develop the terms dimensions and facts.
  • 1970s — ACNielsen
    ACNielsen
    ACNielsen is a global marketing research firm, with worldwide headquarters in New York City. Regional headquarters for North America are located in Schaumburg, Illinois. As of May 2010, it is part of The Nielsen Company.-History:...

     and IRI provide dimensional data marts for retail sales.
  • 1970s — Bill Inmon
    Bill Inmon
    William Harvey Inmon is an American computer scientist, recognized by many as the father of the data warehouse. Bill Inmon wrote the first book, held the first conference , wrote the first column in a magazine and was the first to offer classes in data warehousing...

     begins to define and discuss the term: Data Warehouse
  • 1975 — Sperry Univac Introduce MAPPER
    MAPPER
    MAPPER is a database management and reporting system that includes the world's first 4GL. Developed in-house by the UNIVAC Division of Sperry Corporation, MAPPER's heritage dates back to the 1960s when Louis Schlueter conceived the CRT RPS as a means to help Sperry/Univac manage...

     (MAintain, Prepare, and Produce Executive Reports) is a database management and reporting system that includes the world's first 4GL. It was the first platform specifically designed for building Information Centers (a forerunner of contemporary Enterprise Data Warehousing platforms)
  • 1983 — Teradata
    Teradata
    Teradata Corporation is a vendor specializing in data warehousing and analytic applications. Its products are commonly used by companies to manage data warehouses for analytics and business intelligence purposes. Teradata was formerly a division of NCR Corporation, with the spinoff from NCR on...

     introduces a database management system specifically designed for decision support.
  • 1983 — Sperry Corporation
    Sperry Corporation
    Sperry Corporation was a major American equipment and electronics company whose existence spanned more than seven decades of the twentieth century...

     Martyn Richard Jones defines the Sperry Information Center approach, which whilst not being a true DW in the Inmon sense, did contain many of the characteristics of DW structures and process as defined previously by Inmon, and later by Devlin. First used at the TSB England & Wales
    Trustee Savings Bank
    The Trustee Savings Bank was a British financial institution which specialised in accepting savings deposits from the poor. They did not trade their shares on the stock market and, unlike mutually held building societies, depositors had no voting rights nor the ability to direct the financial and...

  • 1984 — Metaphor Computer Systems
    Metaphor Computer Systems
    Metaphor Computer Systems was a Xerox PARC spin-off that created an advanced workstation, database gateway, a unique graphical office interface, and software applications that communicate. The Metaphor machine was one of the first commercial workstations to offer a complete hardware/software...

    , founded by David Liddle
    David Liddle
    David Liddle is co-founder of Interval Research Corporation, consulting professor of computer science at Stanford University, and credited with heading development of the groundbreaking Xerox Star computer system. He has served on the board of many corporations. He was chair of the board of...

     and Don Massaro, releases Data Interpretation System (DIS). DIS was a hardware/software package and GUI for business users to create a database management and analytic system.
  • 1988 — Barry Devlin and Paul Murphy publish the article An architecture for a business and information system in IBM Systems Journal where they introduce the term "business data warehouse".
  • 1990 — Red Brick Systems, founded by Ralph Kimball
    Ralph Kimball
    Ralph Kimball is an author on the subject of data warehousing and business intelligence. He is widely regarded as one of the original architects of data warehousing and is known for long-term convictions that data warehouses must be designed to be understandable and fast...

    , introduces Red Brick Warehouse, a database management system specifically for data warehousing.
  • 1991 — Prism Solutions, founded by Bill Inmon
    Bill Inmon
    William Harvey Inmon is an American computer scientist, recognized by many as the father of the data warehouse. Bill Inmon wrote the first book, held the first conference , wrote the first column in a magazine and was the first to offer classes in data warehousing...

    , introduces Prism Warehouse Manager, software for developing a data warehouse.
  • 1992 — Bill Inmon
    Bill Inmon
    William Harvey Inmon is an American computer scientist, recognized by many as the father of the data warehouse. Bill Inmon wrote the first book, held the first conference , wrote the first column in a magazine and was the first to offer classes in data warehousing...

     publishes the book Building the Data Warehouse.
  • 1995 — The Data Warehousing Institute, a for-profit organization that promotes data warehousing, is founded.
  • 1996 — Ralph Kimball
    Ralph Kimball
    Ralph Kimball is an author on the subject of data warehousing and business intelligence. He is widely regarded as one of the original architects of data warehousing and is known for long-term convictions that data warehouses must be designed to be understandable and fast...

     publishes the book The Data Warehouse Toolkit.
  • 2000 — Daniel Linstedt releases the Data Vault, enabling real time auditable Data Warehouses warehouse.

Normalized versus dimensional approach for storage of data

There are two leading approaches to storing data in a data warehouse — the dimensional approach and the normalized approach.
The dimensional approach, whose supporters are referred to as “Kimballites”, believe in Ralph Kimball
Ralph Kimball
Ralph Kimball is an author on the subject of data warehousing and business intelligence. He is widely regarded as one of the original architects of data warehousing and is known for long-term convictions that data warehouses must be designed to be understandable and fast...

’s approach in which it is stated that the data warehouse should be modeled using a Dimensional Model/star schema
Star schema
In computing, the star schema is the simplest style of data warehouse schema. The star schema consists of one or more fact tables referencing any number of dimension tables...

. The normalized approach, also called the 3NF model, whose supporters are referred to as “Inmonites”, believe in Bill Inmon's approach in which it is stated that the data warehouse should be modeled using an E-R model/normalized model.

In a dimensional approach
Star schema
In computing, the star schema is the simplest style of data warehouse schema. The star schema consists of one or more fact tables referencing any number of dimension tables...

, transaction data
Transaction data
Transaction data are data describing an event and is usually described with verbs. Transaction data always has a time dimension, a numerical value and refers to one or more objects Transaction data are data describing an event (the change as a result of a transaction) and is usually described with...

 are partitioned into either "facts", which are generally numeric transaction data, or "dimensions", which are the reference information that gives context to the facts. For example, a sales transaction can be broken up into facts such as the number of products ordered and the price paid for the products, and into dimensions such as order date, customer name, product number, order ship-to and bill-to locations, and salesperson responsible for receiving the order.

A key advantage of a dimensional approach is that the data warehouse is easier for the user to understand and to use. Also, the retrieval of data from the data warehouse tends to operate very quickly. Dimensional structures are easy to understand for business users, because the structure is divided into measurements/facts and context/dimensions. Facts are related to the organization’s business processes and operational system whereas the dimensions surrounding them contain context about the measurement (Kimball, Ralph 2008).

The main disadvantages of the dimensional approach are:
  1. In order to maintain the integrity of facts and dimensions, loading the data warehouse with data from different operational systems is complicated, and
  2. It is difficult to modify the data warehouse structure if the organization adopting the dimensional approach changes the way in which it does business.


In the normalized approach, the data in the data warehouse are stored following, to a degree, database normalization
Database normalization
In the design of a relational database management system , the process of organizing data to minimize redundancy is called normalization. The goal of database normalization is to decompose relations with anomalies in order to produce smaller, well-structured relations...

 rules. Tables are grouped together by subject areas that reflect general data categories (e.g., data on customers, products, finance, etc.). The normalized structure divides data into entities, which creates several tables in a relational database. When applied in large enterprises the result is dozens of tables that are linked together by a web of joins. Furthermore, each of the created entities is converted into separate physical tables when the database is implemented (Kimball, Ralph 2008).
The main advantage of this approach is that it is straightforward to add information into the database. A disadvantage of this approach is that, because of the number of tables involved, it can be difficult for users both to:
  1. join data from different sources into meaningful information and then
  2. access the information without a precise understanding of the sources of data and of the data structure
    Data structure
    In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks...

     of the data warehouse.


It should be noted that both normalized – and dimensional models can be represented in entity-relationship diagrams as both contain jointed relational tables. The difference between the two models is the degree of normalization.

These approaches are not mutually exclusive, and there are other approaches. Dimensional approaches can involve normalizing data to a degree (Kimball, Ralph 2008).

Bottom-up design

Ralph Kimball
Ralph Kimball
Ralph Kimball is an author on the subject of data warehousing and business intelligence. He is widely regarded as one of the original architects of data warehousing and is known for long-term convictions that data warehouses must be designed to be understandable and fast...

, a well-known author on data warehousing, is a proponent of an approach to data warehouse design which he describes as bottom-up.

In the bottom-up approach data mart
Data mart
A data mart is the access layer of the data warehouse environment that is used to get data out to the users. The data mart is a subset of the data warehouse which is usually oriented to a specific business line or team.- Terminology :...

s are first created to provide reporting and analytical capabilities for specific business process
Business process
A business process or business method is a collection of related, structured activities or tasks that produce a specific service or product for a particular customer or customers...

es. Though it is important to note that in Kimball methodology, the bottom-up process is the result of an initial business oriented Top-down analysis of the relevant business processes to be modelled.

Data marts contain, primarily, dimensions and facts. Facts can contain either atomic data and, if necessary, summarized data.
The single data mart often models a specific business area such as "Sales" or "Production." These data marts can eventually be integrated to create a comprehensive data warehouse. The integration of data marts is managed through the implementation of what Kimball calls "a data warehouse bus architecture".
The data warehouse bus architecture is primarily an implementation of "the bus", a collection of conformed dimensions and conformed facts, which are dimensions that are shared (in a specific way) between facts in two or more data marts.

The integration of the data marts in the data warehouse is centered on the conformed dimensions (residing in "the bus") that define the possible integration "points" between data marts. The actual integration of two or more data marts is then done by a process known as "Drill across". A drill-across works by grouping (summarizing) the data along the keys of the (shared) conformed dimensions of each fact participating in the "drill across" followed by a join on the keys of these grouped (summarized) facts.

Maintaining tight management over the data warehouse bus architecture is fundamental to maintaining the integrity of the data warehouse. The most important management task is making sure dimensions among data marts are consistent. In Kimball's words, this means that the dimensions "conform".

Some consider it an advantage of the Kimball method, that the data warehouse ends up being "segmented" into a number of logically self contained (up to and including The Bus) and consistent data marts, rather than a big and often complex centralized model.
Business value can be returned as quickly as the first data mart
Data mart
A data mart is the access layer of the data warehouse environment that is used to get data out to the users. The data mart is a subset of the data warehouse which is usually oriented to a specific business line or team.- Terminology :...

s can be created, and the method gives itself well to an exploratory and iterative approach to building data warehouses. For example, the data warehousing effort might start in the "Sales" department, by building a Sales-data mart. Upon completion of the Sales-data mart, The business might then decide to expand the warehousing activities into the, say, "Production department" resulting in a Production data mart.
The requirement for the Sales data mart and the Production data mart to be integrable, is that they share the same "Bus", that will be, that the data warehousing team has made the effort to identify and implement the conformed dimensions in the bus, and that the individual data marts links that information from the bus. Note that this does not require 100% awareness from the onset of the data warehousing effort, no master plan is required upfront. The Sales-data mart is good as it is (assuming that the bus is complete) and the production data mart can be constructed virtually independent of the sales data mart (but not independent of the Bus).

If integration via the bus is achieved, the data warehouse, through its two data marts, will not only be able to deliver the specific information that the individual data marts are designed to do, in this example either "Sales" or "Production" information, but can deliver integrated Sales-Production information, which, often, is of critical business value. An integration (possibly) achieved in a flexible and iterative fashion.

Top-down design

Bill Inmon
Bill Inmon
William Harvey Inmon is an American computer scientist, recognized by many as the father of the data warehouse. Bill Inmon wrote the first book, held the first conference , wrote the first column in a magazine and was the first to offer classes in data warehousing...

, one of the first authors on the subject of data warehousing, has defined a data warehouse as a centralized repository for the entire enterprise. Inmon is one of the leading proponents of the top-down approach to data warehouse design, in which the data warehouse is designed using a normalized enterprise data model
Data model
A data model in software engineering is an abstract model, that documents and organizes the business data for communication between team members and is used as a plan for developing applications, specifically how data is stored and accessed....

. "Atomic" data
Data element
In metadata, the term data element is an atomic unit of data that has precise meaning or precise semantics. A data element has:# An identification such as a data element name# A clear data element definition# One or more representation terms...

, that is, data at the lowest level of detail, are stored in the data warehouse. Dimensional data marts containing data needed for specific business processes or specific departments are created from the data warehouse. In the Inmon vision the data warehouse is at the center of the "Corporate Information Factory" (CIF), which provides a logical framework for delivering business intelligence (BI) and business management capabilities.

Inmon states that the data warehouse is:
Subject-oriented : The data in the data warehouse is organized so that all the data elements relating to the same real-world event or object are linked together.
Non-volatile : Data in the data warehouse are never over-written or deleted — once committed, the data are static, read-only, and retained for future reporting.
Integrated : The data warehouse contains data from most or all of an organization's operational systems and these data are made consistent.
Time-variant : For An operational system, the stored data contains the current value.

The top-down design methodology generates highly consistent dimensional views of data across data marts since all data marts are loaded from the centralized repository. Top-down design has also proven to be robust against business changes. Generating new dimensional data marts against the data stored in the data warehouse is a relatively simple task. The main disadvantage to the top-down methodology is that it represents a very large project with a very broad scope. The up-front cost for implementing a data warehouse using the top-down methodology is significant, and the duration of time from the start of project to the point that end users experience initial benefits can be substantial. In addition, the top-down methodology can be inflexible and unresponsive to changing departmental needs during the implementation phases.

Hybrid design

Data warehouse (DW) solutions often resemble hub and spoke architecture. Legacy systems feeding the DW/BI solution often include customer relationship management
Customer relationship management
Customer relationship management is a widely implemented strategy for managing a company’s interactions with customers, clients and sales prospects. It involves using technology to organize, automate, and synchronize business processes—principally sales activities, but also those for marketing,...

 (CRM) and enterprise resource planning
Enterprise resource planning
Enterprise resource planning systems integrate internal and external management information across an entire organization, embracing finance/accounting, manufacturing, sales and service, customer relationship management, etc. ERP systems automate this activity with an integrated software application...

 solutions (ERP), generating large amounts of data. To consolidate these various data models, and facilitate the extract transform load (ETL) process, DW solutions often make use of an operational data store
Operational data store
An operational data store is a database designed to integrate data from multiple sources for additional operations on the data. The data is then passed back to operational systems for further operations and to the data warehouse for reporting....

 (ODS). The information from the ODS is then parsed into the actual DW. To reduce data redundancy, larger systems will often store the data in a normalized way. Data marts for specific reports can then be built on top of the DW solution.

It is important to note that the DW database in a hybrid solution is kept on third normal form to eliminate data redundancy. A normal relational database however, is not efficient for business intelligence reports where dimensional modelling is prevalent. Small data marts can shop for data from the consolidated warehouse and use the filtered, specific data for the fact tables and dimensions required.
The DW effectively provides a single source of information from which the data marts can read, creating a highly flexible solution from a BI point of view. The hybrid architecture allows a DW to be replaced with a master data management
Master Data Management
In computing, master data management comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization...

 solution where operational, not static information could reside.

The Data Vault Modeling components follow hub and spoke architecture. This modeling style is a hybrid design, consisting of the best of breed practices from both 3rd normal form and star schema. The Data Vault model is not a true 3rd normal form, and breaks some of the rules that 3NF dictates be followed. It is however, a top-down architecture with a bottom up design. The Data Vault model is geared to be strictly a data warehouse. It is not geared to be end-user accessible, which when built, still requires the use of a data mart or star schema based release area for business purposes.

Data warehouses versus operational systems

Operational systems are optimized for preservation of data integrity
Data integrity
Data Integrity in its broadest meaning refers to the trustworthiness of system resources over their entire life cycle. In more analytic terms, it is "the representational faithfulness of information to the true state of the object that the information represents, where representational faithfulness...

 and speed of recording of business transactions through use of database normalization
Database normalization
In the design of a relational database management system , the process of organizing data to minimize redundancy is called normalization. The goal of database normalization is to decompose relations with anomalies in order to produce smaller, well-structured relations...

 and an entity-relationship model
Entity-relationship model
In software engineering, an entity-relationship model is an abstract and conceptual representation of data. Entity-relationship modeling is a database modeling method, used to produce a type of conceptual schema or semantic data model of a system, often a relational database, and its requirements...

. Operational system designers generally follow the Codd
Edgar F. Codd
Edgar Frank "Ted" Codd was an English computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases...

 rules of database normalization
Database normalization
In the design of a relational database management system , the process of organizing data to minimize redundancy is called normalization. The goal of database normalization is to decompose relations with anomalies in order to produce smaller, well-structured relations...

 in order to ensure data integrity. Codd defined five increasingly stringent rules of normalization. Fully normalized database designs (that is, those satisfying all five Codd rules) often result in information from a business transaction being stored in dozens to hundreds of tables. Relational database
Relational database
A relational database is a database that conforms to relational model theory. The software used in a relational database is called a relational database management system . Colloquial use of the term "relational database" may refer to the RDBMS software, or the relational database itself...

s are efficient at managing the relationships between these tables. The databases have very fast insert/update performance because only a small amount of data in those tables is affected each time a transaction is processed. Finally, in order to improve performance, older data are usually periodically purged from operational systems.

Data warehouses are optimized for speed of data analysis. Frequently data in data warehouses are denormalised
Denormalization
In computing, denormalization is the process of attempting to optimise the read performance of a database by adding redundant data or by grouping data. In some cases, denormalisation helps cover up the inefficiencies inherent in relational database software...

 via a dimension-based model
Star schema
In computing, the star schema is the simplest style of data warehouse schema. The star schema consists of one or more fact tables referencing any number of dimension tables...

. Also, to speed data retrieval, data warehouse data are often stored multiple times—in their most granular form and in summarized forms called aggregates. Data warehouse data are gathered from the operational systems and held in the data warehouse even after the data has been purged from the operational systems.

Evolution in organization use

These terms refer to the level of sophistication of a data warehouse:

Offline operational data warehouse: Data warehouses in this stage of evolution are updated on a regular time cycle (usually daily, weekly or monthly) from the operational systems and the data is stored in an integrated reporting-oriented data
Offline data warehouse : Data warehouses at this stage are updated from data in the operational systems on a regular basis and the data warehouse data are stored in a data structure designed to facilitate reporting.
On time data warehouse : Online Integrated Data Warehousing represent the real time Data warehouses stage data in the warehouse is updated for every transaction performed on the source data
Integrated data warehouse : These data warehouses assemble data from different areas of business, so users can look up the information they need across other systems.

Sample applications

Some of the applications data warehousing can be used for are:
  • Decision support
  • Trend analysis
  • Financial forecasting
  • Churn
    Churn rate
    Churn rate , in its broadest sense, is a measure of the number of individuals or items moving into or out of a collective over a specific period of time...

     Prediction for Telecom subscribers, Credit Card users etc.
  • Insurance fraud analysis
  • Call record analysis
  • Logistics and Inventory management
  • Agriculture

See also

  • Accounting intelligence
    Accounting intelligence
    A specialist form of business intelligence, accounting intelligence is the general name for the set of technologies used to extract , analyse and present information from accounting and ERP applications such as JD Edwards, Oracle E-Business Suite or SAP....

  • Anchor Modeling
    Anchor Modeling
    Anchor Modeling is an agile database modeling technique suited for information that change over time both in structure and content. It provides a graphical notation used for conceptual modeling similar to that of Entity-Relationship modeling, with extensions for working with temporal data...

  • Business Intelligence
    Business intelligence
    Business intelligence mainly refers to computer-based techniques used in identifying, extracting, and analyzing business data, such as sales revenue by products and/or departments, or by associated costs and incomes....

  • Business intelligence tools
    Business intelligence tools
    Business intelligence tools are a type of application software designed to retrieve, analyze and report data. The tools generally read data that have been previously stored, often, though not necessarily, in a data warehouse or data mart....

  • Data integration
    Data integration
    Data integration involves combining data residing in different sources and providing users with a unified view of these data.This process becomes significant in a variety of situations, which include both commercial and scientific domains...

  • Data mart
    Data mart
    A data mart is the access layer of the data warehouse environment that is used to get data out to the users. The data mart is a subset of the data warehouse which is usually oriented to a specific business line or team.- Terminology :...

  • Data mining
    Data mining
    Data mining , a relatively young and interdisciplinary field of computer science is the process of discovering new patterns from large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics and database systems...

  • Data Presentation Architecture
    Data Presentation Architecture
    Data presentation architecture is a skill-set that seeks to identify, locate, manipulate, format and present data in such a way as to optimally communicate meaning and proffer knowledge.-Origin and context:...

  • Data scraping
    Data scraping
    Data scraping is a technique in which a computer program extracts data from human-readable output coming from another program.-Description:Normally, data transfer between programs is accomplished using data structures suited for automated processing by computers, not people...

  • Data warehouse appliance
    Data warehouse appliance
    In computing, a data warehouse appliance consists of an integrated set of servers, storage, operating system, DBMS and software specifically pre-installed and pre-optimized for data warehousing...

  • Database Management System (DBMS)
    Database management system
    A database management system is a software package with computer programs that control the creation, maintenance, and use of a database. It allows organizations to conveniently develop databases for various applications by database administrators and other specialists. A database is an integrated...

  • Decision support system
    Decision support system
    A decision support system is a computer-based information system that supports business or organizational decision-making activities. DSSs serve the management, operations, and planning levels of an organization and help to make decisions, which may be rapidly changing and not easily specified in...

  • Data Vault Modeling
    Data Vault Modeling
    Data Vault Modeling is a database modeling method that is designed to provide historical storage of data coming in from multiple operational systems. It is also a method of looking at historical data that, apart from the modeling aspect, deals with issues such as auditing, tracing of data, loading...

  • Executive Information System (EIS)
    Executive information system
    An executive information system is a type of management information system intended to facilitate and support the information and decision-making needs of senior executives by providing easy access to both internal and external information relevant to meeting the strategic goals of the organization...

  • Extract, transform, and load (ETL)
    Extract, transform, load
    Extract, transform and load is a process in database usage and especially in data warehousing that involves:* Extracting data from outside sources* Transforming it to fit operational needs...

  • Master Data Management (MDM)
    Master Data Management
    In computing, master data management comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization...

  • Online Analytical Processing (OLAP)
  • Online transaction processing (OLTP)
    Online transaction processing
    Online transaction processing, or OLTP, refers to a class of systems that facilitate and manage transaction-oriented applications, typically for data entry and retrieval transaction processing...

  • Operational Data Store (ODS)
    Operational data store
    An operational data store is a database designed to integrate data from multiple sources for additional operations on the data. The data is then passed back to operational systems for further operations and to the data warehouse for reporting....

  • Snowflake schema
    Snowflake schema
    In computing, a snowflake schema is a logical arrangement of tables in a multidimensional database such that the entity relationship diagram resembles a snowflake in shape. The snowflake schema is represented by centralized fact tables which are connected to multiple dimensions.The snowflake schema...

  • Software as a service
    Software as a Service
    Software as a service , sometimes referred to as "on-demand software," is a software delivery model in which software and its associated data are hosted centrally and are typically accessed by users using a thin client, normally using a web browser over the Internet.SaaS has become a common...

     (Saas)
  • Star schema
    Star schema
    In computing, the star schema is the simplest style of data warehouse schema. The star schema consists of one or more fact tables referencing any number of dimension tables...

  • Slowly changing dimension
    Slowly Changing Dimension
    Dimension is a term in data management and data warehousing that refers to logical groupings of data such as geographical location, customer information, or product information...


Further reading

  • Davenport, Thomas H. and Harris, Jeanne G. Competing on Analytics: The New Science of Winning (2007) Harvard Business School Press. ISBN 978-1-4221-0332-6
  • Ganczarski, Joe. Data Warehouse Implementations: Critical Implementation Factors Study (2009) VDM Verlag ISBN 3-639-18589-7 ISBN 978-3-639-18589-8
  • Kimball, Ralph and Ross, Margy. The Data Warehouse Toolkit Second Edition (2002) John Wiley and Sons, Inc. ISBN 0-471-20024-7
  • Linstedt, Graziano, Hultgren. The Business of Data Vault Modeling Second Edition (2010) Dan linstedt, ISBN 978-1-4357-1914-9

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK