Data warehouse is a ‘relational database’ which stores many months or years of historical records. It also provides support for gathering data, transportation, transformation, reporting, data mining, statistical analysis etc. Data warehouses are designed to analyze data rather than transaction processing. They usually contain data which are derived from transaction data.

Data warehouses are said to be nonvolatile. The end users cannot update the data directly(except for using the analytical tools) once it is entered into the warehouses because the purpose of them is to allow us to check what has happened. Data warehouses are also subject oriented. For example, if we wish to learn about your company’s sale details, a data warehouse that concentrate only on sales can be built. Depends on the amount of data maintained and the number of queries processed, more powerful systems are required.


We can find a lot of differences between the OLTP (Online Transaction processing) systems and Data warehousing. In OLTP systems, your application might be designed or tuned to support only the predefined operations and the users can execute update statements directly in database. The OLTP databases demand performance for the current transaction and they are always up to date. But data warehouses are optimized to work under a wide variety of analytical operations and queries. The data warehouses are updated by the ETL (extract, transform and Load) process using bulk data modification techniques (nightly or weekly). They usually use denormalized or partially denormalized schema for query optimization where as OLTP uses normalized schema. Data warehouse queries scan millions of rows, but the online transaction processing operations scan only a handful of operations.


Data mining is the analyzing and summarizing of data and making it into useful information. Technically, it is the process of searching large amounts of data and finding correlations or patterns among the fields in relational databases. Data mining is a commonly used tool to retrieve data and transform it to information. Data mining tools helps in answering time consuming business questions within few minutes. Data mining tools can analyze massive databases in minutes when it is implemented in high performance parallel processing or client/server computers.


The steps included in data mining process are


1)problem definition – consist of understanding the objectives and requirements,


2)data gathering and preparation – involve data collection and exploration,


3) model building and evaluation – applying modeling techniques and optimization, and


4)knowledge development – deriving actionable information from data.


warehouse is a ‘relational database’ which stores many months or years of historical records. It also provides support for gathering data, transportation, transformation, reporting, data mining, statistical analysis etc. Data warehouses are designed to analyze data rather than transaction processing. They usually contain data which are derived from transaction data.
Data warehouses are said to be nonvolatile. The end users cannot update the data directly(except for using the analytical tools) once it is entered into the warehouses because the purpose of them is to allow us to check what has happened. Data warehouses are also subject oriented. For example, if we wish to learn about your company’s sale details, a data warehouse that concentrate only on sales can be built. Depends on the amount of data maintained and the number of queries processed, more powerful systems are required.
We can find a lot of differences between the OLTP (Online Transaction processing) systems and Data warehousing. In OLTP systems, your application might be designed or tuned to support only the predefined operations and the users can execute update statements directly in database. The OLTP databases demand performance for the current transaction and they are always up to date. But data warehouses are optimized to work under a wide variety of analytical operations and queries. The data warehouses are updated by the ETL (extract, transform and Load) process using bulk data modification techniques (nightly or weekly). They usually use denormalized or partially denormalized schema for query optimization where as OLTP uses normalized schema. Data warehouse queries scan millions of rows, but the online transaction processing operations scan only a handful of operations.
Data mining is the analyzing and summarizing of data and making it into useful information. Technically, it is the process of searching large amounts of data and finding correlations or patterns among the fields in relational databases. Data mining is a commonly used tool to retrieve data and transform it to information. Data mining tools helps in answering time consuming business questions within few minutes. Data mining tools can analyze massive databases in minutes when it is implemented in high performance parallel processing or client/server computers.
The steps included in data mining process are 1)problem definition – consist of understanding the objectives and requirements, 2)data gathering and preparation – involve data collection and exploration, 3) model building and evaluation – applying modeling techniques and optimization, and 4)knowledge development – deriving actionable information from data.Data warehouse is a ‘relational database’ which stores many months or years of historical records. It also provides support for gathering data, transportation, transformation, reporting, data mining, statistical analysis etc. Data warehouses are designed to analyze data rather than transaction processing. They usually contain data which are derived from transaction data.
Data warehouses are said to be nonvolatile. The end users cannot update the data directly(except for using the analytical tools) once it is entered into the warehouses because the purpose of them is to allow us to check what has happened. Data warehouses are also subject oriented. For example, if we wish to learn about your company’s sale details, a data warehouse that concentrate only on sales can be built. Depends on the amount of data maintained and the number of queries processed, more powerful systems are required.
We can find a lot of differences between the OLTP (Online Transaction processing) systems and Data warehousing. In OLTP systems, your application might be designed or tuned to support only the predefined operations and the users can execute update statements directly in database. The OLTP databases demand performance for the current transaction and they are always up to date. But data warehouses are optimized to work under a wide variety of analytical operations and queries. The data warehouses are updated by the ETL (extract, transform and Load) process using bulk data modification techniques (nightly or weekly). They usually use denormalized or partially denormalized schema for query optimization where as OLTP uses normalized schema. Data warehouse queries scan millions of rows, but the online transaction processing operations scan only a handful of operations.
Data mining is the analyzing and summarizing of data and making it into useful information. Technically, it is the process of searching large amounts of data and finding correlations or patterns among the fields in relational databases. Data mining is a commonly used tool to retrieve data and transform it to information. Data mining tools helps in answering time consuming business questions within few minutes. Data mining tools can analyze massive databases in minutes when it is implemented in high performance parallel processing or client/server computers.
The steps included in data mining process are 1)problem definition – consist of understanding the objectives and requirements, 2)data gathering and preparation – involve data collection and exploration, 3) model building and evaluation – applying modeling techniques and optimization, and 4)knowledge development – deriving actionable information from data.