ETL Testing – Part 3

System Testing:

This environment integrates the various components and runs as a single unit. This should include sequences of events that enable those different components to run as a single unit and validate the data flow.

  • Verify all the required functionality in the validation environment.
  • Run end-to-end system test.
  • Record initialization and incremental load statistics.
  • Perform and mitigate performance of the entire system.
  • Verify error-handling processes are working as designed.
Prerequisites:
  • Finalized Implementation Checklist.
  • All integration testing complete.
  • Migration from the Test environment to the validation environment, as applicable.
  • Production configuration and data available.

Regression Testing:

Regression Testing is performed after the developer fixes a defect reported. This testing is to verify whether the identified defects are fixed and fixing of these defects does not introduce any other new defects in the system / application. This testing will also be performed when a Change Request is implemented on an existing production system. After the Change Request (CR) is approved, the testing team takes the impact analysis as input for designing the test cases for the CR.

Prerequisites:
  • Finalized Implementation Checklist.
  • All integration testing complete.

Performance Testing:

To determine the system performance under a particular workload / Service Level Agreement (SLA).Ensures system meets the performance criteria and it can detect bottle neck.

Types of Performance Testing are Load, Stress, Volume etc.

 

ETL Testing – Part 2

Test Estimation:

Effective software project estimation is one of the most challenging and important activities in the testing activities. However, it is the most essential part of a proper project planning and control. Under-estimating a project leads to under-staffing it, running the risk of low quality deliverables and resulting in loss of credibility as deadlines are missed. So it is imperative to do a proper estimation during the planning stage.

The basic steps in estimation include:
  • Estimating the size of the system to be tested.
  • Estimating the effort in person-hours (1 person hours – number of working hour in a day i.e. 8 hours)

After receiving the requirements the tester analyses the mappings that are created/modified and study about the changes made. Based on the impact analysis, the tester comes to know about how much time is needed for the whole testing process, which consists of mapping analysis, test case preparation, test execution, defect reporting, regression testing and final documentation. This calculated time is entered in the Estimation time sheet.

Integration Testing:

Integration testing is to verify the required functionality of a mapping (Single ETL / single session) in the environment specific to testing team (Test environment). This testing should ensure that correct numbers of rows (validated records) are transferred to the target from the source.

Integration testing is also used to verify initialization and incremental mappings (sessions) functionality along with the pre-session and post-session scripts for dependencies and the usage/consumption of relative indicator files to control dependencies across multiple work streams (modules). During integration testing error-handling processes, proper functionality of mapping variables and the appropriate business requirements can be validated.

Prerequisites:
  • Access to the required folders on the network.
  • Implementation Checklist for move from development to test.
  • All unit testing completed and summarized.
  • Data available in the test environment.
  • Migration to the test environment from the development environment.

Next ⇒ ETL Testing Part 3

ETL Testing – Part 1

ETL Testing:

Testing is an important phase in the project life-cycle. A structured well defined testing methodology involving comprehensive unit testing and system testing not only ensures smooth transition to the production environment but also a system without defects.

The testing phase can be broadly classified into the following categories:

  • Integration Testing
  • System Testing
  • Regression Testing
  • Performance Testing
  • Operational Qualification

Test Strategy:

A test strategy is an outline that describes the test plan. It is created to inform the project team the objective, high level scope of the testing process. This includes the testing objective, methods of testing, resources, estimated timelines, environment etc.

The test strategy is created based on high level design document.For each testing component test strategy needs to be created. based on this strategy testing process will be detailed out in the test plan.

Test Planning:

Test Planning is a key for successfully implementing the testing of a system. The deliverable is the actual “Test Plan”.  A software project test plan is a document that describes the Purpose, System Overview, Approach to Testing, Test Planning, Defect Tracking, Test Environment, Test prerequisites and References.

A key prerequisite for preparing a successful Test Plan is having approved (functional and non functional) requirements. Without the frozen requirements and specification the test plan will result in the lack of validation for the projects testing efforts.

The process of preparing a test plan is a useful way to get to know how testing of a particular system can be carried out within the given time line provided the test plan should be thorough enough.

The test plan outlines and defines the strategy and approach taken to perform end-to-end testing on a given project. The test plan describes the tasks, schedules, resources, and tools for integrating and testing the software application. It is intended for use by project personnel in understanding and carrying out prescribed test activities and in managing these activities through successful completion.

The test plan objectives are as follows:

  • To define a testing approach, scope, out of scope and methodology that encompasses integration testing, system testing, performance testing and regression testing in one plan for the business and project team.
  • To verify the functional and non functional requirements are met.
  • To coordinate resources, environments into an integrated schedule.
  • To provide a plan that outlines the contents of detailed test cases scenarios for each of the four phases of testing.
  • To determine a process for communicating issues resulting from the test phase.

The contents of a typical test plan consist of the following:

  • An introduction that includes a purpose, Definition & Acronym, Assumptions & Dependencies, In scope, Out of scope, Roles& Responsibilities and contacts. This information is obtained from the requirements specification.
  • System Overview will explain about the background and the system description.
  • A test approach for all testing levels includes test Objectives for each level, Test responsibilities, Levels of testing, various testing, Test coverage, Testing tools, Test data and Test stop criteria.
  • Test planning specifies Test schedule, Documentation deliverables, Test communication and Critical and High risk functions.

The test plan, thus, summarizes and consolidates information that is necessary for the efficient and effective conduct of testing.  Design Specification, Requirement Document and Project plan supporting the finalization of testing are located in separate documents and are referenced in the test plan.

Next ⇒ ETL Testing – Part 2  

Data Warehouse & ETL Tutorial

Data Warehouse & ETL Tutorial:

Data Warehouse is where data from different source systems are integrated, processed and stored. Data Warehouse data is a non-production data which is mainly used for analyzing and reporting purposes. Business users and higher management authorities use the data warehouse data to analyze, and to make important business decisions. Data Warehouse collects large volume of data from variant sources with many different data formats. The ETL(Extraction, Transformation and Loading) process handles these data and transforms it into a more consistent, standard formatted data. This ETL process is done with the help of an ETL tool and most widely used tools are Informatica, Data Stage, Ab Inito, Oracle Warehouse Builder etc.

This Data Warehouse and ETL tutorial section explains the data warehouse/ETL concepts, ETL tools and their usage with practical examples. Go through the links given below to read more.


Data Warehouse & ETL:



 

ETL Concepts

ETL Concepts:

Extraction, transformation, and loading. ETL refers to the methods involved in accessing and manipulating source data and loading it into target database.

The first step in ETL process is mapping the data between source systems and target database(data warehouse or data mart). The second step is cleansing of source data in staging area. The third step is transforming cleansed source data and then loading into the target system.

Note that ETT (Extraction, Transformation, Transportation) and ETM (Extraction, Transformation, Move) are sometimes used instead of ETL.

Data comes in different forms – Structured format or UnStructured format – with huge volumes – in high velocity.

Data in Structured Format:

  • Example: .csv files, .xls files, .dat files, .xml files etc.
  • Structured data is stored for several years within the Organization’s high end servers.
  • The need for the Data Analysis (Slicing and Dicing of the Data) of these Structured Data Formats led to the Evolution of ETL.
In the earlier times, data from different source systems like .csv, .xls, .dat, .xml etc., were  loaded to the target database by traditional programming methods. But the same data can be loaded very quickly by using ETL tools when compared to the old methods by making use of different components present in the ETL tools.
From the ETL perspective, an ETL developer has to know how to identify/connect to the source systems and do the transformations and load the data into target database.
Extraction and Loading process is easy when compared to the Transformation process  since Transformation has to see the different anomalies present in the data entry.

Common Anomalies found in the Source Systems:

1. Name: Combination of upper character and lower characters. No predefined formats.
Example: kEvin CURtis, STEven CURtis
2. Indicator Columns: Yes, No, Y, N
3. Gender Columns: M, F, Male, Female
4. Format Columns: xxx-xx-xxxx, xx-xxx-xxxx
5. Currency Columns: $1234, $1234.00, $1234.50
6. Date Columns: MM/DD/YYYY, DD/MM/YYYY oformat
The above mentioned problems are resolved by the ETL developer using the ETL tools and the right data is loaded into target system.

Sample ETL Process Flow Diagram:

Sample ETL Process Flow Diagram

Glossary of ETL Terms (Reference:www.Oracle.com):

Source System:

A database, application, file, or other storage facility from which the data in a data warehouse is derived.

Mapping:

The definition of the relationship and data flow between source and target objects.

Metadata:

Data that describes data and other structures, such as objects, business rules, and processes. For example, the schema design of a data warehouse is typically stored in a repository as metadata, which is used to generate scripts used to build and populate the
data warehouse. A repository contains metadata.

Staging Area:

A place where data is processed before entering the warehouse.

Cleansing:

The process of resolving inconsistencies and fixing the anomalies in source data, typically as part of the ETL process.

Transformation:

The process of manipulating data. Any manipulation beyond  copying is a transformation. Examples include cleansing, aggregating, and integrating data from multiple sources.

Transportation:

The process of moving copied or transformed data from a source to a data warehouse.

Target System:

A database, application, file, or other storage facility to which the “transformed source data” is loaded in a data warehouse.

 

Reach US!!!

 

  • We provide online training in advanced OLTP Data Modeling and DIMENSIONAL Data Modeling.
  • We also teach the data structures with Data Analytics Software “R”.
  • We provide online Data Modeling Project Support when you get struck with projects that you are involved.
  • We can analyse your Business Requirements, understand and suggest solutions to create OLTP Data Models and Dimensional Data models.
  • We provide Data Modeling Interview Preparation Sessions with a lot of Data Modeling Interview Questions/Answers, which will help you to clear any interview.

 

If you are interested, please reach us at Training@LearnDataModeling.com or 91-9080157239

 

ETL Tools – What to Learn?

ETL Tools: What to Learn?

With the help of ETL tools, we can create powerful target Data Warehouses without much difficulty. Following are the various options that we have to know and learn in order to use ETL tools.

Software:

  • How to install ETL tool on server/client?

Working with an ETL Tool:

  • How to work with various options like designer, mapping, workflow, scheduling etc.,?
  • How to work with sources like DBMS, relational source databases, files, ERPs etc., and import the source definitions?
  • How to import data from data modeling tools, applications etc.,?
  • How to work with targets like DBMS, relational source databases, files, ERPs etc., and import the source definitions?
  • How to create target definitions?
  • How to create mappings between source definitions and target definitions?
  • How to create transformations?
  • How to cleanse the source data?
  • How to create a dimension, slowly changing dimensions, cube etc.,?
  • How to create and monitor workflows?
  • How to configure, monitor and run debugger?
  • How to view and generate metadata reports?

 

ETL Tools

What are ETL Tools?

ETL Tools are meant to extract, transform and load the data into Data Warehouse for decision making. Before the evolution of ETL Tools, the above mentioned ETL process was done manually by using SQL code created by programmers. This task was tedious and cumbersome in many cases since it involved many resources, complex coding and more work hours. On top of it, maintaining the code placed a great challenge among the programmers.

These difficulties are eliminated by ETL Tools since they are very powerful and they offer many advantages in all stages of ETL process starting from extraction, data cleansing, data profiling, transformation, debugging and loading into data warehouse when compared to the old method.

There are a number of ETL tools available in the market to do ETL process the data according to business/technical requirements. Following are some those.

Popular ETL Tools:

[ultimatetables 82 /]

 

Data Warehouse frequently asked interview Questions and Answers

1. What is a data warehouse?

A data warehouse is a collection of integrated data from one or more sources, used for data analysis and reporting. Several years of data is stored in data warehouse. Data is static and not a transactional.

2. What is a data mart?

A data mart is a subset of data warehouse. Data mart gives a clear understanding of the small portion of a data warehouse. For viewing, analyzing, reporting, and documentation, data mart will be better.

3. What is the difference between a data warehouse and data mart?

Data Warehouse comprises of all subject areas, where data mart is focused on a specific subject area.

4. What is a Dimensional Data Model?

Dimensional Data Model contains one or more dimension tables and fact tables and is used for calculating the summarized data. Dimensional Data Model is used in data warehouse and data marts.

5. What is a dimension?

Dimension table is also called as lookup or reference table. The data (foreign keys) in the fact table refers to the data (primary key) in the dimension and is used for validation and calculation purpose.

6. What is a slowly changing dimension (SCD)? What are types of SCD?

Dimensions that change over time are called slowly changing dimensions. Type1, Type2 and Type3 are three types of slowly changing dimensions.

7. What is a star schema?

Star Schema is a database schema, which contains one or more dimensions and fact table representing multidimensional data. It is called as star schema because the relationship between the fact tables and dimensions looks like a star.

8. What is OLAP data modeling?

OLAP stands for ONLINE ANALYTICAL PROCESSING. The approach by which data models are constructed for analyzing data is called as OLAP data modeling. Example: Data Warehouse and Data Marts.

9. What is ETL?

ETL acronym stands for Extraction, Transformation and Loading. ETL is a process by which data stored in various sources are extracted, transformed and loaded into the Target Database.

10. What is a Fact table?

The centralized table in a star schema is called as Fact table. Fact table contains many columns referenced to dimension tables and standalone measure/fact columns. These facts or measure columns give useful and meaningful data based on some calculation.

11. What are the types of measure columns in a fact table?

Additive, Semi Additive and Non-Additive columns are three types of measure columns.

Additive means: Measures or facts that can be added across all columns.

Semi Additive means: Measures or fact that can be added across few columns.

Non Additive means: Measures that cannot be added across all dimensions.

12. What are the steps to create a Data Warehouse?

The various steps are: Analyzing the data from different sources, Data Modeling, creating databases, designing etl process, extracting data from various sources, transforming the data, loading the data into target data warehouse database/target Data Mart databases. From database, reports are generated as per the needs.

1 2