Identify duplicates and null values in a Column using Talend Open Studio for Data Quality

Writing this post after quite a long time on verifying the quality of data in a column using Data profiling techniques. Let’s take a simple example of a table called Country which has Code and Description.

CODE    DESCRIPTION
IND    India

US    United States of America

UK    United Kingdom

IND    INDIA

GER    Germany

AUS    <<null>>

AF    Afganistan

DZ    Algeria

Alb    Albania

Arg    Argentina

 

Possible problems in data:

What could be the possible problems in this with respect to data:

  1. The application might have been designed in such a way that Code must be 3 characters in length where in during data migration there could be some code which might of 2 characters in length
  2. Code might have been destined to be All caps which might have been compromised
  3. There should not be any null values in the Description

 

Now let us see how we can identify these problems in the table Country with the example.

 

Step 1: Connect to the oracle database using the DQ Repository

 

Step 2: Now we will add simple Column analysis on the column Code for the table Country

 

Step 3: Select the Indicators for each column to analyze, that is essential for analysis. Without indicators we will not be analyze the issues. For this example I have chosen some of the parameters for analysis as given below.

 

Step 4: Run the Analysis

When you run the analysis you can get to the graphical chart as depicted in the snapshot provided below:

 

In the above picture we can realize the row count is 10 and it has one duplicate values and there are 9 distinct values.

 


 

In this picture you can also find the length related metrics and Text statistics which even takes care of the case related issues.

 

This way we can easily identify issues in a specific column. Hope this post gives a simple example which might be useful in different context for ensuring data quality.

 

Data Profiling: Step by Step connection analysis using Talend

In continuation to my previous post, in this post we will look at some sample data and use Talend Open Studio for Data Quality for Data profiling. You can also refer to this link as an alternative tutorial. In this blog post we will evaluate how to do Connection analysis which will help us to the following key parameters in a database:

  • How many tables exists?
  • How many rows exists?
  • How many views exists?
  • How many rows exists in each table etc.,?

Why do we need to do Connection Analysis:

The connection analysis helps you to get a overview of the Database connection in the context quickly.

Creating a DB connection

Step 1:

Step 2: Installing mysql drivers

Step 3: Checking the connectivity

Starting Connection Analysis:

Step 1: Create the Database Structure Overview as given in the steps.

Step 2: Select the DB Connections

Step 3: Select the tables you need to analyze on.

Step 4: Now run the ClassicModel Connection Analysis

Step 5: Execution status after run

Step 6: Now it results in data analysis with statistical information such as 3864 rows and 8 tables.

Open source tools for Data Profiling

Data Profiling is nothing but analyzing the existing data available in a data source and identifying the meta data on the same. This post is an high level introduction to data profiling and just provide pointers to data profiling.

What is the use of doing data profiling?

  1. To understand the metadata characteristics of the data under purview.
  2. To have an enterprise view of the data for the purpose of Master Data Management and Data Governance
  3. Helps in identifying the right candidates for Source-Target mapping.
  4. Ensure data fits for the intended purpose
  5. It helps to identify the Data issues and quantify them.

Typical project types its being put to use:

  • Data warehousing/Business Intelligence Projects
  • Research Engagements
  • Data research projects
  • Data Conversion/Migration Projects
  • Source System Quality initiatives.

Some of the open source tools which can be used for Data Profiling:

Some links which points to understand various commercial players exists and there comparison and evaluation:

In the next post we will evaluate certain aspects of data profiling with any of the tools mentioned in this blog post.