Home

Hemchander - Azure Data Engineer
[email protected]
Location: Columbus, Ohio, USA
Relocation: Open
Visa: H1B
Hemchander Thippana [email protected]

614 775 8091

Career Objectives:

I intend to establish my career in Data warehousing/Business Intelligence field with an integrated business solution provider through a long-time commitment, contributing to the company's growth and in turn ensuring personal growth within the organization. I believe that my technical, functional and communication skills will enable me to face the challenging career ahead. I consider myself a fast learner and I want to utilize my talent in the IT industry to the best of my effort. Willing to work as a key player in a challenging & creative environment.

PROFESSIONAL SUMMARY:

Having 12+ years of Professional experience in IT industry, Involved in MSBI projects with extensive usage of ETL & Reporting tools like PowerBI, SQL Server Integration Services (SSIS), SQL Server Reporting Service (SSRS) and Azure data factory (ADF) Azure Data Lake Storage (ADLS), Azure Data BoxAzure SQL, Azure Data bricks, Azure Cosmos DB, Azure Key vaults and Azure Analytical services.

Developed extensive ETL Pipelines utilizing Azure Data factory, leading to increase in workflow efficiency.

Optimized Azure SQL database performance, cutting storage costs by 18% without compromising on speed.

Implemented CI/CD pipelines within Azure Devops to enhance team productivity and shorten release cycles.

Good experience in migration of Salesforce data using SSIS Adapters, Kingsway soft adapters and Azure Data Factory.

Good experience in migration of MSCRM dynamics data using SSIS Adapters, Kingsway soft adapters and Azure Data Factory Pipelines.

Proficient in SQL server integration services (SSIS) for ETL development.

Experience in transforming data from one server to other servers using tools like Bulk Copy Program (BCP), SSIS and Azure Data Factory.

Experienced in developing Web Services with Python programming language.

Extensive experience in SQL server reporting services (SSRS) for report design and deployment.

Solid experience of T-SQL for data manipulation and querying.

Created pipelines for copy to execute the SSIS packages using integration runtime in Azure data factory and developed pipeline to copy the data from different servers to BLOB and vice versa.

Created pipelines to move data from multiple DB to Single DB using ADF.

Implemented archival and deletion process using ADF pipelines.

Extensively worked on Data Extraction, transforming the data by using SSIS.

Extensively worked in reading Continuous json data from different source system using EventHub into various downstream systems using stream analytics and Apache spark structured streaming (Databricks).

Extensive knowledge of Spark Architecture including Spark Core, Spark SQL, Data Frames and Spark Streaming.

In-depth knowledge and Hands on experience implementing cloud data lake like Azure Data Lake Gen1 and Azure Data Lake Gen2.

Experience in Apache Spark jobs using Scala and Python for faster data processing and used Spark Core and Spark SQL libraries for querying.

Actively involved in various project phases: Functional Specification, Design, Coding, Testing, Documentation, Deployment and Production Support.

Proven proficiency at Data Transformations like Derived Column, Conditional Split, Aggregate, Merge join, Multicast and Sort& Execute SQL Task to load data into Data Warehouse.

Maintaining the Log and Audit information in the SQL tables.

Hands of experience in creating Jobs and Scheduling SSIS Packages.

CERTIFICATIONS

Microsoft Certified: AZ-900 Azure fundamentals ,2024.

Microsoft Certified Azure Data Engineer,2024.

QUALIFICATION

Bachelor of Technology (B. Tech) from JNTU, Hyderabad.

Intermediate from Board of Intermediate.

SSC from Secondary school of Education.

TECHNICAL SKILLS

Build Tools Visual studio, Azure Data factory, SSDT Tools, Azure data lake storage, Kingsway soft, One Tab Adaptor

Management utilities JIRA, Agile, Scrum, Source versioning, Control -M, Bitbucket, confluence

Databases Oracle, MS SQL-Server, Azure SQL

Languages & Scripting T-SQL, PL/SQL, HTML, XML, Python, Spark, PySpark

IDE Tools SSIS, SSRS, SSAS, Microsoft Azure data factory, Azure Data Lake Storage, Azure Synapse Analytics, D365 and Salesforce

Continuous Integration Tools Jenkins, Azure Devops

Cloud Azure, ADLS, Azure Data Box, Azure SQL DB, SQL server, Azure Synapse, Azure Analysis Services, Databricks, Azure Cosmos DB, Azure Stream Analytics, Azure Event Hub, Logic Apps, Event Grid, Azure DevOps, ARM Templates

Reporting Tools Power BI, SSRS, Crystal Reports

Version Control GIT

CARRIER EXPERIENCE

Crane CPE, USA

Azure Lead Data Engineer Mar 23 -Present

Responsibilities:

Developed extensive ETL Pipelines utilizing Azure Data factory, leading to increase in workflow efficiency.

Responsible for getting, understanding the business specs, preparing technical Specs, Azure Data Factory Pipelines.

Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

Develop deep understanding of the data sources, implement data standards and maintain data quality

Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL).

Created Pipelines in ADF using Linked Services/Datasets to Extract, Transform and load data from different sources like Oracle into Azure SQL, Azure Data Lake Storage, Azure Data Box.

Created pipelines, data flows and complex data transformations and manipulations using ADF and PySpark with Data bricks.

Working on Azure Data bricks to run Spark-Python Notebooks through ADF pipelines.

Created and maintained various Shell and Python scripts for automating various processes and optimized Map Reduce code, pig scripts and performance tuning and analysis.

Extracted huge tables from Oracle to flat files using SQLPlus and continuously monitoring them.

Automated scripts creation for data extraction from Oracle.

Used Jupyter Python notebooks to create data validation reports based on Column name, Datatype, record count and sample data validations.

Involved in creating HiveQL on HBase tables and importing efficient work order data into Hive tables

Involved in converting Map Reduce programs into Spark transformations using Spark RDD's using Scala and Python.

Used Pig Latin at client-side cluster and HiveQL at server-side cluster.

Using Data bricks utilities called widgets to pass parameters on run time from ADF to Data bricks.

Experience in importing and exporting Terabytes of data between HDFS and Relational Database Systems using Sqoop.

Collaborated with cross-functional teams to gather requirements and design documents.

Working on Agile environment with monthly releases and daily standups.

Implemented scheduled CI/CD pipelines from lower environment to higher environments.

Wrote python scripts to parse XML and JSON data and load the data into database.

Installed Oozie workflow engine to run multiple Hive.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation using Azure data factory pipelines.

Implemented Daily scheduled jobs by using Triggers in Azure Data Factory.

Performed various Azure data factory data flow transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Created Event Handlers to Handling Errors and Debugging for the pipelines.

Environment : Oracle, Azure SQL, Azure Data Factory, Azure Data Lake, Azure Databox, Python, MS Excel, Jupyter, Microsoft Teams, Azure blob, Git hub.

Prudential, USA

Azure Data Engineer & SSIS developer Sep 21 - Mar 23

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, SSIS Packages, Azure Data Factory Pipelines.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation using SSIS package, ADF Pipelines.

Performed various SSIS transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Designed and Developed pipelines, data flows, complex data transformations and manipulations using Azure Data Factory (ADF) and PySpark with Databricks.

Created, provisioned multiple Databricks clusters needed for batch and continuous streaming data processing and installed the required libraries for the clusters.

Created External tables in Azure SQL Database for data visualization and reporting purpose.

Created Azure Data Factory pipelines to extract the data from Relational sources like Oracle, SQL Server, DB2 and non-relational sources like Flat files, JSON files, XML files, Shared folders etc.

Developed streaming pipelines using Apache Spark with Python.

Developed Azure Databricks notebooks to apply the business transformations and perform data cleansing operations.

Develop Databricks Python notebooks to Join, filter, pre-aggregate, and process the files stored in Azure data lake storage.

Implemented Spark using Scala, Python and utilizing Data frames and Spark SQL API for faster processing of data.

Experience in developing customized UDF s in Python to extend Hive and Pig Latin functionality

Analyzed the SQL scripts and designed it by using PySpark SQL for faster performance.

Experience in creating reports from scratch using Power BI.

Created SSIS packages for File Transfer from one location to the other using FTP task.

Create and maintain SSIS packages to extract transform and load data into SQL Server.

Experience in creating scheduling Jobs, Alerts, SQL Mail Agent, and scheduled DTS Packages.

Involved in ETL architecture enhancements to increase the performance using query optimizer.

Configured the loading of data into slowly changing dimensions using Slowly Changing Dimension wizard.

Designed various SSIS modules in order to fetch the data in the data staging environment based on the different types of incoming data.

Optimization of Hive queries using best practices and right parameters and using technologies like Hadoop, YARN, Python, PySpark.

Performing the daily code check in Bitbucket using git bash.

Executing and monitoring the pipelines using Control M

Created Stored Procedures to load data in different tables.

Created pipelines for copy to execute the SSIS packages using integration runtime and developed pipeline to copy the data from different servers to BLOB and vice versa.

Created Event Handlers to Handling Errors and Debugging for the Packages.

Environment : SSIS and SQL server 2019, Azure data factory, Azure SQL, Visual Studio, Control M, Bitbucket, JIRA

Abu Dhabi Ports, INDIA

BI developer Aug 19 - SEP 21

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, SSIS Packages

Understanding existing D365 Security model and customer requirements.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation in Dynamics CRM using SSIS package.

Performed various SSIS transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Wrote T-SQL code for creating cursor and handling the data validations.

Designed and implemented stored procedures and triggers for automating tasks.

Worked as a developer in creating complex Stored Procedures, SSIS packages, triggers, cursors, tables, and views and other SQL joins and statements for applications.

Created several lookup tables and updated couple of existing lookup tables.

Generated database SQL scripts and deployed databases including installation and configuration.

Designed and implemented user login and security.

Resolved any deadlocks issues with the databases / servers on a real-time basis.

Responsible for ongoing maintenance and change management to existing reports and optimize report performance.

Utilized SSIS to extract data from main frame to load data into the new data warehouse.

Created eight OLAP cubes from relational data warehouse with various measures, dimensions, and business critical KPIs and representing aggregations in different ways hierarchically and using custom groupings using SSAS.

Using Star schema in a SQL/SERVER environment, stored Insite Data Model in a SQL/SERVER data warehouse.

Created forty various reports using SSRS from OLAP cubes and publish and secured with SharePoint Server.

Developed Reports based on User input and published using SharePoint Server.

Performed report testing and Verification as per internal reporting system requirements.

Created OLAP cubes from relational data warehouse to develop measures, dimensions, and business critical KPIs and representing aggregations in different ways - hierarchically and using custom groupings using SSAS.

Provided the application analyses, data modeling, metadata management, and integration testing functions for existing applications and the new performance assessment and invoice visibility applications.

Created Stored Procedures to load data in different tables.

Created Packages to migrate the documents to SharePoint using Kingsway soft.

Created Event Handlers to Handling Errors and Debugging for the Packages.

Created Jobs and implemented it for scheduling the SSIS packages on daily basis.

Environment : SSIS and SQL server 2019, Power BI, D365 & Kingsway soft.

Kansas Department of Corrections, INDIA

BI developer Mar 17 - July 19

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, ADF Pipeline creations.

Responsible for managing BI environment which includes Business model, Data Model, Data sources, ETL tools, target data warehouse, Data marts, OLAP analysis and Reporting tools.

Designed and developed SSIS (ETL) packages to validate, extract, transform and load data from OLTP system to the Data warehouse and Report-Data mart.

Understanding existing D365 Security model and customer requirements.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation in Dynamics CRM using ADF Pipelines.

Performed various Activities before loading such as Foreach, Web, copy data, stored procedure etc., which was for data scrubbing, and included data validation checks during staging.

Developed SSIS packages using for each loop in Control Flow to process all excel files within folder, File System Task to move file into Archive after processing and Execute SQL task to insert transaction log data into the SQL table.

Developed and deployed SSIS packages for ETL from OLTP and various sources to staging and staging to Data warehouse using For Each Loop Container, Execute Package task, Execute SQL Task, Sent Mail task, Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Slowly Changing Dimension and more.

Supported Production Environment with schedule the packages and make the package dynamic with SQL Server Package Configuration.

Worked with partially / full blocking transformations (Merge, Pivot, Union All/Aggregate, Row Sampling, Sort) and expertise in performance tuning of SSIS packages.

Worked in creating and managing Event Handlers, Package Configurations, Logging, System and User-defined Variables for SSIS Packages.

Used various SSIS tasks such as Conditional Split, Derived Column etc. which did Data Scrubbing, including data validation checks during Staging, before loading the data into the Data warehouse.

Designed, deployed, and maintained various SSRS Reports in SQL Server. Administered interface to organize reports and data sources, schedule report execution and delivery, and track reporting history using SSRS.

Created different types of reports such as Crosstab, Drill-down, Summary reports using SSRS.

Monitor the performance of project team members, providing and documenting performance feedback.

Created Stored Procedures to load data in different tables.

Handling Errors and Debugging the pipelines.

Create generic template to load the Entities data into CRM based on load order.

Create Jobs and implemented it for scheduling the Pipelines on daily basis.

Environment : ADF, SQL server 2017 & D365.

Comcast, INDIA

BI developer Aug 16 - Mar 17

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, SSIS Packages

Understanding existing D365 Security model and customer requirements.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation in Dynamics CRM using SSIS package.

Designed and developed SSIS packages for loading data from DDM interface to Staging. Performed various SSIS transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Created Stored Procedures to load data in different tables.

Created Event Handlers to Handling Errors and Debugging for the Packages.

Created Jobs and implemented it for scheduling the SSIS packages on daily basis.

Environment : SSDT Tools, SQL SERVER, MS Dynamics CRM 2016 Online.

WIC Program (NMWIC), INDIA

BI developer Aug 15 - Aug 16

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, SSIS Packages

Understanding existing CRM 2016 Security model and customer requirements.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation in Dynamics CRM using SSIS package.

Designed and developed SSIS packages for loading data from DDM interface to Staging. Performed various SSIS transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Created Stored Procedures to load data in different tables.

Created Event Handlers to Handling Errors and Debugging for the Packages.

Created Jobs and implemented it for scheduling the SSIS packages on daily basis.

Environment : SSDT Tools, SQL SERVER, MS Dynamics CRM 2016 Online.

CI Reporting, INDIA

MSBI developer Feb 13 - Aug 15

Responsibilities:

Developed Reports module using.vs.net 3.5 framework, Reporting Services and SQL server 2012.

Creating the packages and usage in SSIS with slowly changing Dimensions.

Report Creation and usage in CrystalRepots2008/XI.

Created the test plan, user test cases, build Control Documents and technical design documents.

Worked closely with Customers and Marketing to resolve issues and improve usability.

Done Unit, integration testing and functional testing.

Involved in the maintenance and support operations.

Environment : MSBI (SSIS, SSRS) and SQL server 2012.

Royal Bank of Scotland, INDIA

MSBI developer Aug 11 - Feb 13

Responsibilities:

Responsible for getting, understanding the business specs, preparing technical Specs, SSIS Packages

Understanding existing CRM 2011 Security model and customer requirements.

Implemented an ETL interface layer that abstracts the external security application from the security model implementation in Dynamics CRM using SSIS package.

Designed and developed SSIS packages for loading data from DDM interface Staging.

Performed various SSIS transformations before loading such as Derived Column data conversion etc., which was for data scrubbing, and included data validation checks during staging.

Created Stored Procedures to load data in different tables.

Created Event Handlers to Handling Errors and Debugging for the Packages.

Created Jobs and implemented it for scheduling the SSIS packages on daily basis.

Environment : MSBI (SSIS, SSRS) and SQL server 2012, CRM 2011 and CRM 2015
Keywords: continuous integration continuous deployment business intelligence database information technology microsoft procedural language Arizona

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3257
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: