Home

Sandeep Kumar Chinta - Azure Data Engineer.
[email protected]
Location: Irving, Texas, USA
Relocation: yes
Visa: H1b
PROFESSIONAL SUMMARY:
Overall 9 years in IT, Having 6+ years of extensive experience as Azure Data Engineer.
Developed and maintained end-to-end operations of ETL data pipelines and worked with large data sets in Azure Data Factory.
Worked on Migrating On premises SQL database to Azure Synapse Data warehouse, Azure Data lake Gen2 using Azure Data factory.
Developed Spark applications using PySpark in Data bricks for data extraction, transformation and aggregation (ETL) from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP
Practical understanding of the Data modeling concepts like Star - Schema Modeling, Snowflake Schema Modeling.
Researched and implemented various components like pipeline, activity, mapping data flows, data sets, linked services, triggers and control flow.
Performed extensive debugging, data validation, error handling mechanism, transformation types and data clean up analysis within large datasets.
Responsible for designing, building, validating, and maintaining heterogeneous databases and integrating these databases into an integrated data source.
Expert in developing SSIS Packages to extract, transform and load (ETL) data into data warehouse/data marts from heterogeneous sources.
Expert in generating writing parameterized queries, drill through reports and formatted SQL server Reports in SSRS 2005/ 2008/2008 R2 using data from ETL Loads, SSAS Cubes and various heterogeneous data sources.
Collaborate with data scientists to build and deploy machine learning models and Performed exploratory data analysis (EDA) in data bricks and Monitor and analyze model performance and make necessary adjustments
Collaborate with other teams within the organization, such as Azure DevOps team, to ensure smooth deployment and operation of Databricks solutions.
WORK EXPERIENCE (9 YEARS) (2014-2023)
WIPRO Limited Technical Lead
Srinav Info Systems Pvt.Ltd, Hyderabad (Contract to WIPRO) Technical Lead
AAC Software & IT Pvt.Ltd, Hyderabad Sr. Software Engineer
CERTIFICATIONS:
Microsoft Certified Database Administrator Cert No: I324-8184
Microsoft Certified Azure DevOps Engineer Expert Cert No: H810-1930
Microsoft Certified Azure Administrator Cert No: H779-4106
Microsoft Certified Azure Data Fundamental Cert No: I323-0940

EDUCATION QUALIFICATIONS
B.Sc. (Computer Science) From Osmania University, Hyderabad, India. 2010- 2013

PROFESSIONAL EXPERIENCE:
CLIENT: MICROSOFT (2021-2023)
SR.DATA ENGINEER
Roles and Responsibilities
As a Microsoft Partner, we offer solutions to our customers across the globe.
Led end-to-end implementation of data ingestion pipeline, applying advanced enrichment and aggregation techniques.
Moving CSV files from Azure blob to on-prem server SQL server
Developed scalable data pipelines and ETL processes using Azure Data factory to process large datasets with a daily volume of 10 TB.
Scheduled Jobs in Flows and ADF Pipelines
Creating Stored Procedure and Scheduled them in Azure Environment
Monitoring Produced and Consumed Data Sets of ADF
Creating Multiple Data Sets in Azure
Migrating the data from different sources to the destination with the halp of ADF
Creating Pipelines with GUI in Azure Data Factory
Successfully creating the Linked Services on the source and as well for the destination servers
Moving data from HIVE to Azure SQL DB with the help of pipeline and data flows
Analyzed large datasets with SQL to identify trends and draw insights, resulting in a 29% increase in operational efficiency.
Involving on loading and transforming large sets of structured, semi structured and unstructured data
Extract, Parsing, Cleaning and ingest data
Monitor System health and logs and respond accordingly to any warning or failure conditions
Worked on Log Analytics to create custom queries and alerts based on the collected data and Set up alerts for specific events or performance thresholds (e.g., high CPU usage, cluster scaling events) in data bricks.
Develop Spark applications using Pyspark and spark SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns
Optimized data processing performance by using 60% and reduced time-to-insights by 45% using Databricks distributed systems and parallel processing methods.
Integrated Data Bricks with Azure Monitor to create alerts that trigger actions in response to specific Databricks events.
Worked in Databrick utilities (dbutils) to perform combination task like parameterized note book, chain note book and secrets.
Collaborated with DevOps Engineers to developed automated CI/CD and Test driven development pipeline using Azure as per client standards.
Author technical design of functional specifications and progress the solution from design though the software development life-cycle to implementation.

CLIENT: ZENSAR TECHNOLOGIES (2019-2021)
AZURE DATA ENGINEER
Roles and Responsibilities
Building the pipelines to copy the data from source to destination in Azure Data Factory
Worked on creating dependencies of activities in Azure Data factory
Successfully creating the Linked Services on the source and as well for the destination servers
Reduced migration costs of large data sets across multiple cloud providers by 50%.
Created automated workflows with the help of triggers
Monitoring Produced and Consumed Data Sets of ADF
Optimized data pipelines to reduce costs by 30% while ensuring data integrity and accuracy
Migrating the data from different sources to the destination with the help of ADF
Created stored procedures using Common Table Expression (CTE)
Scheduling Pipelines and monitoring the data movement from source to destinations
Transforming data in Azure Data Factory with the ADF Transformations
Provide support of production applications by trouble - shooting issues, and developing, testing and migration of Data Bases.
Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
Troubleshoot any kind of data issues or validation issues
Gathering requirements from the client managers and implementing them as tasks as per requirements
Debugging the scripts which were developed by the other team mates
Delivering the workloads before the deadlines
Understand the business processes and application functionality relevant to their area as well as an understanding of related applications in adjacent areas.
Author technical design of functional specifications and progress the solution from design though the software development life-cycle to implementation.
Provide production support of applications in production by trouble-shooting issues, proposing solutions, develops and tests fixes and migrate solution.
Obtain code reviews from Senior Application Developer and ensure that all programming standards and policies are adhered to.
Automated manual data query task and modified existing software programs to enhance performance with indexing, normalization and performance tuning queries.
Diverse experience in all phases of software development life cycle (SDLC) especially in Analysis, Design, Development, Testing and Deploying of applications.
Knowledge in various file formats in HDFS like Avro, orc, parquet.
Performed extensive debugging, data validation, error handling mechanism, transformation types and data clean up analysis within large datasets

CLIENT: IRI WORLDWIDE (2014-2019)
MSBI DEVELOPER
Roles and Responsibilities
Experienced in analyzing, designing, developing, installing, configuring and deploying MS SQL Server suite of products with Business Intelligence in SQL Server Reporting Services 2008, SQL Server Analysis Services of 2008 and SQL Server Integration Services.
Importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.
Expert in generating writing parameterized queries, drill through reports and formatted SQL server Reports in SSRS 2008/2008 R2 using data from ETL Loads, SSAS Cubes
Experience in using recursive CTEs, CTE, temp tables and effective DDL/DML Triggers to facilitate efficient data manipulation and data consistency as well as to support the existing applications.
Skilled in error and event handling: precedence Constraints, Break Points, Check points and Logging
Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), and SQL Server Integration Services (SSIS) (2008).
Defining data warehouse (star and snow flake schema), fact table, cubes, dimensions, measures using SQL Server Analysis Services.
Strong Experience in creating Ad Hoc, parameterized reports, Linked reports, Snapshots, Drilldown and Drill through reports using SQL SSRS 2008.
Experience in creating configuration files to deploy the SSIS packages across all environments
Expertise in writing T-SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.
Supported team in resolving SQL Reporting services and T-SQL related issues and Proficiency in creating different types of reports such as Cross-Tab, Conditional, Drill-down, Top N, Summary, Form, OLAP and Sub reports, and formatting them.
Experience in ETL processes involving migrations and in sync processes between two databases.
Experience in maintaining one to many and many to one relationships while moving data between two databases.
Designed & implemented migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-party tools) worked on Azure suite: Azure SQL Database, Azure Data Lake (ADLS), Azure Data Factory(ADF) V2, Azure SQL Data Warehouse, Azure Analysis Service(AAS) and Azure Blob Storage
Keywords: continuous integration continuous deployment database active directory information technology microsoft South Carolina

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];854
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: