Home

Nirnay Reddy Rao - Data Engineer
[email protected]
Location: Dallas, Texas, USA
Relocation: Yes
Visa: H1B
Nirnay Reddy Rao
469 459 6394
[email protected]

Summary:
Over 9+ years of strong experience in Software Design, Analysis, Development, Implementation and Testing of Object-Oriented Applications and Web based Enterprise Applications.
Extensively worked with designing, developing, and implementing Big Data Applications using Microsoft Azure Cloud, AWS, and big data technologies like Apache Hive, Apache Spark & Spark SQL.
Developed ETL/ELT pipelines using big data and cloud technologies such as Apache Hive, Apache Spark, Azure Data Factory, and Azure Databricks.
Designed and solved many complex data pipeline issues for building fault-tolerant and scalable ELT pipelines.
Worked with various big data file formats such as Apache Parquet, CSV, AVRO, and JSON while developing big data applications using Apache Hive and Apache Spark.
Worked with Azure Synapse Analytics to develop end-to-end ETL/ELT applications.
Developed reusable/generic pipelines that can be used for multiple data products/Lobs.
Designed and developed a generic notebook to validate the data between source and target across multiple stages of data processing.
Created Databricks notebooks with delta format tables and implemented lake house architecture.
Worked with spark core, Pyspark and spark SQL modules of Spark.
Implemented control flow architecture for developing a secure & end-to-end big data application using Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Azure Datalake Storage, Azure SQL DB, and Azure Key Vaults.
Experience in design and development of robust and highly scalable REST API's.
Having experience with Cloud Computing environment like Amazon Web Services (AWS).
Managed operations and maintenance support for AWS cloud resources which includes launching, maintaining, and troubleshooting EC2 instances, S3 buckets, Auto Scaling, Dynamo DB, AWS IAM and Elastic Load Balancers (ELB) and API Gateway.
Experience on unified data analytics with Databricks, Databricks workspace user interface, managing Databricks Notebooks, Delta Lake with python, Delta Lake with Spark SQL.
Deployment of Azure Data bricks and Azure Data Factory (ADF) using Azure DevOps.
Proficient in using Maven scripts for building and deploying applications in web/App servers.
Hands on Experience in developing web applications by using Spring framework modules like Spring IOC, Spring MVC, Spring AOP, Spring Data, Spring Boot and Spring Cloud Gateway.
Experience in using code repository tools Bitbucket and GitHub
Worked on a project involving Databricks, Pyspark and data warehousing concepts
Good Understanding in databases such as Oracle, MySQL, and Mongo DB databases to manage tables, views, indexes, stored procedures, functions, triggers, and packages.


Work Experience:

Client: Ford July 2021 Till Date
Role: Senior Data Engineer

Roles and Responsibilities:
Integrated predefined modules which are utilized by ihub/fsl application.
Understand and analyze ETL requirements and identify solution elements for enhancement requests and change requests.
Predominantly worked on Azure Synapse Analytics to design end-to-end Pipelines, spark applications using notebooks, and loaded the data into dedicated SQL Pools.
Built fault tolerant, scalable, and complex Pipelines using synapse pipelines.
Implemented CDC (Change Data Capture logic) for incremental data loads using Azure Synapse features of pipelines.
Created PySpark and Spark SQL scripts in synapse notebooks for doing data transformations as per the given business requirements.
Built complex ETL/ELT pipelines for data processing in the azure cloud using Azure Data Factory V2, and Azure Synapse Dedicated SQL Pools.
Did a complete migration project from end-to-end from on-premises to Azure cloud by developing the ELT processing framework using Azure Synapse Analytics
Implement data ingestion from SAP and JDE systems to read the Master and Transactional Data and process raw data.
Migrating the data from sources to the destination with the help of Azure Data Factory (ADF).
Implement curation jobs based on business logic and compute data.
Develop Business object and expose services for TruView Application (Sales Force Application).
Developing Applications using Cl Build tools - Gradle, Maven & Continuous integration Jenkins.
Involved in EDG (Enterprise Data Grid) Applications Build to Support Transition activities including Cutover activities for multiple EDG Projects like Mooresville, ATOM OTC P3, DPS Enhancements and there are some interactions with OMEGA ASPAC node ext.
Coordinate with other teams to take any necessary actions in case of job failures or redeployments.
Create Notebooks in Data bricks for loading data from source to target ADLS
Developed spark applications Pyspark and spark SQL for data extraction, transformation, and aggregation from various.
Build and deploy the code using automated build tools, testing process using Jenkins build tools.
Setup an environment for data integration from different ERP (Enterprise Resource Planning) systems.
Development of Data Bricks Notebooks using SQL.
Designated as primary point of contact for production support for the implementation of the new framework using Azure data factory, and Azure Databricks.
Built pipelines to move hashed and un-hashed data from Azure Blob to Datalake.
Migration of on-premises data (SQL Server) to Azure Data Lake Store (ADLS Gen 2) using Azure Data Factory (ADF) and Azure Synapse Pipelines.
Created Clustered and non-clustered indexes on the fact and dimension tables in the data warehouse for faster retrieval of data for complex analytical and reporting queries.
Generate Daily Reconciliation Reports for Business stakeholders.
Migration of on-premises data (SQL Server) to Azure Data Lake Store (ADLS Gen 2) using Azure Data Factory ADF and Azure Synapse Pipelines.
Created Clustered and non-clustered indexes on the fact and dimension tables in the data warehouse for faster retrieval of data for complex analytical and reporting queries.
Follow Continuous Integration (CI) and Continuous Development (CD) process and ensure the developed code pass the SonarQube quality gate standards
Worked on production issue and Acknowledge and Investigate the production incident tickets assigned to group and provide resolution based on priority and criticality within SLA's.

Client: Morgan Stanley July 2019 June 2021
Role: Data Engineer

Responsibilities
Worked on Agile Environment having daily stand-up meetings, pre-planning and planning meeting and face-to-face communication.
Interacted with the client in understanding the stories (Problems) and dividing them into sub tasks and providing estimation for the tasks.
Involved in Initial Application Design Documents to support integrations with different components.
Responsible for setting up AngularJS framework to achieve binding data between HTML and JavaScript objects.
Implemented single page applications with AngularJS using custom directives and developed Angular routes using route providers.
Implemented Multithreading, Concurrency, Exception Handling and Collections whenever necessary in the code for enhancement purpose.
Developed single page applications using Angular 6.
Worked Extensively with Angular CLI for creating components, Services, pipes, Directives.
Implementation and design of new interfaces.
Implemented UI changes using Struts Validation Framework for UI validation and worked with Struts tag libraries.
Implementing SOAP and Restful webservices & Integration.
Hibernate ORM integration to populate and update customer data.
Implemented rules using Drools Engine & Integration.
Used Hudson for building and deployment of application into specific environments.
Used JIRA to keep track of bugs and issues and Maven as build tool.
Involved in Unit & Integration Testing for application.
Develop and execute unit test plans using JUNIT, SOAP, and POSTMAN.
Involved in validating the test cases with QA team whether the test cases meet business Requirements or not.
Leading the sprint team on deliverables & issues.
Checking logs and exception statements.
Supporting deployment team in configuring continues Integration and Continues Deployments

Client: CVS Health June 2017 July 2019
Role: Data Engineer

Responsibilities
Involved in the client meetings, Architecture design and blueprinting phases of the project and provided estimates.
Developed and delivered REST based web services (Member portal and Provider portal) as part of the team from Aug 2016 to Present on AWS Lambda.
Worked on making applications more scalable and highly available system in AWS (load balancing).
Migrating applications from Data center to AWS.
Performed S3 buckets creation, policies and on the IAM role based polices and customizing the JSON template.
Deployment of application using Jenkins and GIT hub.
Merged developer copies with shared mainline server for Continuous Integration.
Created AWS Cloud formation templates on creating IAM Roles & total architecture deployment. (Creation of EC2 instances & its infrastructure).
Configured Cloud watch alerts.
Developed integration objects using ESB tool iWay.
Used AGILE methodology for developing applications.
Worked on XML and JSON data structures.
Configured Data provider and schedule provider in iWay Service Manager in Dev and UAT domains.
Created SOAPUI based unit test cases for the services that we have developed.
Configured channels for clients in I-Way (ESB).
Involved in Debugging and defect fixing for production defects.
Designed stored procedures and triggers along with Performance tuning for SQL.
Responsible for all stages of the software development process.
Implemented the application using the concrete principles laid down by several design patterns such as MVC-2, Business Delegate, Session Facade, Service Locator, Data Access Object, and Singleton.
Performed unit testing using JUnit.

Client: Atria Convergence Technologies May 2014 July 2016
Role: Data Engineer

Responsibilities
Involved in discussions with client to understand existing system, technical details, scope, and their limitations.
Involved in the client meetings, Architecture design and blueprinting phases of the project and provided estimates.
Used AGILE methodology for developing the application.
Written low level and high-level design documents, presented demos to client in confirmation for solutions/approach.
Worked on upgrade of Health Edge payor and connector releases. Deployed same upgrades in client environments by doing smoke and regression testing.
Created xsd, xjb, xml-based SOAP web services, configured scripts for the new services, deployed the karaf files into camel-karaf environment.
Designed and configured web portal customizations, merged the code pieces, maintained the perforce(repository) and deployed code into client environments.
Configuration and deployment of the web application using Web Logic.
Used JMS for server-side messaging.
Used XSLT for arranging the Data (XML Data) in the order required by the Client.
Wrote static queries (xml based) using X-Query framework.
Managed the scrum Jira tickets and tracking scrum board for weekly based client deliveries.
Developed and delivered a total of 9 SOAP based web services (Payee Bank Accounts, Supplier Terminations) as part of the team.
Created SOAP UI based unit test cases for the services that we have developed.
Worked on I-Way (Enterprise Service Bus). Configured channels for clients using classic connectors.
Attended meetings with business analysts for new services to give delivery estimations.
Involved in Debugging and defect fixing for production defects.

Education:
Master s in information technology management, Lindenwood University, MO, USA.
Bachelor s in computer science and technology, SRM University, Chennai, India.

Certification:
AWS Certified Developer Associate (Amazon Web Services) - Issued Sep 2022
Keywords: continuous integration continuous deployment quality analyst user interface sthree database Missouri

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3150
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: