Home

Mohan - Data Engineer
[email protected]
Location: Berlin Heights, Ohio, USA
Relocation: Yes
Visa: H1B
Mohan Palatya
[email protected] Ph# +1 614-384-6978
PROFESSIONAL SUMMARY

9 years of professional experience as a Data Analyst, Data Engineer, and Data Science analyst working with Python, Spark, AWS, Python, and SQL, developing, testing, and implementing business application systems.
Proficient in SQL, SPSS, SAS, Tableau, Snowflake, Software Tool Management, and Server Administration with solid knowledge of Python, statistics, and Machine Learning models for data analysis.
Experience with data querying languages (SQL), scripting languages (Python), statistical/mathematical software (R, SAS, SPSS), JavaScript, Unix, and Unix scripting.
Seeking a challenging position as a Software Developer, BI Engineer, Data Engineer, and Data Scientist to utilize skills and expertise in data modeling, statistical data modeling, and statistical programming.
Experienced process-oriented Data Analyst with excellent analytical thinking, quantitative, and problem-solving skills using SQL, MicroStrategy, Advanced Excel, and Python.
Proficient in writing functional testing, unit testing code using Unit Test/PyTest, and integrating the test code with the build process.
Experienced with version control systems like Git, GitHub, and Bitbucket to keep the versions and configurations of the code organized.
Strong knowledge of test methodologies and experience implementing performance testing best practices.
Experience in the SDLC life cycle, from requirements gathering to releasing into production.
Skilled experience in Python with proven expertise in using frameworks like Django, Flask, and libraries like Pandas, NumPy, SciPy, matplotlib, Pickle, Seaborn, matplotlib, networks, urllib2, MySQL DB for database connectivity to build microservice and monolithic applications.
Having experience in Agile Methodologies, Scrum stories, and sprints experience in a Python-based environment, along with data analytics, data wrangling, and Excel data extracts.
Experience in using build/deploy tools such as Jenkins, Docker, and OpenShift for Continuous Integration and deployment.
Experience in working with EC2 and S3 buckets in AWS using Cloud formation template.
Administered the Business Users and Groups and Assigned Privileges accordingly in the Central Management Console. Extensively used Business Objects Designer to create, maintain, and modify Universes for different business groups.
Support Daily Operations by monitoring, maintaining, and promoting data quality, documenting, troubleshooting, and resolving day-to-day data issues, investigating issues in reports or analyses, and providing root cause analysis.
Followed with Data Science life cycle, SDLC, Waterfall, and Agile methodologies and used to develop software products.
Excellent knowledge of Market Research, Insurance, health care, and supply chain domains and experience working with huge datasets and Skilled in Data Analytics, Data Processing, and Data Management.
TECHNICAL SKILLS

Programming Languages: Python, R, Java, HTML, CSS, JavaScript, Scala, UNIX - Shell Scripting
Statistical Analysis Tools: SPSS, SAS, SQL, DB2, Tableau, crunch.io, Qlik, Power BI, Alteryx
Environment: SQL, PySpark, Snowflake, Agile, Jira, Java, Eclipse, Selenium Web Driver, TestNG, Cucumber, Maven, Git, Git-Hub, Jenkins.
Cloud: AWS (EC2, S3, VPS, SNS, Lambda), Azure, Snowflake
Machine Learning and AI: Machine Learning solutions, artificial intelligence, Deep Learning, Generative AI solutions, Predictive models, AI/ML model, TensorFlow, PyTorch, Keras, LLM.
Database and Tools: SQL and No-SQL, MongoDB, Teradata, Dashboard Management, Docker & Kubernetes, APIs, Azure, AWS, Data Acquisition and Management, Testing and Reporting, Big Data.
Development and Tools: Git and Google Collab, MS Office, JIRA, EDA, ETL, web analytics tools, Elasticsearch, Data Structures, Algorithms, CRM, ci/cd, metadata, A/B testing, customer segmentation.
ETL Tools: BI Data Integrator (ETL), Ab-initio, Data Services, Informatica, Data Stage.
SDLC Methodologies: Agile-Scrum, Waterfall

________________________________________
PROFESSIONAL EXPERIENCE

Cigna May 2021 Present
Data Engineer

Extracted, Loaded, and Integrated large volumes of raw, complex healthcare data from various health plans and disparate sources.
Designed, developed, and created ETL (Extract, Transform, and Load) packages using Python to load data into Data warehouse tools (Teradata) from databases such as Oracle SQL Developer and MS SQL Server.
Data analysis in application and reporting databases and resolution of inconsistencies. The data was cleaned, visualized, and analyzed for missing data and skewness, as well as correlation analysis utilizing heat maps, principal component analysis, and model construction.
Served as a subject matter expert and go-to person for internal and external questions on health data topics and conducted statistical analysis using SAS and SQL on large datasets.
Perform data mining and build statistical models on data, perform qualitative and quantum data analytics, and derive predictions on data behavior for health care data.
Experienced in different types of data modeling, like- star schema modeling, and created joins and relationships according to the data model Schema between attributes.
Generated advanced reports in Grid, Graph, and SQL mode, consisting of data analysis by combining a template with filters using MicroStrategy Web and Desktop.
Identify and modify source tables and columns in the warehouse required for creating the MicroStrategy project.
Integrating Snowflake SQL within Python and configuring Apache Airflow scheduler to automate the generation of business reports on a regular basis and transfer the outputs to Amazon storage S3.

State Farm July 2019 May 2021
Data Engineer

Worked with the IT architect and program managers in requirements gathering, analysis, and project coordination.
Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Datawarehouse, Azure Data Lake Store, and Azure Blob Storage technologies.
Perform data mining and build statistical models on data, perform qualitative and quantum data analytics, and derive predictions on data behavior.
Created mapping documents with detailed source-to-target transformation logic, Source data column information, and target data column information.
Created ETL and Data warehouse standards documents - Naming Standards, ET methodologies and strategies, Standard input file formats, data cleansing, and preprocessing strategies.
Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type, CDC and Incremental Load) and Fact load processes.
Implemented an internal Dashboard using a business intelligence tool to integrate all business transformations into a centralized dashboard, facilitating project status reviews and enhancing collaboration across teams.
Collaborated with cross-functional teams, including developers, data scientists, and project managers, to ensure the integration and success of Business Solutions.
Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting requirements.
implement processes or develop predictive models in targeted marketing strategies, which may include assessing distinct client categories and estimating the incremental impact of several marketing efforts by reviewing results.
KCS April 2014 May 2018
Senior Data Analyst

Lead designing, technical expertise, and developing a comprehensive Disney project dashboard, integrating KPIs (Key performance indicators), drivers, and barriers for streamlined visualization and decision-making.
Using Python and Machine Learning Algorithms like sentiment analysis and NLP, we streamlined ServiceNow tickets and created an automated ticket classification system.
Created SPSS macros to handle ad-hoc research efficiently, removing 60% of manual work, enhancing data analysis, and boosting operational effectiveness with a creative approach.
Extracted actionable insights, facilitating project improvements and performance optimization for the projects, and worked on a recommendation system and demand forecasting using ML techniques.
Identify and measure the success of product efforts through goal setting, forecasting, and monitoring of key business metrics to analyze industry trends and provide support for Technical Projects.
Data sourcing from many platforms to convey the tale of what happened and provide suggestions on continuously enhancing the business's performance in satisfying consumer expectations.
Ensured high accuracy and data quality of files for optimal analysis, brand strategy, and product development by spearheading data gathering business requirements, data manipulation, data cleaning, and preparation.
Produced compelling tables and visualizations using production reports with SPSS and SAS, elevating client presentations and enabling data-driven decisions.
Demonstrated proficiency in SPSS and Quantum scripts, skillfully executing data merging, stacking, weighting, and tabulation, hypothesis testing to enhance quality assurance and usability.
Engaged with customer needs to tailor survey questionnaires, leveraged Excel, SPSS, and SQL for precise data analysis, and conducted research and comprehensive statistical techniques to uncover significant insights.
data extracts and working with data from different database sources for Oracle, SQL server, DB2, NoSQL, PostgreSQL, and MySQL. Analyze data using analytics tools like Python, R, and/or SAS, SPSS.
________________________________________
Keywords: continuous integration continuous deployment artificial intelligence machine learning business intelligence sthree database active directory rlang information technology golang microsoft

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1148
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: