Home

Harish A - Python
[email protected]
Location: Dallas, Texas, USA
Relocation:
Visa:
HARISH
623 624 1283
[email protected]
SUMMARY:
Over 9 years of experience as a Python Developer, proficient coder in multiple languages and environments
including Python, REST API, AWS, C, C++, and SQL.
Worked within an Agile framework, primarily following Scrum methodologies, to ensure iterative and incremental
delivery of software products.
Developed and deployed scalable RESTful APIs using FastAPI, delivering high-performance applications with
asynchronous request handling and fast response times.
Integrated FastAPI with various databases, including MongoDB, to create efficient data-driven applications,
ensuring seamless CRUD operations and optimized query performance.
Implemented JWT authentication and authorization in FastAPI applications, enhancing security and access control
for web services.
Wrote Python scripts to parse XML documents and load the data in a database and developed web-based
applications using Python, CSS, and HTML
Worked on applications and developed them with XML, JSON, XSL (PHP, Django, Python, Rails)
Set up and managed CI/CD pipelines using Jenkins, streamlining the deployment process and reducing time-to-
production.
Integrated Jenkins with GitHub to automate build processes, ensuring code is consistently tested and deployed with
minimal manual intervention.
Developed custom Jenkins pipelines using Groovy scripts, optimizing the build and deployment workflows for
various applications.
Experienced in developing web-based applications using Python, Django, PHP, C++, XML, CSS, HTML, DHTML,
JavaScript, and jQuery
Experience of software development in Python and IDEs: PyCharm, Sublime Text, Jupyter Notebook, PyScripter,
Spyder, PyStudio, and PyDev.
Contributed to various projects involving API Gateway, API testing using Postman, Pytest
Hands-on experience working in WAMP (Windows, Apache, MySQL, and Python/PHP) and LAMP (Linux,
Apache, MySQL, and Python/PHP) Architecture
Implemented Agile metrics and reporting, such as burn-down charts and velocity tracking, to monitor team
performance and identify areas for improvement.
Developed views and templates with Python and Django's view controller and templating language to create a user-
friendly website interface
Experience in implementing Python alongside using various libraries such as matplotlib for charts and graphs,
MySQL db for database connectivity, python-twitter, PySide, Pickle, Pandas data frame, network, urllib2
Experienced in using Python libraries like BeautifulSoup, NumPy, SciPy, matplotlib, Python-twitter, NetworkX,
urllib2, MySQLdb for database connectivity and IDEs - Sublime Text, Spyder, PyCharm
Experienced in Requirement gathering, Use Case development, Business Process flow, Business Process Modeling
Extensively used UML to develop various use cases, class diagram, and sequence diagrams
Well versed with the design and development of the presentation layer for web applications using technologies like
HTML5, CSS3, JavaScript/TypeScript, jQuery, AJAX, AngularJS, React JS, Bootstrap, JSON, XML
Exceptionally solid background in composing APIs along with Web Services over Python
Experience in using Adobe Flash, SVN, Eclipse, JIRA, GitHub, and CVS
Experience working knowledge in UNIX and Linux shell environments using command line utilities
Developed and maintained complex data pipelines for processing and analyzing large datasets related to equities,
fixed income, derivatives, and other financial instruments in capital markets.

Implemented risk management models using Python and R to assess market, credit, and operational risks, ensuring
compliance with regulatory requirements.
Optimized trading algorithms and back-tested trading strategies for equities and fixed income products, resulting in
enhanced execution and profitability.
Experienced in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers
Utilized GitHub Actions to create automated workflows, enabling continuous integration and deployment directly
from the GitHub repository.
Configured GitHub Actions to trigger builds, run tests, and deploy code, enhancing collaboration and efficiency in
development teams.
Implemented secure and scalable pipelines in Jenkins, managing environment-specific configurations and secrets to
protect sensitive data.
Configured and managed Elasticsearch, Logstash, and Kibana (ELK) stack for centralized logging and real-time
analytics.
Hands-on experience in using NoSQL libraries like MongoDB, Cassandra, Redis, and relational databases like
Oracle, SQLite, PostgreSQL, and MySQL
Worked on UNIX shell scripts for business processes and loading data from different interfaces to HDFS
Experience in deploying applications in heterogeneous Application Servers like Tomcat, WebLogic, and Oracle
Application Server
Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provide fast and efficient
processing of Big Data
Expertise in Python scripting with a focus on DevOps tools, CI/CD, and AWS Cloud Architecture
Utilized FastAPI's dependency injection system to manage resources such as database connections, authentication
mechanisms, and configuration settings, improving code modularity and maintainability.
Developed interactive and dynamic front-end applications using JavaScript, leveraging frameworks like React.js or
Vue.js to create responsive user interfaces and enhance user experience.
Implemented complex client-side logic in JavaScript, including DOM manipulation, event handling, and AJAX
calls, to build rich, interactive web applications.
Worked extensively with MongoDB for designing and managing NoSQL databases, handling large datasets, and
performing efficient indexing, aggregation, and querying.
Leveraged Jenkins for automated testing, code quality checks, and vulnerability scans, ensuring the codebase meets
high standards before deployment.
Managed multi-branch pipelines in Jenkins, allowing for parallel development and testing across different feature
branches.
Integrated Jenkins pipelines with Docker for containerized deployments, improving the portability and consistency
of applications across environments.
Orchestrated complex workflows in Jenkins and GitHub Actions, coordinating tasks across multiple tools and
platforms to achieve seamless CI/CD processes.
Experience in using Scikit-Learn and Statsmodels in Python for Machine Learning and Data Mining
Use of NLTK, OpenNLP & StanfordNLP for Natural Language Processing and sentiment analysis
Utilized tools like cron and systemd for job scheduling and service management in Linux environments
Experience with SQL Server Management Studio, SQL Developer, Toad, and MySQL GUI Tool
Utilized Confluence for documentation, collaboration, and knowledge sharing within the team.

TECHNICAL SKILLS:
Operating Systems Windows 7/8/10/11, Mac OS, Linux (CentOS, Debian, Ubuntu)
Programming Languages Python (3.x), R, Java (8, 11, 17)
Web Technologies HTML5, CSS3, XML, jQuery, JSON, Bootstrap (5), React JS /18,

JavaScript/TypeScript (ES6+)

Python Libraries/Packages

NumPy, SciPy, Pickle, PySide, PyTables, Pandas, Matplotlib, Seaborn, SQLAlchemy,
HTTPLib2, Urllib2, Beautiful Soup, Boto3, Requests, cx_Oracle, NLTK, spaCy,
scikit-learn

Python Frameworks Pandas, Flask, Django, Docker
IDEs PyCharm, Visual Studio Code, Notepad++, Jupyter Notebook
Machine Learning and
Analytical Tools

Supervised Learning (Linear Regression, Logistic Regression, Decision Tree, Random
Forest, SVM, Classification), Unsupervised Learning (Clustering, KNN, Factor
Analysis, PCA), NLP
Cloud Computing AWS, Azure, GCP, Snowflake
Databases MySQL, SQLite3, PostgreSQL, MongoDB, Teradata
Web Services/Protocols TCP/IP, UDP, FTP, HTTP/HTTPS, SOAP, REST
Miscellaneous Git, GitHub, SVN
Build and CI Tools Docker, Kubernetes, Maven, Jenkins
PROFESSIONAL EXPERIENCE:
Walgreens, Chicago, IL Oct 2022 Till Date
Senior Python Developer
Responsibilities:
Involved in the Web/Application development using Python 3.x, HTML5, CSS3, AJAX, JSON, and jQuery
Installed and configured NVIDIA drivers to optimize GPU performance for machine learning and deep learning
tasks, ensuring compatibility with CUDA.
Deployed and managed NVIDIA CUDA Toolkit for high-performance computing, enabling the development of
GPU-accelerated applications.
Utilized NVIDIA cuDNN library for deep learning frameworks, significantly improving the training speed of neural
networks.
Used Confluence templates to standardize documentation practices and ensure consistency across projects.
Managed the integration of market data feeds from various providers (such as Bloomberg, Reuters) into trading
platforms, ensuring real-time data availability and accuracy.
Automated the generation of financial reports, including P&L, VaR, and stress testing, improving the timeliness and
accuracy of capital markets analytics.
Developed single-page application using Angular backed by MongoDB and Node.js
Designed and developed the application using Python Django and Angular Framework.
Developed Angular components to consume the endpoints created in the backend.
Design and maintain databases using Python and developed Python-based API (RESTful Web Service) using Flask,
SQL Alchemy, and PostgreSQL
Implemented security best practices for the ELK stack, including role-based access control (RBAC) and data
encryption.
Integrated ELK with various application and infrastructure monitoring tools to achieve end-to-end visibility.
Maintained and monitored CI/CD pipelines, troubleshooting issues and optimizing performance to ensure consistent
and high-quality software delivery.
Applied basic machine learning knowledge to develop and deploy predictive models, leveraging tools like scikit-
learn and TensorFlow.
Experimented with feature engineering, model selection, and hyperparameter tuning to improve model accuracy and
performance.
Automated trade execution processes within SWAP platforms using scripting and API integrations, reducing manual
intervention and operational risks.
Develop remote integration with third-party platforms by using RESTful web services and Successful
implementation of Apache Spark and Spark Streaming applications for large-scale data.

Developed and optimized complex SQL queries, stored procedures, and functions in PostgreSQL.
Designed and implemented efficient database schemas, ensuring data normalization and integrity.
Managed PostgreSQL databases, including installation, configuration, upgrades, and patches.
Implemented and managed CI/CD pipelines using Jenkins, ensuring smooth integration and deployment processes.
Configured Jenkins to automate build, test, and deployment workflows, reducing manual intervention and improving
efficiency.
Integrated Jenkins with GitHub and BitBucket repositories for automated code checks and version control.
Outputting the parsed data as JSON/BSON and stored into MongoDB
Used NLTK and StanfordNLP to process text data and created offline intelligence
Querying data from MongoDB and use them as input for the machine learning models
Developed views and templates with Django view controller and template Language to create a user-friendly
Implemented data migration and ETL processes, transferring data between PostgreSQL and other databases
Developed data models and schema design in MongoDB, optimizing data storage and retrieval strategies for various
use cases, including real-time applications.
Implemented MongoDB replication and sharding to ensure high availability and scalability, catering to the needs of
distributed systems and large-scale applications.
Employed Agile methodologies (e.g., Scrum, Kanban) to manage software development projects, ensuring
continuous delivery of high-quality features through iterative sprints and regular feedback loops.
PUtilized tools like Jenkins, GitLab CI, and Artifactory to manage continuous integration, continuous delivery, and
continuous testing workflows, enhancing development efficiency and reducing release cycles.
Integrated testing frameworks within CI/CD pipelines to ensure code quality and reliability, leveraging unit,
integration, and functional tests.
Managed infrastructure as code (IaC) with Terraform and AWS CloudFormation, automating the provisioning and
management of environments across multiple stages.
Deployed applications using Docker containers and Kubernetes, enabling scalable, fault-tolerant services across
different environments.
Use Python unit and functional testing modules such as unittest, unittest2, mock, and custom frameworks in-line
with Agile Software Development methodologies
Implement Test-Driven Development (TDD) practices and write comprehensive test cases using tools like PyTest
Develop and test features for dashboards using Python, Java, Bootstrap, CSS, JavaScript, and jQuery
Integrated NVIDIA TensorRT to optimize inference workloads, reducing latency and improving throughput in AI
models.
Managed NVIDIA Docker containers to run GPU-accelerated applications in isolated environments, enhancing
resource allocation and scalability.
Implemented NVIDIA DeepStream SDK for real-time video analytics, leveraging GPU acceleration to process and
analyze video streams efficiently.
Configured and maintained NVIDIA GPU Cloud (NGC) containers for deploying AI and data science applications,
ensuring consistent performance across different environments.
Conducted performance tuning and optimization of Java applications, identifying and resolving bottlenecks.
Utilized Java for batch processing and job scheduling, ensuring efficient and reliable execution of tasks.
Manage datasets using Pandas data frames and MySQL, queried MySQL database queries from Python using
Python-MySQL connector and MySQL DB package to retrieve information
Clean data and processed third-party spending data into maneuverable deliverables within specific formats with
Excel macros and Python libraries such as NumPy, SQLAlchemy, and matplotlib
Used Pandas as an API to put the data as time series and tabular format for manipulation and retrieval of data
Analyze format data using Machine Learning algorithms by Python Scikit-Learn
Experience in Python, Jupyter, Scientific computing stack (NumPy, SciPy, Pandas, and matplotlib)

Manage code versioning with GitHub, Bitbucket, and deploy to staging and production servers and implement MVC
architecture in developing the web application with the help of Django framework.
Environment: Python 3.9, Django, HTML5/CSS, PostgreSQL, MS SQL Server, MySQL, JavaScript, Jupyter Notebook,
VIM, Pycharm, Shell Scripting, Angular, JIRA .
Paycor - Frisco, TX Nov 2020 Sep 2022
Senior Python Software Engineer
Responsibilities:
Led projects in custom training of large language models (LLMs) using proprietary datasets to cater to domain-
specific language understanding and generation, significantly improving accuracy and relevance in responses.
Designed and executed end-to-end testing for trading systems, ensuring the reliability and performance of
applications under market conditions.
Engaged in cross-functional teams to enhance data governance practices, ensuring data quality and consistency
across all capital market applications and platforms.
Used Pandas, Opencv, Numpy, Seaborn, Tensorflow, Keras, Matplotlib, Sci-kit-learn, NLTK in Python for
developing data pipelines and various machine learning algorithms.
Utilized Python packages such as pandas, NumPy, SciPy, Matplotlib, and Scikit-learn for developing various
machine learning (ML) algorithms.
Developed and deployed high-performance, scalable web APIs using FastAPI, ensuring fast response times and
efficient asynchronous processing.
Utilized FastAPI's robust dependency injection system for managing resources such as database connections and
authentication, leading to cleaner, more maintainable code.
Implemented advanced security features in FastAPI, including JWT authentication and role-based access control, to
protect sensitive data and enhance application security.
Integrated FastAPI with various data storage solutions, including relational and NoSQL databases, to create efficient
back-end services that handle complex data operations.
Leveraged PyTorch's DataLoader and Dataset classes to efficiently manage and batch large-scale datasets for
training and evaluation.
Applied transfer learning techniques to adapt pre-trained language models to specific domains, improving accuracy
and relevance in specialized applications.
Developed and maintained RESTful APIs using Python, Node.js, and Java, enabling seamless communication
between frontend and backend systems in full-stack applications.
Implemented ReactJS for building dynamic, responsive, and user-friendly interfaces, enhancing user experience
across web applications.
Integrated frontend ReactJS components with backend services via REST APIs, ensuring efficient data retrieval and
manipulation.
Optimized machine learning models using NVIDIA RAPIDS for GPU-accelerated data processing, reducing
training times and improving overall performance.
Developed custom applications utilizing NVIDIA's Jetson platform, enabling edge AI solutions with efficient power
consumption and high computational power.
Managed GPU clusters with NVIDIA packages, enabling parallel processing for large-scale data analysis and AI
model training.
Conducted performance tuning and troubleshooting on SWAP platforms, improving system reliability and user
experience.
Implemented risk management frameworks within SWAP platforms, ensuring compliance with regulatory
requirements and minimizing exposure to market fluctuations.
Designed and executed test cases for SWAP platform upgrades, ensuring seamless transitions and minimal
downtime during deployments.
Implemented data transformation and aggregation tasks using PySpark DataFrames and RDDs, ensuring efficient
and scalable processing of big data.

Leveraged PySpark's built-in functions and SQL integration to perform complex data queries and manipulations,
improving data analysis capabilities.
Utilized PySpark for real-time data streaming and processing, integrating with Apache Kafka and other messaging
systems for low-latency applications.
Applied machine learning algorithms using PySpark's MLlib library for tasks such as classification, regression,
clustering, and recommendation systems.
Added support for Amazon AWS S3 and RDS to host static/media files and the database in Amazon Cloud, and
involved in front end utilizing Bootstrap and React for page design.
Built interactive, responsive front-end applications using JavaScript, enhancing user experience with dynamic
content rendering and seamless integration with back-end APIs.
Leveraged modern JavaScript frameworks like React or Vue.js to develop modular, reusable UI components,
improving code maintainability and scalability.
Implemented AJAX for asynchronous data retrieval in JavaScript applications, providing real-time updates and
smooth user interactions without reloading the page.
Applied JavaScript ES6+ features, such as arrow functions, destructuring, and async/await, to write cleaner, more
efficient code and improve application performance.
Collaborated with cross-functional teams to design and implement RESTful APIs, using FastAPI as the back-end
framework and JavaScript as the front-end scripting language.
Ensured front-end and back-end integration was seamless by testing API endpoints and debugging JavaScript code,
resulting in smooth data flow and robust application functionality.
RESTful APIs using Flask, integrating with various databases such as MySQL, PostgreSQL, and MongoDB.
Designed and implemented web applications using Flask, following best practices in software architecture and
design patterns.
Provided training and support to users of SWAP platforms, ensuring they are proficient in utilizing the systems to
their full potential.
Assisted in the migration of financial data and trade records to new SWAP platforms, ensuring data integrity and
continuity during transitions.
Implemented macros and plugins in Confluence to extend its functionality and tailor it to specific project needs.
Actively participated in the continuous improvement of documentation processes, leveraging Confluence's
capabilities to enhance knowledge management and team productivity.
Deployed Flask applications on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging
services like EC2, S3, or Kubernetes.
Containerized Flask applications using Docker, facilitating easy deployment and scalability across different
environments.
Optimized AWS resource deployment and management using Boto3, achieving a 40% increase in infrastructure
provisioning efficiency.
Integrated Amazon AWS S3 and RDS to host static/media files and database instances in the cloud, enhancing data
storage and management capabilities.
Designed, deployed, and managed multiple applications on AWS, leveraging a range of services (EC2, S3, RDS,
VPC, IAM, ELB, EMR, CloudWatch, Route 53, Lambda, and CloudFormation) to ensure high availability, fault
tolerance, and scalability.
Developed and implemented comprehensive monitoring and logging solutions using Prometheus, Grafana, and the
ELK stack to ensure real-time tracking of application performance, health, and infrastructure metrics.
Configured AppDynamics agents to monitor key performance indicators (KPIs) and business transactions in real-
time.
Created custom dashboards and reports in AppDynamics to track application health, performance, and user
experience.
Analyzed AppDynamics data to diagnose application issues, optimize performance, and ensure SLA compliance.

Collaborated with development and operations teams to implement proactive monitoring and incident response
strategies using Catchpoint and AppDynamics.
Provided training and support for team members to effectively utilize ELK, Catchpoint, and AppDynamics for
performance monitoring and analysis.
Implemented a CI/CD pipeline with Docker, Jenkins, and GitHub by virtualizing servers using Docker for the Dev
and Test environments, achieving needs through configuring automation using containerization.
Designed and deployed highly available, scalable, and secure Kubernetes clusters for production environments,
ensuring seamless application performance and reliability.
Deployed and managed containerized applications using Docker and Kubernetes on Linux hosts.
Developed robust and scalable applications using Java, adhering to object-oriented design principles and best
practices.
Designed and implemented RESTful APIs and web services using Java frameworks such as Spring Boot.
Utilized Java for backend development, integrating with databases, external APIs, and third-party services.
Implemented multithreading and concurrency in Java applications to enhance performance and responsiveness.
Effectively communicated key findings and insights to multiple stakeholders, facilitating data-driven decisions
through presentations and reports using MS PowerPoint, Tableau, and Jupyter Notebook.
Environment: PL/SQL, Python, HTML5, CSS3, JavaScript, PySpark, Autosys, Azure, Databricks, Modeling and
Classification, Snowflake, Alteryx, Power BI, SQL server plus, ETL, Tableau, Git Bash, Bitbucket, React, Numpy , Pandas,
MongoDB

Barclays - New York Jan 2019 Oct 2020
Sr. Full Stack Python Developer
Responsibilities:
Designed and implemented Django models, views, templates, and forms for dynamic and scalable web applications.
Utilized Django ORM to efficiently interact with relational databases like PostgreSQL, MySQL, and SQLite.
Implemented authentication and authorization using Django's built-in system or third-party libraries like Django
REST framework.
Designed email marketing campaigns and created web forms using the Python/Django framework, ensuring efficient
data handling.
Integrated PyTorch with AWS SageMaker, Google Cloud AI Platform, and Azure ML for scalable model training
and deployment.
Created methods (GET, POST, PUT, DELETE) to interact with API servers and tested RESTful APIs using
Postman. Loaded CloudWatch Logs to S3 then processed them using Kinesis Streams. Implemented Django Forms
and Crispy Forms for user data, login, and signup functionalities.
Applied transfer learning techniques using pre-trained models like ResNet, EfficientNet, and GPT to improve model
performance.
Implemented distributed training using PyTorch's DistributedDataParallel to leverage multiple GPUs, reducing
training times for large-scale models.
Utilized PyTorch Lightning for streamlined and flexible model training, validation, and testing workflows.
Integrated frontend frameworks such as React.js and Angular with Django templates or Django REST framework
for interactive UIs.
Utilized Power BI's REST API to automate report and dataset management tasks, enhancing operational efficiency.
Implemented CI/CD pipelines with Azure DevOps for automated build, test, and deployment processes, ensuring
continuous delivery.
Configured and managed user roles and permissions in Jenkins, GitHub, BitBucket, and Artifactory to ensure
security and compliance.

Automated notification and reporting processes in Jenkins, providing real-time feedback on build and deployment
status.
Conducted regular maintenance and updates of Jenkins, GitHub, BitBucket, Artifactory, and Salt environments to
ensure optimal performance and security.
Collaborated with development, QA, and operations teams to streamline the CI/CD processes and improve
deployment times.
Documented CI/CD processes and best practices, providing training and support to development teams on the use of
Jenkins, GitHub, BitBucket, Artifactory, and Salt.
Set up CI/CD pipelines for Angular applications using tools like Jenkins and GitHub Actions.
Strong understanding of JavaScript fundamentals, including ES6/ES7 features such as arrow functions, async/await,
destructuring, and spread/rest operators.
Designed and implemented containerized applications using Docker, ensuring consistency across development,
testing, and production environments.
Developed Dockerfiles for various applications, optimizing them for performance and security.
Managed Docker container orchestration using Kubernetes, deploying and scaling applications efficiently.
Configured and maintained Kubernetes clusters, ensuring high availability and fault tolerance of applications.
Develop the scripts using Perl, Python, UNIX and SQL.
Developed ETL processes in AWS Glue to migrate data from sources like S3 and Parquet/Text Files into AWS
Redshift.
Used AWS Glue catalog with crawlers to query data from S3 using AWS Athena.
Created Lambda functions to run AWS Glue jobs based on S3 events.
Utilized various AWS services (S3, EC2, AWS Glue, Athena, Redshift, EMR, SNS, SQS, DMS, and Kinesis) for
data extraction and processing.
Created PySpark to bring data from databases to Amazon S3. Optimized PySpark jobs to run on Kubernetes clusters.
Designed Azure Data Factory Pipelines for data extraction and set up triggers and monitoring alerts.
Integrated Azure Logic Apps into data pipelines for automated workflows and notifications.
Working with monitoring tools (Nagios, Zabbix, Prometheus) for Linux system performance.
Used Python libraries (Pandas, OpenCV, NumPy, Seaborn, TensorFlow, Keras, Matplotlib, Sci-kit learn, and
NLTK) for data pipelines and machine learning algorithms.
Implemented Kubernetes deployment strategies, including rolling updates and canary deployments, to minimize
downtime during releases.
Utilized Kubernetes Helm for managing application deployments and maintaining consistency across environments.
Set up and managed Kubernetes namespaces, services, and controllers to facilitate microservices architecture.
Monitored and optimized containerized applications using Kubernetes-native tools such as Prometheus, Grafana,
and Kubernetes Dashboard.
Conducted data preprocessing and feature engineering using PySpark.
Implemented data validation and quality checks with PySpark.
Optimized PySpark job performance with techniques like partitioning, caching, and broadcasting variables.
Deployed PySpark applications on cloud platforms like AWS EMR, Google Dataproc, or Azure HDInsight.
Integrated PySpark with data storage systems like HDFS, S3, and relational databases.
Collaborated with teams to design end-to-end data workflows using PySpark.
Managed AWS cloud resources, including EC2, S3, VPCs, ELB, and RDS.
Developed ETL workflows using Alteryx and cleaned data from various sources.
Visualized data analysis results with interactive dashboards and reports.
Built streaming pipelines for event analysis using Spark, Kafka, Presto, and Snowflake.
Ingested raw data into Azure Data Lake and triggered transformation workflows in Azure Databricks.

Loaded transformed data into Azure SQL Data Warehouse (Azure Synapse Analytics) and created views using T-
SQL.
Developed Airflow operators to interact with services like EMR, Athena, S3, DynamoDB, Snowflake, and Hive.
Utilized Python libraries (Pandas, NumPy, Seaborn, SciPy, Matplotlib, Scikit-learn, NLTK) for machine learning.
Employed Dependency Injection and Middleware in ASP.NET Core.
Automated scripts and workflows using Apache Airflow and shell scripting.
Proficient in writing Lambda functions for business logic and event-driven workflows.
Integrated AWS Lambda with other AWS services to build serverless architectures and microservices-based
applications.
Environment: Python 3.x, Django, Numpy, Pandas, Seaborn, Tableau, Beautiful soup, HTML5, CSS/CSS3, Bootstrap,
XML, JSON, Apache spark, Linux, Git, Amazon S3, Jenkins, MySQL, Mongo DB, T-SQL, React, Snowflake, Apache
Airflow, AWS Glue.

ROYAL IT PARK SERVICES Hyd, India Jun 2016 Dec
2018
Python Developer
Responsibilities:
Responsible for gathering requirements and creating Python modules for internal purposes.
Designed and built entire frameworks from scratch based on the requirements.
Developed scripts using Pandas to perform read/write operations on CSV files.
Manipulated and compared data by columns for efficient data processing.
Developed a GUI using Python and Django to dynamically display test block documentation.
Implemented features of Python code using a web browser interface.
Created and updated views using Python and Django view controller and template language.
Added new functionalities to websites by enhancing existing views and templates.
Evaluated the performance of NLP models using metrics such as accuracy, precision, recall, and F1-score.
Utilized Python for unit tests to analyze differences among hotel clusters.
Automated infrastructure and application deployments using Kubernetes, streamlining CI/CD processes.
Configured and managed persistent storage solutions for stateful applications within Kubernetes.
Ensured security and compliance by implementing network policies, secrets management, and RBAC within
Kubernetes clusters.
Troubleshooted and resolved issues related to Docker containers and Kubernetes clusters, ensuring minimal
downtime and high performance.
Integrated Docker and Kubernetes with CI/CD tools like Jenkins, GitLab CI for automated deployment pipelines.
Developed web-based applications using Python and Django for large dataset analysis.
Creating Python scripts for data access and analysis to support process and system monitoring and reporting.
Wrote RDBMS SQL queries to query databases, handling data retrieval and error management.
Used Python IDEs such as PyCharm and Sublime Text for development and unit testing.
Created and deployed AWS EC2 environments for proof of concept design assumptions.
Developed an embedded software data-driven test automation framework in Linux/Python.
Designed test cases and authored test plans.
Built web applications using Python, Django, AWS, J2EE, PostgreSQL, MySQL, Oracle 10g, and MongoDB.
Generated capacity planning reports using Python packages like Numpy and Matplotlib.
Developed custom features and applications using Python, Django, HTML, and CSS.
Debugged and tested applications, fine-tuning performance for optimal operation.
Worked with GCP Dataproc, GCS, Cloud Functions, and BigQuery.

Managed data migration between GCP and Azure using Azure Data Factory.
Involved in writing SQL queries, implementing functions, triggers, cursors, object types, sequences, indexes, and
more.
Developed and tested numerous features for database operations.
Managed development environments using bug-tracking tools like JIRA and version control systems such as Git,
GitLab, and SVN.
Environment: Python, SQL, Data Analysis, SQL Server Reporting Services, Azure, Microsoft Team Foundation Server,
Airflow, GCP, Bash, Shell, React

Sacrosanct Info- Hyd, India Jan 2014 - Jun 2016
Python Developer
Responsibilities:
Designed and implemented RESTful APIs using ASP.NET Web API.
Facilitated seamless integration with third-party services.
Developed and maintained enterprise-level web applications using ASP.NET MVC and Entity Framework.
Improved system performance by 30%.
Integrated cloud services using Azure for scalable and reliable data storage solutions.
Assisted in developing web applications using ASP.NET Web Forms and SQL Server.
Contributed to successful project deliveries.
Involved in using collections for manipulating and looping different objects.
Designed Bash scripts to control data flow from the PostgreSQL database.
Managed URLs and application parameters using Django configuration.
Configured NoSQL databases like Apache Cassandra and MongoDB to increase compatibility with Django.
Responsible for debugging projects monitored on JIRA.
Generated Django forms to record data.
Used PyTest for writing test cases.
Extensive use of version controlling systems like GIT and SVN.
Environment: Python, Django, MySQL, NOSQL, GIT, SVN, Unit Test Linux, Windows.

EDUCATION:
B. Tech in CSE, JNTUH 2014
Keywords: cprogramm cplusplus continuous integration continuous deployment quality analyst artificial intelligence machine learning user interface javascript business intelligence sthree database rlang information technology microsoft procedural language Illinois Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3419
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: