Home

Sahana - Python Developer
[email protected]
Location: Charlotte, North Carolina, USA
Relocation: Relocation
Visa: H1B
Professional Summary

10+ years of experience in Python software development, SOX Compliance, KYC/AML, and SQL for data management, specializing in automating systems, streamlining processes, integrating systems, and implementing projects.
Expertise in developing web-based OpenStack applications for large dataset analysis using Python Django/Flask.
Experience implementing MVT/MVC architecture using server-side applications.
Linked Python API to Spark Core using Spark Shell, working with various Python IDEs like NetBeans, PyCharm, Studio, Eclipse, and Sublime Text.
Proficient in Python libraries such as Requests, NumPy, SciPy, Matplotlib, and Pandas throughout the development lifecycle.
Experience with Jupyter Notebooks and the Anaconda distribution for Python development.
Utilized TensorFlow and Python libraries for high-performance numerical and scientific calculations.
Extensive experience with AWS Cloud Platform, including S3, EC2, Autoscaling, RedShift, DynamoDB, Route 53, RDS, Glacier, EMR, ECS, CloudWatch, IAM, CloudFormation, ELB, CloudFront, and EBS.
Experience using Python with Boto3 to supplement automation tasks with Ansible and Terraform, including encrypting Elastic Beanstalk volumes and scheduling Lambda functions.
Implemented data warehouse solutions on Amazon Redshift and Oracle SQL Server.
Contributed to developing serverless architecture on AWS using services like Lambda, S3, API Gateway, and Step Functions.
Worked with JSON-based RESTful Web services and integrated them with AWS services like EC2, Lambda, ELB, and CloudWatch.
Utilized Apache Airflow, Kafka, and RabbitMQ for asynchronous communication and real-time updates between microservices.
Worked with Azure Storage, Data Factory (V2), Blob Storage, and Data Lake Storage for data ingestion, processing, and storage.
Orchestrated containerized applications using Kubernetes, optimized resource utilization, and implemented clusters on AWS and Azure.
Developed consumer-facing applications using Python, Django, HTML, and CSS, with experience in full SDLC, agile methodologies, and various IDEs.
Proficient in SQL and NoSQL databases, including MySQL, PostgreSQL, Oracle, MongoDB, Cassandra, and HBase.
Experience in Apache Spark Streaming jobs using PySpark for faster data processing and querying with Spark SQL.
Served as a subject matter expert, providing critical insights that guided decision-making and optimized business operations.
Extensive experience in developing RESTful APIs using Python, Flask, and Fast API, enabling seamless integration and communication between microservices.
Skilled in using Python for data analysis, transformation, and visualization, leveraging libraries like Pandas, Matplotlib, and Seaborn.
Implemented automated testing frameworks using Python s unit test and pytest libraries, ensuring code quality and reliability.
Developed and optimized data pipelines in Python for ETL processes, improving data ingestion and processing efficiency.
Experience in writing Python scripts for data scraping, processing, and analysis, using tools like Beautiful Soup and Scrapy.
Utilized Python s threading and multiprocessing modules for parallel processing, enhancing application performance.
Proficient in containerizing Python applications using Docker, facilitating deployment and scalability across different environments.
Created custom Python packages and modules to streamline repetitive tasks and promote code reuse.
Expertise in writing Python-based CLI tools for automation and system administration tasks.
Experience in using Python for machine learning model development and deployment with libraries like scikit-learn and TensorFlow.
Integrated Python applications with databases using ORM libraries like SQL Alchemy and Django ORM, ensuring efficient data management.
Applied Python in cloud automation, utilizing AWS SDKs and APIs to automate infrastructure and application deployment.
Developed real-time data processing applications using Python and Apache Kafka for high-throughput environments.


Technical Skills

Programming Languages Python, Scala, SQL, Java, JavaScript, HTML/CSS, React, Angular, Django, Shell Scripting
Project management tools Microsoft office tools (word, excel, SharePoint, Visio), JIRA
Web Technologies HTML, JSON, XSL, CSS3, JavaScript, AJAX, jQuery, XML, Web-Services
Databases Oracle, MySQL, MongoDB, SQL Server, NoSQL, SQLite, PostgreSQL
IDE s and tools Eclipse, Sublime text, PyStudio, PyCharm, NetBeans
OS &Environment XP, windows, Linux, Unix, Ubuntu
Version controllers SVN, GitHub, GitLab
Development Methodologies Agile, Scrum, Waterfall
AWS Services EMR, S3, EC2, Redshift, EMR, Lambda, Dynamo DB, RDS, SNS, Glue, SQS
ETL/ BI Tools Informatica, SSIS, Tableau, Power BI, SSRS
CI/ CD Azure DevOps, Jenkins, Ant, Maven
Defect tracking tools: JIRA- Zephyr
Operating Systems Linux, Windows, Ubuntu, Unix
Databases (RDBMS/ NoSQL) Oracle, SQL Server, Cassandra, Teradata, PostgreSQL, HBase, MongoDB
Programming Languages/Scripting SQL, R, Python (Pandas, NumPy, SciPy, Scikit-Learn, Seaborn, Matplotlib), Shell
Reporting / BI Tools/Other MS Excel, Tableau, Tableau Server and Reader, Power BI, QlikView, Crystal Reports, MS word, MS Excel, MS SharePoint, MS PowerPoint, confluence, Microsoft Visio, Lucid chart, Balsamiq

Professional Experience

November 2022 - Current
Client: First National Bank, Pennsylvania
Role: Sr. Python Developer
Responsibilities:
Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used agile methodology for developing application.
Designed and developed transactions and persistence layers to save/retrieve/modify data for application functionalities using Django and PostgreSQL.
Develop Python microservices with Django framework for Confidential and Confidential internal Web Applications.
Involved in designing user interactive web pages as the front-end part of the web application using various web technologies like HTML5, JavaScript, Angular.JS, jQuery, AJAX and implemented CSS3 for better appearance and feel.
Design and documentation of REST APIs using Django Rest framework for collection and retrieval of high-volume data which is results of search query.
Developed Python batch processors to consume and produce various feeds.
Created Business Logic using Python to create Planning and Tracking functions.
Wrote Python routines to log into the websites and fetch data for selected options.
Used collections in Python for manipulating and looping through different defined objects.
Created Python tools to increase efficiency of application system and operations, data conversion scripts, REST, JSON, and CRUD scripts for API Integration.
Implemented AWS solutions using E2C, S3, RDS, EBS, Elastic Load Balancer, and Auto scaling groups, Optimized volumes and EC2 instances.
Utilized Python libraries such as NumPy, pandas and matplotlib to read data from csv files aggregate and update data.
Used Python to place data into JSON files for testing Django Websites.
Utilized PyQt to provide GUI for the user to create, modify and view reports based on client data.
Worked on Python to place data into JSON files for testing Django Websites. Created scripts for data modeling and data import and export.
Used jQuery for selecting particular DOM elements when parsing HTML.
Created User Controls and simple animations using Java Script and Python.
Involved in Python OOD code for quality, logging, monitoring, and debugging code optimization.
Analyzed the SQL scripts and designed solutions to implement using Pyspark. Created custom new columns depending up on the use case while ingesting the data using Pyspark.
Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart
Involved in AGILE (SCRUM) practices and planning of sprint attending daily agile (SCRUM) meetings and SPRINT retrospective meetings to produce quality deliverables within time.

Environment: Python, Django, PyQt, HTML5, CSS3, JavaScript, Angular.JS, AWS, PySpark, DOM, NumPy, Pandas, Matplotlib, AJAX, jQuery, JSON, PL/SQL, REST, PostgreSQL, Agile and Windows

July 2021 November 2022
Client:- Walmart , Bentonville
Role: Sr. Python Developer
Responsibilities:
Developing apps based on in-house mapping API - map GO API, automating tasks using batch scripts and Python in Windows and Linux, developing Python solutions, GIS analyses and consulting.
Utilized Python libraries like python, NumPy, Twisted, PyQt, Web2py, Pygal, Plot.ly.
Designed RESTful Web services using FLASK, emphasizing improved Security for the service using FLASK-HTTP Auth with HTTPS. Also utilized Hug libraries to develop HTTP REST APIs to provide validations and used CherryPy framework to model and bind HTTP.
Worked with Boto3 as interface for AWS API and Tensor Flow to make numerical computations using data flow graphs. Developed various APIs for Django applications using Django-Tastypie.
Developed Windows services to store SQL Server data in cloud-based data warehousing by using Microsoft Azure and Amazon Web Services (AWS) RedShift.
Designed and engineered on-premises to off-premises CI/CD docker pipelines (integration and deployment) with ECS, Glue, Lambda, ELK, Spark data bricks, and kinesis stream.
Imported data from SQL Server DB, Azure SQL DB to power BI to generate reports.
Hands-on experience in Microsoft Azure cloud platform and merging with Python Azure cloud Merge with Python to store data in the cloud with High security.
Heavily leveraged Python s graphics APIs for creating graphics and serialization libraries for encoding data in XML/JSON formats.
Also involved in writing REST APIs using the Django framework for data exchange and business logic implementation.
Configured Postman to validate API requests and responses for error detection and data consistency checks.
Proficiently integrate AWS services using the AWS Software Development Kit (SDK) with Python, enhancing functionality and accessibility.
Developed large-scale ETL processes using SQL, Python, Scala, and PySpark, handling terabytes of data efficiently and ensuring high data quality. Have also designed end-to-end data pipelines, integrating SQL, Python, Scala, and PySpark for data transformation, enrichment, and loading into target data stores.
Automated ETL jobs, scheduling, and monitoring using AWS Step Functions and integrated error handling for seamless operation.
Developed Terraform and CloudFormation templates to define infrastructure resources, facilitating automated and repeatable provisioning of cloud resources.
Used Boto3 for AWS Command Line Interface (CLI) to automate and script AWS tasks.
PySpark designed data storage solutions in AWS and the Hadoop Distributed File System (HDFS).
Integrated PySpark with AWS technologies, including S3, and built effective data pipelines for data ingestion.
Worked with Relational databases (RDBMS) like SQLite, PostgreSQL, and the no-SQL database MongoDB for database connectivity and the structure DB browser (SQLite).
Maintained data lineage and metadata records for tracking the origin and transformations of data within the Hadoop.
Utilized Databricks to visualize data and create informative and interactive dashboards to communicate insights to stakeholders.
Optimized Databricks clusters and workloads for enhancing data processing efficiency and optimal performance.
Utilized Git, Jenkins, and custom tools developed in Python for an automated continuous integration system.
Collaborated with cross-functional teams to design and implement Kubernetes-based solutions for cloud-native applications, providing architectural guidance, troubleshooting support, and performance tuning expertise.
Used collections in Python for manipulating and looping through different user-defined objects.
Designed and implemented custom Bash scripts to control the PostgreSQL database's data flow.
Upgraded existing UI, working as an application developer experienced with templates, views, and models in Django. Generated data analysis reports using Matplotlib and Tableau. Created various types of data visualizations using Python and Tableau. Ensured data accuracy and treated missing values using NumPy and Pandas. Conducted explanatory data analysis using Python to support decision-making.
Responsible for data processing of large datasets using Hadoop and PySpark for extracting, transforming, and loading.
Integrated Databricks with Apache Spark to build data processing and analytics solutions.
Responsible for migrating data using AWS Glue.
In AWS Glue, data transformations and schema evolution are implemented to accommodate evolving business requirements.
Implemented the application using Python Django Framework and Apache Spark and handled the security using Python security-related libraries.
Responsible for handling the integration of database systems.
Used object/relational mapping (ORM) solution, the technique of mapping data representation from MVC model to Oracle Relational data model with an SQL-based schema.
Implemented Performance tuning and improved the Performance of Stored Procedures and Queries.
Developed GUI using webapp2 for dynamically displaying the test block documentation and other features of python code using a web browser.
Set up base Python structure with the Create-Python-App package, SRSS, PySpark.
Using GIT for source code management.

Environment: Python, Django, Angular JS, HTML, CSS, JavaScript, jQuery, Sublime Text, Jira, GIT, py Builder, unit test, Web Services, AWS, S3, Spark, Selenium, RedHat Linux, Jupiter, NumPy, Pandas, JSON.

August 2017 December 2019
Client: Amazon, India
Role: Python Developer
Responsibilities:
Designed the application using Python and Django for the backend development and front-end application using React and MySQL for the database.
Involved in developing the front end of the application using React.
Developed Python scripts to manage AWS resources from API calls with BOTO SDK and AWS CLI.
Using the Django framework, to develop models, customized admin pages, views, templates, and effective ORM implementation with MySQL.
Responsible for creating organized, responsive design and developing user interaction screens using Django, HTML5, React, Angular, JavaScript, and jQuery.
Worked with databases such as SQLite, MySQL, and MongoDB. I also configured and Deployed Django projects on EC2.
Developed views and templates with Python and Django's view and templating language to create a user-friendly website interface.
Used Python 3.6 (NumPy, spicy, pandas, sci-kit-learn, seaborn) and Spark 2.0 (PySpark, ML lib) to develop various models and algorithms for analytic purposes.
Used Pandas, NumPy, Matplotlib, Seaborn, sci-kit-learn, and Exploratory Data Analysis (EDA).
Developed GUI using webapp2 for dynamically displaying test block documentation using Python code.
Worked closely with designer, tightly integrating Flash into the CMS with Flashovers stored in the Django models. Also created XML with Django to be used by Flash.
Designed and implemented open-source AI frameworks like PyTorch, TensorFlow, and Scikit-learn.
Python as the primary language, libraries for implementation, and web Development expertise are here to enhance our application and help us gain new knowledge every day.
Worked on a Django project from scratch to implement the application. Developed back-end components to improve responsiveness and overall performance. Integrated user-facing elements into applications.
Developed entire frontend and backend modules using Python on Django Web Framework.
Developed and Architect RESTful API using Flask, SQL Alchemy ORM, and Python 3.4.
Developed web-based with Python 3.4/2.7, Django 1.4/1.3, XML, CSS3, HTML5, DHTML, JavaScript and jQuery.
Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content. Used data types like dictionaries, tuples, and object-oriented concepts-based inheritance features for making complex algorithms of networks.
Developed web applications and RESTful web services and APIs using Python Flask.
Worked on several standard Python packages like NumPy, Matplotlib, SciPy, Pandas PyTables etc.
Implemented the AWS lambda functions using Python to improve the file upload performance and merge functionality to AWS S3 buckets within the Amazon, GCP Cloud environment.
Architected and developed Python and Django for the backend development and front-end application using React, Webpack, Redux, ES6/7, and PostgreSQL for the database.
Wrote Python modules to extract/load asset data from the MySQL source database.
Developed user interfaces of the application using front-end technologies and used APIs developed using the Django REST framework to pull data from the backend, showing it to the user.
Implemented backend asynchronous task queue system for data processing pipelines using libraries/frameworks like Django-Celery and RabbitMQ.
Designed and implemented complex event-driven architectures using AWS Lambda, integrating with various AWS services like S3, DynamoDB, SQS, SNS, Step Functions, API Gateways, and CloudWatch Alarms to build scalable and decoupled systems.
Implemented various Validation Controls for form validation and implemented custom validation controls using JavaScript.
Used React JS in components like JSX, creating React components, Virtual DOM, React Props, Lifecycle methods, working with the React States and Events. Developed Restful Micro Services using Flask and Django and deployed.
I have valuable work experience developing software projects using Agile methodologies, particularly the SCRUM framework.
Set up monitoring and logging solutions for Elasticsearch on AWS, utilizing services like Amazon CloudWatch and AWS Elasticsearch built-in monitoring tools and experience in doing unit tests and integration tests.
Developed interactive dashboards and visualizations using Amazon Quick Sight to track marketing campaign performance and extract actionable insights.
Designed and implemented ETL pipelines with AWS Glue to process large volumes of marketing data, ensuring data accuracy and reliability.
Leveraged Amazon Athena for ad-hoc analysis of marketing metrics stored in Amazon S3 buckets, facilitating quick decision-making based on real-time data.
Developed web applications using Python and Django for the backend, incorporated J2EE for Java-based development, and employed AWS services for cloud infrastructure. Managed databases with PostgreSQL, MySQL, and MongoDB. Proficient in continuous deployment using Jenkins.
Environment: Python 3.8, Flask, Django, Real-Time Systems, Analytics HTML5/CSS, JavaScript, Linux/Unix, Jupiter Notebook, Junit 4.1, JIRA, Python threading, Python scheduling crontab, Django, RESTful API, JSON, XML, Git, GitHub, Linux, Jenkins, Kubernetes.

March 2014 August 2017
Client: Cerner Corporation
Role: Software Developer
Responsibilities:
Involved in Development of PySpark applications to process and analyze text data from emails, complaints, forums, and click streams to achieve comprehensive customer care.
Constructed data pipelines to process streamed and chunked data on AWS by ingesting from 10+ data sources. Automated ETL Process using PySpark & Spark SQL in EMR Clusters for reporting and data transformation.
Historical Data migration of 2000 tables from on-premises (Netezza) to AWS EMR Hive and Redshift.
Developed customized API with token-based authentication for Django using Rest Framework.
Performing retrieval and validation on the data stored through API s, DynamoDB.
Used Bash Python included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes, backing AMIs and scheduling Lambda functions for routine AWS tasks.
Installed data sources like SQL-Server, Cassandra and remote servers using the Docker containers to provide the integrated testing environment for the ETL applications.
Developed Node-based front-end applications for easy accessing, tracking, and monitoring inventory, system for internal usage with multi-level permission, SRSS, PySpark, Node JS.
Developed numerous new features and enhancements for the Web application using Python/Django, HTML/CSS/JavaScript/Type Script, as well as automating our development environment.
Developed scalable applications using the MEAN (Mongo-DB + Express.JS + Angular.JS + Node.JS) stack and created POC's for rest service development using Node.JS, Express.JS and MongoDB.
Developed application logic using Python, Java script, C++. Used JMS for updating Mailing plans and tracking them. Used Java Server Pages for content layout and presentation.
Worked extensively with Bootstrap, JavaScript, and jQuery to optimize the user experience.
Created database using MySQL, wrote several queries to extract/store data.
Developed, tested, and debugged software tools utilized by clients and internal customers.
Extracted and loaded data using Python scripts and PL/SQL packages.
Worked on Installation and configuring MongoDB Cluster nodes on different AWS EC2 instances.
Utilized Amazon Glue to automate data ingestion and transformation processes, enabling seamless integration of data from multiple sources.
Integrated Amazon Quick Sight to develop custom dashboards and reports for stakeholders, providing valuable insights into supply chain analytics.
Conducted exploratory data analysis using Amazon Athena to uncover trends and patterns in customer data, supporting strategic decision-making.
Developed and tested many features for a dashboard using Python, Java, Bootstrap, CSS, JavaScript, and jQuery. Generated Python Django forms to record data of online users and used PyTest for writing test cases.
Developed server-based web traffic using RESTful API s statistical analysis tool using Flask, Pandas.
Enhanced the project functionality by incorporating Python XML SOAP request/response handlers, enabling seamless account additions, trade modifications, and security updates. Implemented Windows services to facilitate the storage of SQL Server data in cloud-based data warehousing solutions, leveraging both Microsoft Azure and Amazon Web Services (AWS) RedShift.
Designed and engineered on-premises to off-premises CI/CD docker pipelines (integration and deployment) with ECS, Glue, ELK, Spark data bricks, and kinesis stream.

Environment: AWS, Windows, Python3.7, SQL Server, AngularJS, libraries - (NumPy, SciPy, Pandas, PySpark, PyCharm, Matplotlib), HTML5, JSON, Lambda functions, Vue.js, type script, Node JS.

July 2013 - March 2014
Client: Cerner Corporation, Bangalore, India
Role: Software Engineer
Responsibilities:
controller and template language. Developed and Architect RESTful API using Flask, SQL Alchemy ORM, and Python 3.x. Developed web-based applications using Python 3.4/2.7, Django 1.4/1.3, XML, CSS3, HTML5, DHTML, JavaScript and JQuery.
Utilized data migration for the PostgreSQL database and worked on an upgrade path for applications from South to use the latest version of Django. Developed reusable components with Angular JS custom directives.
developed a scalable microservices architecture,
Wrote python scripts to parse XML documents. Developed tools using Python, Shell scripting, XML to automate some of the menial tasks, Data Scraping
Worked on WAMP ( Windows, Apache, MySQL, and Python )
Developed modules for contact management including Database design, Back-end development and implementing front-end using Bootstrap and JavaScript.
Restful web services using Python REST API Framework.
Developing web pages by using HTML5, CSS3, Typescript, JavaScript and Angular to design a user friendly and multi functionality responsive interface.
Conducted A/B testing using tools like Vue.js and AngularJS, optimizing website elements to improve conversion rates.
Created restful web services for Catalog and Pricing with Django MVT, Ansible, MySQL, MongoDB.
Used Django APIs for database access and JavaScript and MYSQL. Designed and developed the web application using Django web framework in python.
Environment: Python 3.x, SQL, Django, PySpark, MVC, NumPy, Pandas, SciPy, Matplotlib, HTML5, CSS3, JavaScript, Angular.JS, Node.JS, XML, jQuery, Pl/SQL, Agile, TDD and Windows.
Keywords: cplusplus continuous integration continuous deployment artificial intelligence machine learning user interface javascript business intelligence sthree database active directory rlang information technology golang microsoft procedural language

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3565
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: