Home

Laxmi V - Sr. Python Developer
[email protected]
Location: Remote, Remote, USA
Relocation:
Visa:
Laxmi V
Sr. Python Developer
(615) 656-8597
[email protected]





Professional Summary
8+ years of experience with full software development life-cycle (SDLC), object-oriented programming, database design, and agile methodologies.
Proficient in Python and SQL, with extensive experience developing and managing web applications.
Skilled in using both Django and Flask frameworks to create robust and scalable backend systems.
Well-versed in many web technologies, including HTMS, CSS, JavaScript, jQuery, Angular, Angular CLI, React, React Router, Redux, NodeJS, and Bootstrap.
Experienced in working with various data interchange formats such as XML, AJAX, and JSON, facilitating seamless communication between the front and back end.
Familiar with Apache, Tomcat, and Nginx as web servers, enabling efficient deployment and hosting of web applications.
Competent in using Kafka for messaging and data streaming, enhancing real-time communication between components.
Expertise in developing and consuming REST and SOAP web services, contributing to seamless integration with external systems.
Professional in working with SQL databases like Oracle, MySQL, and PostgreSQL, ensuring efficient data storage and retrieval.
Adept in monitoring tools like ELK, Prometheus, and Grafana, enabling proactive system monitoring and performance optimization.
Well-versed in cloud platforms, particularly AWS, for scalable and resilient cloud-based application deployment.
Knowledgeable about essential libraries/packages, including NumPy, Pandas, and OAuthLib, for efficient data processing and security implementation.
String experience using IDEs like PyCharm, Eclipse, and Jupyter Notebook for streamlined code development and debugging.
Proficient in utilizing Python for networking tasks and protocols.
Developed Python scripts for networking automation, configuration, and monitoring.
Extensive experience with network socket programming and networking communication protocols (TCP/IP, UDP).
Proficient in integrating Python applications with networking APIs and SDKs.
Proficient in developing web applications using the FASTAPI web framework.
Extensive experience in building RESTful APIs and microservices with FASTAPI.
Strong knowledge of Python 3.x libraries and frameworks, including NumPy, pandas, Django, Flask, and more, for building robust and efficient applications.
Skilled in following best practices for Python 3.x, including PEP 8 coding standards, virtual environments, and package management using tools like pip.
Strong knowledge of testing methodologies, including unit testing and integration testing, in FASTAPI projects.
Experienced in data visualization tools like Tableau, Power BI, and Matplotlib for creating insightful and visually appealing data representations.
Proficient in DevOps practices, utilizing Jenkins, Docker, Kubernetes, and AWS CodePipeline for efficient development and deployment workflows.
Acquainted with NoSQL databases like MongoDB, Cassandra, and Couchbase, providing diverse data storage solutions.
Skilled in version control systems, including GIT, GitHub, and Bitbucket, enabling collaborative code management and version tracking.
Proficient in using tools like JIRA, Terraform, Ansible, and Chef to enhance project management, infrastructure provisioning, and configuration management.
Experienced in Agile, SCRUM, and TDD methodologies, contributing to efficient and iterative development processes. Skilled in testing frameworks such as PyUnit and PyTest to ensure code quality and robustness.
Efficient analytical, problem-solving, and decision-making skills. Outstanding communication, documentation, knowledge transfer & requirement-gathering skills.
Developed Python scripts to integrate and automate data ingestion and extraction processes with Splunk.
Implemented Splunk SDKs and APIs to seamlessly integrate Python applications with Splunk for log management and analysis.
Designed and implemented Azure-based solutions for various projects, including cloud-native applications, data analytics, and machine learning models.
Utilized Azure Functions and Azure Logic Apps to build serverless applications, automating critical business processes and reducing operational costs.
Developed ETL pipelines using Python to extract data from various sources, transform it into a usable format, and load it into a data warehouse.
Implemented data cleansing and validation processes to ensure data quality and consistency within ETL workflows.
Proficient in designing and implementing DynamoDB data models for highly scalable and cost-effective NoSQL database solutions.
Extensive experience in using the AWS SDK for Python (Boto3) to interact with DynamoDB tables, perform CRUD operations, and manage database resources.
Developed robust Python applications with seamless integration to Amazon S3, enabling efficient storage, retrieval, and management of large datasets.
Proficient in using AWS Glue to automate data ETL (Extract, Transform, Load) processes for efficient data integration and analysis.
Proficient in utilizing Amazon RDS (Relational Database Service) for managing and optimizing databases in various Python-based projects.
Proficient in using AWS Athena for querying and analyzing data stored in S3 buckets.
Developed serverless applications using AWS Lambda functions to handle various tasks, such as data processing, file uploads, and real-time data streaming.
Proficient in deploying and managing Python applications on Amazon EC2 instances.
Proficient in deploying and managing applications on Amazon Web Services (AWS) using Elastic Kubernetes Service (EKS).
Leveraged Python libraries and frameworks like Flask, Django, and FastAPI to build robust and scalable backend services for EKS-deployed applications.
Extensive experience in utilizing API Gateways to streamline API management, security, and routing.
Extensive experience with Document base Databases such as MongoDB and CouchDB, including data modeling, schema design, and performance optimization.
Proficient in using GraphQL to design and implement efficient APIs for web applications.
Extensive experience in integrating GraphQL with Python-based backend systems.
Proficient in using SparkSQL to process and analyze large datasets, enabling efficient data extraction, transformation, and loading (ETL) operations.
Proficient in Python programming with a strong emphasis on object-oriented (OO) design principles.
Developed Python scripts and applications to automate firewall and proxy management tasks, improving operational efficiency.
Developed Python scripts to automate firewall rule provisioning and management, significantly reducing manual configuration errors and saving operational time.
Developed and maintained proxy server applications in Python to facilitate secure and efficient internet communication for clients.
Implemented proxy server solutions to anonymize web traffic, enhance security, and optimize network performance.
Integrated third-party ticketing APIs into Python applications to enable seamless communication and data synchronization with external ticketing platforms.
Developed a Python-based ticketing system to streamline and automate the process of creating, tracking, and managing support tickets.
Collaborated with cross-functional teams to develop Python scripts for ticket data migration and system integration, ensuring data consistency and accuracy.









Technical Skills
Languages: Python, SQL
Frameworks: Django, FASTAPI, Flask
Web Technologies: HTML, CSS, JavaScript, jQuery, Angular, Angular CLI, React, React Router, Redux, NodeJS, Bootstrap, XML, AJAX, JSON.
Webpack Servers: Apache, Tomcat, Nginx
Messaging Tools: Kafka
Web Services: REST, SOAP
SQL Databases: Oracle, MySQL, PostgreSQL
Monitoring Tools: ELK, Prometheus, Grafana
Version Controls: GIT, GitHub, Bitbucket
Libraries/ Packages: NumPy, Pandas, OAuthLib
Cloud Platforms: AWS, Azure AWS Core Services: DynamoDB, S3, Glue, Athena, RDS, Lambda, SQS, EC2, EKS, API Gateway
Software Platforms: Splunk
Query Language: GraphQL
IDEs: PyCharm, Eclipse, Jupyter Notebook.
Visualization tools: Tableau, Power BI, Matloptlib
DevOps tools: Jenkins, Docker, Kubernetes, AWS CodePipeline, CI/CD pipelines
No SQL Databases: MongoDB, Cassandra, Couchbase,
Other Tools: JIRA, Terraform, Ansible, Chef, Service Now, ETL
Methodologies: Agile, SCRUM, TDD
Testing: PyUnit, PyTest
Operating Systems: Windows, Linux, Mac OS


Work Experience:

Client: Elanco | Feb 2022 Present
Location: Greenfield, IN
Python Developer
Developed entire frontend and backend modules using Python on Flask Web Framework.
Designed and implemented RESTful APIs with Flask and SQLAlchemy, enabling efficient communication between frontend and backend components.
Utilized JSON and XML for data exchange, ensuring compatibility with various client applications and third-party integrations.
Deployed the application on AWS using RDS for relational databases and ECS/EKS for containerized Microservices, optimizing scalability and resilience.
Implemented IAM roles and policies to enforce security and access control, ensuring secure communication between application components.
Utilized AWS services like SQS and Lambda to enable asynchronous processing of tasks and improve overall system performance.
Integrated AWS Simple Queue Service (SQS) to create asynchronous communication between different components of the application, improving system reliability and scalability.
Implemented error handling and retry mechanisms in Lambda functions to ensure fault tolerance and maintain high availability.
Wrote Python scripts and Lambda functions to process and transform data from various sources, such as databases, APIs, and streaming platforms.
Collaborated with front-end developers to integrate HTML5, CSS3, JavaScript, and jQuery, enhancing the application's interactivity and visual appeal.
Utilized Ajax and Webpack to create dynamic and responsive user interfaces, improving user engagement and responsiveness.
Developed complex UI components using React, leveraging Redux for state management and React Router for efficient navigation.
Ensured responsive design using Bootstrap, enhancing cross-device compatibility and user satisfaction.
Utilized MySQL and MongoDB databases, optimizing schema design and query performance to meet specific application requirements.
Integrated Node.js for server-side scripting and real-time features, enhancing application responsiveness.
Managed and maintained Apache Tomcat for deploying Java-based components, enabling seamless integration with the overall system.
Designed and implemented networking solutions using Python, including client-server applications and network services.
Implemented networking security measures through Python scripts, including intrusion detection and prevention.
Set up and managing CI/CD pipelines using AWS CodePipeline, automating code integration, testing, and deployment processes.
Designed and implemented automated CI/CD pipelines using tools such as Jenkins, Travis CI, or GitLab CI/CD to streamline software development and deployment processes.
Implemented event-driven architecture using Kafka, facilitating efficient data streaming and real-time updates between application components.
Utilized PyCharm IDE for streamlined code development, debugging, and performance optimization.
Implemented infrastructure-as-code using Terraform, ensuring consistent and repeatable deployment of AWS resources.
Designed and optimized DynamoDB tables to meet application requirements, including data partitioning, indexing, and secondary indexes.
Implemented data access patterns such as single-table design and composite keys in DynamoDB for efficient querying and data retrieval.
Implemented automated S3 backup and recovery solutions, ensuring data integrity and availability in case of system failures.
Employed Python scripts to optimize data transfers to and from S3, reducing latency and costs through multipart uploads and parallel processing.
Conducted performance profiling and optimization of S3 operations, fine-tuning Python code to achieve faster upload and download speeds.
Utilized Python scripting for automating Athena query execution and result retrieval, enhancing workflow automation.
Integrated Athena with visualization tools like Tableau for creating interactive and dynamic data dashboards.
Implemented security best practices in Athena to safeguard sensitive data and ensure compliance with data privacy regulations.
Implemented API Gateway solutions such as AWS API Gateway and Kong to create scalable and efficient API ecosystem
Managed project tasks and collaboration using Jira, ensuring smooth progress and clear communication among team members.
Implemented comprehensive unit tests using PyUnit and performed log analysis using ELK stack for effective troubleshooting.
Leveraged Docker and Kubernetes for containerization and orchestration, enhancing scalability and maintainability, and utilized Ansible and Chef for configuration management.
Utilized Python to preprocess and enrich data before ingestion into Splunk, improving the quality and relevance of logs.
Integrated Python scripts with firewall APIs (e.g., Cisco ASA, Palo Alto Networks) to streamline rule updates and ensure seamless network traffic management.
Utilized Python libraries and frameworks to create custom dashboards and reporting tools for firewall logs, aiding in incident response and security analysis.
Automated backup and recovery procedures for firewall configurations using Python, enhancing disaster recovery capabilities and reducing downtime.
Developed Python plugins for security information and event management (SIEM) systems to centralize firewall log data and facilitate comprehensive security analysis.
Proficient in optimizing Python 3.x code for performance enhancements and troubleshooting/debugging using tools like pdb.
Proficiency in unit testing and test-driven development (TDD) using Python 3.x frameworks such as unit test and pytest.
Utilized Python 3.x for data analysis, data manipulation, and scientific computing, including libraries like SciPy and scikit-learn.
Designed custom proxy solutions to meet client requirements, including reverse proxies, transparent proxies, and forward proxies.
Integrated proxy server functionality into web applications and services to enhance privacy and security for end-users.
Monitored and analyzed proxy server logs and traffic patterns to identify and mitigate potential security threats and performance bottlenecks.
Proficient in configuring and managing SSL/TLS certificates for securing proxy server communications.
Designed and maintained a user-friendly web interface using Python web frameworks like Django to allow support agents and customers to submit and track tickets effortlessly.
Conducted regular code reviews and implemented code quality standards in Python projects to enhance the reliability and maintainability of the ticketing system.
Leveraged Python's data processing capabilities to generate detailed ticketing reports and analytics, providing insights into support team performance and customer satisfaction.
Developed Python scripts for field extractions, lookup table management, and event correlation in Splunk.

Environment: Python, Flask, SQLAlchemy, RESTful APIs, JSON, XML, AWS, HTML5, CSS3, JavaScript, Jquery, Ajax, Webpack, React, Bootstrap, MySQL, MongoDB, NodeJS, Apache Tomcat, AWS CodePipeline, Kafka, PyCharm, Terraform, Jira, PyUnit, ELK, Docker, Kubernetes, Ansible, Chef, GitHub, Matplotlib, Splunk, DynamoDB, S3, Athena, SQS, Lambda, API Gateway, Networking, python 3.x, Firewall, Proxy, Ticketing.

Client: Erie Insurance | Sep 2020 Jan 2022
Location: Erie, PA
Python Developer
Responsible for gathering requirements, system analysis, design, development, testing, and deployment.
Worked on complex web applications using Python and Flask framework, following the Model-View-Controller (MVC) architectural pattern.
Designed and implemented RESTful APIs, ensuring seamless communication between frontend and backend systems, utilizing XML and JSON data formats.
Leveraged Flask's ORM capabilities to manage database interactions with MySQL and Cassandra, optimizing data storage and retrieval for improved performance.
Architected and developed a responsive user interface using HTML5, CSS3, and JavaScript, enhancing user engagement and delivering a visually appealing experience.
Employed AJAX techniques and jQuery for dynamic frontend interactions, reducing page reloads and enhancing overall application responsiveness.
Collaborated with the frontend team to integrate the Angular framework, utilizing Angular CLI and Typescript to build interactive and feature-rich user interfaces.
Utilized Jinja2 templating engine for dynamic rendering of server-side templates, ensuring consistent and dynamic content delivery.
Conducted thorough testing of API endpoints using Postman, validating data exchange and refining API performance for seamless user experiences.
Employed advanced data manipulation and analysis libraries like NumPy and Pandas, facilitating data processing and insights generation.
Implemented real-time data updates and subscriptions using GraphQL subscriptions in Python applications.
Collaborated with data analysts by providing clean and structured data sets, enabling them to create comprehensive visualizations in Tableau.
Containerized the application using Docker, simplifying deployment and ensuring consistent behavior across different environments.
Developed shell scripts to automate deployment processes, enhance team productivity, and reduce manual errors during releases.
Managed source code using Bitbucket, implementing version control strategies and facilitating efficient collaboration among team members.
Utilized Python scripting to create custom automation scripts for CI/CD pipeline tasks, enhancing efficiency and reducing manual intervention.
Monitored and tracked CI/CD pipeline performance, optimizing build times, and minimizing resource consumption for cost-effective and efficient pipelines.
Implement a complex cloud-native application on AWS, utilizing Python for backend development and Microservices architecture.
Implemented monitoring and visualization using Grafana, ensuring real-time insights into application performance and resource utilization.
Utilized AWS Glue Crawlers to automatically discover and catalog data sources from various repositories, including Amazon S3, RDS, and Redshift.
Monitored and maintained Glue ETL jobs, ensuring high availability, performance, and cost-efficiency of data pipelines
Implemented data replication and failover strategies using Amazon RDS to ensure high availability and disaster recovery for applications.
Ensured data security and compliance by implementing encryption and access control mechanisms on Amazon RDS.
Collaborated with DevOps teams to automate database deployment and maintenance tasks on RDS, streamlining development workflows.
Skilled in setting up auto-scaling groups and load balancing for Python applications on EC2.
Capable of implementing CI/CD pipelines for Python applications deployed on EC2 instances.
Proficient in optimizing EC2 instance performance for various Python frameworks and libraries.
Knowledgeable in backup and disaster recovery strategies for EC2 instances running Python applications.
Extensive experience in containerization technologies, including Docker, for packaging and deploying applications on EKS.
Worked on migration projects, moving on-premises workloads to EKS, optimizing for cloud-native architecture.
Knowledgeable in configuring and optimizing EKS clusters for performance, scalability, and cost efficiency.
Proficient in monitoring and troubleshooting EKS clusters using tools such as Prometheus, Grafana, and AWS CloudWatch.
Designed and developed asynchronous APIs using FastAPI, leveraging its asynchronous capabilities to handle concurrent requests and enhance application scalability.
Proficient in SQL and capable of writing complex queries to extract and manipulate data stored in Hadoop clusters.
Implemented data security measures and access controls to safeguard sensitive data stored in Hadoop clusters.
Utilized Hadoop ecosystem components such as HBase for NoSQL data storage, Pig for data transformation, and Oozie for workflow scheduling and coordination.
Leveraged Hive's integration with Hadoop ecosystem components such as HDFS and MapReduce for big data processing.
Proficient in working with Hive metastore, managing schemas, and ensuring data consistency.
Automated build and frontend asset tasks using Gulp, optimizing build processes and ensuring consistent code quality and deployment.
Utilized Jupyter Notebook for data exploration, analysis, and documentation, enabling clear communication of findings and insights.
Tracked and managed project tasks using JIRA, contributing to sprint planning, backlog grooming, and overall project progress tracking.
Environment: Python, Flask, RESTful APIs, MySQL, Cassandra, HTML5, CSS3, JQuery, JavaScript, AJAX, XML, JSON, Angular, Angular CLI, Typescript, Jinja2, Postman, NumPy, Pandas, AWS, Tableau, Docker, Shell Scripts, Bitbucket, Grafana, FastAPI, Gulp, Jupyter Notebook, JIRA, CI/CD pipelines, Glue, S3, RDS, EC2, EKS, GraphQL.

Client: State of MO (Department of Health) | Dec 2019-Aug 2020
Location: Kansas, MO
Python Developer
Spearheaded the backend development of a web application using Python and Django framework, ensuring efficient data processing and seamless user experiences.
Designed, implemented, and optimized RESTful APIs, enabling smooth communication between the front and backend using JSON and XML formats.
Developed data models and leveraged Django's ORM to manage interactions with PostgreSQL and Couchbase databases, ensuring data integrity and efficient querying.
Collaborated closely with the frontend team to integrate HTML, CSS, and JavaScript components, delivering a cohesive and visually appealing user interface.
Implemented dynamic client-side functionality using jQuery, enhancing user interactions and application responsiveness.
Utilized Git for version control, effectively managing source code collaboration and facilitating seamless integration of new features and bug fixes.
Integrated Power BI for data visualization and reporting, empowering stakeholders with actionable insights and improving decision-making processes.
Deployed and maintained the application on WebLogic, ensuring high availability and performance for end users.
Integrated various databases (e.g., PostgreSQL, MongoDB) with FASTAPI for data storage and retrieval.
Proficient in optimizing database interactions and queries for FASTAPI applications.
Implemented continuous integration and deployment (CI/CD) pipelines using Jenkins, automating the build, testing, and deployment processes for smoother development cycles.
Developed robust unit and integration tests using PyTest, ensuring code quality and minimizing the occurrence of regressions.
Collaborated with cross-functional teams to integrate the application with ServiceNow, streamlining incident management and enhancing user support processes.
Monitored application performance using Prometheus, identifying bottlenecks and proactively optimizing backend components for improved efficiency.
Actively participated in Agile Scrum methodologies, contributing to sprint planning, daily stand-ups, and retrospectives to ensure timely project delivery and continuous improvement.
Utilized Eclipse IDE for code development and debugging, maintaining a structured and organized codebase to facilitate collaboration among team members.
Managed and optimized Azure SQL Database instances for efficient data storage and retrieval.
Implemented Azure Active Directory (Azure AD) for secure authentication and access control in applications.
Implemented with the latest Azure developments and trends to continuously improve and optimize solutions.
Environment: Python, Django, HTML, CSS, RESTful APIs, JSON, XML, Jquery, JavaScript, PostgreSQL, Couchbase, GIT, Power BI, WebLogic, Jenkins, PyTest, Eclipse, Service Now, Prometheus, Agile, Scrum, Azure, FASTAPI, Oracle backup

Client: UST Global services | March 2015-Oct 2019
Location: Hyderabad, India
Software Developer
Developed web application using Python and Django framework, following the Model-View-Controller (MVC) architectural pattern.
Collaborated with the team to gather requirements, design database schema, and plan the application's structure.
Designed and implemented responsive user interfaces using HTML, CSS, and JavaScript, ensuring a seamless user experience across various devices.
Developed robust back-end components, integrating them with the front-end using Django's template engine and RESTful APIs.
Designed and optimized database schema using SQL, ensuring efficient data storage and retrieval.
Wrote complex SQL queries and utilized PL/SQL to enhance database performance and implement data integrity constraints.
Designed and optimized SQL queries for efficient data extraction and transformation in ETL processes.
Integrated third-party APIs and external data sources into ETL pipelines to enrich and extend data sets.
Managed project source code using Git, collaborating with a cross-functional team of developers and designers through version control.
Deployed the application to Heroku, configuring necessary environment variables and ensuring proper scalability and availability.
Project management tools like JIRA were used to track and manage software defects and enhancement requests.
Implemented scalable and fault-tolerant architectures on AWS using services like Auto Scaling, Elastic Load Balancing, and Amazon SQS.
Managed and automated deployment pipelines using AWS Code Pipeline and integrated code repositories with AWS Code Commit.
Actively participated in agile ceremonies such as sprint planning, daily stand-ups, and retrospectives, ensuring timely delivery of features and alignment with project goals.
Designed and built data pipelines using SparkSQL, ensuring data quality, reliability, and scalability.
Implemented SparkSQL queries to optimize data processing tasks, improving query performance and reducing processing time significantly.

Environment: Python, Django, MVC, HTML, CSS, JavaScript, HTTP, Git, SQL, PL/SQL, Heroku, ETL, SparkSQL , AWS , SQS.
Keywords: continuous integration continuous deployment user interface javascript business intelligence sthree active directory information technology procedural language Missouri Pennsylvania

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];709
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: