Home

Anindita - Python Developer
[email protected]
Location: Henderson, Nevada, USA
Relocation: Hybrid/Onsite inside state and Remote
Visa: H4EAD
Anindita Bhattacharya
Python Developer
Contact: 571 751 1099
Current Location: Henderson, Nevada
Visa status: H4EAD
Relocation: No
Linkedin: linkedin.com/in/ani-b-7b945a242

Professional Summary:

10+ years of experience in Analysis, Design, Development, Management, and Implementation of various standalone and client - server architecture-based enterprise application software.
Experienced with full software development life-cycle, architecting scalable platforms, object-oriented programming, database design and agile methodologies.
Expertise in development of web - based applications using Python, Django, Flask, HTML, XML, Angular, Angular.JS, CSS, DHTML, JavaScript, JSON and jQuery.
Well versed with design and development of presentation layer for web applications using technologies like HTML, CSS, and JavaScript.
Experience with cloud infrastructure of AWS (Amazon Web Services) and computing AMI virtual machines on Elastic Compute Cloud (EC2).
Experience configuring and developing with different Database servers including MySQL, MSSQL, Oracle and Mongo Db.
Expertise in test automation and continuous delivery of web applications, client server applications, and web services/REST APIs Testing, with Python.
Experience in developing web services (WSDL, SOAP and REST) and consuming web services with python programming language.
Experience with Requests, NumPy, SciPy, Matplotlib, Jupyter note book, Colab, Urllib2, Beautiful Soup and Pandas python libraries during development lifecycle.
Expertise to interact with databases using ORM frameworks like Django, SQL Alchemy frameworks.
Utilized various integrated development environments like Visual Studio Code and PyCharm.
Experience in using defect tracking/issue tracking/ Bug tracking tool like Atlassian Jira.
Possess good interpersonal, analytical presentation skills, ability to work in self-managed and Team environments.
I had created automated test scripts using Selenium to navigate a website, interact with various elements, and validate that the web application functions correctly.
I prioritized to develop a price monitoring tool for e-commerce websites using Selenium and REST APIs. The Selenium part would involve scraping product prices and other details from different e-commerce websites.
I used Selenium to automate the process of posting content to various social media platforms.
I have extensive experience in working on a project that involves scraping data from websites using Selenium and then processing and analysing the data using REST APIs.
I have automated the process of extracting data from web applications, such as CRM systems or data visualization tools by using Selenium.

Technical Skills:

Languages Python, SQL, Shell Scripting
Python Libraries Beautiful Soup, NumPy, SciPy, Matplotlib, python-twitter, Panda s data frame, urllib2, plotlydash
Framework Django, Flask, Angular
Web Services SOAP, RESTful
Cloud Platforms Amazon web services
IDES/Tools PyCharm, Sublime Text, Spyder, NetBeans, Eclipse
Application/Web Servers Apache Tomcat, IBM WebSphere, BEA WebLogic, Nginix
Database Oracle, DB2, MYSQL, MongoDB
Version Control Systems Git, GitHub
Project Management Tools Jira, HP ALM, Miro
Operating Systems Windows, iOS, Android

Professional Experience:
Client: Amgen Nov2022 - current
Role: Python Developer

Responsibilities:
Performed efficient delivery of code based on principles of Test-Driven Development (TDD) and continuous integration to keep in line with Agile Software Methodology principles.
Worked on building out, page views, templates, and CSS layouts.
Trying to modularize codes with the implementation of Object-oriented programming.
Using Plotli Dash library to pull the data and used as data repository.
Using Data Bricks, Amazon One cloud for accessing the repository and executed the all the functions inside AWS workspace.
Utilized PyUnit, the Python Unit test framework, for all Python applications and used Database API's to access database objects.
Improved code reuse and performance by making effective use of various design patterns.
Involved in debugging the applications monitored on JIRA, Miro using agile methodology.
Used Python-distributions like PlMySQL to connect with the databases and manipulate data in python language.
Going through the code and also understand the architecture of the project, trying to analyse each and every aspect of the project. Making, enhancing the required updates on the project.
As per the requirements modularizing the code base on their requirements, delivered them some approach and working in those areas, explore the model conversion from MatPlot to Python and explore Plotli Dash for more visualization to build a regression model.
Evaluated between Spotfire and Tableau for data analytics platform to provide the final recommendation to the client, ensuring technology consumption and cost optimization.
Effective delivery road map maintenance in Jira board.

Environment: Python, Angular, Type Script, AWS, HTTP, PyQT, HTML, MySQL, Jenkins, GIT, Jira, Agile, Miro, Windows.
Client: Pacific Specialty Insurance (Remote) Jan2019 - Apr2020
Role: Data Engineer

Responsibilities:
Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used, agile methodology for developing application.
Designed and managed API system deployment using fast HTTP server and Amazon AWS architecture. Utilized Python Libraries like Boto3, Num /Py for AWS.
Responsible for developing impressive UI using HTML, jQuery, CSS, Angular and Bootstrap.
Developed views and templates with Python and Django view controller and templating language to create a user-friendly website interface.
Build road map, clear the backlogs and communicate with the stake holders via daily status calls, meeting.
Design PPT, understanding business, scalability and blockages through a detailed research.
Used Python-distributions like PyMySQL to connect with the databases and manipulate data in python language.
Involved in Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins along with Shell script.
Used GitHub for Python source code version control, Jenkins for automating the build Docker containers.
Utilized PyUnit, the Python Unit test framework, for all Python applications and used Django Database API's to access database objects.
Improved code reuse and performance by making effective use of various design patterns.
I have extensive experience in using Selenium to automate testing processes in the insurance application. I wrote test scripts, perform end-to-end testing, and ensure the application's functionality, security, and data integrity.
I used Selenium to conduct regression testing, ensuring that new updates or changes to the insurance application did not introduce bugs or affect existing features negatively.
I want to emphasize my proficiency in using Selenium to perform cross-browser testing, ensuring that the insurance application works correctly in various web browsers, such as Chrome, Firefox, and Edge.
In this project I needed to use Selenium to validate data entry and processing within the insurance application. This demonstrates your attention to data accuracy and the importance of reliable insurance information.
I am having the ability to work with APIs to enhance the insurance application. I was experiencing in integrating external services like payment gateways, document management systems, or third-party data providers via Restful APIs.
I used automation to streamline insurance claims processing. This may involve using APIs to communicate with external databases, validate claim information, and automate approval or denial processes.
I need to give my constant efforts in the involvement of automating customer communication processes through APIs. This could include sending policy updates, claim status notifications, or renewal reminders via email or SMS using APIs.
Using Selenium to perform security and compliance testing, was my main priority. This could involve testing for vulnerabilities, ensuring data encryption, and verifying compliance with industry regulations (e.g., GDPR or HIPAA).
I am having hands-on experience using Selenium to conduct performance testing on the insurance application. This may involve simulating a large number of users and analyzing system response times and resource utilization.
I am capable of using Selenium to identify and report issues and bugs in the insurance application. Tools or systems I used for issue tracking and collaboration (e.g., JIRA or Confluence).
My continuous involvement in integrating Selenium tests into a CI/CD pipeline, ensuring that automated tests run automatically with each code change.
I want to include my experience in documenting test cases, test results, and automation procedures. Additionally, I provided training or guidance to team members or stakeholders on using Selenium and APIs for testing and automation.
Involved in debugging the applications monitored on JIRA using agile methodology.

Environment: Python, Django, Angular, TypeScript, Angular CLI, NPM, Node JS, AWS, Boto 3, Beautiful Soup 4, HTTP, PyQT, XML, MySQL, Jenkins, GIT, Jira, Agile, Selenium, API, Windows.

Client: Sun Life Financials Nov 2017 Dec 2018
Role: Python Developer

Responsibilities:
Developed Python Django forms to record data and the Login module page for users.
Designed email marketing campaigns and created interactive forms that saved data into database using Django Framework.
Using Amazon SQS to queue up work to run asynchronously on distributed Amazon EC2 nodes.
Developed responsive UI using HTML5/CSS3, AngularJS and JavaScript.
Extensive experience with AWS services like S3, ELB, EBS, Auto-Scaling, Route53, Storefront, IAM, Cloud Watch, RDS
Implemented AngularJS controllers to maintain each view data
Worked in Test driven development with Behave in Python and created Behave scripts using Gherkin syntax.
Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
Troubleshooting AWS Auto scaling and EC2 instances and Redshift related issues.
Developed automated testing framework for command-line based tests on Linux using Objected Oriented Perl and for selenium-based tests using Python.
Features for dashboard were developed and tested using CSS, AngularJS and Bootstrap
Improving the performance while processing data by modifying functions, queries, cursors, triggers and stored procedures for MySQL database and designed Cassandra schema for the APIs.
Django Framework was used in developing web applications to implement the MVT
Automated the continuous integration and deployments using Jenkins, Docker, Ansible and AWS Cloud Templates.
Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, BIG DATA and JavaScript.
In this project, I had created automated test scripts using Selenium to navigate a website, interact with various elements, and validate that the web application functions correctly. This might include filling out forms, clicking buttons, and verifying that certain elements are present on the page. A common use case for Selenium is automating web testing and regression testing.
I prioritized to develop a price monitoring tool for e-commerce websites using Selenium and REST APIs. The Selenium part would involve scraping product prices and other details from different e-commerce websites. The data collected would then be sent to a REST API that stores the information and allows users to set price alerts. When a product's price drops to a user's desired level, the REST API would send a notification.
In this project, I used Selenium to automate the process of posting content to various social media platforms like Facebook, Twitter. The Selenium scripts would log in to the user's accounts, navigate to the posting pages, and upload text, images, or videos. Additionally, a REST API could be used to schedule posts at specific times.
I have extensive experience in working on a project that involves scraping data from websites using Selenium and then processing and analysing the data using REST APIs. I used to scrape product information from e-commerce websites and then use REST APIs to perform sentiment analysis on customer reviews or price comparisons.
Using Selenium, I have automated the process of extracting data from web applications, such as CRM systems or data visualization tools. Once the data is collected, a REST API could be used to generate automated reports in various formats (PDF, Excel, etc.) and distribute them to stakeholders.
Selenium can be used to periodically check the availability and functionality of a website. If the website goes down or displays errors, Selenium can trigger a REST API to send alerts to administrators or initiate automated recovery processes.

Environment: Python, Django, Selenium, HTML, CSS, Bootstrap, MySQL, PostgreSQL XML, GITHUB, Jenkins, Node.js, RDBMS, REST APIs Jira, Agile, Windows.

Client: Credit One Bank, Las Vegas, NV Aug 2015 June2017
Role: Data Analyst

Responsibilities:
Involved in all phases of software development life cycle for the case assignment and case management modules.
Developed frontend and backend modules using Python on Django including Tasty Pie Web Framework using Git.
Implemented various functions in NumPy and Pandas for mathematical operations and arrays.
Designed the application using Python, Django, HTML, JSON and jQuery. Worked on backend of the application.
Used Python to write data into JSON files for testing Django Websites. Created scripts for data modelling and data import and export.
Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
Created Data tables utilizing PyQt to display customer information and add, delete, update customer records and PyQuery for selecting particular DOM elements when parsing HTML.
Used Jenkins for continuous integration and deployment. Managed continuous maintenance and troubleshooting of the Project.
Used JIRA for bug tracking and issue tracking and used Agile Methodology and SCRUM Process.
Responsible for debugging and troubleshooting the web application.

Environment: Python, Django, HTML5, CSS3, Bootstrap, AWS, Boto3, XML, JSON, PyQt, PyQuery, Jenkins, Jira, GIT, Agile, Windows.

Client: Eppendorf Sep 2009 - June 2013
Role: Intern Data Analyst

Responsibilities:
Features for dashboard were developed and tested using CSS, AngularJS and Bootstrap
Improving the performance while processing data by modifying functions, queries, cursors, triggers and stored procedures for MySQL database and designed Cassandra schema for the APIs.
Django Framework was used in developing web applications to implement the MVT
Automated the continuous integration and deployments using Jenkins, Docker, Ansible and AWS Cloud Templates.
Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, BIG DATA and JavaScript.
In this project, I had created automated test scripts using Selenium to navigate a website, interact with various elements, and validate that the web application functions correctly. This might include filling out forms, clicking buttons, and verifying that certain elements are present on the page. A common use case for Selenium is automating web testing and regression testing.
I prioritized to develop a price monitoring tool for e-commerce websites using Selenium and REST APIs. The Selenium part would involve scraping product prices and other details from different e-commerce websites. The data collected would then be sent to a REST API that stores the information and allows users to set price alerts. When a product's price drops to a user's desired level, the REST API would send a notification.
In this project, I used Selenium to automate the process of posting content to various social media platforms like Facebook, Twitter. The Selenium scripts would log in to the user's accounts, navigate to the posting pages, and upload text, images, or videos. Additionally, a REST API could be used to schedule posts at specific times.
I have extensive experience in working on a project that involves scraping data from websites using Selenium and then processing and analysing the data using REST APIs. I used to scrape product information from e-commerce websites and then use REST APIs to perform sentiment analysis on customer reviews or price comparisons.

Environment: Python, Django, Selenium, HTML, CSS, Bootstrap, MySQL, PostgreSQL XML, GITHUB, Jenkins, Node.js, RDBMS, REST APIs Jira, Agile, Windows.

Eppendorf India January 2007 - May2009
Technical Analyst

Responsibilities:
Preparing presentations about any future campaign or any future prospects of the company. Strong knowledge in MS Office and power point and SQL. Strong interpersonal, verbal and communication skill. Experienced in all phases of the Software Development Life Cycle (SDLC).
Competent in analyzing the current business process and creating future state business process flow using industry standard BPM notations.
Has proven history of success in Wealth Management.
Proficient in analyzing and creating Narrative Use Cases, User Stories, Story Boards, Use Case Diagrams, Activity diagrams, Data/Flow/Navigational flow diagrams using UML Tools like MS Visio.
Strong knowledge in capital market products, Treasury, Asset Liability Management (ALM), Custodianship, Asset management, Derivatives.
Excellent skills in facilitating Joint Application Development (JAD) for eliciting functional requirements that support the High-Level business requirements.
Knowledge in Process Modeling using a structured approach using MS Visio Design and review of various documents including the Software Requirement Specifications (SRS), including USE Cases, Business requirements document (BRD), Use Case Specifications, Functional Specifications (FSD), Systems Design Specification (SDS), High Level Design Document (HLD), Requirement Traceability Matrix (RTM) and testing documents.
Experience in working with Agile SAFe, JIRA, APIs and Common Reporting Standard (CRS).
Prepare invoices, reports, memos, letters, financial statements and other documents, using word processing, spreadsheet, database or presentation software. Prepare responses to correspondence containing routine inquiries.
Filing / records management, collects, complies, evaluates, and reports department specific program information. Arrange travel schedules and reservations for executive management as needed.
Maintain department files and records as required, day to day operational data and delegate the task within the team.
Liaison with Germany office counterparts on new business developments

Education Profile:
B.A. (Honors) as Communicative English Major from University of Calcutta, India (2005)
Post Graduate in Mass Communication from Jadavpur University, India (2007).
Post Graduate in Communicative English & Literature from RBU, India (2009).
Technical training in C++, Python, SQL, Java, since 2009.
Keywords: cplusplus continuous integration continuous deployment user interface javascript access management sthree database hewlett packard microsoft Nevada

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];1659
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: