Home

Sukaant - Java Fullstack Developer
[email protected]
Location: Arlington, Texas, USA
Relocation: Yes
Visa: H1B
Having 14+ Years of Experience in DevOps, Java, J2EE, Elixir, Kafka, Neo4j, MySQL, Cloud Computing, Hadoop, Security, Mac, and Linux.
Worked as a part of Research and Development Team
Experience of Full Stack, Backend Developer, and DevOps.
Led the design and development of a Microservices-based architecture using Java and Spring Boot, breaking down a monolithic application into independent and scalable services.
Implemented RESTful APIs adhering to best practices, facilitating seamless communication between Microservices and ensuring interoperability.
Integrated containerization using Docker, enabling consistent deployment across development, testing, and production environments, and reducing deployment time.
Orchestrated container management and scaling using Kubernetes, ensuring high availability and efficient resource utilization.
Utilized CI/CD practices with Jenkins to automate the build, test, and deployment processes, resulting in increased efficiency and shorter release cycles.
Leveraged AWS cloud platform for hosting Microservices, utilizing services like EC2, RDS, and S3 for scalable and cost-effective infrastructure.
Developed RESTful services using AWS Lambda for serverless architecture.
Deployed Java applications on AWS EC2 for scalable hosting.
Implemented centralized logging and monitoring using ELK stack, providing real-time visibility into system performance and facilitating troubleshooting.
Spearheaded the implementation of a real-time data processing system using Apache Kafka and Kafka Streams, leveraging Java for development.
Designed and developed Kafka producers and consumers, enabling seamless ingestion and processing of high-volume data streams.
Developed REST APIs using SpringBoot for scalable web services.
Implemented Microservices architecture with SpringBoot for modular systems.
Secure applications with SpringBoot Security for robust access control.
Integrated Kafka Connect for bi-directional data synchronization with external systems, ensuring data consistency and reliability.
Implemented Kafka Streams for real-time data transformation and analytics, providing actionable insights for decision-making.
Designed fault-tolerant and scalable Kafka clusters on AWS, ensuring high availability and reliability of the messaging infrastructure.
Implemented security measures such as encryption and authentication in Kafka clusters, ensuring data integrity and compliance with regulatory requirements.
Conducted performance optimization and tuning of Kafka clusters, enhancing throughput and reducing latency for real-time data processing.
Provided technical guidance and mentorship to team members on Kafka architecture and best practices, fostering knowledge sharing and collaboration.
Extensive experience in different Integrated Development Environment like Eclipse, STS, neoEclipse, SQLYog, MySQL Workbench and Oracle Developer.
Flexible to work with different technologies according to Industry need.

Strong communication and interpersonal skills and self-starter.

PROFESSIONAL EDUCATION:
B. Tech. (Computer) (June 2011) from Bharati Vidyapeeth Deemed University College of Engineering, Pune.
Diploma in Computer Technology (June 2007) from Bharati Vidyapeeth s Jawaharlal Nehru Institute of Technology, Pune.

CERTIFICATION:
Neo4j Certified Professional.

TECHNICAL SKILLS:

Big Data Hadoop, Cloudera
Languages Java, J2EE, Elixir, JavaScript, Ajax, Ruby, Jquery
Frameworks Struts, Hibernate, Spring (Core, Web, MVC, Boot, Batch), JUnit, Mockito, AngularJS, Bootstrap, Cucumber, iBatis
Design Patterns Singleton, Factory, MVC
IDE Eclipse, STS, Intellij, VS Code, neoEclipse, SQLYog, MySQL Workbench, Oracle Developer
Web Service REST, SOAP
Build Management Tools Maven, Gradle, ANT, Ivy
Servers Tomcat, Neo4j, Apifest, MySQL
Versioning Tools SVN, GIT, CVS, Stash
Testing Tools JMeter, Postman
Tracking Tool Jira, Zenhub
CI/CD Tool TeamCity, Jenkins, TravisCI, GitHub Actions
Log Management Tool Graylog
Automation Tool ElectricCommander
Document Collaboration Tool Confluence
Team Collaboration Tool IdeaBoardz, Fun Retro
Repository GitHub, GitLab, Stash
Containerization Platform Docker
Messaging Services Kafka
Container Orchestrator Kubernetes
Revision Control Tool FishEye
Databases Oracle, MySQL, H2, Neo4j, PrestoDB
Monitoring & Alerting Tool Prometheus
Interactive Visualization Grafana
Operating Systems Linux (Ubuntu, Fedora), Windows, Mac
Cloud Computing Services Amazon AWS (EC2, S3, EMR, RDS, EKS, IAM, Lambda), GrapheneDB, Graph Story, PCF
Google API Google Calendar, Google Plus (Using OAuth2)
Authentication/ Authorization JOSSO, OAUTH2








PROFESSIONAL EXPERIENCE:

Client: T Mobile, Bellevue WA Jun 2021 to Present Date Position: Senior Java Developer and Lead
Project: DATA APPLICATION BACKEND
Team Size: 20
Description: Working as Senior Developer and Lead on the project Data Application Backend. The Data Application Backend is the backend part of the complete customer profile. It takes care of the customer data such as customer account type, account balance, recharge, device, mobile numbers, and plans associated with the account.
Roles & Responsibilities:
Contributed to the backend development of a large-scale project, focusing on the implementation of Microservices architecture.
Utilized Java/J2EE for application development, ensuring robust and scalable solutions.
Leveraged Spring Boot framework to create Microservice applications, enabling rapid development and deployment.
Managed dependencies efficiently via Spring Boot Starter for easy setup.
Configure properties using SpringBoot Config for flexible environments.
Deploy SpringBoot applications on cloud for scalable infrastructure
Implemented test-driven development (TDD) using JUnit for unit testing, ensuring code quality and reliability.
Conducted Sanity Tests using Cucumber to verify the overall functionality and behavior of the application.
Managed project build processes using Gradle, streamlining project management and ensuring efficient development workflows.
Integrate databases using SpringBoot JPA for seamless data access.
Build event-driven applications with SpringBoot for responsive designs.
Monitor applications using SpringBoot Actuator for health checks.
Employed H2 database for both embedded and server modes, facilitating easy testing and development with in-memory databases.
Practiced Agile methodologies for iterative development, enabling adaptability and responsiveness to changing requirements.
Integrated Amazon S3, Lambda, and Serverless for efficient storage, serverless computing, and managing custom logic in the application.
Utilized Postman for building and testing APIs, ensuring API reliability and functionality.
Monitored application performance and logs using Splunk, enabling real-time monitoring and troubleshooting.
Developed and managed APIs using Apigee, ensuring API security, scalability, and performance.
Managed Git repositories and version control using GitLab, facilitating collaborative development and code review.
Implemented CI/CD pipelines using GitLab CI/CD Pipeline, enabling automated testing and deployment processes.
Managed configuration secrets using Vault, ensuring secure handling of sensitive information in applications.
Utilized Jira for story tracking and project management, enabling effective collaboration and task tracking within the team.
Implemented language conversion using Motion Point, facilitating localization and multilingual support.
Generated API documentation automatically using Swagger, ensuring accurate and up-to-date documentation.
Collaborated on documentation using Confluence, enabling seamless document collaboration and knowledge sharing.
Ensured code quality using SonarQube for code quality checks, identifying and addressing code issues and vulnerabilities.
Implemented event streaming using Kafka, enabling real-time data processing and communication between Microservices.
Managed Kafka clusters using Conduktor, ensuring efficient management and monitoring of Kafka infrastructure.
Containerized applications using Docker, simplifying deployment and ensuring consistency across different environments.
Orchestrated containerized applications using Kubernetes, ensuring scalability, reliability, and efficient resource management.
Utilized Helm Charts for Kubernetes resources, enabling streamlined management and deployment of Kubernetes applications.
Monitored application metrics using Prometheus, enabling proactive monitoring and alerting for application health and performance.
Visualized application metrics and performance using Grafana, facilitating interactive visualization and analysis.
Leveraged AWS services such as Amazon S3, Lambda, and Serverless for efficient data storage, serverless computing, and running code without managing servers.
Stored data securely using AWS S3 for reliable storage.
Managed databases with AWS RDS for seamless data integration using AWS API Gateway for API management.
Implemented custom logic in serverless applications using Python, ensuring flexibility and extensibility in application development.
Played a key role in developing and managing APIs for a large-scale project, focusing on scalability, security, and reliability.
Utilized Java/J2EE for API development, ensuring robustness and performance of the APIs.
Leveraged Spring Boot framework for creating Microservice-based APIs, enabling rapid development and deployment.
Implemented comprehensive unit tests using JUnit, ensuring the reliability and functionality of APIs.
Conducted Sanity Tests using Cucumber, validating the overall behavior and functionality of the APIs.
Managed project build processes using Gradle, ensuring efficient project management and development workflows.
Utilized H2 database for embedded and server modes, facilitating easy testing and development with in-memory databases.
Practiced Agile methodologies for iterative development, enabling flexibility and responsiveness to changing requirements.
Integrated Amazon S3, Lambda, and Serverless for efficient storage, serverless computing, and managing custom logic in APIs.
Employed Postman for building, testing, and debugging APIs, ensuring API reliability and functionality.
Monitored API performance and logs using Splunk, enabling real-time monitoring and troubleshooting.
Developed and managed APIs using Apigee, ensuring security, scalability, and performance of APIs.
Managed Git repositories and version control using GitLab, facilitating collaborative development and code review.
Implemented CI/CD pipelines using GitLab CI/CD Pipeline, enabling automated testing and deployment processes for APIs.
Managed configuration secrets using Vault, ensuring secure handling of sensitive information in APIs.
Utilized Jira for story tracking and project management, enabling effective collaboration and task tracking within the team.
Implemented language conversion using Motion Point, facilitating localization and multilingual support for APIs.
Generated API documentation automatically using Swagger, ensuring accurate and up-to-date documentation for APIs.
Collaborated on documentation using Confluence, enabling seamless document collaboration and knowledge sharing.
Ensured code quality using SonarQube for code quality checks, identifying and addressing code issues and vulnerabilities.
Implemented event streaming using Kafka, enabling real-time data processing and communication between Microservices.
Managed Kafka clusters using Conduktor, ensuring efficient management and monitoring of Kafka infrastructure.
Containerized applications using Docker, simplifying deployment and ensuring consistency across different environments.
Orchestrated containerized applications using Kubernetes, ensuring scalability, reliability, and efficient resource management.
Utilized Helm Charts for Kubernetes resources, enabling streamlined management and deployment of Kubernetes applications.
Monitored application metrics using Prometheus, enabling proactive monitoring and alerting for API health and performance.
Visualized API metrics and performance using Grafana, facilitating interactive visualization and analysis.
Environment: Java, J2EE, PostgreSQL, SpringBoot, JPA, Microservices, Junit, Mockito, Cucumber, Gradle, Postman, Intellij IDEA, GitLab, Jira, Springboot, Confluence, Splunk, Kafka, Vault, Apigee, Swagger, SonarQube, Grafana, Prometheus, Docker, Kubernetes, Conduktor, Redis, Motion Point, Python, Amazon S3, AWS Lambda, Serverless.

Client: Ohio Department of Transportation, Columbus OH Jun 2019 to May 2021
Position: DevOps Engineer
Project: SMART COLUMBUS OPERATING SYSTEM
Team Size: 20
Description: Worked as DevOps Engineer on the project Smart Columbus Operating System. The Smart Columbus Operating System is the nexus where the city becomes smart . By ingesting, visualizing and sharing open, secure data, the operating system will equip public sector officials and private sector innovators to use data to activate insights. It features an Open Data Platform, which provides anytime access to city s latest mobility data.
Roles & Responsibilities:
Led the development of a real-time data processing system from inception to deployment, overseeing all aspects of the project lifecycle.
Utilized Elixir for application development, leveraging its concurrent and fault-tolerant nature for building scalable and resilient systems.
Implemented PrestoDB for distributed database management, enabling efficient querying and analysis of large-scale datasets.
Leveraged Amazon AWS for cloud infrastructure, utilizing services such as EC2, S3, and Lambda for scalable and reliable computing and storage.
Practiced Agile methodologies for iterative and adaptive project management, ensuring responsiveness to changing requirements and continuous improvement.
Utilized VS Code IDE for coding in Elixir, providing a lightweight and efficient development environment for writing and debugging code.
Implemented continuous integration using Travis CI and GitHub Actions, ensuring automated testing and code quality checks with every code commit.
Managed continuous deployment pipelines using Jenkins, automating the deployment process and ensuring reliable and consistent deployment of application updates.
Employed Kafka for event streaming, enabling real-time data processing and communication between system components.
Managed Git repositories using GitHub, facilitating version control and collaboration among development teams.
Containerized applications using Docker, simplifying deployment and ensuring consistency across different environments.
Orchestrated containerized applications using Kubernetes, ensuring scalability, reliability, and efficient resource management.
Utilized Helm Charts for Kubernetes resources, enabling streamlined management and deployment of Kubernetes applications.
Monitored system events and performance using Prometheus, enabling proactive monitoring and alerting for system health and performance.
Visualized system metrics and performance using Grafana, facilitating interactive visualization and analysis of system data.
Managed project tasks and workflows using ZenHub, providing a seamless integration with GitHub for project management and collaboration.
Utilized Plotly for visualization, enabling the creation of interactive and visually appealing charts and graphs for data analysis and presentation.
Environment: Elixir, Phoenix, PrestoDB, Kafka, Grafana, Prometheus, Docker, Kubernetes, Terraform, ZenHub, Postman, VS Code, GitHub, Travis CI, GitHub Actions, Amazon AWS, Jenkins, Postgres, Redis, Prometheus, Grafana.

Client: Kroger, Cincinnati OH May 2018 to Jun 2019
Position: Graph and Java Lead/Architect
Project: SUPPLIER ITEM RELATIONSHIP SOLUTION
Team Size: 12
Description: Worked as Graph and Java Lead on the project SIRS, which captures FSMA compliance status of each item, with the compliance evaluation based primarily on the compliance status of each production facility at which the given item is made, packaged or otherwise handled. The only database which knows production facilities are owned by each vendor and which items are produced at each of those production facilities.
Roles & Responsibilities:
Started project from scratch.
Worked as a lead/architect and developer for onshore-ofshore model.
Worked on
o End to end of the of the project.
o Neo4j Server Installation.
o LDAP integration with Neo4j.
o PCF (Pivotal Cloud Foundry).
Worked with
o Java/J2EE for application development.
o SpringBoot for creating Microservice application.
o Automate testing using Spring Boot's embedded testing tools and JUnit.
o Leverage Spring Boot's DevTools for rapid development and debugging.
o Utilize Spring Boot Profiles for environment-specific configurations.
o Simplify logging with Spring Boot's Logback and Log4j support.
o Design scalable solutions using AWS Auto Scaling for dynamic demand.
o Implement caching strategies with AWS ElastiCache for improved performance.
o Use AWS DynamoDB for high-performance NoSQL database operations.
o Configure AWS VPC for secure network isolation and management.
o Leverage AWS Elastic Load Balancing for distributing incoming traffic.
o Automate deployments with AWS CodePipeline for continuous delivery.
o Enhanced performance with SpringBoot Caching to optimize speed.
o Spring Data Neo4j to map annotated entity classes to the Neo4j Graph Database.
o Spring Batch for one time load of data to Neo4j.
o Cypher Queries in Neo4j.
o JUnit for test-driven development.
o Mockito to add dummy functionality to mock interfaces used in JUnit.
o Cucumber for functional testing.
o Maven for building the project.
o Agile methodologies.
Used
o KIC Start (KIC) Provisioner (Kroger Internal Cloud) to create the project and setup the environment.
o Sonar web-based code quality analysis tool for code quality check points.
o TeamCity for build management and continuous integration.
o Stash to manage the Git repositories.
o PCF to build, develop and deploy app.
o Jira for bug tracking, issue tracking and project management.
Environment: Java, J2EE, Neo4j, SpringBoot, Spring Data Neo4j, Spring Batch, Microservices, Junit, Mockito, Cucumber, Maven, Postman, Intellij IDEA, Stash, TeamCity, PCF, Jira, Graylog, Confluence.



Client: GAP Inc, SFO CA Jun 2017 to May 2018
Position: Lead Software Development Engineer
Project: PROFILE ECOM PERSONALIZATION ATTRIBUTE SERVICE
Team Size: 9
Description: Worked as Lead Software Development Engineer for our esteemed client GAP Inc on the project Profile Ecom on the Fashion and Retail e-commerce website, PAS is a mechanism to enable on-site personalization, by making visitor attributes available to Ecom/mobile site, with PAS, we will associate attributes with people based on what we know about them. So that we can show different on-site messages and other different discounts to different people. PAS provides the mechanism by which we can attach values to individuals and surface them to the presentation layer and promotion system for customer facing actions.
Roles & Responsibilities:
Worked as a developer and lead for onshore-offshore model.
Worked with
o Java/J2EE for application development.
o Spring MVC for Model-View-Controller application.
o Struts and Tiles for Web application.
o JavaScript, Ajax, Jquery, AngularJS for the frontend.
o Junit for test-driven development.
o Secure data in transit using AWS KMS for encryption services.
o Perform data analysis using AWS Athena for querying S3 data.
o Manage containerized applications with AWS ECS for efficient
o Mockito to add dummy functionality to mock interfaces used in Junit.
o Cucumber for functional testing.
o Gradle for building the project.
o Ivy dependency management tool.
o Micro Services.
o H2 database for embedded and server modes, in-memory databases.
o Agile methodologies.
o Transmit Security to keep the logs of authorized and unauthorized users.
Used
o Github a web-based Git version control repository hosting service.
o SourceTree Git client.
o BrowserStack to test website for cross browser compatibility on real browsers.
o Jira for bug tracking, issue tracking and project management.
o ElectricCommander for automating the software build, test and release process.
o Confluence for document collaboration.
o Fun Retro for team collaboration.
o PlanningPoker to make agile estimating for sprint planning.
Environment: Java, J2EE, Struts, Tiles, Spring MVC, iBatis, Junit, Mockito, Cucumber, Gradle, Oracle, H2, JavaScript, Ajax, Jquery, Postman, Eclipse, SVN, Github, SourceTree, BrowserStack, Jira, ElectricCommander,
Confluence, Fun Retro, PlanningPoker.

Client: GAP Inc, SFO CA Jan 2017 to Jun 2017
Position: Lead Software Development Engineer Project: ACT-SAM
Team Size: 9
Description: Worked as Lead Software Development Engineer for our esteemed client GAP Inc on the project ACT-SAM for one of our esteemed client on Fashion and Retail, in Assorting and Costing Tool for Strategic Alliance Merchandizing in the Franchise Retail Import Order and Supply Chain, the responsibility include development and handling the team onshore and offshore, the work involved importing the data through different sources like different urls and databases using thread pool and to load them to our own database by mapping them and making them available on request.
Roles & Responsibilities:
Worked as a developer and lead for onshore-offshore model.
Worked with
o Java/J2EE for application development.
o SpringBoot for building and deploying applications with minimal effort.
o Spring Batch for execution of series of jobs for read and write resources.
o Thread Pool for concurrent execution.
o Spring MVC for Model-View-Controller application.
o JUnit for test-driven development.
o Mockito to add dummy functionality to mock interfaces used in JUnit.
o Ruby for functional testing.
o AngularJS and Bootstrap for frontend development.
o Gradle for building the project.
o Agile methodologies.
Used
o Jira for bug tracking, issue tracking and project management.
o ElectricCommander for automating the software build, test and release process.
o FishEye for revision control.
o Confluence for document collaboration.
o IdeaBoardz for team collaboration.
Environment: Java, J2EE, Spring (Boot, MVC, Batch), Junit, Mockito, Ruby, Gradle, AngularJS, Bootstrap, Oracle, Postman, Eclipse, Jira, ElectricCommander, Confluence, IdeaBoardz.

Client: DirecTV (Part of AT&T), El Segundo CA Aug 2016 to Jan 2017 Position: Applications Developer
Project: PROMOTION AND OFFER APPLIANCE MANAGEMENT
Team Size: 8
Roles & Responsibilities:
o Worked with
o JavaScript, Bootstrap and Jquery for designing the frontend.
o Cloudera Platform for BigData (Apache Hadoop).
o Sqoop for transferring data from Oracle to Hive.
o Hue for accessing the data through the UI provided by Cloudera.
o Hive for storage of data for Hadoop.
o Oozie to create the CSV files.
o Neo4j CSV import to import the data from CSV file to Neo4j.
o Spring MVC and SpringBoot.
o MVC Framework.
o Singleton design pattern.
o Agile methodologies.
o Used
o Neo4j for storing data in graphical format.
o Java for implementing our own processing on the data.
o Maven for building the project.
Environment: Java, J2EE, Hadoop, Sqoop, Hive, Hue, HDFS, Maven, Oracle, Postman, Cloudera, Neo4j, STS, spring (MVC, Boot), JavaScript.

Client: Lowe s, Charlotte NC Apr 2016 to Jul 2016
Position: Senior Developer
Project: ENTERPRISE LOYALTY
Team Size: 20
Roles & Responsibilities:
Worked with
Java/J2EE for application development, SpringBoot for creating Microservice application, JUnit for test-driven development, Cucumber for BDD, MapStruct for mapping between Java bean types, Jsonschema2pojo to generate Java types from JSON Schema, Maven for building the project, Agile methodologies.
H2 database for embedded and server modes, in-memory databases.
PostgreSQL for object-relational database, Google Cloud Platform for Cloud hosting platform,
Agile methodologies.
Used
Spinnaker for Cloud Native Continuous Delivery, Stash to manage the Git repositories,
Jenkins CI/CD Pipeline, Vault for config secrets manager for our apps, Stash Bitbucket for Git repositories, Jira for Story Tracker, Google Cloud Platform for Cloud hosting platform,
Swagger for auto generated API docs, SonarQube for code quality check tool, Kafka to handle event streaming, Docker for maintaining images, Confluence for document collaboration, Kubernetes for container orchestration, Helm Charts for Kubernetes resources, Prometheus for event monitoring and alerting, Grafana for interactive visualization web application.
Environment: Java, J2EE, PostgreSQL, SpringBoot, JPA, Spring Batch, Microservices, Junit, MapStruct, Jsonschema2pojo, Mockito, Cucumber, Maven, Postman, Intellij IDEA, Stash, Jira, Kibana, Confluence, Carbon, Jenkins, Spinnaker, Kafka, Vault, GCP, Swagger, SonarQube, Grafana, Prometheus, Docker, Kubernetes, Dbeaver, Kafka Tool GitHub, Redis.

Client: Applied Materials, India Sep 2015 to Mar 2016
Position: Lead Software Engineer Project: MACHINE VISUALIZATION
Team Size: 4

Client: Lukup Media, India Apr 2014 to Sep 2015
Position: Software Engineer
Project: SEARCH RECOMMENDATION VISUALIZATION AND SECURITY
Team Size: 10

YBN Technology & Marketing Solutions, India Jul 2013 to Apr 2014
Position: Senior Software Developer
Project: COMMONSLOT Team Size: 2

CSR Global, India Nov 2012 to Jun 2013
Position: Software Engineer
Project: CORPORATE ACCOMMODATION MANAGEMENT SYSTEM
Team: 10

iStepup Services, Pune Jun 2011 to Nov 2012
Position: Software Engineer Project: Cab Booking System Team Size: 2

iStepup Services, Pune Jun 2009 to May 2011
Position: Junior Software Engineer
Project: ONLINE TRAINING AND PLACEMENT PORTAL
Team Size: 4
Keywords: continuous integration continuous deployment user interface business intelligence sthree information technology California North Carolina Ohio Washington

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3427
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: