Home

Prabbat - Java Developer
[email protected]
Location: Seattle, Washington, USA
Relocation: Yes
Visa:
SUMMARY:
Over 9+ years of professional experience in the IT industry, with a focus on developing, implementing, and maintaining a variety of applications using Java, J2EE technologies, and Object-oriented methodologies. Skilled in enterprise technologies, frameworks, and design patterns.
Certified Scrum master and expertise in delivering projects through Agile and Test driven development(TDD).
Experienced with J2SE Technologies like API, Threads, Executor framework, Completable Future, Futures, Collections, and Exception Handling and, J2EE Technologies like Servlet, Listener, JSP, Java Security API.
Experience writing backend using Node.js with frameworks like Express and MongoDB database.
Expertise in the implementation of Core concepts of Java, J2EE Technologies: JSP, Servlets, JSF, JSTL, EJB transaction implementation, Spring, Hibernate, Java Beans, JDBC.
Good experience and knowledge in various development methodologies like Test Driven Development (TDD), Extreme Programming (XP), Scrum, Agile.
Experience in working with containerization technologies like docker, Kubernetes and OpenShift.
Experience in designing and architecting using UML based diagrams through tools like PlantUML and Lucid charts.
Extremely good in Spring Boot, Spring Framework, Hibernate, Angular 8, React JS, TypeScript, and JUnit frameworks.
Proficient in using Enterprise integration patterns(EIP) with Apache Camel such as Multicast, dynamic router, content based router, splitter and recipient list.
Experience in developing and maintaining event driven based applications using messaging frameworks like pub-sub and point to point messaging providers in a multi-server environment.
Extensively worked on Collections, Generics, Enumerations, Annotations.
Have knowledge of Spring Cloud to develop Spring Boot-based Microservices interacting through REST.
Expertise in Object Oriented Analysis and Design (OOAD), OOPS using Unified Modeling Language (UML), Design Patterns, MVC Frameworks.
In-depth knowledge of Apache Subversion (SVN), Git & Bit Bucket and Jenkins Continuous Integration Server Installation, Configuration, Design and Administration, and integrating these tools with other systems.
Experienced on working with Amazon Web Services like EC2, S3, AWS Cloud Watch, Dynamo, SQS, Lambda and SNS.
Proficiency with big data processing frameworks like Apache Spark and Apache Kafka.
Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience optimizing ETL workflows for Big Data.
Proficient in Service Oriented Architecture (SOA), Experienced in development and use of Web Services.
Proficient in using caching technologies like Hazelcast and Redis and integration of those into application.
Worked on various J2EE applications on appservers such as Weblogic10.3, Websphere, Jboss Fuse 6.1 and Tomcat.
Experience in writing JUnit tests using Mockito, Power Mockito and behaviour based tests using Spock and Cucumber.
Extensive experience in developing Applications using React JS, Redux, Angular 6/7/8, Typescript, ECMA Script 2015/ES6, HTML5/4, CSS3/2, SASS, LESS, Bootstrap and Web Content Management tools (CMS).
Experience in writing efficient queries in SQL performing joins, indexing and optimizing access paths.
Experience in implementing e-commerce/distributed applications using HTML, HTML5, CSS, JavaScript, Java, J2EE, Servlets, JSP, Java Beans, JDBC, EJB, XML, XPATH, JAXB, JAXP, SQL, jQuery, Unix, Linux and Windows.
Designed and implemented XML schemas, Java APIs, business logic and XML/JavaScript user interfaces.
Extensive experience with developing web and enterprise applications with development tools like Eclipse, IntelliJ and WebLogic.
Extensive experience in developing unit testing frameworks using Junit and test-driven methodology.
Experience in building projects through Maven and ANT building systems.
Proficient in Core Java concepts like Multi-threading, Collections and Exception Handling concepts.
Experience in version control tools like SVN, GitHub and BitBucket.
TECHNICAL SKILLS:

Programming Languages Java, J2EE, Shell Scripting, Python, JavaScript, TypeScript, C, C++, jQuery,HTML5,DHTML
Frameworks/Libraries Apache Camel, Spring, Spring-Boot, Angular > 4, React JS, Apache Spark,
Flask, Django, Bootstrap, Dozer, YARN, Apache Kafka, Express.
Java Enterprise API Servlets, JSP, Junit, EJB, JNDI, JSON, JMS, JDBC, Java Mail, RMI,
Web services
Messaging Technologies Apache Kafka, IBM MQ , Rabbit MQ, ActiveMQ, IBM WebSphere and JMS
System Design Docker, Kubernetes, Openshift, MVC, Spring, Spring Boot, Hibernate,
CSS3, Microservices, Node.JS, Reactive and Event driven systems.
Databases& Programming MySQL, SQL, MongoDB, NoSQL, Oracle, SQL Server, IBM DB2,
Stored Procedures, PostgreSQL, AWS Dynamo, AWS Aurora .
Software Engineering UML, Design Patterns, Object Oriented Methodologies, Service Oriented
Architecture,Test Driven Development, Scrum and Agile methodologies
XML Technologies XML, DOM, SOAP, WSDL
Application Servers Apache Tomcat, Glassfish, Jenkins, JBoss, WebLogic, IBM, Apache Karaf
WebSphere, Eclipse, Maven.
IDEs & Tools Eclipse, IntelliJ, VS Code, WinSCP, Putty, Jenkins, ANT, Maven, Log4J,
Splunk, DataDog, Grafana

PROFESSIONAL EXPERIENCE:

Amazon, AWS S3, Seattle, Washington October 2021 Present
Role: Java Developer

Responsibilities:

Designed for breaking the S3 monolith by separating out encryption process of storage system into a microservice handling more than 98% of traffic coming to S3.
Azure cloud migration for all legacy apps
Designed the requirements and implementation strategy using PlantUML and Gliffy and presented it for review with senior engineers and principal engineers.
Designed multiple interfaces and adapters to integrate various API s like GET, PUT, COPY, LIST and DELETE with new encryption module.
Wrote algorithms to serialize and deserialize encryption module output refactored out of S3.
Wrote checksum logic to increase durability guarantees of operations for all API s using thread safe logic without impacting performance of operations.
Refactored various implementations of encryption using factory design pattern and used them to process object bytes coming to S3.
Utilized decorator design pattern to wrap encryption module responses from encryption microservice.
Wrote a blob processing module to process object blobs which included writing new Iterator calculating range and part for GET API.
Performed integration of above two refactored modules enabling it to take 2% of customer traffic. Used Command and Strategy pattern to implement that integration.
Used log4j logger to log errors and info across the application with proper exception handling.
Wrote new unit test cases using Mockito and power Mockito to improve coverage of classes to 97%.
Wrote new behavior-based end to end test cases in Spock and Cucumber to improve testing coverage from 69% to 93%.
Resolved 4 functional bugs using remote debugging and 1 security bug in S3 directly impacting customers as part of Oncall duties driving operational excellence.
Successfully utilized Agile methodology to deliver features resolving dependencies and using CI/CD pipelines.
Setup infrastructure to simulate customer traffic and workflows to perform performance testing for new implementation.
Developed User Interface by using the React JS, Redux for Single Page Application development.
Developed Spark-based data pipelines to automate log analysis tasks. Leveraged Spark's parallel processing capabilities to efficiently process and analyze log files, extracting valuable insights and generating reports or alerts. This automation significantly reduced the time and effort required for frequent log dives.
Used React and Redux for web UI
Combined Spark's machine learning libraries with performance testing data to build predictive analytics models. Leveraged Spark's MLlib or Spark ML to train and deploy machine learning models that could predict system behaviour and identify potential performance bottlenecks. This integration facilitated proactive performance optimization.
Developing large/complex applications in Angular and TypeScript
Used React-Router to create a single page application. Applied Router Guard to deny unauthorized access.
Environment: Java 1.8, Tomcat, JDK, Log4j, Lombok, Spock, Cucumber, Mockito, Power Mockito, Functional programming, design patterns, ANT build, IntelliJ, GIT, SOA, SOAP, XML, Azure, RESTful WebServices, Microservice architecture, Apache Spark, Apache MlLib .

Amazon, Alexa Devices , Austin, Texas September 2020 December 2021
Role: Java Developer

Responsibilities:
Implementing rich UI applications on HTML5, CSS3, JavaScript (ES6/TypeScript), Angular
Worked on Spring-Boot application using JAVA 8 that aggregates data and metrics for 16 types of ECHO devices. Set up Kafka as a distributed streaming platform to handle real-time data streams. Create a Kafka producer that reads data from the ECHO devices and publishes it as messages to Kafka topics.
Develop a Spark application using Java 8 that consumes data from Kafka topics. Use Spark's RDD and DataFrame APIs to aggregate data and metrics for the 16 types of ECHO devices.
Use Spring Boot and Java to develop REST APIs that interact with the Oracle database. Fetch device information from the Oracle database and process the requests on EMR (Elastic MapReduce) clusters using Spark. Utilize Spring's transaction management to ensure data consistency and reliability.
Used multithreading concepts and Executor framework to manage thread pools to run 200-230 Amazon Athena queries to collect performance percentiles.
Created JUnit test cases using Mockito to provide 98% test coverage used Sonar to identify bugs and check style issues.
Used JAXB to process JSON based responses from RESTful Web Services external to application to collect driver metrics.
Worked on building front end using React JS, HTML, CSS that helps parsing device logs and generates insights for memory and CPU logs that are uploaded through web portal.
Used AWS lambda, AWS Cloudwatch and AWS SQS to create email notification system which creates email whenever certain Cloudwatch event is alarming with log analysis details in email.
Used Node.JS to develop comments repository which automatically triggers test runs interacting with MongoDB.
Created AWS IAM roles and policies to create permissions for AWS resources.
Used AWS S3 to store compressed log files and insights pdf generated for future reference. Generated pre-signed urls to access those.
Created SQL queries, indexes, triggers, sequences for AWS Athena to fetch results.
Created Apache Spark jobs that run on a cluster of Linux machines. These jobs should process and stream performance logs into AWS Athena for further analysis and querying.
Used CI/CD pipelines build and release features and update libraries to minimize security risks.
Maintained and optimized the performance of web applications, identifying and addressing bottlenecks as necessary.
Developing react components which communicate with redux as session management and ajax calls to send and retrieve data.
Environment: Java, Spring-Boot,J2EE (JEE),Apache Spark, Apache Kafka, Big Data, Elastic Map Reduce,AWS Athena, AWS Lambda, Web Services, AWS SQS, React, Javascript , Azure, GIT, REST, JAXB, AWS S3, VSCode, AWS IAM, AWS Cloudwatch, Sonar, Mockito, PowerMockito, Node.JS, Linux

Micron Technologies, Virginia Manassas May 2020 August 2020
Role : Java Developer Intern

Responsibilities:

Developed a web application using Angular, HTML, Javascript and CSS to automate filling of excel sheets for Micron engineers saving man hours by 183 hrs/ week. Designed and deployed the whole application end to end.
Created a data access layer and used MYSQL persistence to store changes in excel sheets on click of save button.
Used Node.JS to process comments left at excel sheets with MongoDB storage. Also used Express frameworks to expose REST API s to interact with excel sheets using comments on cells.
Wrote SQL queries, indexes, Stored procedures to maintain CRUD operations of the databases.
Developed the front end in Angular and used HandsonTable library to perform excel sheet related functions.
Used JDBC for connecting to MYSQL database and wrote transactions using JPA and used multiple repository interfaces. Managed connections to scale applications.
The Java Spring boot backend to support database interaction using ORM principles and exposed various REST API s for front end to interact (CRUD operations).
Developed another web application to allow Micron employees to submit tickets to the team. This application uses JIRA API s to log description and other details of ticket.
Created a single sign on (SS0) capability using ADFS integration and JWT token with OAuth2.0 to allow all Micron employees to submit tickets to the team.
Used pagination table to display status, person operating on tickets submitted and ETA of completion increasing response time by 25%.
Deployed application on Openshift pods using Docker images of the application. Explored Kubernetes to manage those containers.
Used Log4J for logging the user actions and exceptions to find out causes of system malfunctioning and keep user action logs.
Used IntelliJ and VScode for the development environment.
Used GIT for version control.

Environment: Java, HTML, Javascript, CSS, ADFS, OpenShift, Docker, Kubernetes, MySQL, SQL, Spring, Spring Boot, Log4J, Maven, GIT, Angular, Javascript, HandsonTable, Node.JS, Linux

General Electric, Transportation Bangalore, India July 2015 August 2019
Role : Java Developer

Responsibilities:

Developed a data ingestion application to load real time data coming from locomotives into Oracle 10g database using Apache Camel framework with Spring Boot Integration.
Developed we application reporting locomotive status using JSF, Hibernate, and J2EE technologies to access an Oracle RDBMS 13g RAC database in a multi-developer, configuration-controlled environment.
Wrote transactions using PreparedStatement and JDBC interacting with Oracle database to ingest refined details after processing messages from IBM MQ.
Used JSP templates to define email notification and used velocity library to integrate that functionality with application.
Developed three microservices using Spring boot to categorize three sets of requests type processing based on data source and request structure. All microservices exposed REST endpoints.
The data rate was 4 million messages per day serving 50-52 requests per sec using IBM MQ and event driven architecture.
Utilized JBOSS IDEs for application server environments that included JBOSS AS > 5.0 and JBOSS EAP.
Used Service-Oriented architecture (SOA) principle and OSGI to expose services for decoding, transforming and storing according to data source, type and use cases.
Used Enterprise Integration Patterns like multicast, content-based-router, dynamic router, recipient list and splitter to transform messages.
Deployed all OSGI web services using FUSE platform 6.1 (Apache Camel, Blueprint, IBM MQ, Karaf/OSGi container)
Used caching mechanisms like Hazelcast to cache intermediate states and store results and reduce reads on the database. Explored differences between Hazelcast and Redis before designing the application.
Developed a Spring Boot application to collect stream of real time panel images from locomotives and display them on web portal with a refresh rate of 3 sec. The system supported around 2k locomotives at a time.
Reviewed code and documented designs to drive ETL processes using RedHat technologies for real time data ingestion.
Used log4J for logging, Junit4 for unit tests and JDK8 for development.
Wrote multiple data ingestion queries and used indexes, stored procedures to facilitate data ingestion process as a configuration.
Maintained SQL queries in configuration files to perform database transactions during data ingestion process.
Used HAWTIO dashboards to monitor logs and Karaf containers health.
Used Sonar, Jenkins CI/CD to drive agility and quality in development process.
Used Splunk dashboards to write queries and detect anomalies from logs for driving operation excellence.
Designed and implemented business logic for tiered applications, incorporating JSF (1.2 3.0), EJB (3.0 - 3.2), and JSP (2.0 - 2.3).
Environment : Java/EE1.8/1.7, Spring, Spring-Boot, JDBC, JSP, JBoss Fuse 6.1, Karaf 2.2, Apache Camel, Active MQ, SQL, Oracle 11g, Maven, Jenkins, Sonar, GIT, Spring Boot, Spring, Blueprint, Hazelcast, Redis, JBOSS Developer Studio.


Informatica, Bangalore India, June 2014 June 2015
Role : Java Developer

Responsibilities:
Developed a JAVA application to test integration of Informatica tool with Hadoop clusters running spark jobs with over 120 scenario.
Used Apache Kafka to collect events and process it to start and stop nodes and issue other environment testing commands.
Set up Cloudera agents on Linux based clusters using shell scripts.
Environment : Java JDK7, Apache Kafka, YARN API s, Cloudera management console.
Education
University of Southern California, Los Angeles, Masters in Computer Science 2021
B.M.S College of Engineering, Bangalore, Bachelors in Computer Science 2016
Keywords: cprogramm cplusplus continuous integration continuous deployment machine learning user interface message queue javascript sthree information technology

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];222
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: