Home

Swathi - Java Developer
[email protected]
Location: Farmington Hills, Michigan, USA
Relocation:
Visa:
Sai Swathi Narla


Professional Summary
9 years of experience in IT Industry as a Java/J2EE Developer involving in Analysis, Design, testing of web based and client server multi-tier applications, which use Java/J2EE technologies.
Experienced in developing Web application using Spring Boot and Hibernate ORM using Spring MVC, Spring IOC, and Spring security modules.
Expert in design and development of web-based applications using J2EE technologies like Java, JSP, Servlets, JDBC and XML/XSL.
Experienced in Java 8 features like Streams, Lambda functions, functional interfaces.
Experience with ORM tool Hibernate for Connection pooling, Mappings, Transaction management.
Experienced working in front-end technologies like HTML, CSS, and JavaScript, Angular.
Experienced working with messaging systems(JMS) and REST/SOAP Web Services.
Experienced in implementing single sign-on SSO authentication using SAML.
Expertise in Continuous Integration and Continuous Deployment using Jenkins, Artifactory and SonarQube, GitLab.
3+ years of hands-on working experience in developing applications and integrations with Kafka.
Experienced in Spark with Java for generating huge extracts from Hadoop HDFS, Hive, HBase.
Experienced in working with data warehouse platforms like Vertica and Snowflake.
Experienced in python for generating analytics reports from MongoDB.
Experienced in generating communications to the end user with xPressionapplication and its components like xAdmin, xDesign, xDashboard, xResponse.
Having working experience in Big data, developing spark engines using java.
Used various Angular custom directives and developed reusable components and templates that can be re-used at various places in the application.
Created reusable components and services to consume REST API's using Component-based architecture provided by Angular.
Responsible to Style, look and feel of the web page with SASS that extends CSS with dynamic behavior such as variable, mixing, operations, and functions.
Proficient in creating Kafka producers and consumers in microservice based environment.
Experience in creating event processing data pipelines using Apache Kafka.
Knowledge in developing and deploying enterprise-based applications using major components in Hadoop Ecosystem like HDFS, Map Reduce, YARN, Hive, HBase, Zookeeper, Spark (streaming, SQL), Kafka.
Expert in core Java development with very good hands-on in writing /analyzing SQL, writing stored procedures, functions.
Experienced in object oriented, data structures, collections framework API for framework design in Java.
Expert working in multithreaded Java applications and tuning applications for performance optimization and synchronization.
Solid experience and understanding of architecting, designing, and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.
Worked with ETL pipelines in and out of data warehouses using combination of Java and Snowflake, SnowSQL.
Extensive Experience on core java concepts such as OOPS, Arrays, Collections, Threading, generics, Multiple Exceptions, Multi-Threading, Garbage Collection and Serialization.
Installation and administration of webservers JBossand Tomcat.
Experienced working in relational databases like Oracle, MongoDB, MS SQL Server.
Experienced in production support activities for deployment, bug fixing and troubleshooting for highly critical applications.
Experience in working with Maven to build and generate the jar files to deploy in the WebLogicConsole, Appengineand AWS environments.
Good experience on working with Amazon Web services like EC2, S3, Amazon Cloud Watch.
Experience in application monitoring tools like Dynatrace, Splunk and IBM Tivoli, Grafana, Prometheus, Moogsoft and Kibana.
6+ years experience in Agile based development, mostly into a 2-week sprint methodology.
Experienced working in Waterfall methodology.
Hands on experience in scripting languages in Unix, Linux.
Strong contributor towards knowledge management activities including project documentation, user manuals, component user guides and other technical documentation.
Good experience in coordinating and working with developers (Offshore and Onsite) and End Users in a Team based environment.
Team player with good communication and written skills and excellent programming skills.
Self-motivated with excellent problem-solving skills and ability to learn new technologies and tools quickly.

Skills
Technology: Java, Core Java Object Oriented and multi-threaded Programming, J2EE, JDBC, JMS/MQ Series, JavaScript, JSP, Servlets, XML, HTML, JSON, Spark with Java, Angular, Node.js.
Languages: Java, SQL, PL/SQL, Python, C/C++ (academic experience only), Angular.
Application Frameworks: Spring (IOC, ORM, DAO, AOP), Spring MVC, Spring Security, Spring JDBC,Spring Boot, Apache Kafka.
Data Access Frameworks: JDBC, Hibernate, JPA
SOA: SOAP/RESTful web services
Testing Frameworks: Junit, Mockito
Database: SQL Server, Oracle, MongoDB, IBM UDB
Datawarehouse: Vertica, Snowflake
Messaging: JMS, IBM MQ, Apache Kafka (Messaging and Stream data processing)
Operating Systems: Unix, Windows, Linux
Code Repositories: IBM ClearCase, GitLab
Build/Deployment Tools: Ant, Maven, Jenkins, Artifactory, UDeploy, GitLab
Development Tools: Net Beans, Eclipse, Oracle PL/SQL Developer, Postman, Spring Tool suite STS, Swagger, DB Visualizer, DBeaver
Monitoring Tools: Dynatrace, Splunk, IBM Tivoli, Grafana, Kibana, Moog soft, App dynamics, New Relic
Application Server: Web logic, Apache Tomcat, JBoss, Ansible
Hadoop Ecosystem: MapReduce, HDFS, YARN, Hive, Spark-shell, HBase, Spark Streaming, Spark-SQL, Flume, Sqoop, Kafka Spark-core
Cloud Computing: Amazon EC2, Amazon S3, Amazon Cloud Watch
Methodologies: Agile methods, Waterfall

Professional Experience

TD Ameritrade, Remote July 2022 Present
Role: Full stack Java Developer
Type of Project: Implementation
TD Ameritrade is a broker that offers an electronic trading platform for the trade of financial assets including common stocks, preferred stocks, futures contracts, exchange-traded funds, forex, options, cryptocurrency, mutual funds, fixed income investments, margin lending, and cash management services.

Responsibilities
Analyze, Design, Develop and Maintain Enterprise level Web applications.
Interaction with business users for gathering of new Requirements.
Involved in Solution discussion for all Major & Minor service release requirements.
Participated in all phases of Software Development Life Cycle (SDLC) throughout the implementation of the project.
Providing input into estimating engagement activities and execute engagements following the Agile methodology including SCRUM.
Integrating the existing application with Kafka for producing events, which is consumed by several event driven microservices
Encrypted the Payload and specific fields for the PII data using voltage identities in Kafka.
Implemented different data formatter capabilities and publishing to multiple Kafka Topics.
Monitor the events in the Kafka monitor tool to address any issue and with Kafka 2.0 tool.
Application monitoring using Dynatrace, Grafana, and Kibana, Splunkfor the applications during production support.
Created dashboards in Kibana and Splunk for better debugging during production issues.
Coordinated with technical team for designing architecture of big data platform while maintaining data pipeline.
Customized and maintained integration tools, warehouses, databases, and analytical systems.
Created and monitored dashboard on App dynamics to find the root cause of any production issues by tracking through the transaction of events at method level.
Used Amazon S3 to upload the files for other consumers and download the files from other source systems.
Designed and developed a Restful APIs for different modules in the project as per the requirementusing Jersey andJAX-RSframework.
Designed an ideal approach for data movement from different sources to HDFS via Apache/Confluent Kafka
Build the logical and Physical data model for snowflake and migrated data from Vertica to Snowflake
Worked in creation of internal and external stages in Snowflake and worked on transformations during bulk data load with Copy into.
Redesigned views in snowflake to improve performance and unit tested data between Vertica and Snowflake
All the functionality is implemented using Spring Boot and Hibernate ORM. Implemented Java EE components using Spring MVC, Spring IOC, and Spring security modules.
Used spring framework for integrating the MVC components with business services.
Independently learned and Worked on Angular for Application UI enhancements.
Worked on enhancements and new features of existing single page application which can bind data into specific views using Angular by leveraging MVC pattern to organize controllers, custom directives, factories, and views.
Worked with npm commands and using package. json for managing dependencies of node.js application.
Integrating user facing elements developed in front-end with server-side logic using node.js.
Implemented Data Access Layer (DAL) using Spring and Hibernate ORM tool.
Configured and deployed the application using Appengine, ansible tower.
Built Java applications using Maven and deployed JAVA/J2EE applications using GitLab, Docker, Appengine.
Managing Java files using GitLabApplication.
Contributed to the DevOps to automate the build and deployment process using Gitlab, shell scripting, Appengine.
Used Amazon Cloud Watch to monitor logs of the monitor application and AWS services.
Worked on performing docker image build and deployments on Appengine and ansible.
Performed Web services testing between and third-party applications using Postman.
Preparing unit tests to test all the desired functionality using JUnit, Mockito.
Preparing builds, deploy and Co-ordinate with the release management team to ensure that the proper process is followed during the release.
Used JIRA to track the bugs, issues, and project management.
Used New Relic to effectively analyze the performance of the application like how many requests are coming per day and how fast application is responding.
Used Content Square to analyze the end user digital experience and to get insights into customer behaviors, feelings, and intent at every touchpoint in their journey and enhance the product accordingly.
Experienced in technical application design skills, interpreting functional requirements into technical designs for web applications.
Providing End to End support for the testing activities during System Testing and UAT.
Production support for the application and handling of critical issues in timely manner.
Worked with Engineers and Architects in continuous improvement initiatives.
Environment: Java, J2EE, JDBC, SOAP, RESTful, Spring, Spring IOC, Spring AOP, Spring Boot, Spring MVC, Hibernate, HTML, CSS, JavaScript, XHTML, JSON, XML, Apache Tomcat, UML, JSP, Servlets, Unix shell scripting, SQL, Junit, Mockito, Maven, Eclipse IDE, Postman, Spark, Kafka, Dynatrace, Grafana, Prometheus and Kibana Hadoop, Linux, HDFS, Vertica, Snowflake, Appengine, Ansible, Angular, Kibana, New Relic, Content Square

UPS, Remote December 2021 June 2022
Role: Backend Developer
Type of Project: Implementation
UPS is a global leader in logistics, offering a broad range of solutions including the transportation of packages and shipment; the facilitation of international trade; and the deployment of advanced technology to manage the world of business more efficiently.

Responsibilities
Extensively used core java concepts like Multithreading, Collections Framework, File I/O and concurrency.
Experience in Core Java concepts such as OOP Concepts, Collections Framework, and Exception Handling, I/O System, Multi-Threading, JDBC, Generics.
Used Java8 features in developing the code like Lambda expressions, creating resource classes, fetching documents from database.
Designed and developed business components and integrated with Spring Framework and also developed various reusable utility and helper classes which are used across all modules of the application.
Used design patterns like Singleton, Data Access Objects, Factory and MVC patterns.
Created POJO's and DAO's for the database entities using Spring JPA annotation mappings.
Developed microservices with Spring and tested the application using Spring Boot.
Used Spring Core annotations for Spring Dependency Injection, Spring MVC for Rest API's and Spring Boot for microservices.
Configured Spring security in the application for securing the method calls and restful endpoints.
Development of cloud hosted web applications and REST APIs using Spring Boot with embedded Tomcat.
Implemented swagger for the microservices for documenting Rest API.
Deployed applications into Continuous integration environments like Jenkins to integrate and deploy code on Cl environments for development testing.
Implemented Rest based web service using JAX-RS annotations, Jersey provider implementation.
Wrote build scripts using Mavenfor automated deployment of the application and extensively used GitLab for CI/CD
Developed the persistence layer using Hibernate Framework by configuring the 1:1/1:M/M:M mappings in hibernate files and created DAO and POJO.
Used log4j to print the logging, debugging, warning, info statements.
Fixed defects identified in production/ QA environments and used Jira for tracking the tasks and defects.
Providing End to End support for the testing activities during QA testing and production deployment.
Production support for the application and handling of critical issues in timely manner.
Worked with Engineers and Architects in continuous improvement initiatives.
Environment: Java, J2EE, JDBC, SOAP, RESTful, Spring, Spring IOC, Spring AOP, Spring Boot, Spring MVC, Hibernate, HTML, CSS, JavaScript, XHTML, JSON, XML, Apache Tomcat, UML, JSP, Servlets, Unix shell scripting, SQL, Junit, Mockito, Maven, Eclipse IDE, Postman, Spark, Kafka, Kibana, Linux, Appengine, Ansible, Kibana, New Relic, Content Square


Client: ANZ Australia and New Zealand Banking, Hyderabad April 2021-November 2021
Company: CTS
Role: Backend Developer
Type of Project: Implementation
The Australia and New Zealand Banking Group Limited (ANZ) is an Australian multinational banking and financial services company headquartered in Melbourne. It is Australia's second-largest bank by assets and third-largest bank by market capitalization.

Responsibilities
Involved in the design and development phases of Agile Software Development.
Participated in workshops with business users and key stake holders to gather requirements.
Involves in Sprint planning for the estimation of efforts for user stories and bugs
Designed and developed the REST based Microservices using the Spring Boot.
Developed java modules implementing business rules and workflows using Spring Boot
Used Spring Core for IOC implemented using DI and developing Restful web services.
Consumed SOAP based Web Services to integrate with the Web Application.
Published and consumed Web Services using SOAP, WSDL and deployed it on WebLogic server
Used SOAP based messaging format to transfer requests and responses and validated the request and responses against XML Schema Definition.
Used JERSEY framework to implement the JAX-RS (Java API for XML and RESTful Service).
Implemented Kafka High level consumers to get data from Kafka partitions and move into HDFS.
Used Kafka HDFS connector to export data from Kafka topics to HDFS files in a variety of formats and integrates with Apache Hive to make data immediately available for querying with HiveQL.
Used Elasticsearch as a secondary store to store a copy of data produced in Oracle to take advantage of its rapid search and analytics capabilities.
Used Kibana for data discovery and analysis, as well as a Graph tool.
Integrated Apache Kafka with Elasticsearch using kafka Elasticsearch Connector to stream all messages from different partitions and topics into Elasticsearch for search and Analysis purpose.
Using Sqoop, imported and exported the data from RDBMS into HDFS.
Loaded unstructured data (Log files, Xml data) into HDFS using Flume.
Used HIVE to analyze the partitioned and bucketed data and compute various metrics for reporting.
Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
Primarily focused on the spring components such as Dispatcher Servlets, Controllers, View Resolver.
Involved in designing and developing the JSON, XML Objects with SQL Server.
Used GIT for Source Control and Version Management of all codes and resources.
Designing the communication using xDesign and managing the template configuration in xAdmin and xDashboard.
Migrating the templates across the different xPression application servers.
Used SOAPUI tool extensive to conduct regression and performance testing of our services.
Utilize user stories to develop technical requirements to ultimately deliver new or modified unique expert level programs, features, or modules, in a timely and cost-effective manner.
Prepared author guides which involve explanation of all the templates, components in Confluence pages.
Fixing the defects raised in SIT and UAT environment.
Communicating with other teams in JIRA.
Collaborated with testing team members for software upgrading, customization and patches.
Environment: Java, J2EE, JDBC, SOAP, RESTful, Spring, Spring Boot, Spring MVC, Hibernate, HTML, CSS, JavaScript, XHTML, JSON, XML, Apache Tomcat, UML, SQL, Junit, Mockito, Maven, Eclipse IDE, Postman, SOAPUI, xPression,Spark, Kafka, Dynatrace, Grafana, Prometheus and Kibana Hadoop, Flume, Hive, HBase, Linux, Sqoop, MapReduce, HDFS

Client: TransAmerica-Health, Hyderabad October 2019-March 2021
Company: TCS
Role: Backend Developer
Type of Project: Implementation
The TransAmerica is a US-based insurance project to serve clients and their families with the solutions and support to help them with their long-term wellness.TransAmerica is committed to helping people make the wealth and health connection it s a whole new way of looking at everything that can affect finances and how the little steps we take today can have a big impact tomorrow.

Responsibilities
Involved in analysis and design phase of Software Development Life cycle (SDLC).
Worked on loading data to Data Warehouse system using Hive and HDFS system.
Work in an agile environment and continuously improve the agile processes.
Maintain existing ETL workflows, data management and data query components.
Querying the hive and HBase as per requirement.
Writing Hive HQL statements to query the Hive system based on extraction requirement.
Transforming the extracted data as per the requirement specification and loading the transformed data to Datawarehouse.
Performing complex joins on hive tables based on transformation logic.
Preparing the output report by extracting data from Datawarehouse system.
Preparing the extracts using Spark in Java as per the client requirements.
Used GitHub as version controller and code repository.
Fixing the defects raised in SIT and UAT environment.
Interacting with team for explaining the improvements and new requirement implementations.
Supporting the other team members to resolve their issues and clear the doubts.
Environment: Java, J2EE, JDBC, Spark Java, MapReduce, HDFS, Hive, HBase, Zookeeper, ETL, Maven, Eclipse IDE,Putty,WinSCP, FileZilla, Apache Tomcat, STS, Dev Plus.

Client: My Phoenix Wealth Advisor Portal, Hyderabad January 2019-September 2019
Company: TCS
Role: Backend Developer
Type of Project: Implementation
The My Phoenix Wealth Advisor Portal is a UK-based, off-platform pensions and investment portal for advisors. This portal is launched in December 2019.This portal is launched to help most out of clients' existing, feature-rich, policies.
Responsibilities
Involved in Solution discussion for portal development.
Designing and implementing the business requirements into Java application using Core Java, Threading, Servlets, JSP, Rest Webservices Spring MVC.
Querying the NoSQL database Mongo DB as per requirement.
Performing CRUD operations in Mongo DB using Hibernate.
Completely owned the development of advisor portal login component using SAML for Single sign-on SSO implementation.
Created an application that allows user to login to the portal with Single sign-onthrough Identity Provider using SAML.
Integrated CA API gateway between front end and backend application.
Involved in design,development and deployment of microservices using REST webservices.
Prepared unit testing for all the APIs using Junit and Mockito.
Performed testing of all APIs using Postman Client and document the API info.
Deployment of APIs in AWS environment to perform integration testing.
Preparing the builds using Maven and deploying the builds using Jenkins.
Maintaining code in GitHub using Tortoise GIT or GIT GUI.
Created documentation of all the APIs developed using Swagger.
Provide full time support for System integration testing and User acceptance testing.
Fixing the defects raised in SIT and UAT environment.
Interacting with client for explaining the improvements and new requirement implementations.
Environment: Java, J2EE, JDBC, SOAP, RESTful, Spring, Spring Boot, Spring MVC, Hibernate, Swagger, HTML, CSS, JavaScript, XHTML, JSON, XML, Apache Tomcat, UML, JSP, Servlets, Unix shell scripting, SQL, Junit, Mockito, Maven, Eclipse IDE, Tortoise GIT, Postman

Client: National Employment Savings Trust, Bangalore May 2015-December 2018
Company: TCS
Role: Backend Developer
Type of Project: Implementation
The National Employment Savings Trust (NEST) is a UK government Project. This
pension scheme launched in April 2011.It is UK s first pension scheme aimed specifically at low-to-moderate earners. The Scheme Launch activity (formerly PASLA) was introduced in order to de-risk the impact of the Onset of Employer Duties (OED) in October 2012, from which date employers will be legally required to automatically enroll eligible employees into a qualifying pension scheme.

Responsibilities
Completing product development as per requirements,contributing to team meetings,handling troubleshooting,development and production problems across multiple environments.
Involved in Solution discussion for all Major & Minor service release requirements
Designing the communication using xDesign.
Managing the template configuration in xAdmin and xDashboard.
Migrating the templates across the different xPression servers.
Prepared batch-oriented solutions with usage of Captiva software applications.
Developed and Implemented Servlets and JSPs.
Responsible for creating, reading, updating, deleting the tables in database as per requirement.
Developed the required XML schema documents and implemented the framework for parsing XML documents using DOM and SAX parser.
Experience in reviewing Python code for running the troubleshooting test-cases and bug issues.
Wrote Python scripts to perform analysis on data stored in MongoDB using Pandas framework.
Extracting data by querying MongoDB using python and applying transformation on data and preparing the analysis reports.
Managing the xPression and Java files using ClearCase Application.
Adapted Continuous Integration and Continuous Development process using Jenkins.
Used Artifactory as code repository for continuous deployment process.
Code Quality of developed code is maintained using SonarQube.
Involved in testing of developed Servlets, JSP using Junit.
Preforming the performance testing (NFT) of our application.
Fixing the defects raised in SIT and UAT environment.
Communicating with other teams in HP ALM.
Implementing case management,process management and content management using case360 Toolbox.
Environment: Java, J2EE, JDBC, SOAP, HTML, CSS, JavaScript, XHTML, JSON, XML, Apache Tomcat, JSP, Servlets, Unix shell scripting, SQL, Junit, Ant, Maven, Eclipse IDE, FileZilla, MQ, xPression, Captiva, Python, Pandas, IBM ClearCase, Jenkins, Artifactory, SonarQube.

Education:
Masters in computer science & Engineering University of Detroit Mercy
Bachelors in computer science& Engineering Acharya Nagarjuna University
Keywords: cprogramm cplusplus continuous integration continuous deployment quality analyst user interface message queue javascript sthree database information technology hewlett packard microsoft procedural language California Colorado Delaware

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2036
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: