Home

Naga Sekhar Reddy - Data Scientist / ML Engineer
[email protected]
Location: Buffalo, New York, USA
Relocation: Yes
Visa: GC
Nagasekhar Medikonda
[email protected]
+1 (716) 520 5001
PROFESSIONAL SUMMARY:
Proactively engaged with clients to decipher intricate business use cases, transforming abstract
concepts into precisely defined problem statements. These refined problem statements served as a
foundation for the development team, ensuring clarity and alignment with the clients' objectives.
Demonstrated a keen eye for data relevance by identifying and curating datasets crucial for the
development of predictive models, contributing significantly to solving both internal and external
business challenges. This strategic data selection process laid the groundwork for accurate and
impactful model outcomes.
Actively bridged data gaps within projects by orchestrating the meticulous gathering of data,
designing intuitive annotation portals, and overseeing data annotation through skilled human
annotators. This hands-on approach to data management ensured comprehensive and high-quality
datasets, facilitating robust model development.
Displayed analytical acumen by delving into diverse datasets, uncovering patterns, and
identifying nuanced data transformation and quality needs tailored to specific applications. This
insightful exploration of data paved the way for tailored and effective solutions in targeted
domains.
- Collaborated closely with stakeholders to understand business requirements and translate them
into technical specifications for AI integration, ensuring alignment with organizational goals.
- Led the development of machine learning models, from data preprocessing and feature
engineering to model training and evaluation, resulting in robust and high-performing AI
solutions.
- Worked with IT teams to design and implement infrastructure requirements for AI systems,
leveraging cloud platforms and optimizing computational resources for scalability and efficiency.
- Acted as a mentor to junior team members, providing technical guidance, feedback, and
fostering a culture of continuous learning and innovation within the team.
- Stayed updated with the latest advancements in AI technologies and methodologies,
incorporating them into project designs and driving continuous improvement in AI capabilities.
Cultivated cross-functional collaboration by partnering with various departments, leveraging
data-driven approaches to solve complex problems, identify emerging trends, and unearth
untapped opportunities. This interdisciplinary cooperation enriched the organization's problem solving capabilities and fostered a holistic understanding of business challenges.
Pioneered the development of a comprehensive program encompassing metrics creation, data
collection, modeling, and operational performance reporting. This program served as a
framework for evaluating and optimizing operational processes, ensuring continuous
improvement and data-driven decision-making.
Collaboratively worked across functions, defining problem statements, collecting pertinent data,
building analytical models, and offering informed recommendations.
Maintained transparent and regular communication with leadership, conveying key metrics,
project progress, and other vital indicators. This consistent communication ensured that
leadership remained informed and empowered to make strategic decisions based on real-time
insights and progress reports.
Enhanced AI applications' data retrieval capabilities by managing and optimizing vector
databases like ChromaDB and Pinecone.
Customized and fine-tuned both open-source and paid LLM models to meet specific project
requirements and achieve performance goals.
SKILLS:
Machine Learning HTML, CSS, Vue JS
Deep Learning Flask, Django, Streamlit
Computer Vision C, Java
Natural Language Processing AWS
Python MySQL, Oracle
LangChain VectorDB
Pytorch LLMs (OpenAI, Gemini Pro, Llama2)
Fine-tuning with custom data Azure
EDUCATION:
Bachelor s in computer science and engineering, 2012.
Master s from University at buffalo,2014.
CERTIFICATIONS:
Completed Udacity certification on Full Stack Foundations
Completed Coursera certification on Programming for Everybody.
ACHIEVEMENTS:
Recognized and got appreciated by Clients for delivering efficient solutions.
Innovative AI Solutions: Successfully developed and deployed cutting-edge AI applications,
demonstrating a strong impact in enhancing user engagement and operational efficiency.
EMPLOYMENT HISTORY:
Client: VNS, New York
May 2023 Till Date
Job Title: Generative AI Engineer
Enhanced AI data retrieval by managing and optimizing vector databases like ChromaDB and
Pinecone.
Ensured optimal performance and reliability of AI applications using DataStax Cassandra DB in
production.
Demonstrated expertise in generative AI technologies through scalable solutions with LangChain
and LlamaIndex frameworks.
Customized and fine-tuned LLM models, both open-source and paid, to meet project
requirements and achieve performance goals.
Leveraged AWS Bedrock for deploying AI models, ensuring scalability and reliability through
cloud services.
Applied skills in handling DataStax Cassandra DB for AI-driven applications in production
environments.
Developed scalable AI solutions using LangChain and Llama Index frameworks, showcasing
generative AI proficiency.
Tailored solutions to specific project need and performance goals by implementing and fine tuning LLM models.
Utilized AWS Bedrock for deploying AI models, ensuring scalability and reliability through
cloud services.
Demonstrated expertise in managing vector databases and implementing AI solutions for optimal
performance.
Demonstrated versatility by customizing LLM models to suit the unique demands of various
projects, ensuring optimal outcomes.
Highlighted a multifaceted skill set, combining database management, AI development, and cloud
deployment expertise for comprehensive AI application enhancement.
Conducted extensive literature reviews, delving into scholarly articles and journals to
meticulously strategize a comprehensive plan for the Readmission use case, demonstrating a keen
understanding of the subject matter.
Translated complex deep learning models into visually intuitive insights, leveraging visualization
techniques to enhance the overall performance and interpretability of AI models, thereby
fostering a deeper understanding of the underlying patterns and predictions.
Pioneered the implementation of a robust Federated Learning Architecture, focusing on
preserving user privacy while ensuring efficient collaborative model training across decentralized
nodes, showcasing a strong commitment to data security and confidentiality.
Spearheaded the integration efforts between GATE and Transformers, bridging the gap between
natural language processing and deep learning frameworks, leading to the development of more
sophisticated and contextually aware AI systems.
Conducted in-depth research on the MIMIC4 dataset, meticulously analyzing the data and
strategically incorporating additional features such as severity levels and discharge nodes. These
enhancements significantly boosted the accuracy of the models by an impressive 1%, showcasing
a deep understanding of dataset nuances and domain-specific insights.
Developed a cutting-edge deep learning model using PyTorch, intricately combining Graph
Attention Network, Graph Pooling, Transformer, and Multi-instance Multi-label Classification
techniques. This innovative approach successfully predicted cardiovascular disease (CVD)
readmission risks with a remarkable 75% accuracy during rigorous testing.
Demonstrated technical prowess by converting the trained model into the ONNX format, a step
that not only optimized performance but also offered unparalleled flexibility. This streamlined
integration facilitated seamless interactions with a Django web server, ensuring a responsive and
user-friendly experience for end-users.
Collaborated with cross-functional teams, fostering a spirit of innovation and knowledge
exchange, thereby contributing to a dynamic and intellectually stimulating work environment.
Leveraged collective expertise to push the boundaries of technology and enhance the
organization's overall capabilities.
Actively engaged in continuous learning and professional development, staying abreast of the
latest advancements in the fields of machine learning, artificial intelligence, and data science.
This commitment to ongoing education fostered a culture of excellence and innovation within the
team.
Client: AAFES, Dallas, TX
Jan 2021 Mar 2023
Job Title: Senior AI Engineer
Chatbot using Python.
Chatbot is customer service used for resolving queries 24*7. This application is developed for retail
application.
Roles and Responsibilities:
Actively engaged in the entire Software Development Life Cycle (SDLC) process,
meticulously analyzing business requirements, and gaining an in-depth understanding of
functional workflows. This involvement ensured a seamless transition of information
from source systems to destination systems, enhancing overall system efficiency and
performance.
Played a pivotal role in various web development projects critical to the company's
success. Through active participation in the full life cycle of these projects, consistently
maintained an impressive track record, achieving a nearly flawless 100% rate of on-time
task delivery. This dedication to project timelines showcased a commitment to project
excellence and client satisfaction.
Demonstrated technical prowess by constructing a sophisticated rule-based customer
support chatbot application for an e-commerce portal. This innovative application utilized
web technologies and API-driven backend architecture, leveraging Python web
frameworks like Django. The chatbot significantly enhanced customer interaction and
satisfaction, contributing to the portal's overall user experience and boosting customer
engagement.
Spearheaded the development of new features within applications, strategically designed
to drive sales, increase revenue, and elevate the overall user experience. Through
innovative solutions and feature enhancements, contributed significantly to the
company's revenue streams and customer loyalty, positioning the organization as a
market leader.
Exhibited a strong problem-solving acumen by actively engaging in troubleshooting,
refining, and optimizing codebase. This proactive approach not only improved the
applications' performance but also streamlined operational processes, ensuring optimal
efficiency and user satisfaction.
Proficient in utilizing Azure tools like Blob Storage, Azure Functions, Virtual Machines,
AKS, Logic Apps, and Azure AD for chatbot development, tailored to specific business
needs.
Experience in leveraging Azure services such as Blob Storage, Functions, AKS, and
Logic Apps for end-to-end chatbot solutions, ensuring seamless integration and efficient
operation.
Skilled in assessing project requirements to determine the most suitable Azure services
for chatbot development, ensuring scalability, reliability, and cost-effectiveness.
Proficient in integrating chatbots with Azure services like Azure AD and Storage for
enhanced functionality and usability within the Azure ecosystem.
Proven ability to implement monitoring solutions using Azure Monitor and Application
Insights to track chatbot performance, optimize resource utilization, and improve user
experience.
Client: Abbott Laboratories, NC
Jan 2018 Dec 2020
Job Title: Machine Learning & AI Developer
Breast Cancer Diagnosis using Ensemble Techniques :
Breast cancer is one of the most common and dangerous diseases seen in Indian women. If not identified
at the early stage, it can cause death. In many developed countries it is the second largest disease that is
leading to death in women. The main idea of the project is to identify the disease by considering a few
parameters regarding the disease from the patient and with the advent of machine learning technology
apply a few algorithms on the user data and detect the disease. Here we train the model with data of
previously examined patients. With the thousands of data points and advanced algorithms the model is
producing quite brilliant performance in diagnosis of the disease.
Roles and Responsibilities:
Conducted extensive literature reviews, delving into scholarly articles and journals to
meticulously strategize a comprehensive plan for the Readmission use case, demonstrating a keen
understanding of the subject matter.
Translated complex deep learning models into visually intuitive insights, leveraging visualization
techniques to enhance the overall performance and interpretability of AI models, thereby
fostering a deeper understanding of the underlying patterns and predictions.
Pioneered the implementation of a robust Federated Learning Architecture, focusing on
preserving user privacy while ensuring efficient collaborative model training across decentralized
nodes, showcasing a strong commitment to data security and confidentiality.
Spearheaded the integration efforts between GATE and Transformers, bridging the gap between
natural language processing and deep learning frameworks, leading to the development of more
sophisticated and contextually aware AI systems.
Conducted in-depth research on the MIMIC4 dataset, meticulously analyzing the data and
strategically incorporating additional features such as severity levels and discharge nodes. These
enhancements significantly boosted the accuracy of the models by an impressive 1%, showcasing
a deep understanding of dataset nuances and domain-specific insights.
Developed a cutting-edge deep learning model using PyTorch, intricately combining Graph
Attention Network, Graph Pooling, Transformer, and Multi-instance Multi-label Classification
techniques. This innovative approach successfully predicted cardiovascular disease (CVD)
readmission risks with a remarkable 75% accuracy during rigorous testing.
Demonstrated technical prowess by converting the trained model into the ONNX format, a step
that not only optimized performance but also offered unparalleled flexibility. This streamlined
integration facilitated seamless interactions with a Django web server, ensuring a responsive and
user-friendly experience for end-users.
Collaborated with cross-functional teams, fostering a spirit of innovation and knowledge
exchange, thereby contributing to a dynamic and intellectually stimulating work environment.
Leveraged collective expertise to push the boundaries of technology and enhance the
organization's overall capabilities.
Design and develop AI-powered chatbots using IBM Watson Assistant, leveraging AI, ML, and
NLP technologies for seamless integration with external APIs and cloud functions.
Client: Bayer, St. Louis, MO
Aug 2014 - June 2018
Job Title: Machine Learning Developer
Plant Disease identification using CNN
Agriculture is the most important sector in today s life. Most plants are affected by a wide variety
of bacterial and fungal diseases. Diseases on plants placed a major constraint on the production and a
major threat to food security. Hence, early and accurate identification of plant diseases is essential to
ensure high quantity and best quality. In recent years, the number of diseases on plants and the degree of
harm caused has increased due to the variation in pathogen varieties, changes in cultivation methods, and
inadequate plant protection techniques. An automated system is introduced to identify different diseases
on plants by checking the symptoms shown on the leaves of the plant. Deep learning techniques are used
to identify the diseases and suggest the precautions that can be taken for those diseases.
Roles and Responsibilities:
Demonstrated proactive involvement in daily standup calls, showcasing strong team collaboration
and effective task management skills.
Exhibited a keen analytical mindset by meticulously analyzing and pre-processing data, ensuring
high-quality input for the subsequent CNN model development.
Utilized advanced machine learning techniques to construct a comprehensive Convolutional
Neural Network (CNN) model. This model was meticulously crafted with five essential steps:
Convolution Operation, ReLU Layer, Max Pooling, Flattening, and Fully Connected Layer,
indicating a deep understanding of neural network architectures.
Played a pivotal role in building and training the CNN model, dedicating time and expertise to
achieve an impressive accuracy rate of approximately 80%, underscoring a commitment to
precision and excellence in model development.
Demonstrated practical application of the developed model by creating an intuitive and user friendly application. This innovative application seamlessly processes photos, predicting whether
a plant is diseased or healthy, emphasizing the ability to translate technical knowledge into
impactful real-world solutions.
Showcased a passion for continuous learning and skill development, staying abreast of the latest
advancements in the field of machine learning and implementing best practices into projects.
Actively contributed to a collaborative and innovative team environment, fostering creativity,
knowledge sharing, and a proactive approach to problem-solving, resulting in successful project
outcomes and enhanced team dynamics.
Keywords: cprogramm artificial intelligence machine learning javascript database active directory information technology Missouri North Carolina Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];2974
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: