| Sai Hemanth Paturi - Sr .Net Fullstack Developer |
| [email protected] |
| Location: Charlotte, North Carolina, USA |
| Relocation: Yes |
| Visa: Green Card |
| Resume file: Sai_Hemanth .Net_Developer_1773842681301.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Sai Hemanth Paturi
Sr .Net Full Stack Developer +1 (704) 997-9426 | [email protected] |Linkedin PROFESSIONAL SUMMARY: I am a Sr .Net Full Stack Developer with over 11 years of experience delivering machine learning solutions, Conversational AI, and predictive analytics across banking, healthcare, and government sectors. Senior .NET Full Stack Developer with over 11+ years of experience delivering complex, secure, and scalable solutions across banking, healthcare, retail, and insurance domains. Proficient in both Agile and Waterfall SDLC models, collaborating in sprint planning, retrospectives, UAT cycles, and cross-functional team workflows using JIRA, TFS, and Azure Boards. Expertise in C# (5.0 11.0), .NET Core (2.2 7), ASP.NET Core Web API, and Web API 2, designing layered, modular, and reusable service-oriented backends for mission-critical systems. Designed enterprise platforms using modern architectural patterns like Microservices, N-Tier, MVC, and 3-Tier, ensuring scalability, modularity, and separation of concerns across services. I have participated on working with corss functional data scientists with working on buiolding Intelligent Chatbot Solutions using Microsoft Bot Framework, Microsoft Copilot Studio, and Google Cloud Contact Center AI (CCAI) to build natural, engaging, and intuitive user experiences. Conversational AI Specialist with 4+ years of dedicated experience in NLP tuning, intent matching, and RAG (Retrieval-Augmented Generation) to develop state-of-the-art intelligent agents and custom copilots. Expert with proven success in migrating and supporting legacy SharePoint On-Prem systems to M365 (SharePoint Online), integrating AI-driven logic via Power Automate and Microsoft Graph API. Lead for Complex AI Initiatives from problem framing and data discovery through feature engineering, model validation, and production rollout, ensuring full alignment with stakeholder expectations. Designed End-to-End ML Pipelines combining data ingestion, transformation, and monitoring using Python, Scikit-Learn, Spark MLlib, and cloud-native services (Azure, AWS, GCP). FinTech & Healthcare Specialist in credit risk automation, patient stratification, and portfolio analytics, leveraging robust classification models and explainable AI techniques for regulated environments. Advanced Statistical Analyst utilizing ANOVA, hypothesis testing, and variance diagnostics to validate model assumptions and ensure interpretability in high-governance settings. I am skilled in architecting and building pipelines that integrate APIs, relational databases, NoSQL stores, and Kafka streams to ingest structured and unstructured data for real-time AI applications. Big Data Ecosystem Expert proficient in Spark, Hadoop, Hive, and HDFS, capable of processing multi-terabyte datasets to support enterprise-scale batch and streaming analytics. I have also modeled optimized cross-validation, hyperparameter tuning, and threshold calibration to maximize precision, recall, and stability in diverse production environments. I have architected end-to-end machine learning solutions, specializing in deploying production-ready AI agents and scalable microservices within high-governance global enterprise environments and cloud platforms. Expert in building autonomous agentic workflows using LangChain and LangGraph, integrating large language models with vector databases for sophisticated RAG pipelines and real-time semantic retrieval to drive high-value business outcomes. Proficient in serving machine learning models as high-performance microservices using FastAPI, managing the entire lifecycle from raw data acquisition and ETL engineering to containerized deployment using Docker and Kubernetes clusters. Strategic MLOps leader implementing robust CI/CD pipelines and automated quality gates within GitHub Actions, ensuring reliable model versioning, experiment tracking, and continuous performance monitoring for mission-critical production AI applications. I am highly skilled in deep data science methodologies, utilizing TensorFlow and PyTorch for model development while applying advanced statistical validation to ensure interpretability and compliance across diverse regulated industry sectors. I have applied emerging AI concepts including Neural Concept Processing to optimize model selection, ensuring that specific machine learning architectures were strategically deployed to meet unique business objectives and complex data requirements. Also developed comprehensive EDA and model validation frameworks to track experiments and ensure robustness, utilizing statistical diagnostic tools to maintain high accuracy and interpretability for all mission-critical production AI deployments. TECHNICAL SKILLS: Category Specialized Skill Set Backend & .NET Stack C# (5.0 11.0) , .NET 7 , .NET Core (2.2 6.0) , ASP.NET Core Web API , ASP.NET MVC 5 , Web API 2.2 , Blazor Server , Razor Pages , WCF (SOAP) , WinForms , SignalR. Frontend & UI Angular (10 18) , AngularJS 1.7 , React 16 , JavaScript (ES5 ES6) , TypeScript (4.6 5.0) , HTML5 , CSS3 , Bootstrap (3.3 5) , RxJS 7.4 , Redux , WCAG (2.1/Compliance). AI & Agentic Systems LangChain, LangGraph, AI Agents, RAG Pipelines, Neural Concept Processing (NCP) Concepts, Prompt Engineering. Model Serving & APIs FastAPI, RESTful Microservices, Flask, Request/Response Optimization, API Security, Model Compression. MLOps & DevOps Docker, Kubernetes (K8s), GitHub Actions, CI/CD, MLflow, Model Registry, Automated Deployment. Data Engineering Apache Spark (PySpark), AWS Glue, Databricks (dbx), Kafka, ETL/ELT Pipelines, Feature Engineering, SQL. Machine Learning Scikit-learn, TensorFlow, PyTorch, Keras, EDA, Hyperparameter Tuning, Experiment Tracking. Cloud & Observability AWS (SageMaker, Bedrock, Lambda), Azure, OpenTelemetry (OTel), Jaeger, CloudWatch Logs, Metrics. Vector & NoSQL Pinecone, Milvus, FAISS, OpenSearch, DynamoDB, MongoDB, Snowflake, PostgreSQL. PROFESSIONAL EXPERIENCE: Client: Citi Bank, NYC, NY August 2023 Present Role: Sr .Net FullStack Developer Responsibilities: Practiced Agile methodology with JIRA for sprint planning, story grooming, and daily stand ups while collaborating closely with BAs, QA teams, and product owners. Designed the platform using Microservices Architecture, separating presentation, application, domain, and data access layers for clear separation of concerns and maintainability. Built scalable and secure RESTful APIs in ASP.NET Core 7.0 and C# 11 to support loan onboarding, credit scoring, KYC, and approval workflows. Implemented centralized exception handling, middleware based logging, and policy based authorization to enforce role and access control across endpoints. Developed and documented RESTful endpoints for KYC checks, loan eligibility, and approval routing, ensuring seamless integration with internal risk and underwriting engines. Used Entity Framework Core 7 with repository and unit-of-work patterns for structured ORM and implemented Dapper selectively in reporting modules for performance optimization. Standardized data exchange using JSON with schema validation and serialization policies using System.Text.Json to maintain consistency across services. Introduced GraphQL using Hot Chocolate for Angular dashboards, allowing nested data retrieval for loan approvals, reducing overfetching and improving performance. Published and secured APIs via Azure API Management Gateway, managing throttling, versioning, and IP filtering for internal enterprise consumers. Integrated OAuth2 with Azure AD B2C, issuing JWT tokens and applying role-based access to loan officers, underwriters, and support staff. Migrated frontend from Angular 12 to Angular 16, gradually replacing legacy components with reusable Standalone Components for internal dashboards. Optimized AWS DynamoDB schemas and ETL pipelines to handle high velocity credit data facilitating efficient feature engineering and data acquisition for real time risk scoring and sophisticated generative AI workflows. Implemented advanced RAG pipelines using Azure OpenAI and vector search to ensure grounded high accuracy responses integrating semantic retrieval layers to enhance the reliability of autonomous agents within the ecosystem. Architected secure cloud environments using AWS VPC and IAM roles to protect sensitive financial data ensuring that all AI model deployments adhered to strict governance security and high compliance standards. Maintained robust CI/CD pipelines via GitHub Actions implementing automated quality gates and experiment tracking to accelerate the transition from initial prototype to continuous improvement for all production grade AI services. Utilized the full AI coding spectrum and modern frameworks to generate modular object oriented Python code prioritizing rigorous unit testing and documentation to support regulatory audits and long term reliability. Performed rigorous exploratory data analysis and statistical validation to improve the interpretability of AI driven risk detection ensuring that all deployed models met enterprise level performance thresholds and regulatory requirements. Served complex machine learning models as high concurrency microservices via FastAPI implementing request optimization and API security protocols to ensure seamless integration with internal enterprise applications and external stakeholder platforms. Managed the end to end MLOps lifecycle by containerizing AI workloads with Docker facilitating consistent environment replication and rapid scaling across development staging and production tiers within diverse global infrastructures. Collaborated with cross functional Agile teams to identify high impact use cases participating in sprint planning and reviews to take AI concepts from raw data acquisition through to deployments. Developed custom Microsoft Copilot plugins and intelligent agents leveraging the Graph API to retrieve enterprise data significantly improving worker efficiency by automating information retrieval across Outlook Teams and SharePoint environments. Evaluated emerging AI frameworks and model selection strategies to optimize hardware utilization ensuring that specific architectures like Neural Concept Processing were applied correctly to meet unique financial business objectives. Built secure authentication mechanisms using OAuth and AWS Cognito for AI driven applications ensuring that high privilege operations and sensitive data access remained strictly controlled within the identity management framework. Documented all data models API specifications and agentic reasoning flows to ensure transparency and ease of maintenance fostering a culture of continuous learning and technical excellence across the entire organization. Environment: Python, SQL, FastAPI, Docker, Kubernetes, LangGraph, LangChain, React, Azure OpenAI, AWS Bedrock, Pinecone, PySpark, Databricks, Kafka, RabbitMQ, EventBridge, DynamoDB, Oracle, SQL Server, GitHub Actions, OpenTelemetry, ANOVA. Client: UHG OPTUM, Minnetonka, MN May 2022 July 2023 .Net Fullstack Developer Responsibilities: I have launched conversational AI solutions and generative AI agents using AWS Bedrock and Google Cloud Contact Center AI to synthesize clinical documentation and patient risk inquiries across various healthcare platforms. I have streamlined healthcare workflows by integrating AI driven logic with Power Automate and the Power Platform replacing legacy manual processes with intelligent automation to improve worker efficiency and clinical productivity. I have constructed stateful AI agents using LangGraph to synthesize insights from massive datasets of claims and provider notes creating intuitive natural and engaging chatbot experiences for clinical teams within production. I have created RAG approaches within healthcare data pipelines to support population health initiatives and standardized patient data ingestion using FHIR and HL7 standards while ensuring absolute data integrity every day. I have integrated React based web applications and custom dashboards to visualize patient stratification models and monitor chatbot performance metrics for medical and analytics teams to ensure reliable AI driven support. I have formulated scalable ML infrastructure on AWS using EventBridge and DynamoDB to process multi terabyte clinical datasets in a decoupled secure and compliant environment optimized for high throughput and performance. I have orchestrated complex workflows with Apache Airflow and implemented automated CI CD pipelines to ensure the security stability and reproducibility of distributed healthcare systems across development and production cloud environments. I have managed the end to end MLOps lifecycle by containerizing AI workloads with Docker facilitating consistent environment replication and rapid scaling across development and production tiers within healthcare cloud infrastructure. I have served complex healthcare models as scalable microservices using FastAPI ensuring seamless integration with clinical applications and maintaining low latency performance for real time patient risk assessment and documentation tools. I have collaborated with cross functional teams to translate high level business needs into production ready AI solutions ensuring all models are continuously improved and aligned with enterprise level documentation standards. I have engineered robust ingestion pipelines connecting APIs S3 SQL Server and Kafka streams ensuring timely availability of clinical and provider datasets for advanced modeling and predictive analytics within secure environments. I have performed large scale data transformation and NLP tuning using PySpark and Pandas to address outliers and categorical encoding across diverse patient cohorts ensuring high quality features for downstream models. I have implemented feature quality and integrity checks including drift detection and bias assessment utilizing CloudWatch and EvidentlyAI to ensure compliance driven validation cycles for all mission critical healthcare machine learning. I have selected and optimized ML algorithms like XGBoost and Random Forest for clinical risk scoring focusing on model reliability high throughput processing and strict regulatory compliance within highly regulated environments. I have produced comprehensive technical documentation including model cards schema definitions and deployment procedures to support audits and ensure operational readiness for engineering and compliance teams while maintaining high documentation standards. Environment: Python, SQL, FastAPI, Docker, LangGraph, LangChain, R, C#, .NET, Power Platform, Power Automate, Google CCAI, AWS Bedrock, Oracle, Hive, MongoDB, HDFS, Hadoop, MapReduce, Kafka, Airflow, React, FHIR, HL7, MS Visio, Distributed File Systems. Client: State of VA, Richmond, Virginia April 2019 May 2022 Role: .Net Application Developer Responsibilities: I have architected horizontal data integration frameworks to unify diverse public domain datasets into a single source of truth for statewide real time analytics and high level strategic public sector decision support. I have engineered secure multi tier storage ecosystems for public domain data within AWS VPC ensuring strict isolation and adherence to state level compliance standards for sensitive and open government datasets. I have implemented production grade classification models using public domain records and scaled Python utilities with SciPy and NumPy to enhance machine learning workflow performance for targeted statewide public health initiatives. I have designed and deployed integration layers for public domain reporting and analytics dashboards using Power BI and Tableau translating complex government datasets into actionable insights for senior leadership and stakeholders. I have orchestrated full scale data engineering pipelines for public domain sources using Python and Spark while utilizing infrastructure as code for automated cloud resource provisioning within the enterprise data environment. I have formulated distributed machine learning solutions using public domain records and Spark MLlib resulting in a significant improvement in analytical model precision for complex and large scale public sector initiatives. I have managed the modernization of legacy public domain repositories and integrated automated data extraction workflows to support statewide diagnostic reporting and data accessibility across multiple vertical agencies and departments successfully. I have formulated cloud ready CI CD workflows for public domain data using Dockerized components AWS EMR and GitHub Actions enabling automated deployments and real time monitoring via CloudWatch and dashboards. I have facilitated large scale distributed architectures using AWS DynamoDB and EventBridge to process high volume public domain datasets while optimizing transformations to reduce operational costs and improve data processing efficiency. I have collaborated with cross functional teams to create comprehensive architectural diagrams for public domain data systems to support state audits and regulatory reviews while maintaining high technical documentation standards daily. I have integrated Kafka and Spark Streaming to process real time public domain feeds from various vertical agencies ensuring that the centralized analytics platform provided immediate insights for critical government operations. I have maintained data integrity across Snowflake and Teradata environments by designing optimized schemas that supported high performance querying of public domain data for downstream machine learning and enterprise reporting workloads. I have transformed unstructured public domain data from multiple government sources into structured formats suitable for advanced analytics and predictive modeling using PySpark and Hive within a distributed and scalable infrastructure. I have served as a technical lead for data migration projects moving on premise public domain data to cloud native AWS architectures while ensuring zero downtime for critical statewide reporting and analytics. I have applied advanced feature engineering techniques to improve the accuracy of predictive models used for public domain resource allocation and trend analysis across various diverse and complex government data sources. Environment: Python, SQL, FastAPI, Docker, R, C#, .NET, SharePoint, Power BI, PySpark, Spark MLlib, AWS EC2, S3, EMR, Lambda, EventBridge, DynamoDB, Hive, Hadoop, HDFS, Snowflake, Teradata, Oracle, Kafka, Spark Streaming, GitHub Actions, Jenkins. Client: American Airlines, Fort worth, TX December 2017 Mar 2019 Role: Software Developer. Responsibilities: I have engineered Python based operational automation tools and data processing frameworks that improved system efficiency by 32% while supporting conversational AI agents for flight operations and real time analytics. I have designed microservices driven architectures using Python and React to ensure scalable fault tolerant integration across scheduling and crew management systems for seamless data visualization and interactive user experiences daily. I have integrated automated flight extraction workflows with enterprise SQL Server and Oracle databases utilizing Microsoft Graph API to pull crew datasets and operational logs for downstream machine learning logic. I have created and deployed REST APIs using Flask and Django to expose machine learning enabled insights and business logic to airline applications ensuring secure and scalable consumption across internal platforms. I have orchestrated near real time ingestion and processing of operational event streams by integrating Python applications with Spark Kafka and Hadoop components to drive predictive modeling and airline scheduling analytics. I have optimized application performance through vectorization and concurrency achieving significant reductions in runtime for high volume processing tasks involving both structured and unstructured aviation data for complex machine learning models. I have implemented predictive features and rule based engines leveraging scoring logic and clustering to support decision making for operations planning and flight delay risk reduction across the global network. I have containerized Python services using Docker for deployment on cloud based environments ensuring consistent runtime performance and portability for mission critical machine learning models and aviation data processing tools. I have managed CI CD pipelines using Git and Jenkins to enforce continuous delivery standards and improve release reliability for mission critical airline systems and predictive analytics applications during production. I have produced detailed API specifications and conversational flow diagrams in Visio to facilitate long term maintainability and team handoffs for complex machine learning and front end development projects daily. I have collaborated within an Agile Scrum methodology contributing to sprint planning and backlog refinement to deliver high value automation features and machine learning models for twenty four hour airline operations. I have developed interactive front end components using React to visualize real time flight delay predictions and crew scheduling data providing operational teams with intuitive dashboards for fast decision making processes. I have formulated advanced feature engineering techniques within Python to improve the accuracy of airline risk models ensuring that flight operations data was properly prepared for high performance machine learning inference. I have transformed legacy data processing scripts into modular Python packages supporting both batch and stream processing for machine learning workflows and real time operational reporting for the entire airline organization. I have maintained data integrity across Cassandra and Oracle environments by designing optimized schemas that supported high performance querying of aviation records for downstream machine learning and front end analytics. Environment: Python, React, JavaScript, SQL Server, Oracle, Cassandra, FastAPI, Flask, Django, Docker, Jenkins, Git, Spark, Kafka, Hadoop, PyTest, Matplotlib, MS Visio, Agile Scrum, Linux, Windows, JSON, XML. Client: IBM, India. June 2014 May 2017 Role: Software Developer Responsibilities: I have engineered distributed data processing architectures on Linux using Python and C# components ensuring scalable integration with internal telemetry platforms and high performance RDBMS logging systems for engineering teams. I have created RESTful APIs using Flask and Django to expose machine learning telemetry metrics and automation results enabling seamless integration with monitoring systems and internal engineering applications for stakeholders. I have optimized data retrieval by managing structured and unstructured datasets across SQL Server and MySQL through the design of optimized queries and indexing strategies to improve database performance significantly. I have implemented predictive analytics features using Python object oriented programming to provide anomaly detection and performance summaries derived from large scale telemetry and log data for various internal systems. I have streamlined diagnostic workflows by optimizing Python processing pipelines on Linux with vectorization and concurrency significantly reducing latency and improving throughput for high volume machine learning and automation tasks. I have formulated containerized services using Docker and maintained CI CD pipelines using Jenkins and Git to ensure standardized deployments across development and production environments for all mission critical tools. I have collaborated in an Agile Scrum environment with Program Managers and QA teams to deliver backend enhancements and machine learning automation features while participating in sprint planning and reviews. I have crafted comprehensive unit tests using PyTest and monitored application stability through logging frameworks and threshold based alerts to ensure mission critical system performance and reliable machine learning inference. I have produced detailed technical documentation including API specifications and architectural diagrams in Visio to support cross team knowledge transfer and auditing for complex machine learning and data engineering projects. I have designed lightweight front end interfaces using React to visualize telemetry summaries and anomaly detection results providing engineering teams with a clear view of system health and performance metrics. I have transformed raw telemetry logs into structured datasets suitable for machine learning using NumPy and SQL Alchemy while maintaining data integrity across diverse SQL and Linux based file systems. I have utilized Linux shell scripting to automate database maintenance and model deployment tasks ensuring that all RDBMS environments remained optimized for high concurrency and heavy data processing workloads daily. I have managed SharePoint repositories by implementing automated data extraction and machine learning reporting tools facilitating seamless information flow between technical teams and leadership for data driven decision making processes. I have applied statistical sampling techniques and exploratory data analysis to validate telemetry data quality ensuring that all diagnostic insights were grounded in accurate and reliable engineering metrics every day. Environment: Python, SQL, Linux, FastAPI, Docker, React, C#, .NET, SQL Server, MySQL, SQL Alchemy, NumPy, Requests, Beautiful Soup, Jenkins, Git, SharePoint, Tableau, MS Visio, Anaconda, Spyder, JSON, XML. Education: Bachelor of Technology in Computer Science and Engineering 2010 2014 GITAM University, India. Keywords: csharp continuous integration continuous deployment quality analyst artificial intelligence machine learning user interface access management business intelligence sthree active directory rlang microsoft mississippi Minnesota New York Texas Virginia |