Summary FRONTEND BACKEND DEVOPS DATA&ETL AI/ML Testing & Agile Azure (Functions, AKS, App Service, Storage Accounts, Key Vault, Application Insights), GCP (Cloud Run, Compute Engine, Pub/Sub, BigQuery, Cloud Storage, Composer,) Professional Experience Atlantic Health Atlantic Health serves nearly half of New Jersey, and as the organization expanded through acquisitions and partnerships with independent providers, workflows became fragmented across different healthcare systems. Our project focused on building a secure, cloud-native e-prescribing and care coordination platform that unified hospitals and physician groups, automated insurance and pharmacy interactions, and reduced prescription delays through real-time, data-driven collaboration among providers, pharmacies, and payers, regardless of which EHR they used. React: ● Migrated Angular components and views into a modular React 18 architecture using TypeScript, React Router, and React Query, implementing a reusable Material-UI design system with multilingual support and full accessibility compliance. ● Rebuilt search and directory modules using mobile-first responsive layouts with CSS Grid, Flexbox, and Material-UI breakpoints, ensuring consistent behavior across browsers and devices. ● Built dynamic forms with React Hook Form, Zod validation, masked inputs, and field-level analytics, using async/await and promises to streamline validation logic. ● Integrated workflows with routing guards, outage handling, and conditional UI toggles tied to feature flags and backend availability. ● Improved performance baselines and optimized Lighthouse scores and core web vitals using code-split routes, dynamic imports, and SSR/CSR tuning. ● Centralized state and cache control using React Query, Redux Toolkit, and Context API to reduce redundant API calls and improve render performance. ● Profiled component bottlenecks using React DevTools and Flame Graphs, optimizing memorization and context propagation to avoid unnecessary re-renders. ● Wrote comprehensive UI and route-level tests with Jest, React Testing Library, and axe-core, automating accessibility checks and validating user flows. ● Instrumented error telemetry and validation feedback via custom React hooks, logging failures and routing issues to support QA and troubleshooting. ● Used GitHub Copilot and Cursor to accelerate scaffolding of reusable components, form hooks, and service layers, reducing boilerplate during migration. Frontend: React 18, Angular 10, TypeScript 4.9, React Router v6, React Query v3, Material-UI v5, React Hook Form v7, Zod v3, Jest v29, React Testing Library v13 Angular: ● Migrated legacy jQuery and Bootstrap views into Angular 10+ modules using RxJS, SCSS, and Angular Material, implementing a shared component library with accessibility and responsive design baked in. ● Refactored monolithic templates into modular feature-based routing with lazy-loaded modules, route guards, and dynamic imports to improve performance and maintainability. ● Built reactive forms with custom validators, masked inputs, and conditional field logic using Form Builder and RxJS operators like switchMap, debounceTime, and combineLatest. ● Implemented centralized error handling, loading states, and retry logic using interceptors and service wrappers for RESTful APIs. ● Created reusable UI components for search, filters, modals, and paginated tables with Angular Material and CDK, supporting keyboard navigation and screen reader compatibility. ● Integrated feature flags and environment-based toggles to support phased rollouts and A/B testing across multiple deployments. ● Tuned performance using OnPush change detection, trackBy functions, and memoized selectors to reduce re-renders and improve responsiveness. ● Wrote unit and integration tests using Jasmine, Karma, and TestBed utilities, covering component logic, service interactions, and route-level behaviors. ● Automated CI/CD pipelines with GitHub Actions and Azure DevOps for linting, testing, and deployment across staging and production environments. Frontend: Angular 10+, TypeScript 4.x, RxJS 6+, Angular Material, SCSS, Jasmine, Karma, GitHub Actions, Azure DevOps FastAPI/Flask: ● Python 3.11 microservices using FastAPI, Flask, Pydantic v2, and SQLAlchemy to support workflows such as content suggestions, audit logging, and real-time notifications. ● Maintained and enhanced internal Go microservices responsible for scheduling, calendar management, and rule-based logic for time slots and overbooking. ● Designed and implemented a notification service to deliver real-time alerts via WebSocket streams and Kafka topics. ● Built secure REST and gRPC APIs using Backend-for-Frontend (BFF) patterns, enforcing OAuth2, idempotency, rate limiting, and circuit breakers for reliable upstream interaction. ● Used PostgreSQL (JSONB-indexed) for structured data and DynamoDB (GSI patterns) for metadata, session state, and toggle-based configuration. ● Enforced RBAC, token scopes, consent validation, and encrypted logging to ensure compliance across backend microservices. ● Improved runtime performance with pgBouncer, async FastAPI endpoints, Redis caching, and configured autoscaling with chaos testing to ensure uptime during peak usage. ● Developed a secure context service to resolve session tokens, scopes, and access flags, enabling context-aware access across modules and departments. ● Built and maintained a content management service handling draft creation, autosave, real-time suggestion injection, version control, and final submission via standardized APIs. ● Created a lightweight feature flag and metadata service using DynamoDB and Redis to control backend behavior across environments, supporting safe rollouts and toggle-based logic. ● Integrated Apache Kafka for event-driven workflows, streaming webhook data into topic-based consumers powering async triggers and job queues. ● Orchestrated background processing using Celery and Redis, implementing dead-letter queues, retry safety, and task chaining for job handling and event-driven workflows. ● Tuned backend performance by optimizing ORM queries, applying Redis caching, and scaling Kafka consumer concurrency to meet peak traffic and ingestion loads. ● Scheduled recurring tasks via Celery Beat to perform nightly syncs, session cleanup, integrity checks, and auto-corrections. ● Wrote comprehensive tests using Pytest, Mock, and factory-based fixtures, achieving 85% test coverage across REST endpoints, Kafka consumers, job workers, and adapter layers. Backend: Python 3.11, Django 4.x, DRF, Node.js 18, Express.js 4, PostgreSQL, DynamoDB, Redis, Kafka, Celery, BullMQ, gRPC, OAuth2, JWT, Pytest, Jest, Docker, Terraform, GitHub Actions, Django/Node.js Django/Express: ● Built Python 3.11 microservices using Django and Django REST Framework (DRF), leveraging ORM models, serializers, and view sets to create modular and maintainable APIs. ● Developed and maintained Node.js (Express) services for scheduling logic and rule-based workflows, integrating REST and event-driven endpoints for interoperability with legacy systems. ● Designed and implemented real-time notification services using WebSocket (Socket.IO) and Kafka for streaming alerts and updates. ● Built secure REST and gRPC APIs using Backend-for-Frontend (BFF) patterns, implementing JWT/OAuth2, idempotency, throttling, and circuit breakers for reliable upstream communication. ● Utilized PostgreSQL (JSONB) and DynamoDB for structured and metadata storage, supporting low latency reads and multi-region durability. ● Implemented RBAC, token scopes, and encrypted audit trails across Django middleware to ensure secure access and logging. ● Optimized performance using pgBouncer, Redis caching, Django async views, and Node.js clustering to handle high concurrency. ● Built context-aware services to resolve session tokens, flags, and scopes for dynamic access control and synchronization. ● Developed content management services with draft creation, autosave, version control, and standardized API submission. ● Implemented feature flag and metadata services using DynamoDB and Redis to manage rollout controls and configuration overrides. ● Integrated Apache Kafka for event-driven workflows, processing webhook data through topic-based consumers and async workers. ● Orchestrated background processing using Celery (Django) and BullMQ/Redis (Node.js), applying retries, dead-letter queues, and chaining for reliable task execution. ● Tuned backend performance by optimizing ORM queries, applying caching layers, and scaling Kafka consumer groups. ● Scheduled periodic tasks using Celery Beat and Node.js cron jobs for syncs, cleanup, and reconciliation. ● Wrote unit and integration tests using Pytest, unittest, and Jest to achieve 85% coverage across APIs, consumers, jobs, and middleware. Backend: Python 3.11, Django 4.x, DRF, Node.js 18, Express.js 4, PostgreSQL, DynamoDB, Redis, Kafka, Celery, BullMQ, gRPC, OAuth2, JWT, Pytest, Jest, Docker, Terraform, GitHub Actions AI/ML: ● Designed Retrieval-Augmented Generation (RAG) workflows using LangChain, Pinecone, and LangFlow, combining hybrid search, reranking, and prompt chaining to improve relevance and accuracy ● Implemented graph-based orchestration using LangGraph with guardrails, retries, and streaming token output, using LangGraph UI for step-level telemetry and debugging ● Applied NLP techniques using spaCy, NLTK, and Hugging Face Transformers for summarization, cue extraction, and style alignment, with built-in evaluation for accuracy and drift detection ● Used NumPy, Pandas, and Matplotlib for data profiling, evaluation scoring, and slicing embedding outputs during pipeline development and model A/B testing ● Added confidence scoring, fallback-to-human review, and redaction mechanisms before persistence, and implemented prompt templates with Pydantic-typed variables to prevent injection and format drift ● Integrated LangChain Expression Language (LCEL) to chain modular tools and ensure deterministic routing between retrievers, summarizers, and post-processors ● Optimized retriever performance using multi-vector stores and hybrid filters with metadata in Pinecone to improve retrieval precision ● Connected Azure AI Foundry and OpenAI GPT-4 endpoints for model orchestration, prompt management, and observability of API usage ● Developed FastAPI microservices exposing RAG and summarization endpoints with async inference, streaming responses, and structured logging ● Built evaluation dashboards in Streamlit to visualize similarity scores, latency metrics, and output ranking comparisons across different retrieval and LLM configurations AI/ML : Python 3.11, LangChain, LangGraph, LangFlow, Pinecone, Chroma, FAISS, OpenAI GPT-4, Azure AI Foundry, Hugging Face Transformers, spaCy, NLTK, Pandas, NumPy, Matplotlib, Streamlit, FastAPI DATA : ● Contributed to real-time ingestion of streaming events using Apache Kafka as a bridge between FastAPI services and AI pipelines, reducing latency and improving responsiveness across workflows ● Contributed to distributed ETL jobs using PySpark on Azure Databricks and Delta Lake to process streaming data from Kafka, with schema validation, checkpointing, and batch or incremental ingestion via Azure Data Factory pipelines ● Integrated Snowflake for normalized datasets, tuning clustering keys, warehouse sizing, and query performance to support analytical workloads and reporting ● Built Power BI dashboards to track latency, throughput, and data quality across pipelines, monitoring ingestion delays and downstream impact using DAX and dataflows Data: Azure Databricks, PySpark, Delta Lake, Azure Data Factory (ADF), Snowflake, Kafka, Airflow, Great Expectations, Power BI, dbt, Python, Terraform AWS: ● Containerized web services using Docker and Kubernetes, provisioned infrastructure via Terraform, and deployed workloads on AWS ECS Fargate and EKS. Served static content through S3 and CloudFront ● Configured EC2, Application Load Balancer (ALB), and Auto Scaling Groups to host and scale backend services for high-availability workloads ● Automated infrastructure and CI/CD tasks using Python and PowerShell scripts within AWS and GitHub Actions. Covered test execution, security scans, and blue-green deployments, with configuration managed via AWS Secrets Manager and Parameter Store ● Deployed model inference workers on AWS Lambda and containerized services to support bursty traffic patterns. Versioned model artifacts in Amazon ECR and S3 ● Integrated AWS API Gateway and RDS (PostgreSQL) with FastAPI microservices to expose secure REST endpoints and persist transactional data ● Monitored applications using CloudWatch with structured logs, distributed tracing, and real-time alerts to detect regressions and bottlenecks ● Applied cloud networking principles including VPC architecture, route tables, NACLs, gateways, and VPC endpoints to design secure and scalable infrastructure ● Connected AWS Bedrock and OpenAI GPT-4 endpoints for model orchestration, prompt management, and centralized observability of API usage and cost metrics Cloud & Devops: AWS ECS Fargate, EKS, Lambda, EC2, ALB, API Gateway, RDS PostgreSQL, S3, CloudFront, CloudWatch, Secrets Manager, ECR, Bedrock, Docker, Terraform, GitHub Actions AZURE: ● Containerized web services using Docker and Azure Kubernetes Service (AKS), provisioning infrastructure through Terraform, and serving static front-end assets via Azure Blob Storage and Azure CDN ● Configured Azure Virtual Machines (VM Scale Sets) and Application Gateway with Load Balancer to host and scale backend APIs for high-availability workloads ● Automated infrastructure and CI/CD pipelines using Python and PowerShell within Azure DevOps and GitHub Actions, covering unit tests, security scans, and blue-green deployments, with configurations managed via Azure Key Vault and App Configuration ● Deployed inference workers using Azure Functions and containerized services to handle variable traffic, versioning model artifacts in Azure Container Registry (ACR) and Blob Storage ● Integrated Azure API Management and Azure SQL Database with FastAPI microservices to expose secure REST endpoints and manage transactional data ● Monitored distributed applications using Azure Monitor, Application Insights, and structured logging for proactive alerting and performance tuning ● Applied cloud networking principles including Virtual Networks (VNet), NSGs, route tables, private endpoints, and ExpressRoute to design secure and scalable infrastructure ● Coordinated model experiments using Azure ML and managed configuration versions and telemetry for A/B testing and drift monitoring Cloud & Devops: Azure AKS, Azure ML, Azure DevOps, Azure Functions, ACR, Blob Storage, CDN, API Management, Azure SQL, Application Gateway, Key Vault, Terraform, Python, PowerShell GCP: ● Containerized web services using Docker and Azure Kubernetes Service (AKS), provisioning infrastructure through Terraform, and serving static front-end assets via Azure Blob Storage and Azure CDN ● Containerized web services using Docker and Kubernetes, provisioning infrastructure via Terraform, and deploying workloads on Google Kubernetes Engine (GKE) and Cloud Run, serving static content through Cloud Storage and Cloud CDN ● Configured Compute Engine instances with HTTP(S) Load Balancer and Managed Instance Groups to host and autoscale backend services for high-availability workloads ● Automated infrastructure provisioning and CI/CD pipelines using Python and gcloud SDK scripts within Cloud Build and GitHub Actions, implementing test execution, security scans, and blue-green deployments with secrets managed in Secret Manager ● Deployed model inference workers using Cloud Functions and Cloud Run for scalable, event-driven execution, with container images versioned in Artifact Registry and model artifacts stored in Cloud Storage ● Integrated API Gateway and Cloud SQL (PostgreSQL) with FastAPI microservices to expose secure REST endpoints and persist transactional data with minimal latency ● Monitored distributed microservices using Cloud Operations Suite (Stackdriver), enabling structured logging, distributed tracing, and proactive alerts for performance regressions ● Applied knowledge of VPC design, firewall rules, Cloud NAT, Private Service Connect, and hybrid connectivity (Interconnect, VPN) to design secure, scalable, and compliant architectures ● Integrated Vertex AI, PaLM API, and OpenAI endpoints for model orchestration, prompt lifecycle management, and centralized monitoring of token usage and latency metrics using Cloud Logging and Cloud Monitoring Cloud & Devops: GKE, Cloud Run, Compute Engine, Vertex AI, PaLM API, Cloud Build, Cloud SQL PostgreSQL, API Gateway, Artifact Registry, Secret Manager, Cloud Logging, Cloud Monitoring, Terraform, Python Fidelity Fidelity manages large-scale investment, retirement, and wealth management platforms that serve millions of retail and institutional clients. As part of the Data Modernization and Analytics Enablement program, our team focused on building secure, self-service analytics tools and cloud-native pipelines to improve transparency across portfolio operations, compliance, and client insights. The initiative aimed to unify fragmented reporting workflows, automate validation of financial and operational data, and accelerate decision-making for investment managers and business analysts through real-time dashboards, governed APIs. ReactJs ● Built dashboards using React 17, TypeScript 4.7, and React Router v6, integrating secure REST APIs and applying a reusable Material-UI v5 design system aligned to accessibility standards. ● Developed dynamic form workflows with Formik, Yup, and custom async validation hooks. ● Implemented caching and memoized selectors using React Query to reduce redundant API calls and improve render performance. ● Delivered self-service analytics and validation dashboards using Streamlit 1.x to reduce manual verification cycles. ● Wrote unit and integration tests with Jest and React Testing Library to increase UI coverage across forms, charts, and routing flows. ● Improved bundle performance with Webpack 5 and code splitting, reducing initial load time by 30%. ● Introduced ESLint and Prettier rules across teams to standardize code quality and enforce type safety through TypeScript generics. ● Collaborated with UX and QA to address accessibility gaps and color-contrast issues, ensuring full compliance with accessibility standards. Frontend: React 17, TypeScript 4.7, React Router v6, React Query v3, Material-UI v5, Formik, Yup, Streamlit 1.x, Jest, React Testing Library, Storybook, WCAG 2.1 AA Angular ● Maintained and refactored Angular 10 tools using RxJS, Angular Material, and shared design patterns for consistency across modules. ● Implemented reactive forms with validation services, input masking, and dynamic field behaviors. ● Optimized load times using lazy-loaded routes, PreloadAllModules, and performance profiling with Angular DevTools. ● Wrote unit and component tests with Karma, Jasmine, and Jest to validate UI states and reduce regression risk. ● Migrated legacy AngularJS pages into Angular 10 modules, removing jQuery dependencies and improving runtime consistency. ● Used RxJS Subjects and BehaviorSubjects to manage app-wide state and handle asynchronous data refresh cycles. ● Implemented routing guards, role-based component access, and global error interceptors to improve reliability. ● Enhanced bundle size and Lighthouse score by tree-shaking Angular Material imports and replacing redundant third-party utilities. Frontend: Angular 10, TypeScript, Angular Material, RxJS 7.x, Reactive Forms, Jest, Karma, Jasmine FastAPI / Flask ● Built Python 3.9 microservices using FastAPI, Flask, Pydantic v1.10, and SQLAlchemy to support dashboards, logging, and data services. ● Implemented async background jobs with Celery, Redis, and asyncio for enrichment, caching, and indexing workloads. ● Refactored and migrated legacy JavaEE Core APIs to FastAPI, consolidating business logic and improving latency under high load. ● Documented endpoints with Swagger/OpenAPI and integrated contract, load, and security tests into CI pipelines. ● Created shared libraries for input validation, exception handling, and authentication middleware to standardize behavior across services. ● Developed BFF-style REST APIs for frontend clients, consolidating multiple service calls and enforcing data shape consistency via Pydantic models. ● Optimized database queries and connection pooling using pgBouncer and async session management. ● Implemented lightweight caching with Redis and custom invalidation rules to reduce repeated data fetches across high-frequency endpoints. Backend: Python 3.9, FastAPI 0.95, Flask 2.0, Pydantic v1.10, SQLAlchemy, Alembic, Uvicorn, Celery, Redis, asyncio, Pytest Django / ExpressJS ● Developed Django REST Framework APIs using ORM models, serializers, and view sets for modular and maintainable service layers. ● Implemented async task orchestration using Celery with Redis, applying retries, chaining, and dead letter handling for data syncs and ETL jobs. ● Built and maintained Node.js (Express) utilities for scheduling, synchronization, and event relay, improving interoperability with legacy systems. ● Enforced OAuth2, RBAC, and encrypted audit trails in Django middleware to ensure secure access and compliance. ● Optimized performance using pgBouncer, Redis caching, and Node clustering to handle high concurrency. ● Integrated PostgreSQL (JSONB) for flexible metadata storage and partial indexing for frequently queried fields. ● Developed health checks, readiness probes, and admin dashboards to monitor background tasks and queued events. ● Wrote unit tests and mock fixtures using Pytest and unittest.mock, achieving 85% coverage across serializers and view layers. Backend: Python 3.9, Django 4.x, DRF, Node.js 18, Express 4, Celery, Redis, PostgreSQL, OAuth2, JWT, Pytest, Jest Data ● Contributed to the development of scalable ingestion and transformation pipelines using AWS Glue, PySpark on EMR, and dbt, enabling robust data modeling and lineage tracking. ● Helped stream real-time events using Kafka, Kinesis, and RabbitMQ, delivering curated datasets into Snowflake with schema evolution and lineage visibility. ● Participated in enriching structured datasets through Python and dbt transformations to support analytics and reporting use cases. ● Assisted in integrating Palantir Foundry to unify fragmented datasets and build dashboards and predictive models for operational insights. ● Contributed to pipeline monitoring using Great Expectations and BI tools (Power BI/QuickSight), tracking data latency and validation metrics. ● Supported automation of partition compaction and clustering strategies in Snowflake to enhance batch query performance. ● Built and maintained Airflow DAGs to orchestrate ETL jobs, enabling SLA-based alerting and dynamic dependency resolution. ● Developed metadata ingestion scripts to sync external schemas into Glue Data Catalog and maintain lineage visibility. Data & Analytics: AWS Glue, EMR (PySpark), dbt, Snowflake, Kafka, Kinesis, RabbitMQ, Palantir Foundry, Great Expectations, Power BI, QuickSight AWS ● Containerized microservices with Docker, provisioned infrastructure using Terraform and CloudFormation, and deployed workloads on EKS, ECS Fargate, and Lambda. ● Integrated API Gateway, ALB, and Aurora (PostgreSQL) with backend services for scalable, fault-tolerant execution. ● Automated CI/CD pipelines via GitHub Actions and Jenkins, adding blue-green deployment, rollback, and coverage gates. ● Configured S3 and CloudFront for versioned controlled rollouts and cache control, ensuring zero-downtime releases. ● Monitored distributed systems using CloudWatch and Splunk, implementing structured JSON logs and trace propagation. ● Applied IAM, KMS, VPC endpoints, and WAF to meet security and compliance requirements. ● Implemented ECR for container image management, integrating vulnerability scans and retention policies. ● Developed infrastructure modules for multi-account deployment and parameterized Terraform workspaces. ● Used AWS Secrets Manager and Parameter Store to centralize configuration, reducing risk of credential sprawl. Cloud & DevOps: AWS EKS, ECS Fargate, Lambda, API Gateway, ALB, Aurora (PostgreSQL), S3, CloudFront, CloudWatch, Glue, EMR, Kinesis, Terraform, CloudFormation, GitHub Actions, Jenkins Azure ● Deployed services on Azure Kubernetes Service (AKS) and Azure Functions, managing infrastructure with Terraform and Azure DevOps. ● Used Azure Data Factory and Databricks (PySpark) for ingestion and transformation pipelines integrating with Snowflake. ● Monitored workloads with Azure Monitor and Application Insights, enabling proactive detection of performance regressions. ● Configured Azure Key Vault, App Configuration, and Application Gateway for secure secrets and routing control. ● Served React/Angular apps via Azure Blob Storage and Azure CDN with versioned rollouts and cache headers. ● Implemented CI/CD pipelines using GitHub Actions integrated with Azure DevOps, triggering container deployments to AKS clusters. ● Used Managed Identities for secure service-to-service authentication, removing reliance on static credentials. ● Built data processing notebooks on Azure Databricks for exploratory analysis and debugging ETL issues across staging environments. Cloud & DevOps: Azure AKS, Azure Functions, ADF, Databricks, Azure Monitor, Application Insights, Key Vault, Application Gateway, Blob Storage, CDN, Terraform, Azure DevOps GCP ● Deployed backend workloads to Cloud Run and Compute Engine, provisioning infrastructure through Terraform. ● Orchestrated PySpark jobs with Cloud Composer (Airflow) and processed datasets on Dataproc for reporting and analytics. ● Hosted React applications on Cloud Storage and Cloud CDN, optimizing caching and delivery across regions. ● Implemented monitoring and alerting through Cloud Logging and Cloud Monitoring for full-stack observability. ● Experimented with BigQuery and Vertex AI for large-scale data analysis and model experimentation. ● Configured Secret Manager and IAM roles for secure pipeline orchestration within CI/CD workflows. ● Implemented cross-project networking and private service connections for data exchange between GCP and AWS accounts. ● Used Artifact Registry to host versioned Docker images for reproducible builds across environments. Cloud & DevOps: GCP Cloud Run, Compute Engine, Dataproc, Cloud Composer (Airflow), Cloud Storage, Cloud CDN, BigQuery, Vertex AI, Cloud Logging, Cloud Monitoring, Terraform Security & Operations ● Participated in on-call rotations, triaging incidents and executing hotfix deployments to maintain uptime and SLA targets. ● Implemented structured JSON logging, correlation IDs, and trace propagation for full-stack observability across services. ● Secured APIs with OAuth2, OpenID Connect, and SAML SSO, and enforced RBAC policies tied to entitlements. ● Used AWS KMS, private VPC subnets, and WAF rules for encryption, isolation, and threat mitigation. Authored runbooks, postmortems, and onboarding guides in Confluence; collaborated in Agile Scrum with QA and product teams. ● Enhanced monitoring dashboards in Splunk to visualize latency spikes and dependency bottlenecks across microservices. ● Automated health checks and log scrapers to detect anomalies in job queues and ETL stages. Integrated alert routing with Slack and ServiceNow, ensuring timely response to production incidents. Security & Monitoring: Splunk, OAuth2, OpenID Connect, RBAC, SAML SSO, KMS, VPC, WAF, CloudWatch, Confluence, Agile Scrum. Worked on a modernization program focused on digitizing public-health workflows and improving interoperability between clinical systems. Designed and developed internal web applications that streamlined provider registration, claims submission, and health-record validation using a secure API gateway. Implemented data integration between multiple state health systems, improving accuracy and reducing manual reconciliation. The solution automated recurring data exchange, simplified case tracking, and enhanced reporting for agency operations. Contributed to the development of enterprise applications that automated financial workflows for billing, payments, and audit tracking across departments. Helped design APIs and batch processes to connect legacy mainframe systems with newer microservices for real-time data exchange. Developed dashboards and internal tools for transaction monitoring and performance analytics, improving operational efficiency. The system reduced manual processing and enabled faster, data-driven decision-making for financial operations.