This 6-hour intensive syllabus is designed to take participants from the foundational
mechanics of Natural Language Processing (NLP) to intermediate and advanced applications of Generative AI. By consolidating key topics into focused, one-hour modules,
learners will explore the LLM ecosystem, Retrieval-Augmented Generation (RAG), AI
Agents, Multimodal Vision, and Parameter-Efficient Fine-Tuning (PEFT)
Course Syllabus
6 modules • 15 topics
1
Module 1: Foundations of Generative AI & NLP (1 Hour)
2 topics
Gateway to Generative AI: Exploring New Frontiers
Decoding Language: The Mechanics of NLP and LLMs
2
Module 2: The LLM Ecosystem &Tooling (1 Hour)
3 topics
Open Source AI Powerhouse: Exploring Hugging face
Harnessing the LLM APIs and Open AI Ecosystem
AI Tools: Building, Moderating, and Leveraging AI
3
Module 3: Context-Aware Systems & Lang Chain (1 Hour)
2 topics
Intro to Lang chain: Enhancing LLMs with Contextual Memory
Dynamic AI with RAG: Building Context-Aware Systems
4
Module 4: Advanced RAG & Vector Stores (1 Hour)
2 topics
Understanding Embeddings
Advanced RAG Techniques and Vector Stores
5
Module 5: The Age of AI Agents (1 Hour)
3 topics
The Final Level of Gen AI: Agents
Agents Evolution: Connect to the Web and Integrate Tools
Professional Experience
Ingenero Technologies (for SABIC, Saudi Aramco)
As a Data Scientist, I specialized in developing predictive models for industrial applications. My primary focus was on optimizing boiler operations for our client, SABIC.
Work Performed:
Developed and deployed a Generalized Linear Model (GLM) to accurately forecast NOx emissions, achieving an 80% reduction in forecast errors through advanced data preprocessing and feature engineering.
Built robust forecasting models that achieved a Mean Absolute Percentage Error (MAPE) below 6% consistently over six months.
Created a live assessment tool using predictive analytics to identify and categorize critical parameters, which helped the team resolve the top three causes of operational delays.
Tools & Technologies: Python, SQL, Predictive Modeling, Machine Learning, AWS, Sagemaker, Scikit-Learn, Statistical Analysis.
Transorg Analytics
During my time as a Data Analyst, I delivered data-driven solutions for clients in hospitality, finance, and manufacturing, focusing on enhancing sales, optimizing inventory, and reducing fraud.
Work Performed:
For IHCL (Taj Hotels), I analyzed and improved a recommendation system, boosting additional menu item sales by 25% through effective cross-selling and upselling strategies. I also created custom Business Intelligence (BI) dashboards to facilitate their data-driven marketing decisions and improve operational efficiency across their properties.
For Dalmia Cement, I executed demand forecasting by using advanced algorithms, which optimized inventory levels and enhanced strategic planning across more than 10 of their distribution centers.
For the RBL Bank Fraud Analytics Department, I designed and proposed rules for a new banking dashboard to monitor credit card activity, helping to prevent over $500,000 in potential annual fraud losses.
Tools & Technologies: SQL, Python, Advanced Forecasting Algorithms, Business Intelligence (BI) Dashboards, Data Analysis.
The Hackett Group
Americold: Outbound Pallet Prediction with Macroeconomic Indicators
This project involved creating a sophisticated time-series forecasting tool to predict outbound pallet volumes for Americold, a leader in cold storage logistics. The model's key innovation was its ability to incorporate macroeconomic indicators as external factors to improve prediction accuracy.
Project Details:
Interactive Dashboard: Built a user-friendly web application using Streamlit that allows users to upload data, configure model parameters, select target variables, and visualize forecasting results in real-time.
Advanced Forecasting Model: Implemented the SARIMAX (Seasonal AutoRegressive Integrated Moving Average with eXogenous regressors) model to capture seasonality, trends, and the influence of external economic factors on pallet volume.
Automated Model Optimization: The solution features an automated grid search capability to find the optimal SARIMAX parameters. It evaluates numerous model candidates and selects the best one based on a composite score that balances accuracy (R², MAPE) and model complexity (AIC/BIC).
Performance & Caching: To ensure speed and efficiency, the application uses an SQLite-based caching system to store model results, preventing re-computation for identical requests. It also leverages parallel processing (joblib) to accelerate the model search.
Tools & Technologies: Python, Streamlit, Pandas, Statsmodels, Plotly, Scikit-Learn, SQLite.
Z-Brain: Agentic Flow Builder Architecture
I designed the architecture for the Z-Brain Flow Builder, a core feature for creating and automating complex workflows using AI agents. The system allows users to build structured processes from natural language queries by leveraging predefined components, knowledge bases, and functions.
Architectural Highlights:
Microservice-Based Design: The system is built on a microservices architecture, where a user's query is processed through a series of specialized services, including a prompt-improving library and an intent classifier.
Intent-Driven Workflow: An Intent Classifier interprets the user's goal (e.g., Plan, Build, Update) and routes the request to the appropriate sub-flow.
Modular & Composable: The builder dynamically assembles workflows by combining different Flows (e.g., Plan Flow, Build Flow), accessing information from Knowledge Bases (like Pieces JSON KB), and executing Functions (such as Router or Code). This modularity makes the system highly flexible and scalable.
Tools & Concepts: Agentic AI, Microservices Architecture, Intent Classification, RAG (Retrieval-Augmented Generation), Knowledge Base Integration, JSON.
Additional Projects
Ninjacart - Computer Vision: Developed a Convolutional Neural Network (CNN) for image classification, achieving 91.1% test accuracy by using data augmentation and transfer learning with ResNet-152v2.
Zee - Movie Recommender System: Engineered a hybrid movie recommender system using collaborative filtering and content-based methods. The system was deployed on Streamlit Cloud and served over 10,000 users.
Advanced Question Answering: Created an ensemble of Large Language Models (ALBERT, RoBERTa) that achieved a 95.10% F1-score on a question-answering task, surpassing the baseline by 3.8%.
Custom GPT Chatbot: Deployed a multi-modal chatbot platform supporting up to four LLMs simultaneously. It featured a RAG system and integrated capabilities for voice (Whisper), text-to-speech, and image generation (DALL-E).