
Module 2 Lesson 1: How ChatGPT Processes Prompts
Understanding the technical journey of a prompt from text input to numerical processing and back to text.
11 articles

Understanding the technical journey of a prompt from text input to numerical processing and back to text.

How are businesses actually using those big lists of numbers? In our final lesson of Module 3, we look at semantic search, RECOMMENDATIONS, and the basics of RAG.

Embeddings aren't created by humans; they are learned by machines. In this lesson, we look at the intuition behind how LLMs build their conceptual map of the world.

How does a computer know that a 'King' is like a 'Queen' but not like a 'Kilometer'? In this lesson, we explore Embeddings: the mathematical heart of AI meaning.
Slicing the Data. Understanding how Bedrock breaks docs into 'Chunks' and turns them into 'Vectors' of meaning.
From Tokens to Embeddings. Understanding the mechanics of how a computer 'reads' meaning.
The Memory of AI. Understanding how we store and search 'Meaning' using embeddings and specialized databases.
The Math of Meaning. How to turn human words into a list of numbers that represent their semantic soul.
Choosing your engine. Comparing OpenAI cloud embeddings with local HuggingFace models for speed and privacy.
Hands-on: Build a local knowledge base using ChromaDB and perform semantic queries.
Turning words into math. Understanding the 'Embeddings' that power local semantic search.