MCP Course #8 - Build MCP SSE Client & Server, Deploy on Google Cloud

MCP Course #8 - Build MCP SSE Client & Server, Deploy on Google Cloud

2.450 Lượt nghe
MCP Course #8 - Build MCP SSE Client & Server, Deploy on Google Cloud
Get the code here - MCP Client - https://theailanguage.com/onlySubscribers?id=mcp_client&site=github MCP Server - https://theailanguage.com/onlySubscribers?id=terminal_server&site=github Note - Please Subscribe, allow pop-ups and then login to The AI Language website to access our GitHub Repos. Access is available only to our YouTube subscribers. Thanks! Video for setting up the project on GCP https://youtu.be/AfarS5kOk-U?si=EGGRWVlluYo2p5x-&t=270 Udemy Course (get completion certificate, practice questions, Q&A) https://www.udemy.com/course/modelcontextprotocol/?referralCode=6FADE0F85C5DB97203C6 CHAPTERS - 00:00 Introduction 00:29 Preview of what we’ll build 03:20 Objective 03:49 What is MCP? 05:00 STDIO vs SSE Connection 06:01 Setup project directories 07:37 Install Python & UV 12:33 Setup environment 15:01 MCP Server Code Walkthrough 23:58 MCP Client Code Walkthrough 38:24 Dockerfile walkthrough for MCP Server 39:20 Test Local MCP Server and Client 41:57 Build, Push and Deploy MCP Server to Google Cloud 44:24 Test Google Cloud Run Remote MCP Server and Client Build & Deploy an SSE-Based MCP Client and Server with Python, Docker, and Google Cloud Run In this step-by-step video, we build a full MCP (Model Context Protocol) server and client setup using Server-Sent Events (SSE) for real-time communication. We go from setting up the local dev environment all the way to deploying your MCP server on Google Cloud Run using Docker, Uvicorn, Starlette, and the FastMCP framework. Whether you're building AI-powered tools, LLM agents, or custom dev workflows — this guide is your starting point! 📚 What You’ll Learn ✅ Overview of MCP What is Model Context Protocol? How MCP clients communicate with servers Key tools: @mcp.tool() decorators, FastMCP, SseServerTransport 🔁 STDIO vs SSE Understand the pros and cons of both connection methods Why SSE is ideal for long-lived, shared server deployments (e.g. Cloud Run) 🛠️ Environment Setup Create a Python project using uv (a blazing-fast Python package manager) Set up project directories for terminal_server_sse.py, client_sse.py, and requirements.txt 📦 MCP Server Implementation Use FastMCP to register tools like: add_numbers(a: float, b: float) run_command(command: str) Create an async Starlette app to expose /sse and /messages/ endpoints Run the server locally with uvicorn 🧪 MCP Client Code Use httpx-sse to connect to the MCP server over SSE Send tool call requests and listen for streamed responses Test real-time communication locally 🐳 Docker + Cloud Deployment Write a minimal Dockerfile using python:3.11-slim Build and tag your Docker image for GCP: docker build --platform linux/amd64 --no-cache -t gcr.io/your-project-id/mcp-sse-server . docker push gcr.io/your-project-id/mcp-sse-server Deploy to Google Cloud Run: gcloud run deploy mcp-sse-server \ --image gcr.io/your-project-id/mcp-sse-server \ --platform managed \ --region asia-south1 \ --port 8081 \ --allow-unauthenticated 🌐 Remote Testing Run the MCP client and connect to the Cloud Run server: python client_sse.py https://your-cloud-run-url/sse 📦 Python Packages Used: starlette uvicorn[standard] httpx, httpx-sse mcp (from mcp-lang or your custom lib) 🙌 Like, Share & Subscribe 👍 If you found this helpful, give it a like! 📢 Share with other developers or AI enthusiasts 🔔 Subscribe for more AI + backend dev content every week! #python #MCP #CloudRun #Uvicorn #Starlette #Docker #AIAgents #SSE #ServerSentEvents #BackendDevelopment #LLMTools #FastMCP #GoogleCloud