POC: Build a Pizza Order Chatbot with n8n and LangChain
In this post you’ll learn how to build a proof-of-concept (POC) conversational ordering assistant using n8n and LangChain components. The workflow demonstrates how an AI agent can respond to menu inquiries, accept orders, and check order status by connecting ChatGPT, memory, and simple HTTP endpoints for product and order management.
Why this POC matters
Businesses that want to automate simple customer interactions need a reliable, transparent, and auditable pipeline. Using n8n and LangChain nodes you can prototype a conversational ordering system without heavy engineering. This approach keeps logic visible in the workflow UI while leveraging OpenAI for natural language understanding.
High-level architecture
The POC workflow contains the following building blocks (visible in the template and diagram):
- When chat message received: chat trigger that starts the workflow when a user sends a message.
- AI Agent: LangChain agent configured with a system prompt (Pizzaro the pizza bot) that orchestrates tools and memory.
- Chat OpenAI: the language model used to generate responses and understand intent.
- Window Buffer Memory: short-term memory so the bot can keep context within the conversation window.
- Get Products: HTTP tool to fetch available menu items.
- Order Product: HTTP POST tool to create an order.
- Get Order: HTTP tool to retrieve the status of an existing order.
- Calculator: a simple tool to compute totals, conversions, or multiply quantities if needed.
Core design principles
1. Keep intent handling in the agent
The AI agent receives the user’s message and decides which tool to call (menu lookup, order creation, or order status). Use a clear system prompt that maps user intents to available tools. Example instructions used in the POC: “If a customer asks about the menu, provide the available products. If placing an order, confirm details and call the order tool.”
2. Keep state small and accessible
Window Buffer Memory stores recent messages to provide context (e.g., previous clarifying questions). This prevents the need for a heavy database in small POCs while supporting multi-turn conversations.
3. Use HTTP tools for integration
Get Products, Order Product, and Get Order are simple HTTP endpoints. In a production setup these would be backed by your catalog and order microservices, a Google Sheet, or a database. For POCs you can point these endpoints at n8n webhook receivers, mock endpoints, or simple serverless functions.
Walkthrough: setting up the workflow
Step 1 — Create the Chat trigger
Add the “When chat message received” node. Configure the initial greeting and make the webhook public if you want easy testing. Example initial message used in the template:
Hellooo! 👋 My name is Pizzaro 🍕. I'm here to help with your pizza order. How can I assist you?
📣 INFO: If you’d like to order a pizza, please include your name + pizza type + quantity. Thank you!
Step 2 — Add the AI Agent
Connect the trigger to the AI Agent node. Configure the agent with a system prompt that defines behavior, such as:
Your name is Pizzaro, and you are an assistant for handling customer pizza orders.
1. If a customer asks about the menu, provide information on the available products.
2. If a customer is placing an order, confirm the order details, inform them that the order is being processed, and thank them.
3. If a customer inquires about their order status, provide the order date, pizza type, and quantity.
Also attach the Chat OpenAI node as the language model and the Window Buffer Memory node as memory input.
Step 3 — Add integration tools
Attach the following HTTP tools as agent tools:
- Get Products (GET) — returns catalog JSON. Useful when a user asks “What’s on the menu?”
- Order Product (POST) — accepts an order payload. The agent will POST to this endpoint once the user confirms order details.
- Get Order (GET) — fetches order details given an order id or user info.
- Calculator — optional helper for totals, discounts, or quantity math.
Step 4 — Wire responses and formatting
Design your endpoints to return structured JSON. Example responses:
GET /webhook/get-products
Response: { "products": [{"id": 1, "name": "Margherita", "price": 9.99}, ...] }
POST /webhook/order-product
Request: { "name": "Alice", "productId": 1, "quantity": 2 }
Response: { "orderId": "ORD-1001", "status": "processing", "date": "2025-09-01" }
Structured responses make it easier for the AI agent to extract fields and confirm order details back to the user (order date, type, quantity).
Testing the flow
To test the system:
- Trigger the chat webhook and say “Show me the menu” — the agent should call Get Products and list available pizzas.
- Send an order: “My name is Sam. I want 2 Margheritas.” — the agent should parse details, confirm, call Order Product, and return a confirmation with order id and status.
- Ask for status: “What’s the status of my order ORD-1001?” — the agent should call Get Order and return date, pizza type, and quantity.
Tips for improving accuracy
- Use a strict system prompt and examples to teach the agent when to call each tool.
- Return consistent JSON shapes from your HTTP tools so the agent can reliably parse fields.
- Validate user-provided data (names, quantities) and ask follow-up clarifying questions when fields are missing.
- Limit the memory window to recent turns and relevant facts to avoid model drift.
Security and production considerations
For production deployments:
- Protect webhooks with authentication (HMAC, tokens) and limit public exposure.
- Sanitize inputs before sending to back-end systems.
- Log transactions and maintain an order audit trail in persistent storage.
- Rate-limit API calls to OpenAI and your own endpoints to avoid unexpected costs.
Extending the POC
This POC is intentionally modular. Common extensions include:
- Payment integration (Stripe) after order confirmation.
- Delivery tracking by connecting courier APIs to Get Order.
- Personalization by storing customer preferences in a longer-term database.
- Analytics dashboards for order volume and popular items.
Wrap-up
This n8n + LangChain POC demonstrates how to quickly build a conversational order assistant that understands menu queries, takes orders, and checks order status. The visual nodes make it easy to iterate and share with stakeholders while the agent + tools pattern keeps responsibilities clear.
Ready to try the template?
Import the provided workflow into n8n, configure your OpenAI credentials, and point the HTTP tools to mock endpoints or your real services. If you want a starter template or help configuring endpoints, leave a comment below or reach out — I can share example webhook handler code or a sample Google Sheet integration.
Call to action: Import the template, test a few scenarios, and share your results. Want help customizing the system prompt for your business? Reply and I’ll help tailor it.
