AI Template Search
N8N Bazar

Find n8n Templates with AI Search

Search thousands of workflows using natural language. Find exactly what you need, instantly.

Start Searching Free
Oct 20, 2025

Automate Crypto Price Recording with n8n

Automate Crypto Price Recording with n8n: Step-by-Step Learning Guide This tutorial shows you how to build and understand a complete n8n workflow that automatically records cryptocurrency prices. You will learn how to receive price data via webhook, convert it into embeddings, store it in a Supabase vector database, validate it with a RAG agent, log […]

Automate Crypto Price Recording with n8n

Automate Crypto Price Recording with n8n: Step-by-Step Learning Guide

This tutorial shows you how to build and understand a complete n8n workflow that automatically records cryptocurrency prices. You will learn how to receive price data via webhook, convert it into embeddings, store it in a Supabase vector database, validate it with a RAG agent, log approved records to Google Sheets, and get Slack alerts when something goes wrong.

What you will learn

By the end of this guide, you will be able to:

  • Configure an n8n webhook to receive crypto price updates
  • Split, embed, and store text data in a Supabase vector table
  • Use a RAG-style agent with Anthropic (or another LLM) to validate prices
  • Append clean, validated records to Google Sheets for analysis
  • Set up Slack alerts for workflow errors
  • Apply best practices for schema design, security, and cost optimization

Why automate crypto price recording?

Manually logging cryptocurrency prices is slow, inconsistent, and easy to mess up. Automation with n8n and a vector database gives you:

  • Reliable historical logs that are always captured in the same format
  • Semantic context around each price event using embeddings
  • Smarter queries using a RAG agent to search and interpret past data
  • Simple analysis in Google Sheets for dashboards or exports
  • Instant visibility into failures via Slack alerts

In practice, this means you can store more than just numbers. You can save rich context (such as notes or trade conditions) and then search for similar past situations when you see a new price event.

Prerequisites

Before you start, make sure you have:

  • An n8n instance (cloud or self-hosted)
  • A Cohere API key for generating embeddings
  • A Supabase project with a vector-enabled table
  • An Anthropic API key (or another compatible chat model credential)
  • A Google Sheets account with OAuth credentials configured in n8n
  • A Slack app token or bot token for sending alerts

Concepts and architecture

Core idea of the workflow

The workflow follows a simple pattern:

  1. Receive a new crypto price event
  2. Convert its text into embeddings and store it in Supabase
  3. Use a RAG agent to compare with historical data and validate the new event
  4. If valid, append it to a Google Sheet
  5. If something fails, notify a Slack channel

Key components used

The n8n template is built from these main nodes and services:

  • Webhook Trigger – receives POSTed price data
  • Text Splitter – breaks long text into smaller chunks
  • Cohere Embeddings – converts text into vectors
  • Supabase Vector Store – stores and queries embeddings
  • Window Memory and Vector Tool – provide context to the agent
  • Chat Model (Anthropic) with a RAG Agent – validates and decides what to log
  • Google Sheets Append – writes validated rows to a sheet
  • Slack Alert – sends error notifications

How the RAG agent fits in

RAG (Retrieval-Augmented Generation) combines a language model with a vector database. In this workflow:

  • Supabase stores embeddings of past price events
  • The Vector Tool retrieves similar historical entries for the current event
  • Window Memory keeps recent context across workflow runs
  • The Chat Model uses both to decide if the new price looks valid and what status it should have in your log

Step-by-step: building the n8n workflow

Step 1 – Create the Webhook Trigger

First, set up an HTTP POST webhook in n8n that will receive incoming price data.

  • Node type: Webhook
  • HTTP Method: POST
  • Path: /record-crypto-prices

External services like price feeds, trading bots, or custom scripts will send JSON payloads to this endpoint. A typical payload might look like:

{  "symbol": "BTC",  "price": 61500.12,  "timestamp": "2025-10-20T12:34:56Z",  "source": "exchange-api"
}

Security tip: In production, protect this webhook by:

  • Requiring a secret header or token
  • Restricting IPs or using an allowlist

Step 2 – Split long text with the Text Splitter

If your payload includes extra context such as trade notes, orderbook snapshots, or additional metadata, the content may be long. Embedding very long text can be inefficient and costly, so you use the Text Splitter node to break it up.

  • Node type: Text Splitter
  • Recommended settings:
chunkSize: 400
chunkOverlap: 40

These values are a good balance for many use cases. Smaller chunks usually give more precise matches when you later query the vector database for similar events.

Step 3 – Generate embeddings with Cohere

Next, convert each chunk of text into a vector representation using Cohere.

  • Node type: Embeddings (Cohere)
  • Model: embed-english-v3.0

Each chunk becomes a numeric vector (an array of floats). These embeddings allow you to compute semantic similarity. For example, you can later ask: “Find previous BTC price snapshots with similar context to this one.”

Step 4 – Insert and query vectors in Supabase

Once you have embeddings, store them in a Supabase table that supports vector search.

  • Node types: Supabase Insert, Supabase Query (or combined operations depending on your setup)
  • Target table/index name (template): record_crypto_prices

A typical record structure looks like this:

{  id: uuid(),  symbol: "BTC",  price: 61500.12,  timestamp: "2025-10-20T12:34:56Z",  source: "exchange-api",  text: "BTC price 61500.12 from exchange-api",  embedding: [float array]
}

In the workflow:

  • The Insert operation writes the new embedding and metadata to Supabase.
  • The Query operation retrieves the nearest vectors when the agent needs historical context, for example, to compare the current price to similar past events.

Step 5 – Configure Vector Tool and Window Memory

To let the RAG agent use your Supabase data, you expose the vector store as a tool and add memory.

  • Vector Tool: Wraps the Supabase vector store so the agent can run similarity searches as needed.
  • Window Memory: Keeps a short history of recent interactions or processing steps.

Together, they allow the agent to reason with:

  • Recent workflow context (from Window Memory)
  • Relevant historical records (from the Vector Tool)

Step 6 – Set up the Chat Model and RAG Agent

The heart of the workflow is the RAG agent, powered by a chat model such as Anthropic. It receives:

  • The new price payload from the webhook
  • Relevant past vectors from Supabase
  • Short-term context from Window Memory

Node types:

  • Chat Model (Anthropic or compatible LLM)
  • RAG Agent (or an agent node configured to use the vector tool and memory)

Configure the agent with a clear system message so it knows its job. For example:

You are an assistant for "Record Crypto Prices".
Validate price entries and prepare the row for Google Sheets with a Status column.
Flag anomalous prices and ensure price is numeric and within expected ranges.

The agent should be responsible for:

  • Checking if the incoming price is valid or suspicious
  • Enriching the record if needed (for example, adding notes)
  • Deciding whether the record should be logged to Google Sheets
  • Returning a structured status message (such as “approved”, “rejected”, or “needs_review”)

Step 7 – Append approved records to Google Sheets

Once the agent approves a record, the workflow appends a new row to a Google Sheet. This becomes your easy-to-use audit log.

  • Node type: Google Sheets – Append
  • Sheet name (template): Log

Map the following columns from the agent output and original payload:

  • Timestamp
  • Symbol
  • Price
  • Source
  • Status (for example, Approved, Rejected, Anomaly)

This structure makes it simple to:

  • Filter by symbol or date
  • Review only failed or suspicious entries
  • Export data to BI tools or dashboards

Step 8 – Add Slack alerts for error handling

Errors are inevitable, so it is important to know when they happen. Attach an error path from the RAG Agent (or another critical node) to a Slack node.

  • Node type: Slack – Send Message
  • Channel: for example, #alerts

Use a message template like:

Record Crypto Prices error: {{ $json["error"]["message"] || "Unknown error" }}

This way, you are notified quickly when:

  • Supabase is unreachable
  • The LLM request fails
  • Google Sheets cannot be updated

Designing your Supabase schema

To support vector search, you need a Supabase table with a vector column. For Postgres with pgvector, you might use:

CREATE TABLE record_crypto_prices (  id uuid PRIMARY KEY,  symbol text,  price numeric,  timestamp timestamptz,  source text,  text text,  embedding vector(1536)
);

Make sure the vector dimension matches your embedding model. For embed-english-v3.0 from Cohere, use the appropriate dimension documented by Cohere. Adjust column types if your use case needs additional metadata.

Accuracy, performance, and cost tips

To keep the workflow reliable and cost-effective, consider these practices:

  • Choose a suitable chunk size. Values between 300 and 800 characters usually work well. Smaller chunks increase precision but create more embeddings.
  • Deduplicate payloads. Before inserting into Supabase, check for identical or very similar payloads to avoid unnecessary vector bloat.
  • Archive or rotate old vectors. If storage costs grow, move older data to cheaper storage or a separate table, keeping only recent or high-value records in the main vector index.
  • Tune the RAG prompt. Be explicit about validation rules, such as:
    • Price must be numeric
    • Timestamp must be a valid ISO 8601 string
    • Price should be within a realistic range for the symbol
  • Log key events. Consider logging decisions or anomalies to a separate sheet or observability tool for easier debugging.

Security and best practices

Since you are handling API keys and external calls, follow these guidelines:

  • Use n8n credentials. Store API keys and tokens in n8n’s credential manager instead of hard-coding them in nodes or expressions.
  • Secure the webhook. Use a secret header, token validation node, or IP restrictions to limit who can send data.
  • Restrict Google Sheets scopes. Grant only the minimum permissions needed to append rows to the target sheet.
  • Review Slack alert rules. Make sure you send alerts only for meaningful errors to avoid alert fatigue.

Testing and rollout checklist

Before using this workflow in production, walk through this checklist:

  1. Webhook test: Send sample JSON payloads to the webhook and verify that n8n receives them.
  2. Supabase insert verification: Confirm that new records and embeddings are inserted into the record_crypto_prices table.
  3. Similarity queries: Run a Supabase Query node to check that nearest-neighbor searches return sensible historical records.
  4. RAG edge cases: Test malformed or incomplete payloads, such as missing prices or invalid timestamps, and confirm the agent handles them as expected.
  5. Google Sheets logging: Check that rows are appended with correct values, including the Status column.
  6. Slack error simulation: Temporarily break a credential or node to trigger an error and verify that a Slack alert is sent.

Use cases and extensions

Once this pipeline is running, you can extend it beyond simple logging:

  • Semantic search for events. Ask questions like “Find similar BTC volatility spikes” or “Show events with large price jumps near FOMC announcements.”
  • Automated reporting. Use the RAG agent to summarize daily or weekly price movements and send a report to Slack or email.
  • Trading strategy inputs. Feed the cleaned, validated historical snapshots into backtesting tools or live trading systems.

Recap: how the template works

Here is the full flow in plain language:

  1. A service sends a crypto price payload to your n8n webhook.
  2. The workflow optionally splits long text, embeds it with Cohere, and stores the embedding plus metadata in Supabase.
  3. A RAG agent, backed by Anthropic and the Supabase vector store, checks the new data against historical context.
  4. If the record looks valid, the workflow appends it to a Google Sheet with a clear Status.
  5. If something fails along the way, a Slack message alerts you.

This pattern gives you a robust, auditable, and extensible crypto price recording system.

Quick FAQ

Do I have to use Cohere and Anthropic?

No. The template uses Cohere for embeddings and Anthropic for the chat model, but you can swap them for other providers as long as they integrate with n8n and you adjust the vector dimension and node configuration accordingly.

Can I log extra fields, such as volume or exchange name?