ETL & Data Pipelines Templates
3 workflow templates for etl & data pipelines automation
Very Quick Quickstart - N8n Template
This workflow retrieves customer data from a mock "Customer Datastore", then extracts and transforms specific fields (id, name, and notes) into a more structured format using the "Edit Fields" node, renaming them to customer_id, customer_name, and customer_description. This prepared data is then likely intended for use in subsequent nodes or services, although the workflow is incomplete.
Send Location Updates of the Iss Every Minute to a Table ...
This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery.  **Cron node:** The Cron node will trigger the workflow every minute. **HTTP Request node:** This node will make a GET request to the API `https://api.wheretheiss.at/v1/satellites/25544/positions` to fetch the position of the ISS. This information gets passed on to the next node in the workflow. **Set node:** We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. **Google BigQuery:** This node will send the data from the previous node to the `position` table in Google BigQuery. If you have created a table with a different name, use that table instead.
Pdf to Sql Server - N8n Template
This workflow automates the process of extracting data from a PDF file uploaded to Google Drive, converting it to JSON format using an AI agent, and then inserting the extracted data into a SQL Server database and a Supabase vector store. The workflow also uses embeddings and text splitting to prepare the data for the AI agent.