HTTP Request node
Stripe node

Generate Stripe invoice and send it by email

Published 2 months ago

Template description

Generating Stripe invoices through the API can be tricky since it requires four steps to generate and send it via email to the customer.

With this workflow you can create Stripe invoices automatically and make Stripe send the invoices to the customer email.

How it works

To generate a Stripe invoice, you need to create a customer, specify the invoice items, create the invoice, and finalize it.

What should be a simple task involves multiple steps.

This workflow simplifies the process by providing everything pre-built for you.

Who is this for?

Anyone who wants to generate invoices automatically and send them to the customerโ€™s email.

Stripe will only send invoices to customers if you generate the invoice correctly through the API.
.

Check out my other templates

๐Ÿ‘‰ย https://n8n.io/creators/solomon/

Share Template

More Finance workflow templates

HTTP Request node
Code node
+7

Build Your First AI Data Analyst Chatbot

Enhance your data analysis by connecting an AI Agent to your dataset, using n8n tools. This template teaches you how to build an AI Data Analyst Chatbot that is capable of pulling data from your sources, using tools like Google Sheets or databases. It's designed to be easy and efficient, making it a good starting point for AI-driven data analysis. You can easily replace the current Google Sheets tools for databases like Postgres or MySQL. How It Works The core of the workflow is the AI Agent. It's connected to different data retrieval tools, to get data from Google Sheets (or your preferred database) in many different ways. Once the data is retrieved, the Calculator tool allows the AI to perform mathematical operations, making your data analysis precise. Who is this template for Data Analysts & Researchers:** Pull data from different sources and perform quick calculations. Developers & AI Enthusiasts:** Learn to build your first AI Agent with easy dataset access. Business Owners:** Streamline your data analysis with AI insights and automate repetitive tasks. Automation Experts:** Enhance your automation skills by integrating AI with your existing databases. How to Set Up You can find detailed instructions in the workflow itself. Check out my other templates ๐Ÿ‘‰ https://n8n.io/creators/solomon/
solomon
Solomon
Google Sheets node
HTTP Request node
Merge node
+10

Invoice data extraction with LlamaParse and OpenAI

This n8n workflow automates the process of parsing and extracting data from PDF invoices. With this workflow, accounts and finance people can realise huge time and cost savings in their busy schedules. Read the Blog: https://blog.n8n.io/how-to-extract-data-from-pdf-to-excel-spreadsheet-advance-parsing-with-n8n-io-and-llamaparse/ How it works This workflow will watch an email inbox for incoming invoices from suppliers It will download the attached PDFs and processing them through a third party service called LlamaParse. LlamaParse is specifically designed to handle and convert complex PDF data structures such as tables to markdown. Markdown is easily to process for LLM models and so the data extraction by our AI agent is more accurate and reliable. The workflow exports the extracted data from the AI agent to Google Sheets once the job complete. Requirements The criteria of the email trigger must be configured to capture emails with attachments. The gmail label "invoice synced" must be created before using this workflow. A LlamaIndex.ai account to use the LlamaParse service. An OpenAI account to use GPT for AI work. Google Sheets to save the output of the data extraction process although this can be replaced for whatever your needs. Customizing this workflow This workflow uses Gmail and Google Sheets but these can easily be swapped out for equivalent services such as Outlook and Excel. Not using Excel? Simple redirect the output of the AI agent to your accounting software of choice.
jimleuk
Jimleuk
HTTP Request node
Google Drive node
Google Calendar node
+9

Actioning Your Meeting Next Steps using Transcripts and AI

This n8n workflow demonstrates how you can summarise and automate post-meeting actions from video transcripts fed into an AI Agent. Save time between meetings by allowing AI handle the chores of organising follow-up meetings and invites. How it works This workflow scans for the calendar for client or team meetings which were held online. * Attempts will be made to fetch any recorded transcripts which are then sent to the AI agent. The AI agent summarises and identifies if any follow-on meetings are required. If found, the Agent will use its Calendar Tool to to create the event for the time, date and place for the next meeting as well as add known attendees. Requirements Google Calendar and the ability to fetch Meeting Transcripts (There is a special OAuth permission for this action!) OpenAI account for access to the LLM. Customising the workflow This example only books follow-on meetings but could be extended to generate reports or send emails.
jimleuk
Jimleuk
Webhook node
Respond to Webhook node
OpenAI node

Analyze tradingview.com charts with Chrome extension, N8N and OpenAI

This flow is supported by a Chrome plugin created with Cursor AI. The idea was to create a Chrome plugin and a backend service in N8N to do chart analytics with OpenAI. It's a good sample on how to submit a screenshot from the browser to N8N. Who is it for? N8N developers who want to learn about using a Chrome plugin, an N8N webhook and OpenAI. What opportunity does it present? This sample opens up a whole range of N8N connected Chrome extensions that can analyze screenshots by using OpenAI. What this workflow does? The workflow contains: a webhook trigger an OpenAI node with GPT-4O-MINI and Analyze Image selected a response node to send back the Text that was created after analysing the screenshot. All this is needed to talk to the Chrome extension which is created with Cursor AI. The idea is to visit the tradingview.com crypto charts, click the Chrome plugin and get back analytics about the shown chart in understandable language. This is driven by the N8N flow. With the new image analytics capabilities of OpenAI this opens up a world of opportunities. Requirements/setup OpenAI API key Cursor AI installed The Chrome extension. Download The N8N JSON code. Download How to customize it to your needs? Both the Chrome extension and N8N flow can be adapted to use on other websites. You can consider: analyzing a financial screen and ask questions about the data shown analyzing other charts extending the N8N workflow with other AI nodes With AI and image analytics the sky is the limit and in some cases it saves you from creating complex API integrations. Download Chrome extension
thingsio
Hans Blaauw
HTTP Request node
+11

Build a Financial Documents Assistant using Qdrant and Mistral.ai

This n8n workflow demonstrates how to manage your Qdrant vector store when there is a need to keep it in sync with local files. It covers creating, updating and deleting vector store records ensuring our chatbot assistant is never outdated or misleading. Disclaimer This workflow depends on local files accessed through the local filesystem and so will only work on a self-hosted version of n8n at this time. It is possible to amend this workflow to work on n8n cloud by replacing the local file trigger and read file nodes. How it works A local directory where bank statements are downloaded to is monitored via a local file trigger. The trigger watches for the file create, file changed and file deleted events. When a file is created, its contents are uploaded to the vector store. When a file is updated, its previous records are replaced. When the file is deleted, the corresponding records are also removed from the vector store. A simple Question and Answer Chatbot is setup to answer any questions about the bank statements in the system. Requirements A self-hosted version of n8n. Some of the nodes used in this workflow only work with the local filesystem. Qdrant instance to store the records. Customising the workflow This workflow can also work with remote data. Try integrating accounting or CRM software to build a managed system for payroll, invoices and more. Want to go fully local? A version of this workflow is available which uses Ollama instead. You can download this template here: https://drive.google.com/file/d/189F1fNOiw6naNSlSwnyLVEm_Ho_IFfdM/view?usp=sharing
jimleuk
Jimleuk
Google Sheets node
HTTP Request node
Google Drive node
+7

Invoices from Gmail to Drive and Google Sheets

Attachments Gmail to Drive and Google Sheets Description Automatically process invoice emails by saving attachments to Google Drive and extracting key invoice data to Google Sheets using AI. This workflow monitors your Gmail for unread emails with attachments, saves PDFs to a specified Google Drive folder, and uses OpenAI's GPT-4o to extract invoice details (date, description, amount) into a structured spreadsheet. Use cases Invoice Management**: Automatically organize and track invoices received via email Financial Record Keeping**: Maintain a structured database of all invoice information Document Organization**: Keep digital copies of invoices organized in Google Drive Automated Data Entry**: Eliminate manual data entry for invoice processing Resources Gmail account Google Drive account Google Sheets account OpenAI API key Setup instructions Prerequisites Active Gmail, Google Drive, and Google Sheets accounts OpenAI API key (GPT-4o model access) n8n instance with credentials manager Steps Gmail and Google Drive Setup: Connect your Gmail account in n8n credentials Connect your Google Drive account with appropriate permissions Create a destination folder in Google Drive for invoice storage Google Sheets Setup: Connect your Google Sheets account Create a spreadsheet with columns: Invoice date, Invoice Description, Total price, and Fichero Copy your spreadsheet ID for configuration OpenAI Setup: Add your OpenAI API key to n8n credentials Configure Email Filter: Update the email filter node to match your specific sender requirements Benefits Time Saving**: Eliminates manual downloading, filing, and data entry Accuracy**: AI-powered data extraction reduces human error Organization**: Consistent file naming and storage structure Searchability**: Creates a searchable database of all invoice information Automation**: Runs every minute to process new emails as they arrive Related templates Email Parser to CRM Document Processing Workflow Financial Data Automation
carlosgracia
Juan Carlos Cavero Gracia

More Product workflow templates

HTTP Request node
Microsoft Outlook node
+10

Create a Branded AI-Powered Website Chatbot

Create a Branded AI Website Chatbot Engage website visitors with an intelligent chat widget powered by OpenAI. This template includes: ๐Ÿ’ฌ Natural conversation handling ๐Ÿ“… Microsoft Outlook calendar integration ๐Ÿ“ Lead capture and information gathering ๐Ÿ”„ Human handoff capabilities Simply add a JavaScript snippet to your website and configure the workflow to match your needs. Follow our detailed setup guide to get started in minutes. > Note: Widget includes a "Powered By" affiliate link
nocodecreative
Wayne Simpson
OpenAI Chat Model node

Chat with Postgresql Database

Who is this template for? This workflow template is designed for any professionals seeking relevent data from database using natural language. How it works Each time user ask's question using the n8n chat interface, the workflow runs. Then the message is processed by AI Agent using relevent tools - Execute SQL Query, Get DB Schema and Tables List and Get Table Definition, if required. Agent uses these tool to form and run sql query which are necessary to answer the questions. Once AI Agent has the data, it uses it to form answer and returns it to the user. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need a Postgresql Credentials, and OpenAI api key. Template was created in n8n v1.77.0
kumohq
KumoHQ
HTTP Request node
Merge node
+13

AI Agent To Chat With Files In Supabase Storage

Video Guide I prepared a detailed guide explaining how to set up and implement this scenario, enabling you to chat with your documents stored in Supabase using n8n. Youtube Link Who is this for? This workflow is ideal for researchers, analysts, business owners, or anyone managing a large collection of documents. It's particularly beneficial for those who need quick contextual information retrieval from text-heavy files stored in Supabase, without needing additional services like Google Drive. What problem does this workflow solve? Manually retrieving and analyzing specific information from large document repositories is time-consuming and inefficient. This workflow automates the process by vectorizing documents and enabling AI-powered interactions, making it easy to query and retrieve context-based information from uploaded files. What this workflow does The workflow integrates Supabase with an AI-powered chatbot to process, store, and query text and PDF files. The steps include: Fetching and comparing files to avoid duplicate processing. Handling file downloads and extracting content based on the file type. Converting documents into vectorized data for contextual information retrieval. Storing and querying vectorized data from a Supabase vector store. File Extraction and Processing: Automates handling of multiple file formats (e.g., PDFs, text files), and extracts document content. Vectorized Embeddings Creation: Generates embeddings for processed data to enable AI-driven interactions. Dynamic Data Querying: Allows users to query their document repository conversationally using a chatbot. Setup N8N Workflow Fetch File List from Supabase: Use Supabase to retrieve the stored file list from a specified bucket. Add logic to manage empty folder placeholders returned by Supabase, avoiding incorrect processing. Compare and Filter Files: Aggregate the files retrieved from storage and compare them to the existing list in the Supabase files table. Exclude duplicates and skip placeholder files to ensure only unprocessed files are handled. Handle File Downloads: Download new files using detailed storage configurations for public/private access. Adjust the storage settings and GET requests to match your Supabase setup. File Type Processing: Use a Switch node to target specific file types (e.g., PDFs or text files). Employ relevant tools to process the content: For PDFs, extract embedded content. For text files, directly process the text data. Content Chunking: Break large text data into smaller chunks using the Text Splitter node. Define chunk size (default: 500 tokens) and overlap to retain necessary context across chunks. Vector Embedding Creation: Generate vectorized embeddings for the processed content using OpenAI's embedding tools. Ensure metadata, such as file ID, is included for easy data retrieval. Store Vectorized Data: Save the vectorized information into a dedicated Supabase vector store. Use the default schema and table provided by Supabase for seamless setup. AI Chatbot Integration: Add a chatbot node to handle user input and retrieve relevant document chunks. Use metadata like file ID for targeted queries, especially when multiple documents are involved. Testing Upload sample files to your Supabase bucket. Verify if files are processed and stored successfully in the vector store. Ask simple conversational questions about your documents using the chatbot (e.g., "What does Chapter 1 say about the Roman Empire?"). Test for accuracy and contextual relevance of retrieved results.
lowcodingdev
Mark Shcherbakov
HTTP Request node
Code node
+7

Build Your First AI Data Analyst Chatbot

Enhance your data analysis by connecting an AI Agent to your dataset, using n8n tools. This template teaches you how to build an AI Data Analyst Chatbot that is capable of pulling data from your sources, using tools like Google Sheets or databases. It's designed to be easy and efficient, making it a good starting point for AI-driven data analysis. You can easily replace the current Google Sheets tools for databases like Postgres or MySQL. How It Works The core of the workflow is the AI Agent. It's connected to different data retrieval tools, to get data from Google Sheets (or your preferred database) in many different ways. Once the data is retrieved, the Calculator tool allows the AI to perform mathematical operations, making your data analysis precise. Who is this template for Data Analysts & Researchers:** Pull data from different sources and perform quick calculations. Developers & AI Enthusiasts:** Learn to build your first AI Agent with easy dataset access. Business Owners:** Streamline your data analysis with AI insights and automate repetitive tasks. Automation Experts:** Enhance your automation skills by integrating AI with your existing databases. How to Set Up You can find detailed instructions in the workflow itself. Check out my other templates ๐Ÿ‘‰ https://n8n.io/creators/solomon/
solomon
Solomon
Google Sheets node
+5

๐Ÿš€ Boost your customer service with this WhatsApp Business bot!

This n8n workflow demonstrates how to automate customer interactions and appointment management via WhatsApp Business bot. After submitting a Google Form, the user receives a notification via WhatsApp. These notifications are sent via a template message. In case user sends a message to the bot, the text and user data is stored in Google Sheets. To reply back to the user, fill in the ReplyText column and change the Status to 'Ready'. In a few seconds n8n will fetch the unsent replies and deliver them one by one via WhatsApp Business node. Customize this workflow to fit your specific needs, connect different online services and enhance your customer communication! ๐ŸŽ‰ Setup Instructions To get this workflow up and running, you'll need to: ๐Ÿ‘‡ Create a WhatsApp template message on the Meta Business portal. Obtain an Access Token and WhatsApp Business Account ID from the Meta Developers Portal. This is needed for the WhatsApp Business Node to send messages. Set up a WhatsApp Trigger node with App ID and App Secret from the Meta Developers Portal. Right after that copy the WhatsApp Trigger URL and add it as a Callback URL in the Meta Developers Portal. This trigger is needed to receive incoming messages and their status updates. Connect your Google Sheets account for data storage and management. Check out the documentation page. โš ๏ธ Important Notes WhatsApp allows automatic custom text messages only within 24 hours of the last user message. Outside with time frame only approved template messages can be sent. The workflow uses a Google Sheet to manage form submissions, incoming messages and prepare responses. You can replace these nodes and connect the WhatsApp bot with other systems.
eduard
Eduard
Google Sheets node
HTTP Request node
Markdown node
+7

โœจ Vision-Based AI Agent Scraper - with Google Sheets, ScrapingBee, and Gemini

Important Notes: Check Legal Regulations: This workflow involves scraping, so ensure you comply with the legal regulations in your country before getting started. Better safe than sorry! Workflow Description: ๐Ÿ˜ฎโ€๐Ÿ’จ Tired of struggling with XPath, CSS selectors, or DOM specificity when scraping ? This AI-powered solution is here to simplify your workflow! With a vision-based AI Agent, you can extract data effortlessly without worrying about how the DOM is structured. This workflow leverages a vision-based AI Agent, integrated with Google Sheets, ScrapingBee, and the Gemini-1.5-Pro model, to extract structured data from webpages. The AI Agent primarily uses screenshots for data extraction but switches to HTML scraping when necessary, ensuring high accuracy. Key Features: Google Sheets Integration**: Manage URLs to scrape and store structured results. ScrapingBee**: Capture full-page screenshots and retrieve HTML data for fallback extraction. AI-Powered Data Parsing**: Use Gemini-1.5-Pro for vision-based scraping and a Structured Output Parser to format extracted data into JSON. Token Efficiency**: HTML is converted to Markdown to optimize processing costs. This template is designed for e-commerce scraping but can be customized for various use cases.
dataki
Dataki

Implement complex processes faster with n8n

red icon yellow icon red icon yellow icon