
pipedream
Integration platform for event-driven automations
Key Features
Use Cases
README
Pipedream is an integration platform for developers.
Pipedream provides a free, hosted platform for connecting apps and developing event-driven automations. The platform has over 1,000 fully-integrated applications, so you can use pre-built components to quickly send messages to Slack, add a new row to Google Sheets, and more. You can also run any Node.js, Python, Golang, or Bash code when you need custom logic. Pipedream has demonstrated SOC 2 compliance and can provide a SOC 2 Type 2 report upon request (please email support@pipedream.com).
This repo contains:
- The code for all pre-built integration components
- The product roadmap
- The Pipedream docs
- And other source code related to Pipedream.
This README
explains the key features of the platform and how to get started.
To get support, please visit https://pipedream.com/support.
Key Features
- Workflows - Workflows run automations. Workflows are sequences of steps - pre-built actions or custom Node.js, Python, Golang, or Bash code - triggered by an event (HTTP request, timer, when a new row is added to a Google Sheets, and more).
- Event Sources - Sources trigger workflows. They emit events from services like GitHub, Slack, Airtable, RSS and more. When you want to run a workflow when an event happens in any third-party app, you're using an event source.
- Actions - Actions are pre-built code steps that you can use in a workflow to perform common operations across Pipedream's 1,000+ API integrations. For example, you can use actions to send email, add a row to a Google Sheet, and more.
- Custom code - Most integrations require custom logic. Code is often the best way to express that logic, so Pipedream allows you to run any Node.js, Python, Golang, or Bash code. You can import any package from the languages' package managers, connect to any Pipedream connected app, and more. Pipedream is "low-code" in the best way: you can use pre-built components when you're performing common actions, but you can write custom code when you need to.
- Destinations - Deliver events asynchronously to common destinations like Amazon S3, Snowflake, HTTP and email.
- Free - No fees for individual developers (see limits)
Demo
Click the image below to watch a brief demo on YouTube.
Workflows
Workflows are sequences of linear steps triggered by an event (like an HTTP request, or when a new row is added to a Google sheet). You can quickly develop complex automations using workflows and connect to any of our 1,000+ integrated apps.
See our workflow quickstart to get started.
Event Sources
Event Sources watch for new data from services like GitHub, Slack, Airtable, RSS and more. When a source finds a new event, it emits it, triggering any linked workflows.
You can also consume events emitted by sources using Pipedream's REST API or a private, real-time SSE stream.
When a pre-built source doesn't exist for your use case, you can build your own. Here is the simplest event source: it exposes an HTTP endpoint you can send any request to, and prints the contents of the request when invoked:
export default {
name: "http",
version: "0.0.1",
props: {
http: "$.interface.http",
},
run(event) {
console.log(event); // event contains the method, payload, etc.
},
};
You can find the code for all pre-built sources in the components
directory. If you find a bug or want to contribute a feature, see our contribution guide.
Actions
Actions are pre-built code steps that you can use in a workflow to perform common operations across Pipedream's 500+ API integrations. For example, you can use actions to send email, add a row to a Google Sheet, and more.
You can create your own actions, which you can re-use across workflows. You can also publish actions to the entire Pipedream community, making them available for anyone to use.
Here's an action that accepts a name
as input and prints it to the workflow's logs:
export default {
name: "Action Demo",
description: "This is a demo action",
key: "action_demo",
version: "0.0.1",
type: "action",
props: {
name: {
type: "string",
label: "Name",
},
},
async run() {
return `hello ${this.name}!`;
},
};
You can find the code for all pre-built actions in the components
directory. If you find a bug or want to contribute a feature, see our contribution guide.
Custom code
Most integrations require custom logic. Code is often the best way to express that logic, so Pipedream allows you to run custom code in a workflow using:
You can import any package from the languages' package managers by declaring the imports directly in code. Pipedream will parse and download the necessary dependencies.
// Node.js
import axios from "axios";
# Python
import pandas as pd
// Go
import (
"fmt"
pd "github.com/PipedreamHQ/pipedream-go"
)
You can also connect to any Pipedream connected app in custom code steps. For example, you can connect your Slack account and send a message to a channel:
import { WebClient } from "@slack/web-api";
export default defineComponent({
props: {
// This creates a connection called "slack" that connects a Slack account.
slack: {
type: "app",
app: "slack",
},
},
async run({ steps, $ }) {
const web = new WebClient(this.slack.$auth.oauth_access_token);
return await web.chat.postMessage({
text: "Hello, world!",
channel: "#general",
});
},
});
Destinations
Destinations, like actions, abstract the connection, batching, and delivery logic required to send events to services like Amazon S3, or targets like HTTP and email.
For example, sending data to an Amazon S3 bucket is as simple as calling $send.s3()
:
$send.s3({
bucket: "your-bucket-here",
prefix: "your-prefix/",
payload: event.body,
});
Pipedream supports the following destinations:
Contributors
Thank you to everyone who has contributed to the Pipedream codebase. We appreciate you!
Pricing
Pipedream has a generous free tier. You can run sources and workflows for free within the limits of the free tier. If you hit these limits, you can upgrade to one of our paid tiers.
Limits
The Pipedream platform imposes some runtime limits on sources and workflows. Read more about those in our docs.
Found a Bug? Have a Feature to suggest?
Before adding an issue, please search the existing issues or reach out to our team to see if a similar request already exists.
If an issue exists, please add a reaction or add a comment detailing your specific use case.
If an issue doesn't yet exist and you need to create one, please use the issue templates.
Security
You can read about our platform security and privacy here.
If you'd like to report a suspected vulnerability or security issue, or have any questions about the security of the product, please contact our security team at security@pipedream.com.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers

mcp
Universal remote MCP server connecting AI clients to productivity tools.
WayStation MCP acts as a remote Model Context Protocol (MCP) server, enabling seamless integration between AI clients like Claude or Cursor and a wide range of productivity applications, such as Notion, Monday, Airtable, Jira, and more. It supports multiple secure connection transports and offers both general and user-specific preauthenticated endpoints. The platform emphasizes ease of integration, OAuth2-based authentication, and broad app compatibility. Users can manage their integrations through a user dashboard, simplifying complex workflow automations for AI-powered productivity.
- ⭐ 27
- MCP
- waystation-ai/mcp

awslabs/mcp
Specialized MCP servers for seamless AWS integration in AI and development environments.
AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.
- ⭐ 6,220
- MCP
- awslabs/mcp

mindsdb
Connect, unify, and query data at scale with an open-source AI platform.
MindsDB enables seamless connection to and unification of data from hundreds of enterprise sources, allowing for highly accurate responses across large-scale federated systems. It provides an open-source server with built-in support for the Model Context Protocol (MCP) to facilitate standardized interaction with AI-driven question answering over diverse data sets. The platform offers tools for preparing, organizing, and transforming both structured and unstructured data via knowledge bases, views, and scheduled jobs. Its agent framework and SQL interface empower users to configure data-centric agents, automate workflows, and interact with data conversationally.
- ⭐ 35,487
- MCP
- mindsdb/mindsdb

mcp-server-js
Enable secure, AI-driven process automation and code execution on YepCode via Model Context Protocol.
YepCode MCP Server acts as a Model Context Protocol (MCP) server that facilitates seamless communication between AI platforms and YepCode’s workflow automation infrastructure. It allows AI assistants and clients to execute code, manage environment variables, and interact with storage through standardized tools. The server can expose YepCode processes directly as MCP tools and supports both hosted and local installations via NPX or Docker. Enterprise-grade security and real-time interaction make it suitable for integrating advanced automation into AI-powered environments.
- ⭐ 31
- MCP
- yepcode/mcp-server-js

mcpmcp-server
Seamlessly discover, set up, and integrate MCP servers with AI clients.
mcpmcp-server enables users to discover, configure, and connect MCP servers with preferred clients, optimizing AI integration into daily workflows. It supports streamlined setup via JSON configuration, ensuring compatibility with various platforms such as Claude Desktop on macOS. The project simplifies the connection process between AI clients and remote Model Context Protocol servers. Users are directed to an associated homepage for further platform-specific guidance.
- ⭐ 17
- MCP
- glenngillen/mcpmcp-server

anyquery
Universal SQL query engine for files, databases, apps, and LLM integration.
Anyquery is a SQL query engine that enables users to run SQL queries on a wide array of data sources, including files, databases, and popular applications via plugins. It supports integrations with large language models (LLMs) through the Model Context Protocol (MCP), providing seamless access for AI models to query data. Built on SQLite, it extends functionality with a plugin system and can also function as a MySQL server for compatibility with conventional SQL clients. Its flexible installation options and broad plugin ecosystem allow extensive customization and integration.
- ⭐ 1,298
- MCP
- julien040/anyquery
Didn't find tool you were looking for?