Back to Blog

Integrating OpenRouter with Gumloop: Custom Node Tutorial

April 5, 2025
5 min read
Gumloop and OpenRouter Integration
Learn how to build a custom Gumloop Node that leverages the OpenRouter API to effortlessly generate text completions using any supported OpenRouter text-based model.

This straightforward tutorial will walk you through creating a custom node that connects Gumloop's powerful workflow automation with OpenRouter's extensive collection of AI models. By the end, you'll have a flexible integration that works with any text-based model available on OpenRouter.

Step-by-Step Guide

Step 1: Insert Your OpenRouter API Key

First, you'll need to get your API key from OpenRouter and add it to the code. Replace the placeholder with your actual API key:

api_key = "INSERT OPENROUTER API KEY HERE"

Step 2: Specify Your Model

You'll need the correct model identifier from the OpenRouter model page. Simply click the clipboard icon next to the model name to copy it.

Example: Google Gemini 2.5 Pro Preview uses the identifier google/gemini-2.5-pro-preview-03-25

Step 3: Custom Gumloop Node Code

Here's the complete Python script for your custom Gumloop node:

def main(Text, params):
    import json
    import requests

    prompt = params.get("Prompt")
    model = params.get("Model")

    api_key = "INSERT OPENROUTER API KEY HERE"

    if not prompt or not model:
        raise ValueError("Both Prompt and Model must be provided")

    api_url = "https://openrouter.ai/api/v1/chat/completions"

    payload = {
        "model": model,
        "messages": [
            {"role": "system", "content": prompt},
            {"role": "user", "content": Text}
        ]
    }

    try:
        response = requests.post(
            api_url,
            headers={
                "Authorization": f"Bearer {api_key}",
                "Content-Type": "application/json",
                "HTTP-Referer": "https://gumloop.com",
                "X-Title": "Gumloop OpenRouter Integration"
            },
            json=payload
        )

        response.raise_for_status()

        result = response.json()

        if result.get("choices") and len(result["choices"]) > 0:
            if "message" in result["choices"][0]:
                output = result["choices"][0]["message"].get("content", "")
            else:
                output = result["choices"][0].get("text", "")
        else:
            raise ValueError("No completion was generated")

    except requests.exceptions.RequestException as e:
        raise Exception(f"API request failed: {str(e)}")
    except (KeyError, json.JSONDecodeError) as e:
        raise Exception(f"Failed to parse API response: {str(e)}")

    return output

Node Configuration

When setting up your custom node in Gumloop, use the following configuration:

Inputs

  • Text (string type)

Outputs

  • Text (string type)

Parameters

  • Text input field for the prompt
  • Model selection with default value: openai/chatgpt-4o-latest

Quick Tips

API Key Security: Never share your API key publicly.

Model Flexibility: Easily switch models by updating the model name.

Text-Based Models: This setup works great with text-only models, offering quick integration and solid performance.

What You Can Do With This Integration

Once you've set up your custom OpenRouter node in Gumloop, you can:

  • Access hundreds of AI models from a single integration point
  • Switch between different models without changing your workflow structure
  • Compare outputs from different models side-by-side
  • Optimize costs by selecting the most cost-effective model for each task
  • Build sophisticated multi-model workflows with minimal code

Try It Out

Give it a try and see how this integration can enhance your Gumloop workflows! The combination of Gumloop's visual workflow builder and OpenRouter's extensive model selection creates a powerful platform for AI automation.

Need Help with Custom Integrations?

We specialize in building custom workflow automations and API integrations that help businesses leverage AI effectively.

Let's Talk