HubSpot Custom LLM Workflow Actions: Step-by-Step Guide
Using HubSpot custom LLM workflow actions, you can connect powerful language models to your workflows, send prompts with CRM data, and route AI-generated responses back into your records or automations with full control.
This guide walks you through the complete process: from enabling the integration and creating prompts, to using responses in other workflow actions.
What Are HubSpot Custom LLM Workflow Actions?
Custom LLM workflow actions let you send data from a workflow to an external large language model (LLM) and capture the response directly in HubSpot. Each use of the action triggers a request to your configured AI provider.
These actions are designed for:
- Generating or transforming content based on CRM records.
- Summarizing long notes or emails.
- Classifying or extracting structured information.
- Powering downstream workflow branches and actions.
You configure the integration once, then reuse the action across workflows.
Requirements for Using HubSpot Custom LLM Actions
Before building workflows with LLMs in HubSpot, confirm the following:
- You have access to workflows that support custom actions.
- Your integration partner or private app is set up to connect with an LLM provider.
- You understand how your LLM provider handles and stores data.
The custom LLM action uses URL-based callbacks to send prompts and receive responses through your integration or app.
How HubSpot Sends Data to the LLM
When a workflow enrolls a record and reaches your custom LLM step, HubSpot sends an HTTP request to the callback URL defined by your integration or private app.
The request includes:
- Prompt text configured in the workflow action.
- Any selected record properties (tokens) merged into the prompt.
- Configuration values such as temperature or max tokens, if supported by your integration.
Your external service takes this data, forwards it to the LLM provider, and waits for the model to respond.
How the LLM Response Returns to HubSpot
After the LLM provider processes the prompt, your integration sends a response back to HubSpot using the callback URL. The response body must match the data schema defined for the custom LLM action.
Typical response data could include:
- Generated text (for example, an email draft or summary).
- Structured fields (for example, sentiment, category, or score).
- Error messages if the LLM request fails.
HubSpot then makes these outputs available as tokens in the same workflow, so you can use them in subsequent actions or branches.
Setting Up HubSpot Custom LLM Actions
The core setup work happens in your integration or private app, which defines how HubSpot talks to the LLM provider.
1. Configure the Integration or Private App
- Create or update your app to expose a custom workflow action that supports LLM prompts.
- Define the callback URL your service will use to receive prompt data from HubSpot.
- Implement the logic that takes the incoming request, calls the LLM, and returns the formatted response.
- Ensure authentication and authorization are correctly configured between HubSpot, your app, and the LLM provider.
For detailed technical requirements and payload formats, use the official documentation at HubSpot custom LLM workflow actions.
2. Define the Action Input Fields
In your app definition, specify what users can configure when they add the custom LLM action to a workflow in HubSpot:
- Prompt field (plain text or rich text).
- Optional configuration settings (for example, model name, tone, temperature).
- Record tokens allowed in the prompt (for example, contact first name, deal amount).
Every input you define becomes editable by users inside the workflow editor.
3. Define the Output Fields
Next, configure output fields that represent what the LLM will return. These are what workflow builders can use later in the automation.
Common output examples:
- Generated response text.
- Title, subject line, or call-to-action.
- Classification label or score.
- Boolean flags indicating success or error.
Outputs are exposed to HubSpot as tokens, which are crucial for creating powerful, data-driven automations.
Using HubSpot Custom LLM Actions in a Workflow
Once the app is installed and the custom action is defined, you can add it to any compatible workflow in HubSpot.
Step 1: Add the Custom LLM Action
- Open the workflow you want to enhance in HubSpot.
- Click the plus icon (+) where you want to insert the AI step.
- In the action library, find your custom LLM action provided by the integration or private app.
- Select it to add it to the workflow canvas.
Step 2: Configure the Prompt in HubSpot
Inside the action settings panel, configure how the LLM will be triggered:
- Enter your prompt text, describing what the LLM should output.
- Insert record tokens such as contact, company, ticket, or deal properties to personalize the prompt.
- Adjust any exposed configuration fields, for example:
- Tone (formal, friendly, concise).
- Language or region.
- Model behavior parameters supported by the integration.
The more precise you are, the more reliable the LLM output will be.
Step 3: Test the Action
Before activating your workflow in HubSpot:
- Use a single test record to run through the action.
- Review the response stored in the output fields.
- Confirm that the LLM response is well-structured and safe for your use case.
If results are not satisfactory, refine your prompt or adjust LLM settings in your integration.
Using LLM Outputs Elsewhere in HubSpot Workflows
After the LLM action runs, its outputs are available as tokens in that same workflow. This is where HubSpot custom LLM actions become especially powerful.
Save LLM Responses to CRM Properties
You can map generated content back into CRM fields:
- Add a Set property value action after the LLM step.
- Choose the property you want to populate, such as:
- A custom text property for AI summaries.
- A classification property for tags or segments.
- A numeric property for AI-generated scores.
- Insert the LLM output token into the property value.
This allows HubSpot users to see the LLM outputs directly on contact, company, deal, or ticket records.
Use LLM Tokens in Emails, Tasks, and More
LLM outputs can be inserted into many other workflow actions, such as:
- Send email – use the generated subject or body content.
- Create task – include AI-produced summaries or next steps.
- Send internal email notification – share AI insights with team members.
- Create record – pre-fill notes or descriptions with generated text.
This keeps your automation fully integrated with AI-generated content inside HubSpot.
Branching Logic with LLM Outputs
LLM outputs are also useful for conditional logic:
- Add an If/then branch after the LLM step.
- Build conditions using the output tokens, for example:
- If sentiment score is below a threshold, alert a sales manager.
- If classification equals a certain category, route the ticket to a specialist.
- Use separate paths to customize follow-up actions.
This allows HubSpot workflows to respond dynamically to AI analysis.
Best Practices for HubSpot Custom LLM Automation
To use LLMs safely and effectively in HubSpot workflows, follow these guidelines:
- Be explicit in prompts: Clearly specify formatting, tone, and length.
- Minimize sensitive data: Only send the fields required for your use case.
- Validate outputs: Use test records and internal review before going live.
- Log errors: Capture error messages in properties or notes so your team can troubleshoot.
- Start simple: Launch with straightforward prompts, then expand to more complex logic.
When to Use Professional Help
Implementing robust LLM workflows in HubSpot often requires careful planning, data modeling, and prompt design. If you need strategic or technical support, you can work with a specialist partner such as Consultevo to design scalable automations that align with your business processes.
Next Steps
With custom LLM workflow actions, HubSpot becomes a powerful AI automation hub. By connecting your LLM provider through an integration or private app, you can:
- Generate personalized content at scale.
- Analyze conversations and records automatically.
- Drive sophisticated branching logic based on AI outputs.
Review your existing workflows, identify areas where AI can add value, then implement and test custom LLM actions to enhance productivity and insight across your CRM.
Need Help With Hubspot?
If you want expert help building, automating, or scaling your Hubspot , work with ConsultEvo, a team who has a decade of Hubspot experience.
“`
