<!--
  Full-page Markdown export (rendered HTML → GFM).
  Source: https://neotoma.io/neotoma-with-chatgpt-connect-custom-gpt
  Generated: 2026-05-04T09:50:48.709Z
-->
# Connect via custom GPT with OpenAPI

[Neotoma with ChatGPT](/neotoma-with-chatgpt) · Full step-by-step setup: tunnel, Actions auth, instructions, OpenAPI paste.

Looking for remote MCP (developer mode) instead? See [Connect ChatGPT via remote MCP](/neotoma-with-chatgpt-connect-remote-mcp).

## Setup

You can also integrate Neotoma as an action inside a [custom GPT](https://help.openai.com/en/articles/20001049-apps-in-custom-gpts-for-business-accounts-beta). This approach uses the Neotoma API's OpenAPI spec directly and works with any ChatGPT plan that supports custom GPTs.

1.  **Start Neotoma with a tunnel:** follow the [tunnel guide](/tunnel) to expose your local Neotoma instance over HTTPS. The quickest path:
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    neotoma api start --env prod --tunnel
    ```
    
    Add `--background` to run as a background process. Logs go to `~/.config/neotoma/logs_prod/api.log` and can be viewed with `neotoma api logs --env prod`.
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    neotoma api start --env prod --tunnel --background
    ```
    
    Your Neotoma API host
    
    https://
    
    Optional. Enter host only (e.g. `abc123.ngrok.io`); copyable URLs below use `https://` + this host.
    
2.  **Create or edit a custom GPT:** go to [chatgpt.com/gpts/editor](https://chatgpt.com/gpts/editor) and open the **Configure** tab.
3.  **Add a new action:** under Actions, click “Create new action”, then click **Import from URL**. Enter your Neotoma API's actions spec URL (reduced spec that stays within GPT Actions operation limits):
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    https://<tunnel-host>/openapi_actions.yaml
    ```
    
    ▶OpenAPI import unavailable? Paste the spec directly
    
4.  **Paste recommended custom GPT instructions** into the GPT's **Instructions** field:
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    You are an assistant that uses Neotoma MCP actions for memory persistence and retrieval.
    
    Execution order (mandatory every turn):
    1) Run bounded retrieval for entities implied by the user message.
       - Use retrieve_entity_by_identifier for names/emails/identifiers.
       - Use retrieve_entities for related types (task, contact, event, etc.).
    2) Run one store call for this turn's conversation + user message (+ implied entities).
    3) Only after steps (1) and (2), continue with other tools and compose the user response.
    
    Do not respond before completing steps (1) and (2).
    
    Storage recipes:
    
    Unified store (preferred one call):
    - store with entities:
      - index 0: { entity_type: "conversation", title? }
      - index 1: { entity_type: "conversation_message", role: "user", sender_kind: "user", content: "<exact message>", turn_key: "{conversation_id}:{turn_id}" }
      - index 2+: optional extracted entities implied by the message
    ...
    ```
    
    Show more
    
5.  **Set the GPT name to “Neotoma”** in the **Name** field (optional but recommended so the assistant identifies as Neotoma).
6.  **Configure authentication:** set auth type to **API Key** (Bearer) in the GPT Actions UI and pass `Authorization: Bearer <token>`. Neotoma's OpenAPI spec includes `bearerAuth`. No OAuth client ID or secret needed. Your API base (from the host field above) for reference:
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    https://<tunnel-host>
    ```
    
    In the GPT Action's **API Key** field, paste your bearer token only (e.g. from `ACTIONS_BEARER_TOKEN` or a key-derived token from your Neotoma server). If you use **OAuth** instead, paste these into the Authentication modal:
    
    Authorization URL
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    https://<tunnel-host>/mcp/oauth/authorize
    ```
    
    Token URL
    
    Code snippet
    
    Copy the exact snippet shown below.
    
    ```
    https://<tunnel-host>/mcp/oauth/token
    ```
    
    ▶Using OAuth instead?
    
7.  **Save and publish:** the custom GPT now has full read/write access to your Neotoma memory graph via the API's REST endpoints.

[Back to Neotoma with ChatGPT](/neotoma-with-chatgpt) · [Install guide](/install) · [MCP reference](/mcp)