How to Chat with Your Notion Data Using Mantium & OpenAI ChatGPT Plugins

In this tutorial, we will learn how to import data from Notion, create a dataset in Mantium, and set up plugins using Mantium. We will then use these plugins in the ChatGPT interface to chat with our data. This will allow us to perform question answering, summarization, text generation, and other applications using the Mantium plugin wizard & OpenAI ChatGPT.


We understand that sometimes it's easier to learn by watching rather than reading. If you prefer a more visual explanation, feel free to check out our accompanying video tutorial below. If you prefer reading or are unable to watch the video, please continue with the text documentation.


The objective of this tutorial is to guide you on how to chat with your Notion data using Mantium and the OpenAI ChatGPT plugin. This will enable you to perform various tasks such as question answering, summarization, text generation, and other applications using the Mantium plugin wizard.


Import Data from Notion

  1. Log into your Mantium account. If you don't have an account, sign up using the beta link here.
  2. Navigate to the Data Sources section and click on Add Data Source.
  3. Select Notion from the list of data sources.
  4. If you haven't set up a connector yet, click on Add New Connector, then Connect to Notion.
  5. Grant Mantium access to your Notion pages. Click on Allow Access, to complete the rest of authorization step. Once you've granted access, Mantium will import the data from the selected Notion page.
  6. Provide a name for the data source and click on Save and Test.

Create a Dataset in Mantium

  1. After the job is done, click on Create Custom Dataset.
  2. Provide a name for the dataset and click on Save.

Once the dataset is created, navigate to the Transform section to add transformations to the dataset.

Add Transformations

Split Text Transformation

After creating the custom dataset, navigate to the "Transform" section to add a couple of transformations. For this tutorial, we're going to be working with the "Split Text" transformation.

The "Split Text" transformation is used to split the content to get it ready to generate embeddings on that so that it doesn't go above the OpenAI recommendation.

Here's how to set it up:

  1. Click on Transform and select Split Text from the dropdown menu.
  2. In the Source Column field, add the content column. This is the column that contains the text data to be split.
  3. In the Destination Column field, type "segmented_text". This will be the new column that will hold the segmented text.
  4. Set "Split By" to "word". This means the text will be split at every word.
  5. In the "Split Length" field, type 1600. This means the text will be split every 1600 words, creating an additional row.
  6. Leave the remaining configuration as default.
  7. Click on the Plus sign(+) to add the next transform. The job will run and split the text in the "content" column into smaller chunks of 1600 words each, storing the result in a new column called "segmented_text".

This transformation is crucial as it prepares your data for the next steps, which include combining columns and generating embeddings for the ChatGPT interface.

Combine Columns Transformation

Next, we'll use the Combine Columns transformation to merge the author, title, segmented content, and source URL into a single column. This combined column will be used to generate embeddings in the ChatGPT use case.

  1. Select the Combine Columns from the dropdown menu.
  2. In the Destination Column name field, type **combined_text**. This will be the new column that will hold the combined data.
  3. In the String Template field, enter Author: $created_by | Title:${title | Content: $segmented_text | Source URL: $source to define the pattern for combining columns. Add the columns by selecting the column variables
  4. Click on "Save and Run Transforms" to apply the transformation.

Now we have processed dataset that is ready for use in ChatGPT!

Create a Plugin


Quick Warning

  • If you select the Standard option and have previously created a split_content column, ensure to pick this same split_content column in subsequent steps rather than the original text column. This will prevent the unnecessary expansion of your dataset, ultimately keeping your OpenAI usage costs in check.
  • To prepare your dataset, Select the Advanced option if you have Embeddings already or you have completed the steps above.
  1. Navigate to the Apps and click on New App.
  2. Select Existing Dataset and choose the dataset you just created.
  3. Select an existing OpenAI credential or create a new one.
  4. Select the Standard, because we don't have the embeddings column
  5. Set the Source Column to Combined_text.
  1. Click on Continue to let Mantium run some transformations and generate embeddings. (see image below)
  1. Select Managed Vector Database (Redis) as the destination.
  2. Provide a name and description for the plugin, then click on Create.

Set Up the Plugin in ChatGPT

  1. Navigate to your ChatGPT account and click on the Plugins dropdown, then select Plugin Store.
  2. Click on Develop Your Own Plugin.
  1. Paste the Plugin URL from Mantium into the Enter your website domain field and click on Find Manifest File.
  2. Click on Next, then click on Install for Me.
  3. Enter your access token from Mantium to install the plugin.

Chat with Your Notion Data

Now, you can use the plugin to chat with your Notion data in ChatGPT. Here are some example prompts:

Prompt 1

Please develop a comprehensive marketing plan and captivating Twitter thread to introduce the 
Manatee Search product using the Notion plugin.
Create a referral program for the product as well and include the data that you've considered.

Result from Prompt 1

That's it! You've successfully set up a plugin to chat with your Notion data using Mantium and OpenAI's ChatGPT.