Spiritual Advisor Cocktail Recipes - OpenAI Completion

Generate fun and creative cocktail recipes with the Mantium Spiritual Advisor!

No time to walk through the tutorial? Test this prompt out here.

Use Case

It's been a long week. Project deadlines, solving problems both at work and at home. All you want to relax with a delicious cocktail, but your brain is tired and cannot imagine what to do with the spirits (liquor) you have on hand. Sure, you could search for "delicious cocktails with vodka" but why not have a language model solve this problem for you?

Prompt Creation

When you are ready to create your prompt, click AI Manager > Prompts > Add New Prompt, and fill out the following:

  • Name of Prompt: Spiritual Advisor
  • Description: OpenAI Completion

Tags and Intelets can be left blank.

For deploying your prompt publicly at the end of this tutorial, you can add a default security policy configured by Mantium. Click Add Security Policies under Security Policies and drag Default Policies from All Policies to Selected Policies. Click Done to save - you will know the policy has been applied when its name is visible under Security Policies.

Provider Settings

  • Provider: OpenAI
  • Endpoint: Completion

Prompt Body

For this tutorial, the prompt body is a collection of real or plausible cocktail recipes. We have prepared a text snippet for you to paste into the prompt line.
When pasting, please be sure that there is no additional whitespace after "Ingredients: " (with one space after the colon) at the end of the text. Language models are sensitive to all characters, including spaces.

Ingredients: vodka, prosecco, grapefruit, lime, strawberry

Cocktail Name: Summer Punch

Recipe:
1.5oz vodka 
1.5oz grapefruit
.75oz lime juice
.5oz Strawberry Syrup

Mix everything except prosecco, top with prosecco
### ###
Ingredients: campari, rum, pineapple, lime

Cocktail Name: Jungle Bird

Recipe:
1.5 oz rum
.75 oz Campari
1.5oz pineapple juice
.5oz lime juice
.5oz simple syrup

Stir everything, pour over ice
### ###
Ingredients: tequila, orange juice, lime la croix

Cocktail Name: Jr. Margarita

Recipe:
1.5oz tequila
1oz orange juice
3oz lime la croix
salt

Add everything over ice, starting with salt
### ###
Ingredients: whiskey, pickles

Cocktail Name: Pickleback

Recipe:
1.5oz whiskey
1.5oz pickle juice

Take a shot of whiskey, take a shot of pickle juice
### ###
Ingredients: bitters, maple syrup, fernet, whiskey

Cocktail Name: Toronto

Recipe:
2oz whiskey
.25oz maple syrup
.25oz fernet
2 dashes bitters

stir whiskey, maple syrup, and fernet. pour over fresh ice. add bitters
### ###
Ingredients: 

Then, select an engine.

  • Choose Engine: Davinci-instruct-beta

Davinci Instruct Beta is a similar engine to Davinci, with better capabilities for understanding instructions. This is useful for structured and formulaic outputs such as recipes.

OpenAI’s Engine Documentation

The prompt body itself is a collection of plausible (if not real) cocktail recipes, or recipes that read like cocktails. This serves as the "pattern" for the model to follow when generating new text.

Each entry in the prompt body includes the following:

  • Ingredients
  • Recipe - proportions of ingredients followed by instructions for combining them

Making the Prompt more User-Friendly

To make the prompt easier to interact with, it ends with Ingredients: . This means that the text following Ingredients: will be user input, and wherever the input leaves off is where the language model will take over generating text.

Prompt Settings

Basic Settings

  • Response Length: 100
  • Temperature: .8
  • Top P: 1
  • Frequency Penalty: .1
  • Presence Penalty: 0
  • Stop Sequence: ### ###

Response Length controls the length of an output response. The longer the response is permitted to be, the higher the potential is for it to become nonsensical. This is a creative prompt that still requires the output to be plausible - we restricted the output to 100 tokens to keep it from veering too far off course.

Temperature controls “creativity” - higher temperatures will produce more unique and creative outputs but also are more likely to become nonsensical. A lower temperature is advised for a prompt that requires a well-defined response, as the model will choose words with a higher probability of occurrence. However, for creative and interesting responses, a higher temperature value is suitable.

Top P is another way to control "creativity" using a different probability method. It is recommended to use either Top P or Temperature but not both. The default value is 1.

Stop sequences are another method of controlling output - they allow you to define any text sequences that force the model to stop. If you don't have a stop sequence, the model might generate a stream of the requested response length or it may stop in the middle of a sentence.
Using "### ###" provides delineation that is clearly visible and not often present in natural language.

Advanced Settings

  • Best of: 2
  • N: 1
  • LogProbs: leave blank

Test Prompt Text

Because these models are stochastic in nature, the resulting output will vary even when using the same input, but we have included a text snippet to run as a test:

fino sherry, aquavit, champagne, chamomile tea

We suggest running Test Run a few times to see the different possibilities of output, or take a look in your fridge for new ingredient inspiration!

Once you are satisfied with the results, click Save.

Results & Conclusion

This is a fun and applicable use case for natural language generation - it can be used in real life as Happy Hour inspiration and to showcase the power of natural language processing at the same time.
An unintended feature of this model is that it also works with made-up ingredients, which would be useful in some domains (such as creative writing) but not in others, where only factual input should be accepted.
Depending on the specific use of this tool, additional prompt text may be needed to provide examples of unacceptable ingredients that the prompt creator may not want to be used in creating recipes.

One-Click Deploy

Mantium enables sharing your prompt using the One-Click Deploy feature, located in each prompt's drawer view. From your list of prompts, click on the Spiritual Advisor prompt, and click Deploy .

Then, fill out the following:

  • Name: Spiritual Advisor
  • Description: Looking for cocktail inspiration? Add your ingredients and get a recipe!
  • Author Name: Your Name
  • ✅ Add Input Field
  • Public
  • Live
  • Input Placeholder Text: What ingredients do you have on hand?

To interact with this prompt, input a list of ingredients the same way that you did when testing the prompt out during setup.

To test out the prompt that we have configured, click this link!

Similar Use Cases

This example can be easily extended onto other creative generation ideas that follow a rough formula. Try creating food recipes, arts and crafts projects, or even new dance moves!