Skip to content

Open source developer tools

All Formulas you publish publicly on Formulaic are currently licensed under Creative Commons CC-BY, which means you can download, reuse, and remix them for both personal and commercial uses, with proper attribution. We are also open sourcing the Formula standard, which we call Open Prompt Scripts, and the tooling to work with them.

Open Formulas

Open Formulas is an open source JSON format to make Formulas interoperable across models and applications. All Formulas that you create or export on Formulaic are natively stored in this format.

Open Formulas include meta information, model configuration information, prompt sequences that support variable templating, and variable information.

We are evaluating JSON Schema support in a future version.

If you would like to contribute to the Open Formulas specification, please join our Discord.

JSON format

This is an example of the Open Formula format

    {
     "model": {
       "id": "mistralai/Mistral-7B-Instruct-v0.1",
       "name": "Mistral 7B Instruct",
       "vendor": "Mistral",
       "provider": "Anyscale"
     },
     "sequences": [
       [
         {
           "text": "Tell me a hilarious joke about {{{subject}}} for a smart 10 year old. "
         }
       ]
     ],
     "variables": [
       {
         "name": "subject",
         "type": "text",
         "value": "Cats",
         "description": "The subject of your joke"
       }
     ]
    }

Code libraries for running Formulas inside AI applications

Formulaic developer tools help you abstract prompts away from core application logic and make your generative AI apps both more robust and easier to update.

  • Python: The Formulaic Python library makes it simple to work with Formulas, render prompt templates, and send and receive messages to a language model service. Includes a quickstart tutorial.
  • Node.js: We'll soon release the Prompted.js library, which includes core functions and a reference implementation for how to run Open Prompt Scripts inside an application.

Run locally with llamafile

You can also run Formulas against models running on local hardwares. The easiest way to do this is via llamafile, a sibling project that is also from the Mozilla Innovation Studio.

This Python tutorial shows that in less than 10 minutes, you can install llamafile on your local machine, create a local server, and begin running Formulas entirely on your laptop.