| CARVIEW |
intelli-monad: Type level prompt with LLMs via louter.
Type level prompt with LLMs via louter. This allows us to define function calls and value validation using types.
[Skip to Readme]
Modules
[Index] [Quick Jump]
Downloads
- intelli-monad-0.1.2.0.tar.gz [browse] (Cabal source package)
- Package description (as included in the package)
Maintainer's Corner
For package maintainers and hackage trustees
Candidates
- No Candidates
| Versions [RSS] | 0.1.0.0, 0.1.0.1, 0.1.0.2, 0.1.1.0, 0.1.1.1, 0.1.2.0 |
|---|---|
| Change log | CHANGELOG.md |
| Dependencies | aeson (>=2.1 && <2.3), aeson-casing (>=0.2 && <0.3), aeson-pretty (>=0.8.10 && <0.9), base (>=4.0 && <5), base64-bytestring (>=1.2.1 && <1.3), bytestring (>=0.11.5 && <1.0), containers (>=0.6.7 && <0.7), exceptions (>=0.10 && <1.0), haskeline (>=0.8.2 && <0.9), http-client (>=0.7.16 && <0.8), http-client-tls (>=0.3.6 && <0.4), http-conduit (>=2.2 && <2.4), http-types (>=0.12 && <1.0), intelli-monad, JuicyPixels (>=3.3.8 && <3.4), kan-extensions (>=5.2 && <5.3), louter (>=0.1.1 && <0.1.2), megaparsec (>=9.5 && <9.7), mtl (>=2.2 && <3.0), optparse-applicative (>=0.18 && <0.19), persistent (>=2.14.6 && <2.15), persistent-sqlite (>=2.13.3 && <2.14), process (>=1.6.17 && <1.7), sixel (>=0.1.2 && <0.2), stm (>=2.5 && <2.6), temporary (>=1.3 && <1.4), text (>=2.0.2 && <2.2), time (>=1.12.2 && <1.13), transformers (>=0.6.1 && <0.7), vector (>=0.13.1 && <0.14), wai (>=3.2 && <4.0), wai-extra (>=3.1 && <3.2), warp (>=3.3 && <4.0), xml-conduit (>=1.9 && <2.0), yaml (>=0.11 && <0.12) [details] |
| License | MIT |
| Author | Junji Hashimoto |
| Maintainer | junji.hashimoto@gmail.com |
| Uploaded | by junjihashimoto at 2025-12-15T08:04:07Z |
| Category | Development |
| Home page | https://github.com/junjihashimoto/intelli-monad |
| Distributions | |
| Executables | auto-talk, calc, intelli-monad |
| Downloads | 204 total (16 in the last 30 days) |
| Rating | (no votes yet) [estimated by Bayesian average] |
| Your Rating |
|
| Status | Docs available [build log] Last success reported on 2025-12-15 [all 1 reports] |
Readme for intelli-monad-0.1.2.0
[back to package description]intelli-monad
intelli-monad provides high level APIs for prompt engineering using openai.
Featuring:
- Type level function calling with JSON-Schema
- Validation of return value
- Repl interface
- Persistent of prompt logs with SQLite
- Session management with repl
intelli-monad is based on openai-servant-gen. openai-servant-gen is automatically generated from OpenAPI interface.
Install
git clone git@github.com:junjihashimoto/intelli-monad.git
cd intelli-monad
export PATH=~/.local/bin:$PATH
cabal install intelli-monad
Usage of repl
After install intelli-monad, set OPENAI_API_KEY, then run intelli-monad command. The system commands begin with prefix ":". Anything else will be the user's prompt.
$ export OPENAI_API_KEY=xxx
$ export OPENAI_MODEL=xxx
$ intelli-monad
% :help
:quit
:clear
:show contents
:show usage
:show request
:show context
:show session
:list sessions
:copy session <from> <to>
:delete session <session name>
:switch session <session name>
:help
% hello
assistant: Hello! How can I assist you today?
Usage of function calling with validation
Here is an example of function calling and validation. In this example, validation is performed using the input of function calling.
Define the function calling as ValidateNumber, and define the context as Math.
JSONSchema type-class can output it as JSON Schema. Defining HasFunctionObject class adds descriptin to each field. This allows Openai's interface to understand the meaning of each field. Tool type-class defines the input and output types of function calling, and defines the contents of the function.
CustomInstruction type-class defines the context with headers and footers.
runPromptWithValidation function calls LLM. The results will be validated and a number will be returned.
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedLists #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeFamilies #-}
module Main where
import Data.Aeson
import Data.Proxy
import GHC.Generics
import IntelliMonad.Persist
import IntelliMonad.Prompt
import IntelliMonad.Types
import OpenAI.Types
data ValidateNumber = ValidateNumber
{ number :: Double
}
deriving (Eq, Show, Generic, JSONSchema, FromJSON, ToJSON)
instance HasFunctionObject ValidateNumber where
getFunctionName = "output_number"
getFunctionDescription = "validate input number"
getFieldDescription "number" = "A number that system outputs."
instance Tool ValidateNumber where
data Output ValidateNumber = ValidateNumberOutput
{ code :: Int,
stdout :: String,
stderr :: String
}
deriving (Eq, Show, Generic, FromJSON, ToJSON)
toolExec _ = return $ ValidateNumberOutput 0 "" ""
data Math = Math
instance CustomInstruction Math where
customHeader = [(Content System (Message "Calcurate user input, then output just the number. Then call 'output_number' function.") "" defaultUTCTime)]
customFooter = []
main :: IO ()
main = do
v <- runPromptWithValidation @ValidateNumber @StatelessConf [] [CustomInstructionProxy (Proxy @Math)] "default" (fromModel "gpt-4") "2+3+3+sin(3)"
print (v :: Maybe ValidateNumber)
Configuration
To specify OpenAI endpoints and models as part of data instead of using environment variables, you can use a configuration file. Create a intellimonad-config.yaml file in the root directory of your project with the following content:
apiKey: "your_openai_api_key"
endpoint: "https://api.openai.com/v1/"
model: "gpt-4"
The apiKey is your OpenAI API key, endpoint is the OpenAI endpoint, and model is the model you want to use.
Make sure to update the initializePrompt and runPrompt functions to read from the configuration file as shown in the examples in the intelli-monad/app/auto-talk.hs and intelli-monad/app/calc.hs files.