| CARVIEW |
Select Language
HTTP/1.1 200 OK
Connection: keep-alive
Server: nginx/1.24.0 (Ubuntu)
Content-Type: text/html; charset=utf-8
Cache-Control: public, max-age=300
Content-Encoding: gzip
Via: 1.1 varnish, 1.1 varnish
Accept-Ranges: bytes
Age: 0
Date: Sun, 18 Jan 2026 07:04:54 GMT
X-Served-By: cache-dfw-kdal2120114-DFW, cache-bom-vanm7210067-BOM
X-Cache: MISS, MISS
X-Cache-Hits: 0, 0
X-Timer: S1768719894.761479,VS0,VE308
Vary: Accept, Accept-Encoding
transfer-encoding: chunked
genai-lib: A library for interacting with various generative AI LLMs
[Skip to Readme]
genai-lib: A library for interacting with various generative AI LLMs
A library for performing completions and chats with various generative AI LLMs (Large Language Models). Works today with Ollama and OpenAI with more to come in the future.
[Skip to Readme]
Downloads
- genai-lib-2.1.0.tar.gz [browse] (Cabal source package)
- Package description (as included in the package)
Maintainer's Corner
For package maintainers and hackage trustees
Candidates
- No Candidates
| Versions [RSS] | 1.3, 2.0, 2.0.1, 2.1.0 |
|---|---|
| Change log | CHANGELOG.md |
| Dependencies | aeson (>=2.1.2.1 && <2.3), base (>=4.18.1.0 && <5), bytestring (>=0.11.5.2 && <0.13), containers (>=0.6.7 && <0.7), genai-lib, http-client (>=0.7.16 && <0.8), http-client-tls (>=0.3.6.3 && <0.4), scientific (>=0.3.7.0 && <0.4), servant (>=0.20.1 && <0.21), servant-client (>=0.20 && <0.21), servant-client-core (>=0.20 && <0.21), string-conv (>=0.2.0 && <0.3), text (>=2.0.2 && <3), time (>=1.12.2 && <1.13) [details] |
| License | ISC |
| Copyright | 2024 Dino Morelli |
| Author | Dino Morelli |
| Maintainer | dino@ui3.info |
| Uploaded | by DinoMorelli at 2025-04-24T20:19:54Z |
| Category | AI, Library |
| Source repo | head: git clone https://codeberg.org/dinofp/genai-lib |
| Distributions | |
| Executables | ex-genai-openai, ex-genai-ollama |
| Downloads | 76 total (14 in the last 30 days) |
| Rating | (no votes yet) [estimated by Bayesian average] |
| Your Rating |
|
| Status | Docs available [build log] Last success reported on 2025-05-13 [all 5 reports] |
Readme for genai-lib-2.1.0
[back to package description]genai-lib
Synopsis
A library for interacting with various generative AI LLMs
Description
A library for performing completions and chats with various generative AI LLMs (Large Language Models). Works today with Ollama and OpenAI with more to come in the future.
This project is very much in-progress and incomplete.
Using this library
An example, more available in src/example
import Control.Exception (Handler (..), catches)
import Data.Text.Lazy.IO qualified as TL
import GenAILib (ClientError, Request (..), jsonToText, mkRequest, numopt,
stringopt, systemmsg, usermsg)
import GenAILib.HTTP (GenAIException, openaiV1Chat, openaiV1ChatJ,
tokenFromFile)
import GenAILib.OpenAI (OpenAIRequest, getMessage)
main :: IO ()
main = do
openaiJSON
openaiData
-- A simple example with no error handling that expects an Aeson Value
-- (OpenAIRequest is an instance of ToJSON) and displays the encoded JSON
-- response
openaiJSON :: IO ()
openaiJSON = do
let req :: OpenAIRequest = mkRequest "gpt-3.5-turbo" $ usermsg "Why is the sky blue?"
token <- tokenFromFile "path/to/openai/key"
TL.putStrLn . jsonToText =<< openaiV1ChatJ token Nothing req
-- Another example with exception handling that expects an OpenAIResponse
-- data structure, displaying that and also just the response text
openaiData :: IO ()
openaiData = do
let req :: OpenAIRequest = mkRequest "gpt-3.5-turbo"
( systemmsg "Answer in the style of Bugs Bunny. Try to work the phrase \"What's up Doc?\" in somewhere."
<> usermsg "Why is the sky blue?"
<> numopt "temperature" 0.8
<> stringopt "service_tier" "default"
)
token <- tokenFromFile "path/to/openai/key"
res <- openaiV1Chat token Nothing req `catches`
[ Handler (\(e :: GenAIException) -> error . show $ e)
, Handler (\(e :: ClientError) -> error . show $ e)
]
print res -- The entire OpenAIResponse
TL.putStrLn . getMessage $ res -- Just the response Message Content
Getting source
Source code is available from codeberg at the genai-lib project page.
Contact
Dino Morelli dino@ui3.info