Skip to content

Version 0.8.0

Latest
Compare
Choose a tag to compare
@mandel mandel released this 06 Sep 01:59
· 16 commits to main since this release
v0.8.0
82ba569

New Features

The main changes in this release are:

  • AutoPDL: automatic optimization of PDL programs
  • Calling PDL functions from Jinja and Python
  • Map/reduce block

AutoPDL: automatic optimization of PDL programs

Given a PDL program with some free variables, a search space for these free variable, a loss function, and a training set, you can optimize the PDL program using the pdl-optimize command. This allows to optimize any part of a PDL program such as the textual prompts, the few-shots examples, or the prompting patterns. This work is based on the paper AutoPDL: Automatic Prompt Optimization for LLM Agents. A new section of the manual is describing how to use the optimizer.

Calling PDL functions from Jinja and Python

It is now possible to call a function defined in PDL inside a Jinja expression or a Python code block. Here is an example where the function translate is called in the Jinja expression ${ translate("Hello", language="French") }:

description: Calling a PDL function from Jinja
defs:
  translate:
    function:
      sentence: string
      language: string
    return:
      lastOf:
      - |
        Translate the sentence '${ sentence }' to ${ language }.
        Only give the result of the translation.
      - model: ollama_chat/granite3.2:2b
text: |
  The way to say hello in French is ${ translate("Hello", language="French") }.

Map/reduce block

Similar to the repeat loops, PDL now offers a map block. The difference with the repeat block is that the context is not accumulated between iterations, each iteration is executed with the same context. Here is an example of map block:

lastOf:
- "Hello, "
- for:
    name: [Alice, Bob, Charlie]
  map:
    lastOf:
    - my name is ${ name }
    - model: ollama/granite3.2:2b
  join:
    as: array

What's Changed

Full Changelog: v0.7.1...v0.8.0