Conjure - taking AI coding to the next level

You describe a function. An LLM hallucinates it. The BEAM compiles and runs it live in your runtime with no restart, no compile step, no test suite, no PR reviews, no deploy. The source also lands in your codebase for further use. Straight from user input into production. Probably fine?

iex> Conjure.run("roman numeral for n", args: [2026])
"MMXXVI"

# Zig is faster than elixir for some things:
iex> Conjure.run("sum integers from 1 to n", args: [1_000_000_000], lang: :zig)
500000000500000000

# or why not use C to calculate the distance between London and Paris:
iex> Conjure.run("distance in km between two gps coordinates", args: [51.5074, -0.1278, 48.8566, 2.3522], lang: :c)
343.5560603410416

Why?

Every AI coding tool is doing some version of this. The difference is how many layers of shiny UI, “sandboxing”, and permission dialogs stand between you and the moment an LLM’s output actually executes. Conjure removes them all.

Software has spent decades building walls between users and raw execution. The stated reason is safety. The actual effect is that developers decide what users are allowed to want.

Conjure explores what the opposite feels like. Neal Stephenson asked this in 1999. We still haven’t figured it out.

Of course, the emperor has no clothes. You’re running a model that doesn’t answer your request but just produces text in the shape of an answer, statistically resembling code written by whoever happened to be overrepresented in the training data. Sometimes that’s String.myers_difference("puppy", "kitten"). Sometimes it’s System.cmd("rm", ["-rf", "/"]). Conjure will compile and run either with equal enthusiasm.

Most AI coding workflows paper over this with a reverse centaur: the human reviews and rubber-stamps LLM output, absorbing responsibility for failures they were never equipped to catch. Conjure just removes the human from the loop entirely. Just pure honest vibe coding.

The technical curiosity

slopc does something similar in Rust. The question here was whether this could be pushed further, not just at compile time via macros, but at runtime: live code generation, compilation, and execution inside a running OTP node, with no restart. And not just Elixir code: thanks to language interoperability tooling, the same trick works for Zig and C, so you can go from idea to halucinated code to computed result with a single Conjure.run/2 runtime call.

6 Likes

I admire the idea, similar vibes to my Handwave lib.

Do you have a good use case for it in mind or is it more of an art project right now?

Definitely more of a “how did we end up here” and where should (not just can which seems to be driving much of this) we go next?

2 Likes