Need help understanding the Spark library

Questions abut Spark

I am working on an Elixir library that provides a Domain-Specific Language (DSL). I plan to use the Spark library and have a few questions regarding its implementation in a project. My questions are as follows:

  1. With Spark, how do you implement the actual functionality of your DSL? Specifically, once you have defined your sections and entities, how do you define the code that will be quoted by each section and entity?
  2. Am I correct in assuming that the sections and entities are macros that quote code, or is there something fundamental about Spark’s operation that I’m missing?
  3. My library focuses more on using a single macro with numerous options rather than nesting multiple macros. I’m primarily interested in Spark for its tooling capabilities. Given this context, do you think Spark would be a good fit for my library?

Any help is welcome.

Hello! So spark is primarily designed for helping to build “structural” DSLs. I.e defining data.

For example, in Ash Framework, on an Ash.Resource

attributes do
  attribute :name, :string, allow_nil?: false
end

That is a section called :attributes, with an entity called :attribute. We then extract this using:

Ash.Resource.Info.attributes(module, :attributes)

which gives us, for example:

%Attribute{
  name: :name,
  type: :string,
  allow_nil?: false
}

(the examples are greatly simplified from real life).

So the DSL simply holds the description. It’s just “something else” (whatever else you want) that contains the “actual functionality” of the DSL. You can leverage it at compile time to do code generation, or at runtime to interpret the data structure however you like.

If you want things to happen “in the module”, then you can use transformers, i.e

defmodule YourTransformer do
  use Spark.Dsl.Transformer

  def transform(dsl_state) do
    {:ok, 
      Spark.Dsl.Transformer.eval(dsl_state, [], quote do
         ...your quoted code here
      end)
    }
  end
end

If you’re looking to write something like a single macro that contains some kind of customized imperative/declarative/expression language, then spark is unlikely to help you.

i.e if you’re looking to make something like NX or something like Ecto’s Ecto.Query macros.

5 Likes

Thank you very much for respond to my questions. Yes, I want to write something like single macro with multiple declarative configurations, so I’ll be looking into other solutions beyond Spark.