Anonymous functions with Spark DSL?

Hi All.

This isn’t really an Ash question but more a Spark question so I hope it is ok posting here.

I’m writing, yet another, admin type library. The library uses a map to describe various entities like input fields or columns in a table (index view). Below is a simplified example

def table() do
  %{
    columns: [
      %{
        name: :first_name,
        icon: %{
          name: "hero-envelope",
          position: :after,
          colour: [&get_icon_colour/1, [:record, :some_other_arg]]
        }
      },
      %{
        name: :last_name,
        icon: [&get_icon/1, [:record]],
        colour: "yellow"
      },
      %{
        name: :age,
        icon: "age-icon"
      }
    ]
  }
end

The nice thing about the above approach is that I can define a key value as a string or a map or even a function (see above for the definition of icon for the three columns).

However, I really like the Spark DSL approach but before I dive into Spark, is the above possible. I was thinking of something like:

columns do
  column :first_name do
    icon do
      name "hero-envelope"
      position :after

      colour do
        function(&get_icon/1)
        args [:record, :some_other_arg]
      end
    end
  end

  column :last_name do
    icon do
      function &get_icon/1
      args [:record]
    end

    colour "yellow"
  end

  column age do
    icon "age-icon"
  end
end

cheers

Dave

:wave: hello! I’ve moved this to general questions, just to keep things well organized.

With that said: Everything you’ve described above is possible with spark, including the anonymous functions :slight_smile:

Great stuff. many thanks @zachdaniel

1 Like