A new update: https://twitter.com/josevalim/status/1333450544409612289?s=19
Sounds more and more like things like ML
A new update: https://twitter.com/josevalim/status/1333450544409612289?s=19
Sounds more and more like things like ML
Assuming by the Erlang ML (mailing list), yes, it seems like ML (machine learning). Tensorflow NIF to be exact.
GPU hardware acceleration ?
I have no idea but it would be so great to have access to GPU in elixir to help with heavy computation.
Doubtful it’s a tensor flow nif. If you use erlang binaries with tensor flow, you could be in trouble, if my understanding of how the vm works is correct (it may be wrong).
That said, I am pretty sure it’s some sort of (optionally) GPU accelerated algebra library.
And I think (hope) we’re getting closer to some more insights: https://twitter.com/josevalim/status/1336315014982979584
At least we can try to push @josevalim a little to give us more information
Anyone is able to see any clues in an image from the tweet?
Maybe it’s a 3D generated image by elixir?
well that’s certainly a |>
To me the x looks like an upside-down lambda sitting stacked on a right-side-up lambda.
Is it automatic differentiation in elixir
Guys, only 5 days left to solve the mystery
I should mention that we are RECORDING the podcast Friday morning but it won’t be released immediately. I have to edit it first.
It will probably be released 2/9.
But I’m excited for it!
Love your podcast, keep up the good work! The news section at the start is something I always look forward to listening
Rx extensions with OTP characteristics.
The mascot looks close enough to the ReactiveX thingy
Whatever it is it will probably be defined with defn
and defnp
do you have link? i looked on master and 1.11 branch but nothing came up.
I don’t mind listening to the unedited version