Over the last several months, I’ve been using Tableau to rebuild my long-neglected website. Tableau is part of the Elixir Tools project headed by @mhanberg, and is built on top of MDEx by @leandrocp. This library is one of ten libraries and Elixir extensions that I built. Most of these were built with the assistance of Kiro.
Prosody is a simple content analysis library that attempts to measure the reading effort and cognitive load for mixed text and code content — like a technical blog. It provides a Tableau extension, Prosody.Tableau, that will return the reading time for code blocks.
It implements text processing in three stages:
- parse where content is parsed with awareness of its content type (extensible, just like Tableau itself) to convert content into block types (
textorcode, initially) - analyze over the blocks to count words, with some cognitive load adjustments applied to
codeblocks - summarize over the analysis results to aggregate into reading time and metrics.
The current analysis is limited to word counting, but other analysis could be performed in the future (such as calculating content complexity such as the Flesch-Kincaid readability score) and steps could be added to merge blocks prior to analysis.
Counting words is more complex than one would imagine, as the text two words and/or fast-paced 1,234.56 www.example.com will produce 6, 8, or 12 words depending on the algorithm used. The default “balanced” algorithm produces 8, but this is configurable.
Example
content = """
# Hello World
This is some text.
```elixir
IO.puts("Hello")
```
"""
Prosody.analyze!(content, parser: :markdown)
This generates:
%{
code: %{words: 10, lines: 1},
text: %{words: 6},
metadata: %{},
words: 16,
reading_time: 1
}






















