StreamState - stateful testing implemented on top of StreamData

There is long running issue on stream_data library to implement such functionality. In the meanwhile I have created this small library (still WIP) using @alfert counter statem implementation as a base.

It implements API similar to eqc_statem, however there is still a lot of unimplemented hooks. For now supported hooks are:

  • */n
  • *_args/1
  • *_command/1
  • *_pre/2
  • *_post/3
  • *_next/3
  • weight/2
7 Likes

Mind documenting this library a little further? I looked at your example but couldn’t quite grok it, f.ex. what do generate_commands do exactly, and how run_commands runs under the hood. Looked at the code but it seemed that I have to read the entire file to understand which at the time I didn’t do.

Are you willing to provide a few more examples in the repo and attach detailed explanations to them?

Yes, I was working on it. I should release version with updated documentation today.

After fighting a lot with PropCheck last week I went and checked the long standing stream_data issue. Naturally it led me to stream_state.

I greatly prefer the ergonomics and documentation of stream_data and would love to use stream_state for some models. My assumption is that the hooks behave similarly to Proper’s, but there are still ambiguities such as whether it uses symbolic variables.

Thanks for releasing it, I’ll be watching the progress.

1 Like

What issues were you having with Propcheck? Personally I think there’s some room for making prop check’s api a bit more friendly and I’d love to see more effort go in that direction since the internals are already very solid.

My biggest issue is the license as in some projects I am not allowed to use it (as legal do not allow GPL code) and in other it is PITA to need to double license project or use different license for tests and code.

2 Likes

Funny you ask, in addition to Fred’s book I learned how to make stateful models from your Redis model example =)

My two primary issues are:

  1. While working out the generators it stores the failing examples in a file and immediately runs them when I start the next pass. That is a great workflow when I’m fixing failure cases, but not when the issues are coming from my generators and model code.
  2. The built in generators are very limited in comparison to stream_data. I have to build up generators from more primitive primitives. If I want to test ascii strings with a minimum and maximum length I have a lot of work to do, with stream_data I can use string/2.
  3. Assertion errors aren’t reported as well, which could just user error on my part.

I’ve definitely felt some of these same pain points.

I’m sure that a PR to add some more generators would be welcome. I end up needing the same types of things in all of my projects with proper and haven’t taken the time to open PRs for them. I find the assertions for stateless tests in proper to be fine. I think in theory the stream data reporting is nicer but I have so much trouble getting stream data to shrink failing cases to a reasonable size that its hard to have a real comparison. The reporting for Proper’s stateful tests could certainly use some love though just to make them more readable. That would probably go a really long way towards making them more approachable.

Yeah this tends to come up. We got permission to use it at work since we’re never shipping the test code to our servers. The same is true for my open source stuff. But I’m not a lawyer and I get that GPL tends to scare people.

1 Like

It is more than that. GPL is very opinionated, and about some clauses there is no consensus how to treat them. Though I prefer to give freedom to the developers, so I prefer more permissive licenses. I am more OSI/BSD guy rather than FSF/GNU. So for me this is also slightly ideological problem.

2 Likes

Fair enough.

2 Likes