Why is defmodule line marked as executable on mix test --cover?

When I run mix test --cover in my Phoenix project, I get some files with poor coverage because they are tiny modules with a use Something, and some of them mark the defmodule X do line as executable.

e.g.

Any idea why?

1 Like

I’m guessing it has something to do with use adding functions that are not being used anywhere… But just wanted to make sure that’s it…

This really piqued my curiosity so I put together a little test and it seems you’re… half right? I’m not exactly known for being able to come up with the most accurate scenarios for these things but it appears you just need to use something from the dependency module.

With this test:

defmodule CoverTest do
  use ExUnit.Case

  test "foobars the world" do
    assert Cover.hello() == "foobar"
  end
end

…and this dependency module:

defmodule Cover.Dep do
  defmacro __using__(_) do
    quote do
      def foo, do: "foo"
      def bar, do: "bar"
    end
  end
end

…this gets 50% coverage:

defmodule Cover do
  use Cover.Dep

  def hello do
    "foobar"
  end
end

…and this gets 100%:

defmodule Cover do
  use Cover.Dep

  def hello do
    foo() <> bar()
  end
end

…which I thought proved your theory right until I tried this:

defmodule Cover do
  use Cover.Dep

  def hello do
    foo() <> "bar"
  end
end

…which also scores 100% :face_with_spiral_eyes:

So ya, I know your post is only 5 hours old but BUMP for anyone who knows! :slight_smile:

EDIT: Oh, perhaps it’s because it doesn’t consider strings executable so the fact that you are using anything from the module means it’s covered?

1 Like

It scores 100% because as foo and bar are defined by Cover.Dep, using any function defined from it would mark the defmodule line as covered. The defs inside quotes aren’t marked as executable by the coverage tool at all.

Anyway, thanks for the investigation! That’s definitely why I posted here, cause I knew there was going to be someone as curious as I am - but not as lazy as I was - that would test it :smile:

2 Likes

I feel so used :sweat_smile:

Right, that makes sense, thanks!

1 Like

Sorry to resurrect this thread. Is the conclusion that we don’t know what is causing this or how to fix it? I’m running into this same thing. And 98.33% coverage on a file just doesn’t release those endorphins like it should.

You could fix it, by improving Erlang cover tool, although it might also be a time to take a few steps back and meditate. Code coverage is a bad metric to begin with, and especially if the goal is to get to a 100%.

It’s great to get an idea if there is something you have missed testing for or even dead code. But that is all it is. I really wish that the report was inverted instead of fueling an addiction, which I’m 100% guilty off.

1 Like

I’m only 98.33% guilty of this, but otherwise agree!

3 Likes