Best Practice for testing database?

Hello. I am writing my first server in elixir, and I am using the influxDB for storing time series data. So far I have enjoyed programming in elixir. But I am struggling to test my system.

I have read this article: http://blog.plataformatec.com.br/2015/10/mocks-and-explicit-contracts/
and it did help me write Mock modules for HTTP servers (for some HTTP requests my server makes) in order to test my GET/POST queries.

But I am struggling to understand how I can simulate my database by creating a mock module. If I decide to do that, I will also have to parse the SQL queries, so that the Mock supports all sorts of queries that I may wish to add in future (and also to make the Mock re-usable for anybody else who wants to test it the way I do).

The other way (simpler/less time consuming) is to simply have the DB run in the background while running tests. And create a different database for test purposes. But is this bad test design?

What are the best practices for testing database queries in elixir? And can somebody point me to an article, github project, etc. where I can refer it and learn?

Thank you.

This is very normal, and is also the way that people test when using Postgres as a database.

2 Likes

Yes, I do see some other projects also take this approach. I will do that as well. Thanks!

I do not mean to hijack this topic, but has anybody have tried to incorporate pgtap into their workflow (with elixir projects) and reached any meaningful conclusion, successes or failures in doing so?

Personally, I like setting up a docker environment.

An example from a pet project:

docker-compose.yml

version: '3.3'

services:

  app:
    build:
      context: .
      dockerfile: Dockerfile
    working_dir: /app
    command: 'mix phx.server'
    ports:
      - '4000:4000'
    volumes:
      - '~/.bash-history:/.bash-history'
      - '.:/app:cached'
      - '/tmp:/tmp:delegated'
      - '.bashrc:/root/.bashrc:cached'
      - '/var/run/docker.sock:/var/run/docker.sock:cached'
      - '/var/tmp:/var/tmp:cached'
      - '/tmp:/tmp:cached'
    depends_on:
      - db

  db:
    image: postgres:11-alpine
    volumes:
      - 'pgdata:/var/lib/postgres/data'

volumes:
  pgdata:

Makefile

start:
	docker-compose up -d

stop:
	docker-compose down

build:
	docker-compose build app

shell:
	docker-compose run --rm app bash

install:
	docker-compose run --rm app mix deps.get

install-assets:
	docker-compose run --rm app bash -c "cd assets && npm install && node node_modules/webpack/bin/webpack.js --mode development"

compile:
	docker-compose run --rm app bash -c "mix do compile, phx.digest"

db-setup:
	docker-compose run --rm app mix ecto.setup

db-reset:
	docker-compose run --rm app mix ecto.reset

start-interactive:
	docker-compose run --rm --service-ports app iex -S mix phx.server

test:
	docker-compose run --rm app mix test

setup: build install install-assets compile db-setup

.PHONY: test
3 Likes

The difficulty lies in the details.

If you can assert correctness purely based on the query alone you could use a mock, but at least for SQL this is really hard as even simple changes in the order of building up a query are likely to make the logic of comparing a query fail, but could very well still result in the same proper results when handled by an actual db. Such differences defeat the purpose of a testsuite allowing you to refactor safely; Or you’d need to basically reimplement the db’s query engine. This is different to e.g. a mocked API (which is the subject of the blogpost), as the inputs there are likely to be way less complex to assert against than sql.

3 Likes