Monitoring to port

The purpose of the question is to ask about, How you may handle things and what are the better options to solve my issue regarding ports monitoring.

I am building an application for myself, which will monitor the port of NVRs if the port is closed or not, Currently, we have 100 NVRs (Hikvision), The application built in Phoenix, Front, and back-end all.

We are going to monitor with frequency one per minute for now, and approximately, there are going to 100 queries to DB (INSERT) each minute. I am using PostgreSQL DB. and for checking if the port is open or not I am using

  def port_open?(address, port) do
    case :gen_tcp.connect(to_char_list(address), to_integer(port), [:binary, active: false], 1000) do
      {:ok, socket} ->
        :gen_tcp.close(socket)
        true
      {:error, _error} ->
        false
    end
  end

in the above method as you can see its as simple as it is but in {:ok , socket} is there any information we can get from socket? that how much time it took or any information about the server to which we are testing for port.

Right my plan is to have such fields in table as

nvr_id :integer
status :boolean
done_at :DateTime

My question is that, as long as data will grow in the DB, we may have query timeouts or some issues with PostgreSQL, as In past when we have some millions of records in DB, the next INSERT queries never worked totally fine, most of the SSL TLS issue or sometime timeouts, Should I use the same DB or separate it from the actually project DB, or should I switch to Mongo or Rethink? Considering so much amount of data, and also every 1 minute Insert queries,

Also, It would be suitable to create a cron job to run every 1 minute for every NVR and do the whole process? I am trying to learn GenServer and OTP stuff, What path should I take so that I can use the concurrency of Elixir.

You can capture the time yourself; I often use a little helper function like this:

  def time(function) do
    :timer.tc(function)
    |> elem(0)
    |> Kernel./(1_000_000)
  end

And then pass what I want time in a function. You could also do it with a simple “fetch the current time, substract it from the time at that later point” bit of code. But afaik there is no information offered on actual time of socket connection, and these in-Elixir methods will only be approximate and include the runtime of your Elixir code as well.

As for getting some information on the socket itself, there is :inet.get_stat, but that is fairly limited. If you want to do a ping after the connect to see how the network is behaving, you could always use procket which includes a ping example

Inserting new rows into a table with millions of rows really shouldn’t be an issue for pgsql unless you are running it on seriously underwhelming hardware. But it is really hard to say anything about this, or what alternatives would make sense, without seeing your table definitions and SQL in question.

That said, I would tend to keep this in a separate DB of its own just so that it is neatly movable later as needed as its requirements due to data size may end up being different from the rest of your data.

You may also want to look at something like Riak TS (“TS” stands for time series) if you really want high-speed timestamped data. Takes a bit to set up, but it is quite fast and very solid. Still requires a reasonable amount of server resources.

Rethink is innapropriate to your needs from what I know of that project (it’s more about live queries than high volume time series data), and I wouldn’t touch MongoDB for any project I actually cared about.

2 Likes