Filter json by date

Is there a way to show only the results on the specific dates?
Ex: 2020-12-01 to 2020-12-31?
Or show only 2-day data?

Below are the results.

Enum.map(json_atoms.messages, &%{ts: DateTime.from_unix!(String.to_integer(String.slice(&1[:ts], 0..9))), text: "#{&1[:attachments]"})

Result =

[
  %{
    text: "test1 ",
    ts: "1609403162.123900"
  },
  %{
    text: "test2",
    ts: "1609398007.123800"
  },
  %{
    text: "test3",
    ts: "1609379878.123700"
  },
  %{
    text: "test4",
    ts: "1609375674.123600"
  },
  %{
    text: "test5",
    ts: "1609370666.123500"
  },
  %{
    text: "test6",
    ts: "1609358560.123400"
  },
  %{
    text: "test7",
    ts: "1609348573.123300"
  }
]

Hi @gilbertosj, what have you tried so far?

Enum.filter/2 and Enum.reject/2 can be used to filter out the items you do not want.

You might need to convert your times into the timestamp format of your data first, or the other way round.

I tried to do this with
Enum.sort_by(&(&1.ts), {:asc, Date})

I tried with
|> Date.diff(&1[:ts], current_date)

But without success.

Your issue is that your ts key is not a datetime, it’s a string, representing a float, which is I guess a unix time stamp with seconds. You need to parse that into a date time, and then you can sort it as a datetime.

You could also simply parse it into a float and sort it by the float value.

1 Like

I see, I still haven’t been able to do it that way.
But I will keep looking here.

@gilbertosj Here is an example:

defmodule Example do
  # pattern match for map containing messages atom key
  def sample(%{messages: messages}, from, to)
      # check if messages value is list + check if from and to are NaiveDateTime
      when is_list(messages) and is_struct(from, NaiveDateTime) and is_struct(to, NaiveDateTime) do
    # simple comprehension with pattern matching for every message
    # a filter and map (in do … end block)
    for %{attachments: text, ts: ts} <- messages, filter_ts(ts, from, to) do
      %{text: text, ts: ts}
    end
  end

  defp filter_ts(ts, from, to) do
    # with allows us to focus on main work without a need to handle all errors
    # firstly filter of all get integer from string containing float
    with {integer, _rest} <- Integer.parse(ts),
         # next try to convert it to DateTime assuming that our integer represents unix ts format
         {:ok, datetime} <- DateTime.from_unix(integer),
         # convert DateTime to NaiveDateTime - skip this if you want also work on timezone
         ndt = DateTime.to_naive(datetime) do
      # compare our result with from and to i.e. from < ndt < to
      # left-inclusive
      # NaiveDateTime.compare(from, ndt) in [:eq, :lt] and NaiveDateTime.compare(ndt, to) == :lt
      # right-inclusive
      # NaiveDateTime.compare(from, ndt) == :lt and NaiveDateTime.compare(ndt, to) in [:eq, :lt]
      # not inclusive
      NaiveDateTime.compare(from, ndt) == :lt and NaiveDateTime.compare(ndt, to) == :lt
    else
      # if any check written with <- would fail handle it here
      # important: it does not catches raise
      # that's why we use DateTime.from_unix/1 rather than DateTime.from_unix!/1
      # false here means that we do not want any message with invalid ts
      _ -> false
    end
  end
end

# an example map based on original post
json_atoms = %{
  messages: [
    %{
      attachments: "test1 ",
      ts: "1609403162.123900"
    },
    %{
      attachments: "test2",
      ts: "1609398007.123800"
    },
    %{
      attachments: "test3",
      ts: "1609379878.123700"
    },
    %{
      attachments: "test4",
      ts: "1609375674.123600"
    },
    %{
      attachments: "test5",
      ts: "1609370666.123500"
    },
    %{
      attachments: "test6",
      ts: "1609358560.123400"
    },
    %{
      attachments: "test7",
      ts: "1609348573.123300"
    }
  ]
}

# since provided ts are for 30th and 31th December 2020
# I would filter x where: 31th December 2020 < x < 1st January 2021
from = ~N"2020-12-31 00:00:00"
to = ~N"2021-01-01 00:00:00"
Example.sample(json_atoms, from, to)
3 Likes

That’s right, thank you very much.