Information, OpenSourceSoftware, passivedns, Security

PassiveDNS update (v0.2.4)

It has been some while since I had time to code on my C projects. But the last week I got some time and used it to get PassiveDNS into a state where Im more relaxed about it. Previous version (V0.1.1) used to spit out all DNS data it saw. The latest version caches DNS data internally in memory and only prints out a DNS record when it sees if for the first time, or if it is a active domain, it prints it out again after 24 hours and so on (once a day). The previous version would give me Gigabytes of DNS data daily in my test setup, while this version gives me about 2 Megabytes. This version also just gives you A, AAAA, PTR and CNAME records at the moment. I’m open for suggestions for more (use-cases would be great too!).

In my tests and in feedback from people who has tried it, PassiveDNS is very resource friendly when it comes to CPU usage (more or less idling). In current version (v0.2.4) there is not implemented any limitation on memory usage, so if your network sees a lot of DNS traffic, you might end up using some hundreds of Megabytes RAM for the internal cache. The most I’ve seen is around 100 MB at the moment. My plan is to implement some sort of “soft-limit” on memory usage, so that you can specify how much memory PassiveDNS should maximum use. The “downside” of this though, is that PassiveDNS would have to expire domains from its cache faster. That might end up in bigger log files with duplicate entries. When I say “downside”, its not a real downside as I see it. From my tests with the example scripts pdns2db.pl and search-pdns.pl, it is not much of a problem keeping up with insertions to the DB (MySQL) and your last seen timestamp will be a bit more accurate. I guess this kind of data though, is better suited for a NoSQL solution, if you are collecting lots of it.

If you have read this, and you are into Network Security Monitoring, and you don’t use passive DNS in your work, I recommend you too Google it and read a bit about it.

Advertisement
Standard

3 thoughts on “PassiveDNS update (v0.2.4)

  1. Jens-Harald Johansen says:

    I wrote my own passive DNS (pdnsd) based on output from http://www.enyo.de/fw/software/dnslogger/.
    When both programs are running they’re using less than 2MB of memory and, as you say, more or less idling.

    So far I’ve added support for A, AAAA, PTR, CNAME, TEXT, MX and NS. Not sure when I’ll get time to add more to it though.

    Currently the data is just pushed into text files which is parsed by a ruby script and put into a Postgres database.

    Like

    • Great! Do you output all request/answers? or do you in any way aggregate them? Is you project on github? And do you have any numbers on how much data each query type consume? I’m a bit concerned about the amount of data TXT records can produce etc. Im currently playing with types: DNAME, NAPTR, RP and SRV. With MX, NS, SOA, TXT Im mainly concerned about the amount of data it produces, compared to real use. My plan is to implement them all, and have the user decide what he cares about, with a reasonable default though 🙂

      Like

  2. Jens-Harald Johansen says:

    Each record is logged to text file (max 10MB) then parsed and uniq-sorted by the Ruby script before they’re added to the DB.

    Each file is normally filled up within 20-30 minutes. Existing DNS records are only updated with date in rr_date_last.

    Database columns are: rr_key, rr_type, rr_value, rr_ip, rr_date_first, rr_date_last, rr_text.
    TEXT records are saved in hexadecimal and average length is roughly 150 bytes.

    Here’s some stats from the database (took about 5 minutes to run):
    7.138.388;”PTR”
    9.275.702;”NS”
    13.088.170;”A”
    733.132;”MX”
    26.966;”AAAA”
    1.660.748;”CNAME”
    56.333;”TEXT”

    DB size is atm at 8.776MB with 31.979.439 records.
    First record is from 2010-11-12 and I have clean up script which I try to run a few times each 6 months just to clean up really old data (typically records not seen in the last 3 months).

    The code is not on github. Not sure if I’ll get that far after I’ve seen your passive DNS which is simpler since you’re using libpcap library.

    Testing so far shows that the queries respond fairly quick while the rails gui consumes more time related to rendering.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s