PowerDNS service for get coordinates for IPv4 addresses

Bert Hubert from PowerDNS made an interesting announcement today. Retrieving the coordinates for an IPv4 address with just a DNS query.


Hopefully the country code will also be included, but this is an interesting way of using DNS as a directory service with public data.

Monitoring GitHub for new releases

Big sites like GitHub or GitLab are hosting a lot of projects and have numerous of releases a day. And while you as a person can watch a repository on GitHub, you can’t filter out new releases easily. At least not easily findable in the interfaces and checking all the repositories manually because they aren’t part of a build process is too much hassle and will fail in the end. So also for me with highlight.js as it has been updated from version 9.11.0 to 9.12.0 months ago.

Looking at some solutions people were writing about on StackOverflow for example was to parse the HTML and use that as a basis for actions to be executed. A quick check and grep of the output shown that we only have links to releases, but no structured data we can easily parse.

$ curl -s https://github.com/isagalaev/highlight.js/releases | grep -i releases\/tag
    <a href="/isagalaev/highlight.js/releases/tag/9.12.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.11.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.10.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.9.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.8.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.7.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.6.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.5.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.4.0">
    <a href="/isagalaev/highlight.js/releases/tag/9.3.0">

If we take the same URL and add the extension .atom to it, then GitHub presents the same data in a consumable feed format. Now we have structured data with timestamps, URLs and descriptions.

$ curl -s https://github.com/isagalaev/highlight.js/releases.atom
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/" xml:lang="en-US">
  <id>tag:github.com,2008:https://github.com/isagalaev/highlight.js/releases</id>
  <link type="text/html" rel="alternate" href="https://github.com/isagalaev/highlight.js/releases"/>
  <link type="application/atom+xml" rel="self" href="https://github.com/isagalaev/highlight.js/releases.atom"/>
  <title>Release notes from highlight.js</title>
  <updated>2017-05-31T02:46:46Z</updated>
  <entry>
    <id>tag:github.com,2008:Repository/1213225/9.12.0</id>
    <updated>2017-05-31T02:46:46Z</updated>
    <link rel="alternate" type="text/html" href="/isagalaev/highlight.js/releases/tag/9.12.0"/>
    <title>9.12.0</title>
    <content type="html"><p>Version 9.12.0</p></content>
    <author>
      <name>isagalaev</name>
    </author>
    <media:thumbnail height="30" width="30" url="https://avatars2.githubusercontent.com/u/99931?v=4&s=60"/>
  </entry>
...

This data can be used by a custom parser or RSS-readers like TT-RSS, but also used by platforms like IFTTT to trigger actions like adding it to a backlog or posting it to a Slack-channel.

Removing SPF Resource Records

With the creation of RFC 4408 also new a record type 99 for DNS was created to identify SPF Resource Records. It was advised to have both TXT and SPF records in DNS with the same content.  RFC 4408 was obsoleted by RFC 7208 in 2014 with paragraph 3.1 stating the following:

SPF records MUST be published as a DNS TXT (type 16) Resource Record (RR) [RFC1035] only.  The character content of the record is encoded as [US-ASCII].  Use of alternative DNS RR types was supported in SPF's experimental phase but has been discontinued.

Now that the SPF Resource Record has been discontinued for  a while, the time has come to remove it from DNS (if not done already) and make sure it never comes back. Luckily most code libaries already preferred the TXT variant, but still this is one to put on the maintenance checklist to remove it for any application code and/or infrastructure.

Increasing Inotify Watches Limit

After upgrading to PyCharm 2017.2 the notice came that inotify value was too low and the IDE would fallback to recursive directory scanning. For now the following commands increase the inotify limit to 512k of files.

$ cat <<EOF | sudo tee /etc/sysctl.d/idea.conf
fs.inotify.max_user_watches = 524288
EOF
$ sudo sysctl -p --system
...
* Applying /etc/sysctl.d/idea.conf ...
fs.inotify.max_user_watches = 524288
...

It is still interesting why PhpStorm wasn’t complaining while the Symfony projects are much larger.

Is CWE-525 still relevant?

During a code upgrade for a web application from Symfony 2.8 to 3.3 it also became time to do some basic tests with Zed Attack Proxy. While most findings were logical and easy to fix, but one was different and it started with the finding below.

Description:
The AUTOCOMPLETE attribute is not disabled on an HTML FORM/INPUT element containing password type input. Passwords may be stored in browsers and retrieved.
Solution:
Turn off the AUTOCOMPLETE attribute in forms or individual input elements containing password inputs by using AUTOCOMPLETE=’OFF’.

Included was the reference to CWE-525 that describes the risk and how to fix it. It leads back to the login section described in a Twig template file as below not having the proper attributes set on the form and/or input element.

<form class="navbar-form navbar-right" action="{{ path('login') }}" method="post">
    <div class="form-group">
        <input type="text" id="username" name="_username" placeholder="Email" class="form-control">
    </div>
    <div class="form-group">
        <input type="password" id="password" name="_password" placeholder="Password" class="form-control">
    </div>
    <button type="submit" class="btn btn-success">Sign in</button>
</form>

Mozilla Developer Network has an article about turning off form autocomplete for sensitive fields that browsers shouldn’t cache. Storing credit card data for example isn’t a good idea unless it is in a secured storage area. Updating the template file to contain right attributes and ZAP doesn’t complain anymore as we tell the browser not to store the password field.

<form class="navbar-form navbar-right" action="{{ path('login') }}" method="post">
    <div class="form-group">
        <input type="text" id="username" name="_username" placeholder="Email" class="form-control">
    </div>
    <div class="form-group">
        <input type="password" id="password" name="_password" placeholder="Password" class="form-control" autocomplete="off">
    </div>
    <button type="submit" class="btn btn-success">Sign in</button>
</form>

The story doesn’t end by adding the right attributes to resolve the finding, because browser makers have moved on as the attribute is being ignored for input elements of the type password for example. This as the benefit of having people store a secure password in password manager has a better risk score than people remembering and (re)using weak passwords. With this is mind we basically fixed a false positive to make the audit finding green.

For this reason, many modern browsers do not support autocomplete=”off” for login fields:

  • If a site sets autocomplete=”off” for a form, and the form includes username and password input fields, then the browser will still offer to remember this login, and if the user agrees, the browser will autofill those fields the next time the user visits the page.
  • If a site sets autocomplete=”off” for username and password input fields, then the browser will still offer to remember this login, and if the user agrees, the browser will autofill those fields the next time the user visits the page.

Here comes the problem as ZAP generates a finding that a developer has to spend time on as he needs to investigate and then has to implement a solution or explain why it isn’t worth fixing. Security scanners are handy for detecting low hanging fruit, but they shouldn’t become a time waster with outdated rules as developers will start ignoring findings or let findings slowly starve on their backlog.