Is CWE-525 still relevant?

During a code upgrade for a web application from Symfony 2.8 to 3.3 it also became time to do some basic tests with Zed Attack Proxy. While most findings were logical and easy to fix, but one was different and it started with the finding below.

Description:
The AUTOCOMPLETE attribute is not disabled on an HTML FORM/INPUT element containing password type input. Passwords may be stored in browsers and retrieved.
Solution:
Turn off the AUTOCOMPLETE attribute in forms or individual input elements containing password inputs by using AUTOCOMPLETE=’OFF’.

Included was the reference to CWE-525 that describes the risk and how to fix it. It leads back to the login section described in a Twig template file as below not having the proper attributes set on the form and/or input element.

<form class="navbar-form navbar-right" action="{{ path('login') }}" method="post">
    <div class="form-group">
        <input type="text" id="username" name="_username" placeholder="Email" class="form-control">
    </div>
    <div class="form-group">
        <input type="password" id="password" name="_password" placeholder="Password" class="form-control">
    </div>
    <button type="submit" class="btn btn-success">Sign in</button>
</form>

Mozilla Developer Network has an article about turning off form autocomplete for sensitive fields that browsers shouldn’t cache. Storing credit card data for example isn’t a good idea unless it is in a secured storage area. Updating the template file to contain right attributes and ZAP doesn’t complain anymore as we tell the browser not to store the password field.

<form class="navbar-form navbar-right" action="{{ path('login') }}" method="post">
    <div class="form-group">
        <input type="text" id="username" name="_username" placeholder="Email" class="form-control">
    </div>
    <div class="form-group">
        <input type="password" id="password" name="_password" placeholder="Password" class="form-control" autocomplete="off">
    </div>
    <button type="submit" class="btn btn-success">Sign in</button>
</form>

The story doesn’t end by adding the right attributes to resolve the finding, because browser makers have moved on as the attribute is being ignored for input elements of the type password for example. This as the benefit of having people store a secure password in password manager has a better risk score than people remembering and (re)using weak passwords. With this is mind we basically fixed a false positive to make the audit finding green.

For this reason, many modern browsers do not support autocomplete=”off” for login fields:

  • If a site sets autocomplete=”off” for a form, and the form includes username and password input fields, then the browser will still offer to remember this login, and if the user agrees, the browser will autofill those fields the next time the user visits the page.
  • If a site sets autocomplete=”off” for username and password input fields, then the browser will still offer to remember this login, and if the user agrees, the browser will autofill those fields the next time the user visits the page.

Here comes the problem as ZAP generates a finding that a developer has to spend time on as he needs to investigate and then has to implement a solution or explain why it isn’t worth fixing. Security scanners are handy for detecting low hanging fruit, but they shouldn’t become a time waster with outdated rules as developers will start ignoring findings or let findings slowly starve on their backlog.

Integrity checking for JavaScript

Including JavaScript files from a CDN can be beneficial in many ways as you don’t have to ship the code with your code and caching can be done by the browser of a proxy server. It also allows for injecting untrusted code into a web page as someone else is hosting the code you rely on. But Firefox, Chrome and Opera already support Subresource Integrity checking script and link tags. Hopefully both Safari and Edge (or Internet Explorer) will support it soon.

But how does it work? First let calculate the SHA256 hash of JQuery version 3.2.1 hosted by CloudFlare. Also keep in mind to verify this number with the official version offered by JQuery. In this example we download the minimized version of JQuery with curl and run it twice through openssl to generate the checksum and encode the result in base64 format.

$ curl -s https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js | openssl dgst -sha256 -binary | openssl enc -base64 -A
hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=

Now that we have the hash we can add the integrity attribute to the script tag and the prefix for the hash is “sha256-” to indicate the hashing used. From this point forward a browser that supports SubResource Integrity will require that the provided hash will match the calculated hash of the downloaded file.

<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>

Beside SHA256 the specification allows for SHA384 and SHA512 to be used. Calculation is the same as with SHA256 and we only change the algorithm that openssl needs to use.

$ curl -s https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js | openssl dgst -sha512 -binary | openssl enc -base64 -A
3P8rXCuGJdNZOnUx/03c1jOTnMn3rP63nBip5gOP2qmUh5YAdVAvFZ1E+QLZZbC1rtMrQb+mah3AfYW11RUrWA==

We could put only the SHA512 hash in the attribute, but we can put multiple algorithm results in the same attribute by just splitting them with a space. This leaves a lot of room for proper lifecycle management of hashing algorithms as you can present multiple hashes when you switch to a better version instead of doing it big bang style and hope for the best.

<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4= sha512-3P8rXCuGJdNZOnUx/03c1jOTnMn3rP63nBip5gOP2qmUh5YAdVAvFZ1E+QLZZbC1rtMrQb+mah3AfYW11RUrWA==" crossorigin="anonymous"></script>

The next step is to have a fallback when the CDN you rely on goes down or is serving corrupt files. You could add a second src tag as in the example below that tells the browser to use the Google CDN when CloudFlare has issues serving the correct files.

<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js" noncanonical-src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4= sha512-3P8rXCuGJdNZOnUx/03c1jOTnMn3rP63nBip5gOP2qmUh5YAdVAvFZ1E+QLZZbC1rtMrQb+mah3AfYW11RUrWA==" crossorigin="anonymous"></script>

The next step is to get the Content-Security-Policy header correct, but for now only Firefox 49 and higher have the option to act on the require-sri-for attribute. This would basically force the browser to only load scripts and style sheets if the SRI-steps are successful, but many a lot of developers need to optimise their build pipeline to produce correct hashes and have correct monitoring to detect problems.

Someone Is Learning How to Take Down the Internet

Bruce Schneier has an interesting article about a development that brings back memories from when Stuxnet was discovered.

Over the past year or two, someone has been probing the defenses of the companies that run critical pieces of the Internet. These probes take the form of precisely calibrated attacks designed to determine exactly how well these companies can defend themselves, and what would be required to take them down. We don’t know who is doing this, but it feels like a large a large nation state. China and Russia would be my first guesses.

This may be in line with some findings that a new larger team is developing malware and exploits for an international scale. It also urges a lot of parties to take security more serious and not only go for compliance. It also may put the announcement from GCHQ about the great British firewall in a new light.

Security Weekly: The State Of Healthcare Security

Security Weekly episode 479 has an interesting section about the State of Healthcare Security.


The most interesting question is how we as an sector are going to convince people to by new equipment every 3 to 5 years or how we can make something that will last at least 20 to 30 years.

Kali Linux 2016.2

Last week Kali Linux 2016.2 was released so it was time to make a new VirtualBox instance for it to see the difference from the release in January. But let’s automate a little bit to quickly rebuild virtual machines for Kali Linux.

$ cd ~/Downloads
$ wget http://cdimage.kali.org/kali-2016.2/kali-linux-2016.2-amd64.iso

Let’s create the virtual machine and boot it. In this example it is bound to the wireless network card and allocates an 16 GB disk image as the default 8 GB size for Debian is too small and 10 GB is the minimum advised.

$ export VM="Kali Linux 2016.2"
$ VBoxManage createhd --filename "$HOME/VirtualBox VMs/$VM/$VM.vdi" --size 16384
$ VBoxManage createvm --name "$VM" --ostype "Debian_64" --register
$ VBoxManage storagectl "$VM" --name "SATA Controller" --add sata --controller IntelAHCI
$ VBoxManage storageattach "$VM" --storagectl "SATA Controller" --port 0 --device 0 --type hdd --medium "$HOME/VirtualBox VMs/$VM/$VM.vdi"
$ VBoxManage storagectl "$VM" --name "IDE Controller" --add ide
$ VBoxManage storageattach "$VM" --storagectl "IDE Controller" --port 0 --device 0 --type dvddrive --medium $HOME/Downloads/kali-linux-2016.2-amd64.iso
$ VBoxManage modifyvm "$VM" --ioapic on
$ VBoxManage modifyvm "$VM" --boot1 dvd --boot2 disk --boot3 none --boot4 none
$ VBoxManage modifyvm "$VM" --memory 1024 --vram 128
$ VBoxManage modifyvm "$VM" --nic1 bridged --bridgeadapter1 wlp1s0
$ VBoxManage startvm "$VM

After the installation is completed and the machine is powered down it is safe to remove the virtual DVD and create a snapshot to always quickly return to.

$ VBoxManage storageattach "$VM" --storagectl "IDE Controller" --port 0 --device 0 --type dvddrive --medium none
$ VBoxManage snapshot "$VM" take "Snapshot 1"

I can now continue to prepare for the Offensive Security Certified Professional (OSCP) training. Hopefully I can also join the CTF organized by Platform voor Informatie Beveiliging.