Don’t Judge an Ebook by Its Cover

Interesting times lead to interesting opportunities. The current pandemic is proving no exception, but, sadly, it’s an opportunity for some attackers who have laid a rather cunning trap. As you no doubt know, supply chain security typically focuses on firmware and installers. However, in the course of researching vendor documentation, we saw a clever technique being utilized by attackers targeting critical infrastructure and industrial asset owners.

Instead of modifying software, the bad guys are going after ICS documentation. They are planting modified documents using legitimate titles and content from OEM manuals, thus polluting search engine results so that they can deliver a modified PDF/drive-by download to potentially compromise ICS systems. What we observed began about 2 weeks ago and it’s been getting worse since then.

Why is this scheme clever?

  • It targets specific industries susceptible to already known vulnerabilities.
  • It mimics real content on Google by stealing legitimate hosting on a valid domain, such as a university.
  • It blends in with system integrators hosting various information and installers outside of the official OEM ecosystem.
  • It targets end users who might not be the most cyber-savvy, especially if they are in a hurry or struggling during this pandemic.

In case you still think air gaps are a thing, this type of attack ends that debate. Any well-meaning technician seeking a manual or other deployment document, or perhaps an application installer, can download what they believe is a legitimate file right into their facility — bypassing the facility’s “secure” perimeter. The Google search results look reasonable enough, and that may seem like a faster way to get your hands on a document than navigating through a complex vendor website.

Let me share an example. On our hunt for the manual for an Emerson Delta-V DCS, we ran into a downright weird site that seemed pretty sketchy (not to pick on Emerson: we ran into variants of this issue with other vendors as well). Googling for the manual returned results that looked fairly reasonable:

We went ahead and clicked the first result (as one might do if they are in a hurry, had trouble tracking down the manual via other searches, or just generally consider .edu sites to be trustworthy). We then found ourselves here:

Suspicious, we checked out that original domain from the Google results and something definitely looked strange (and each time we reloaded the page, the list of seemingly random titles was different):

No way were we going to click on that Click Here link.

We had a similar experience when Googling for a GE training manual. (I won’t bother with all the screen captures from that adventure!) The manual showed up in the results multiple times, using the exact text and product names from the authentic GE document — but they were hosted on a bunch of .ru sites. And a comparison of one of the Russian versions and the real document showed that the PDF had indeed been tampered with. The XML in the PDF had been rewritten and the trailer (tail end) was removed. We haven’t yet done a full analysis on the details of the tampering, but you obviously don’t want technicians downloading it onto a secure plant floor.

Most OT sites are soft targets on the inside, so the impacts could be highly disruptive (e.g., delivery of ransomware) and the infected user instantly becomes an unwitting malware delivery service.

So what to do?

  • For vendors: use FACT (🙂) to fingerprint your files, including your PDFs, and encourage your customers to always validate the origins of their software and to use your OEM portal.
  • For asset owners: use FACT and authenticate files before letting them anywhere near your critical systems.
  • For everyone: perform endpoint protection to block commodity malware and limit unauthorized software or documentation from being installed.

FACT is free (and easy) to use, so there’s really no reason to take risks with files of dubious origin. Try out FACT here.

Windows 10 Certificate Validation Bug Exposes a Fundamental Weakness

Major Windows CVEThe announcement Tuesday from the NSA about the new cryptographic vulnerability in the Microsoft Windows operating system sent ripples of shock through our entire community. In case you missed it, this devastating vulnerability (CVE-2020-0601) allows attackers to bypass trust mechanisms to falsify certificates, making them appear to come from a trusted source. It also allows attackers to falsely authenticate themselves on vulnerable HTTPS connections and remotely execute code. Let’s hope everyone is on top of their Microsoft security patches or there could be some serious damage done.

This week’s warning isn’t the usual story of forged certificates or somebody using stolen keys. We all remember Stuxnet (read more on that here), but that exploit required the attackers to penetrate and then steal the code signing keys from two trusted software manufacturers. The theft was non-trivial and the stolen keys were only dangerous while the theft remained undiscovered. Once the world learned about the theft, any certificate created from the stolen keys could be revoked and rendered useless. In other words, the Stuxnet code signing problem was serious but the fix was simple.

But what happens to trust when you can’t trust the trust system? With this latest vulnerability, we’re talking about the very underpinnings of digital signing and software validation for any software running on any current Windows-based platform. And while the vulnerability doesn’t impact the actual controllers on the plant floor, I’m willing to bet that 99.9% of today’s industrial systems are running the Windows operating system for all the operator HMIs, engineering stations, data historians, and management servers. In other words, while this vulnerability doesn’t impact the actual PLCs, it will allow counterfeit and malicious software to sneak onto all the computers that communicate with, manage, or report on industrial processes.

This isn’t the first time that the limitations of code signing have been laid bare. In 2017, researchers at the University of Maryland showed that there were, at the time, over one million malware files in the wild that were signed. Such files are signed by bad guys as a means of fooling poorly-written antivirus software into thinking the malware is legitimate software, causing the software to skip over it.

So, as I point out frequently at conferences, code signing and digital certificates are necessary but not sufficient to ensure software is tamper-free and legitimate. This is especially true in critical infrastructures, where the use of code-signing is limited* and multiple validation mechanisms are necessary to keep our industrial processes reliable and our people safe.

This all ties back to why, over a half-decade ago, I became interested in alternative methods of validating software. My current project, the Framework for Analysis and Coordinated Trust (FACT), provides a collection of validation checks for vulnerabilities, malware, and subcomponent analysis, and does a deep dive into a file’s full certificate chain. Then, after thorough scrutiny, the platform provides a “FACT trust score” that technicians and managers can use to be confident in the decision to install a package (or the decision not to).

Certainly, any single test that FACT performs could be misled by a vulnerability like this latest one. However, by combining multiple tests and enabling the community to share intelligence, we stand a much better chance of outing rogue packages, counterfeits, and deprecated versions.

The ICS world needs ways it can trust software and firmware that cannot be signed (e.g., controller binaries) and confirms the validity of files that are signed, but with invalid certificates. I hope you’ll join the FACT community and help make ICS safer and more secure.

If you want to learn more, check out a quick video on how FACT handles Code Signing Validation.

If you want to kick the tires for yourself, try the FACT platform for free.


* For most embedded devices in the industrial world, code signing isn’t even an option. The operating systems found in most industrial devices don’t have the ability to validate certificates. ICS vendors are making progress in having the newest controllers offer validation features, but it will be many years before we can expect code signing to be broadly deployed in ICS.

Sniffing Out Fakes: From Saffron in Marrakech to Digital Certificates

Eric Byres in Morocco

I’m writing this blog from Marrakech, a city in the western foothills of Morocco’s High Atlas Mountains. Marrakech has been a trading city since it was established by a clan of Berber warriors (the Almoravids) in the 11th century. The heart of the city (where Joann and I are staying) is the medina, a densely packed, walled medieval city with over 9000 maze-like alleys full of noisy, chaotic souks (marketplaces) that sell everything from traditional textiles, pottery, and jewelry to food and spices to motorcycle parts. There is probably nothing you can’t buy, either legal or illegal, in the Marrakech medina.

Like all unregulated marketplaces, the Marrakech medina has its share of fakes and counterfeits. Some are very obvious (Armani bags for $50), some are highly amusing (need an official Louis Vuitton football anyone?), and some are subtle. But the one fake that really interested me was counterfeit saffron.

If you aren’t familiar with saffron, it is a vivid crimson spice created by collecting the stigma and styles from crocus flowers. According to Wikipedia, saffron is the world’s most costly spice by weight. I won’t disagree: the saffron we ended up buying in the medina cost $6 USD per gram, but we heard of some higher quality stuff selling for over 4 times that price.

Now, at those sorts of prices, it isn’t surprising that some crooked merchants might start making fake product to swindle the unsuspecting consumer. Joann and I wanted to learn a bit about both the real and the fake saffron, so we spoke to a reputable spice merchant in the souk. He showed us both the real and fake product and what he looks for when buying wholesale.

I won’t go into the details of selecting good authentic saffron in this blog, but the fake stuff fascinated me. While there are some good fakes that require pretty sophisticated testing, many fakes are easily spotted impostors. These are made by simply dying corn silk with either red food colouring or paprika. The tests to spot them are simple, as this Youtube video shows.

This got me wondering: how can it be profitable to make and sell such poor quality fakes? After all, if one can detect them so easily after a few minutes of education, wouldn’t anyone selling these fakes be immediately discovered and never make a sale? (Or much worse: in Middle Ages, those found selling adulterated saffron were executed under the Safranschou code.)

Sadly, there must be a large enough cohort of people (aka tourists?) that buy saffron without knowing the first thing about it. Or, to put it another way, clearly counterfeit saffron doesn’t need to be a quality imitation to be an effective scam, it just needs to be good enough to fool a person that lacks either the knowledge, time or incentive to perform the simple tests (such as a tourist in a rush to get back to a tour bus).

Later that night I realized that the cybersecurity world has been seeing this same situation playing out in the area of digital signatures for executable files (aka code signing). In 2017, Doowon Kim, Bum Jun Kwon, and Tudor Dumitraș at the University of Maryland published a paper investigating malware that carried digital signatures.

Some of the malware they investigated had been digitally signed with keys stolen from legitimate companies: Stuxnet being the most famous example of this sort of trickery. In other cases, malware was signed using certificates that had been mistakenly issued to malicious actors impersonating legitimate companies. For example, in 2001 VeriSign issued two code signing certificates with the common name of “Microsoft Corporation” to an adversary who claimed to be a Microsoft employee. Both of these types of exploits require a considerable amount of expertise and effort to carry out.

However, the authors discovered a third unbelievably simple exploit that accounted for almost one-third of the signed malware in the wild. In the words of the authors:

We find that simply copying an Authenticode signature from a legitimate file to a known malware sample may cause anti-virus products to stop detecting it, even though the signature is invalid, as it does not match the file digest. 34 anti-virus products are affected, and this type of abuse accounts for 31.1% of malware signatures in the wild.

This is the digital equivalent of the border officer that lets you pass simply because you have a passport in your hand. Not taking the time to see if the passport actually belongs to you completely invalidates the integrity of the passport system.

Now, in all my travels, I’ve never actually seen a border officer do this. They are trained to follow the approved validation processes that ensure all but the most skillfully constructed fake passport is detected.

But, like saffron purchasers, much of the IT and OT world has been assuming that the mere existence of a digital signature is proof of that software being trustworthy. This is a terrible assumption that allows malicious actors an easy attack path. Unless we start to properly test software signatures, the bad guys will penetrate our systems just as quickly as the scam artist in the medina separates tourists and their money.

For More Information:

Doowon Kim, Bum Jun Kwon, and Tudor Dumitraș Certified Malware: Measuring Breaches of Trust in the Windows Code-Signing PKI, CCS ’17 Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, Pages 1435-1448, October 2017

Watch how FACT handles code signing/certificate checking:

For More Reading:

NIST 800-63-3 Appendix A: The Strength of Memorized Secrets:

Scroll to top