Don’t Judge an Ebook by Its Cover

Interesting times lead to interesting opportunities. The current pandemic is proving no exception, but, sadly, it’s an opportunity for some attackers who have laid a rather cunning trap. As you no doubt know, supply chain security typically focuses on firmware and installers. However, in the course of researching vendor documentation, we saw a clever technique being utilized by attackers targeting critical infrastructure and industrial asset owners.

Instead of modifying software, the bad guys are going after ICS documentation. They are planting modified documents using legitimate titles and content from OEM manuals, thus polluting search engine results so that they can deliver a modified PDF/drive-by download to potentially compromise ICS systems. What we observed began about 2 weeks ago and it’s been getting worse since then.

Why is this scheme clever?

  • It targets specific industries susceptible to already known vulnerabilities.
  • It mimics real content on Google by stealing legitimate hosting on a valid domain, such as a university.
  • It blends in with system integrators hosting various information and installers outside of the official OEM ecosystem.
  • It targets end users who might not be the most cyber-savvy, especially if they are in a hurry or struggling during this pandemic.

In case you still think air gaps are a thing, this type of attack ends that debate. Any well-meaning technician seeking a manual or other deployment document, or perhaps an application installer, can download what they believe is a legitimate file right into their facility — bypassing the facility’s “secure” perimeter. The Google search results look reasonable enough, and that may seem like a faster way to get your hands on a document than navigating through a complex vendor website.

Let me share an example. On our hunt for the manual for an Emerson Delta-V DCS, we ran into a downright weird site that seemed pretty sketchy (not to pick on Emerson: we ran into variants of this issue with other vendors as well). Googling for the manual returned results that looked fairly reasonable:

We went ahead and clicked the first result (as one might do if they are in a hurry, had trouble tracking down the manual via other searches, or just generally consider .edu sites to be trustworthy). We then found ourselves here:

Suspicious, we checked out that original domain from the Google results and something definitely looked strange (and each time we reloaded the page, the list of seemingly random titles was different):

No way were we going to click on that Click Here link.

We had a similar experience when Googling for a GE training manual. (I won’t bother with all the screen captures from that adventure!) The manual showed up in the results multiple times, using the exact text and product names from the authentic GE document — but they were hosted on a bunch of .ru sites. And a comparison of one of the Russian versions and the real document showed that the PDF had indeed been tampered with. The XML in the PDF had been rewritten and the trailer (tail end) was removed. We haven’t yet done a full analysis on the details of the tampering, but you obviously don’t want technicians downloading it onto a secure plant floor.

Most OT sites are soft targets on the inside, so the impacts could be highly disruptive (e.g., delivery of ransomware) and the infected user instantly becomes an unwitting malware delivery service.

So what to do?

  • For vendors: use FACT (🙂) to fingerprint your files, including your PDFs, and encourage your customers to always validate the origins of their software and to use your OEM portal.
  • For asset owners: use FACT and authenticate files before letting them anywhere near your critical systems.
  • For everyone: perform endpoint protection to block commodity malware and limit unauthorized software or documentation from being installed.

FACT is free (and easy) to use, so there’s really no reason to take risks with files of dubious origin. Try out FACT here.

Sniffing Out Fakes: From Saffron in Marrakech to Digital Certificates

Eric Byres in Morocco

I’m writing this blog from Marrakech, a city in the western foothills of Morocco’s High Atlas Mountains. Marrakech has been a trading city since it was established by a clan of Berber warriors (the Almoravids) in the 11th century. The heart of the city (where Joann and I are staying) is the medina, a densely packed, walled medieval city with over 9000 maze-like alleys full of noisy, chaotic souks (marketplaces) that sell everything from traditional textiles, pottery, and jewelry to food and spices to motorcycle parts. There is probably nothing you can’t buy, either legal or illegal, in the Marrakech medina.

Like all unregulated marketplaces, the Marrakech medina has its share of fakes and counterfeits. Some are very obvious (Armani bags for $50), some are highly amusing (need an official Louis Vuitton football anyone?), and some are subtle. But the one fake that really interested me was counterfeit saffron.

If you aren’t familiar with saffron, it is a vivid crimson spice created by collecting the stigma and styles from crocus flowers. According to Wikipedia, saffron is the world’s most costly spice by weight. I won’t disagree: the saffron we ended up buying in the medina cost $6 USD per gram, but we heard of some higher quality stuff selling for over 4 times that price.

Now, at those sorts of prices, it isn’t surprising that some crooked merchants might start making fake product to swindle the unsuspecting consumer. Joann and I wanted to learn a bit about both the real and the fake saffron, so we spoke to a reputable spice merchant in the souk. He showed us both the real and fake product and what he looks for when buying wholesale.

I won’t go into the details of selecting good authentic saffron in this blog, but the fake stuff fascinated me. While there are some good fakes that require pretty sophisticated testing, many fakes are easily spotted impostors. These are made by simply dying corn silk with either red food colouring or paprika. The tests to spot them are simple, as this Youtube video shows.

This got me wondering: how can it be profitable to make and sell such poor quality fakes? After all, if one can detect them so easily after a few minutes of education, wouldn’t anyone selling these fakes be immediately discovered and never make a sale? (Or much worse: in Middle Ages, those found selling adulterated saffron were executed under the Safranschou code.)

Sadly, there must be a large enough cohort of people (aka tourists?) that buy saffron without knowing the first thing about it. Or, to put it another way, clearly counterfeit saffron doesn’t need to be a quality imitation to be an effective scam, it just needs to be good enough to fool a person that lacks either the knowledge, time or incentive to perform the simple tests (such as a tourist in a rush to get back to a tour bus).

Later that night I realized that the cybersecurity world has been seeing this same situation playing out in the area of digital signatures for executable files (aka code signing). In 2017, Doowon Kim, Bum Jun Kwon, and Tudor Dumitraș at the University of Maryland published a paper investigating malware that carried digital signatures.

Some of the malware they investigated had been digitally signed with keys stolen from legitimate companies: Stuxnet being the most famous example of this sort of trickery. In other cases, malware was signed using certificates that had been mistakenly issued to malicious actors impersonating legitimate companies. For example, in 2001 VeriSign issued two code signing certificates with the common name of “Microsoft Corporation” to an adversary who claimed to be a Microsoft employee. Both of these types of exploits require a considerable amount of expertise and effort to carry out.

However, the authors discovered a third unbelievably simple exploit that accounted for almost one-third of the signed malware in the wild. In the words of the authors:

We find that simply copying an Authenticode signature from a legitimate file to a known malware sample may cause anti-virus products to stop detecting it, even though the signature is invalid, as it does not match the file digest. 34 anti-virus products are affected, and this type of abuse accounts for 31.1% of malware signatures in the wild.

This is the digital equivalent of the border officer that lets you pass simply because you have a passport in your hand. Not taking the time to see if the passport actually belongs to you completely invalidates the integrity of the passport system.

Now, in all my travels, I’ve never actually seen a border officer do this. They are trained to follow the approved validation processes that ensure all but the most skillfully constructed fake passport is detected.

But, like saffron purchasers, much of the IT and OT world has been assuming that the mere existence of a digital signature is proof of that software being trustworthy. This is a terrible assumption that allows malicious actors an easy attack path. Unless we start to properly test software signatures, the bad guys will penetrate our systems just as quickly as the scam artist in the medina separates tourists and their money.

For More Information:

Doowon Kim, Bum Jun Kwon, and Tudor Dumitraș Certified Malware: Measuring Breaches of Trust in the Windows Code-Signing PKI, CCS ’17 Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, Pages 1435-1448, October 2017

Watch how FACT handles code signing/certificate checking:

For More Reading:

NIST 800-63-3 Appendix A: The Strength of Memorized Secrets:

Who Infected Schneider Electrics’ Thumbdrive?

Infected USB Drive

On 24 August 2018 Schneider Electric issued a security notification alerting users that the Communications and Battery Monitoring devices for their Conext Solar Energy Monitoring Systems  were shipped with malware-infected USB drives.

First of all, kudos to Schneider Electric for alerting their customers and providing information on how to remedy the situation. According to Schneider, the infected files would not affect the devices themselves. Schneider also noted that the particular malware was easy to detect and remove by common virus scanning programs.

Provided that all of Schneider’s customers read these alerts, this should remain a minor security incident. Unfortunately, this is a big assumption. Due to the complexities of modern distribution channels, I’m pretty certain no one in the world knows if the Schneider notice is getting to the people who actually use the Conext product. It could be getting stuck on some purchasing manager’s desk, never to be forwarded to the technicians in the field. Or it could be languishing in the inbox of an engineering firm that is no longer working at the location where the Conext product is deployed. If ZDNet and CyberScoop had not reported on the story, it may have stayed off everyone’s radar.   Clearly, both vendors and asset owners need better ways of sharing urgent security information.

The Conext Battery Monitor from Schneider Electric
The Conext Battery Monitor
Source: Schneider Electric

But what is especially interesting is that the thumb drives were not infected at Schneider’s facilities.  They were infected via a third-party supplier during the manufacturing process. Like ALL major ICS vendors, the supply chain for Schneider hardware, software (and even the media upon which it is shipped) is exposed to many hands.

This situation highlights an alarming reality in the ICS world. Just because a digital file comes from a trusted vendor doesn’t mean you can trust all the other companies that touched that file.

Who knows which “third-party supplier’s facility” was involved in contaminating those USB drives?? Was it the USB manufacturer… or a duplication company… or even a graphics company who added some branding? Schneider Electric no doubt will be re-thinking that relationship, but the fact remains that they have to work with 3rd parties to get their products to market.

The worrisome question is, what other ICS vendors use that same third-party supplier? How widespread is the infection? It seems unlikely that Schneider Electric is this supplier’s only customer. Naming and shaming the supplier may be fraught with legal consequences (or perhaps they are still tracking down the specific vendor) so Schneider has remained silent for now on the source of the malware. That means all the other vendors out there and their customers may be exposed as well. Or not. We don’t know – and that is a problem.

One hopes that if other vendors have detected issues with their USB drives, they will follow Schneider Electric’s lead and issue prompt alerts. Some vendors are better than others at transparency and there will likely be some who choose to lay low instead to avoid bad publicity. It is a pity because vendors like Schneider are as much a victim in this scenario as the end users.

This is one of the reasons aDolus is developing a platform for ICS asset owners and vendors that offers an ecosystem of trust where they can verify software of, let’s call it “complicated origin” and ensure it hasn’t been tampered with BEFORE they install it. We’re also looking at ways vendors can get early warnings about security issues occurring at their client’s sites and not have to wait until hundreds or thousands of facilities have been infected.  

Interested in learning more about protecting yourself from compromised software? Let us know if you are an end user interested in validating ICS software or an ICS vendor interested in protecting your distribution mechanisms to ensure they are clean.

Follow Us:


Building (or Losing) Trust in our Software Supply Chain

Back in 2014, when I was managing Tofino Security, I became very interested in the Dragonfly attacks against industrial control systems (ICS). I was particularly fascinated with the ways that the attackers exploited the trust between ICS suppliers and their customers. Frankly, this scared me because, as I will explain, I knew that all the firewalls, antivirus, whitelisting, and patching in the world would do little to protect us from this threat.

If you are not familiar with the Dragonfly attacks, they were launched against the pharmaceutical industry (and likely the energy industry) in 2013 and 2014. The attacks actually started in early 2013 with a spear phishing campaign against company executives. But the part that concerned me began later, starting in June 2013 and ending in April 2014.

During that period, the Dragonfly attackers penetrated the websites of three ICS vendors: vendors who supply hardware and software to the industrial market. Once the bad guys controlled these websites, they replaced the vendors’ legitimate software/firmware packages with new packages that had Trojan malware called Havex embedded in them (Attack Stage #1).

When the vendors’ customers went to these websites they would see that there was a new version of software for their ICS products. They would then download these infected packages, believing them to be valid updates (Attack Stage #2). And because one of the messages we give in the security world is to “keep your systems patched,” these users pushed out the evil updates to the control systems in their plants (Attack Stage #3).

Once these systems were infected, the Havex malware would call back to the hacker’s command and control center, informing the attackers that they had penetrated deep into a control system. The attackers then downloaded tools for ICS reconnaissance and manipulation into the infected ICS hardware (Attack Stage #4). These new attack tools focused on the protocols we all know well in the ICS world, such as Modbus, OPC, and Ethernet/IP.

As far as we know, the attackers were most interested in stealing industrial intellectual property — not destroying equipment or endangering lives. However, there was nothing that would have restricted the attackers to just information theft. Their tool sets were extremely flexible and could have easily included software that would manipulate or destroy a process.

The Dragonfly attacks were particularly insidious because they took advantage of the trust between suppliers and end users. The engineers and technicians in industrial plants inherently trust their suppliers to provide safe, secure, and reliable software. By downloading software and installing it, the Dragonfly victims were doing what they had been told would improve their plant’s security. In effect, these users were unwittingly helping the attackers bypass all the firewalls, circumvent any whitelisting or malware detection, and go directly to the critical control systems.

This is what I call “Exploiting the Supplier-User Trust Chain” — and I think it is one of the most serious security risks facing our world today. It is not only a problem for ICS-focused industries like energy or manufacturing, but also for any person or company that uses “smart “ devices… which is pretty well all of us. Aircraft, automobiles, and medical devices are all susceptible to this sort of attack.

So with the help of Billy Rios, Dr. Jonathan Butts , a great team of researchers, and the DHS Silicon Valley Initiatives Program, I’ve been working on finding a solution to the Chain-of-Trust challenge. aDolus and FACTTM (Framework for Analysis and Coordinated Trust) are the result of 1000s of hours of our systematic investigation into the problem and its possible solutions. Join me on this blog over the next few months as I share what we have learned and where we still have to go to ensure trust in our software.

For more on Dragonfly:

Scroll to top