PerezBox

Tony Perez On Security, Business, And Life

  • Security
  • Business
  • Life
  • About
  • Contact
standard post icon

3 Tips to Secure Your Home Network

Published in Security on March 24, 2020

Whether we like it or not, we have all become the network administrators of our own home networks. As such, our responsibilities extend beyond protecting our families to helping to be good stewards of the networks we’re connecting to (e.g., Work).

To help, here are a few tips that should help you create a safe environment for both you, your loved ones and your company.

1 – Establish a Separate Networks

Over the past 10 years I have spent a great deal of time working with consumers and business around the world helping them prevent and remediate hacks. More often than not, they missed one very simple principle in security – functional isolation.

This simply speaks to the idea of isolating environments to a specified function. In the world I come from, that was ensuring that you didn’t have a a server serving multiple functions (e.g., web server, DB server, File Server, Key Server, etc..).

The same rule applies to your home network. Without getting into the weeds, a very simple trick is to create a dedicated subnet for your use. A very easy way to do this is a purchase a second router, connect it to the one your Internet Service Provider (ISP) provided you and restrict access to that network.

Example of a Home Network with two subnets – one for work and one for the family

In addition to isolating traffic, it will have the added benefit of addressing some of the network saturation you experience if you have kids that love playing video games.

2. Configure the Router

These days I spend a lot of my time helping parents and organizations alike configure their networks through CleanBrowsing. In that process one thing has become apparent, the routers are rarely configured correctly.

A couple of things to consider when configuring your router:

  1. Check if they allow automatic updates. Let’s face it, you’re likely not going to keep up with it. At a minimum, subscribe to get notifications of updates.
  2. Disable the Wireless Protected Setup (WPS) and the Universal Plug and Play (UPnP) protocol. This is especially good if you live in close quarters to someone else (e.g., TownHouse, Apartments, etc..) or if you have kids. :)
  3. Enable the router firewall if it’s available, at a minimum leave the default settings. Only mess with the defaults if you know what you’re doing.
  4. For all that is holy, please update the basic log in credentials and save them in the password vault you’re using.
  5. DNS is a critical piece of the puzzle, learn how I use CleanBrowsing to provide a safe browsing experience at home and how DNS can be used to mitigate security threats.

3. Force Good Online Behavior

One of the biggest contributing factors to small businesses getting hacked is poor online behavior. I encourage you to pay special attention to how you interact online. Here are a few tips to help:

  1. Try to separate activities by browsers if possible. For instance, dedicate one browser to your social sites (e.g., Twitter, Instagram, Facebook, etc..), another for your work related sites, and try to restrict when you access financial institutions.
  2. It’s never a bad time to do an audit of your passwords, are they the same across all your sites (e.g., financial, social, company, etc..)? If so, might be good to invest in randomly generated passwords and password vaults (e.g., LastPass, Dashlane, 1Password) to help you remember them.
  3. Are you leveraging the Multi-Factor Authentication (MFA) features provided by the various platforms you interact with? If not, it would be good time to lean into that. My buddy Jesper has been writing an exceptional series on MFA I encourage you to read if you have time.

Be a Responsible Network Steward

We’re in unchartered waters these days, and each of us have a responsibility to help keep our networks safe from bad actors. This dramatic shift to Work From Home (WFH) has shattered the last form of network perimeter most corporations have been holding onto and we all need to do our part in helping to protect them.

In the process you might find yourself intrigued by what networks have to offer. :)

standard post icon

Mitigating Web Threats with DNS Security | CleanBrowsing

Published in Security on December 24, 2019

On December 18th, DeepInstinct put out a great article outlining the latest Legion Loader campaign. Whether a parent, or organization, this served as a great example to demonstrate the effectiveness of DNS security in mitigating this type of attack.

Legion Loader Campaign

This campaign is suspected of being a dropper-for-hire campaign because of the number of different malware payloads it’s distributing (e.g., info-stealers, backdoors, crypto-miners, etc…).

Read More

Internet Securitystandard post icon

DNS Firewall to Enhance Your Networks Security | CleanBrowsing

Published in Security on October 7, 2019

DNS is the internets lookup table, it builds a bridge between the domain name (e.g., perezbox.com) and the IP address (e.g., 184.24.56.17). The IP address being where you can find the server that hosts the domain. In addition to its job as a lookup table, it can also serve as an effective security control.

DNS is light weight, doesn’t require an installation, highly effective, conforms to the TTP’s employed by attackers, and, more importantly, affordable.

This article will introduce the concept of a DNS Firewall (Protective DNS) and encourage you to think of it as an additional layer in your security governance program.

Mitigating Attack Tactics

Understanding how attackers leverage domains in their attacks allows us to appreciate how effective DNS can be. Here are few tactics, techniques and procedures (TTP) leveraged by attackers that helps illustrate the point:

Benign WebsitesAn attacker compromises a benign site (domain), it’s used to distribute malware, or perform other nefarious activity (e.g., Phishing, SEO Spam, etc…)
Malicious SiteAn attacker creates a malicious site (domain), it’s sole purpose is to distribute malware, or perform other nefarious activity (e.g., Phishing, SEO Spam, Dropper, etc…)
Command & Control (C&C)Command and Controls (C&C) is what an attacker uses to facilitate their orchestration. Payloads will phone home to C&C’s for instructions on what to do next.

The scenarios above both leverage Fully Qualified Domain Names (FQDN) for the site to render.

Example 1: The 2019 Mailgun Hack

In 2019 there were number of WordPress hacks that exploited a vulnerability in a well known plugin. This exploit affected thousands of sites, including the popular Mailgun service.

Attackers used their access to embed JS code on the sites that would initiate calls to a number of different domains: hellofromhony[.]org, jqueryextd[.]at, adwordstraffic[.]link. These domains would then initiate different actions (including stealing credit card information) depending on the request.

The embedded JS payload initiates a DNS request.

Example 2: Managing Multiple Servers

Assume you are an organization responsible for 100’s, if not 1,000’s, of servers. An attacker bypasses your defenses and moves laterally through the network. In the process, the attacker sprinkles droppers across the network designed to phone home to their C&C.

The phone home initiates a DNS request.

Example 3: – Mitigating User Behavior (Phishing)

If there is something we can always count on is curiosity always kills the cat. Users always click.

Clicking the link initiates a DNS request.

The Effectiveness of DNS

The examples above are only a few that quickly illustrate how DNS can be leveraged to mitigate attacks. To help support the case, we can look at the Verizon Data Breach Investigations Report (DBIR).

Analyzing a five year period, 2012 through 2017, you find that close to a third of the 11,079 confirmed data breaches were identified to be threat actions that DNS could have mitigated (source: Global Cyber Alliance 2018). Having a security control with 1/3 control relevance is pretty impactful for any organization.

With DNS as the backbone of how the internet works, any time a domain is queried DNS is, by design, triggered. Consider the different scenarios above, and you quickly realize that DNS is the gateway that all requests have to pass through.

DNS Firewall (Protective DNS) as a Security Control

The illustration above shows how and where a DNS Firewall might fit in your networks architecture.

The DNS firewall will inspect the initial query, verifying that it’s safe, before allowing it to proceed with the rest of the DNS communication chain. There are a number of great DNS Firewall services; I personally leverage the CleanBrowsing Security Filter (it’s Free and highly effective).

IPv4 address: 185.228.168.9 and 185.228.169.9
IPv6 address: 2a0d:2a00:1::2 and 2a0d:2a00:2::2

If you run your own internal DNS you want to look into leveraging Response Policy Zones (RPZ). RPZ is a security specification and protocol to enhance DNS resolvers with security intelligence about the domains it is handling. It allows a local DNS resolver to restrict access to content that is malicious or unwanted. It allows you to create your own DNS Firewall.

This deployment is applicable to large organizations and homes alike. :)

Trusted Security Information WordSeshstandard post icon

Mozilla Introduces Mechanism to Hijack all DNS Traffic in the Name of Privacy

Published in Security on September 16, 2019

In September of 2019 Mozilla will begin releasing DNS over HTTPS (DOH) in Firefox via their Trusted Recursive Resolver (TRR) program. A primer on DNS Security.

The change is based on a theme we’ve heard before: a) the old protocols don’t take security and privacy into consideration, and b) there is the threat that people can see what you are searching.

This should sound familiar, we saw a similar campaign driven by Google with their #httpseverywhere campaign in 2014 – 2018.

In both instances, these organizations are trying to tackle fundamental flaws in the technology fabric we all depend on. The difference being in how the problem is being approached.


Technically speaking, I don’t have an issue with the idea of making DOH available. I do question whether a system level control should shift to the web layer. What gives me heart burn is with their implementation – they are enabling it ON by default, without asking the consumer. They are also partnering with CloudFlare as their default DOH service provider; this means every request you make on Firefox will go to a private organization that the consumer has not chosen. For me, this is a serious breach of trust by the organization that is waving the trust banner.

In contrast, Google’s implementation will be set OFF by default. It will also allow the user to choose the DOH provider of their choice.

Why Should You Care about Mozilla’s DOH Implementation

If you are someone that is responsible for controlling what happens on your network, you should care a lot. The default implementation by Mozilla is, for lack of a better word, a Virtual Private Network (VPN) that allows anyone using Firefox to bypass whatever controls exist on a network.

A few examples of what this means:

  • Let’s assume you are a school. You have 100’s of kids on your school WiFi. You have implemented your own DNS resolver to protect kids from malicious sites or to stop them from accessing pornographic, or obscene content. This new implementation will make it so that your kids can now bypass your web controls.
  • Let’s assume you are a parent. You worry about what your kids have access to when they are surfing the web. You deploy a network tool to help you control what they can and can’t access while inhibiting the way they interact on the web. This new implementation will make it so that your kids can now bypass your web controls.
  • Let’s assume you are addicted to porn. A very real problem. You deploy controls on your network to prevent yourself from accessing obscene content (something is sometimes uncontrollable for the afflicted). This new implementation will make it so that you can now bypass your own web controls.
  • Let’s assume you are security engineering inside an enterprise NOC. You are chartered with analyzing traffic to ensure malicious traffic is not coming, or out. This new implementation will allow anyone on your network to bypass whatever controls you might have in place.
  • Let’s assume you are a government that is trying to implement new regulations to hold ISP’s responsible for child pornography and other nefarious acts online. This new implementation would prevent this.

These are only a few, crude, examples meant to highlight the seriousness of the chosen deployment by Firefox.

What Can You Do About Mozilla’s Implementation

If you prefer to retain control of your network and not allow Mozilla to make default choices for you, you have a few options:

Option 1: Leverage a Network Content Filtering Service That Disables By Default

If you are a parent, school, or large organization you can use a cloud-based DNS content filtering service like CleanBrowsing to help mitigate this change.

If you are a large enterprise, you can signal to FireFox that you have specific controls in places and the DOH deployment should be disabled.

Network administrators may configure their networks as follows to signal that their local DNS resolver implemented special features that make the network unsuitable for DoH.

DNS queries for the A and AAAA records for the domain “use-application-dns.net” must respond with either: a response code other than NOERROR, such as NXDOMAIN (non-existent domain) or SERVFAIL; or respond with NOERROR, but return no A or AAAA records.

Make note of this very important caveat in their release notes: If a user has chosen to manually enable DoH, the signal from the network will be ignored and the user’s preference will be honored. Depending on your organizations position on this, you might want to consider Option 3.

Option 2: Disabling DNS-over-HTTPS in Firefox

You can disable DoH in your Firefox connection settings:

  1. Click the menu button  and choose Preferences.
  2. In the General panel, scroll down to Network Settings and click the Settings… button.
  3. In the dialog box that opens, scroll down to Enable DNS over HTTPS.
    • On: Select the Enable DNS over HTTPS checkbox. Select a provider or set up a custom provider.
    • Off: Deselect the Enable DNS over HTTPS checkbox.
  4. Click OK to save your changes and close the window.

Option 3: Remove Firefox

As extreme as an option as this might sound, I have spoken with a few enterprise CISO’s that are considering the option of removing Firefox from their network. Their reasoning revolves around two distinct positions: browsers assuming too much control and treating Firefox as a VPN. Which seems to be the direction they are intentionally heading.

The Evolution of a Critical Piece of the Web – DNS

A critical piece of the web is evolving and for most consumers you have no understanding or appreciation for what that means, but the implications can be dramatic.

Regardless of which side of the fence you’re on, there is a mutual desire amongst technologists to to ensure a more secure, private, web; the question, however, is how you implement it.

I’ll dive deeper into the specifics of the community politics, and technical details between the options, in future articles. If you absolutely can’t wait, I encourage you to read this great article by one of my colleagues at GoDaddy, Brian Dickson with our DNS team – DNS-over-HTTPS: Privacy and Security Concerns

standard post icon

Rethinking the Value of Premium SSL Certificates

Published in Security on August 12, 2019

There is an active campaign to reshape how online consumers see SSL certificates, with special interest in shutting down premium certificates by the browsers and security practitioners. This article will shed some light into what is going on, provide some context as to why it’s happening; and it will also offer my own personal opinions and recommendations for the future.

In summary, premium certificates – specifically EV’s – offer more value than we’re letting on because we’re allowing the wrong things cloud the conversation.

Making Sense of the SSL Ecosystem

I recommend reading my primer on SSL, specifically how HTTPS works.

An SSL Certificate is a digital file that binds an identity with a public key and a cryptographic private key. This file is used to verify and authenticate the owner (an identity) of the certificate during a communication exchange between two systems. This SSL Certificate is also what allows you to make use of the HTTPS / TLS protocols on your website.

A site that is leveraging HTTPS/TLS makes use of an SSL certificate to accomplish two goals:

  • Authenticates the identity of the website to the site visitor;
  • Protects, via Encryption, information as it’s transmitted from the web browser to the web server. It ensures that data in transit cannot be intercepted (e.g., MiTM attack) by a bad actor;

Here is a great example:

What you see in this example is that this certificate was issued to the godaddy.com domain by the GoDaddy Certificate Authority (CA). These CA’s are responsible for the creation, issuance, revocation, and management of SSL certificates.

How SSL Certificates Are Created

How they go about performing these duties are defined by a voluntary organization known as the Certificate Authority / Browser (CA/B) forum. The output of this forum is something known as the Baseline Requirements (BR), and these BR’s are the rules by which CA’s must abide by if they want their certificates to be recognized by something known as the browsers root store.

Being in the browsers root store is critical for a CA. To appreciate the importance of the browsers root store simply go to September 2017 when Chrome distrusted Symantec’s root certificate. The impact of being distrusted results in every certificate issued by the CA rendering a page like this:

So yes, having a publicly trusted root is the bloodline of every CA. These root certificates are used in the issuance of certificates, and as long as the CA follows the rules defined by the BR’s then root stores will “Trust” the CA’s root certificate in their root store.

Type of SSL Certificates

Under the rules set forth by the BR, CA’s have the ability to issue a number of different certificate types.

For the purposes of this article I’ll focus only on three:

Domain Validation (DV)Validating the Applicant’s ownership or
control of the domain.
Organization Validation (OV)Validating the Applicant’s identity as a company or individual and the domain.
Extended Validation (EV)Validates the legal entity that controls the website. this is the most stringent validation process available.

A couple things to clarify:

  • All certificates function the same in protecting information in transit, you’re not getting a higher or lower degree of encryption with either certificate, the encryption ciphers are set by the web servers and the minimum values are defined by the BR’s;
  • The thing that has always differentiated these certificates to the public has been their treatment on browsers;
  • The treatment for DV / OV certificate are the same on browsers, and EV’s have always been that special option;

Treatment of SSL Certificate Types

The thing that has always set the certificate types apart has been their treatment on the browser User Interface (UI). The original premise of the treatment was to enable the web users, like you, to quickly delineate those sites that had gone through additional scrutiny in their validation process.

For these examples I’m going to focus on Chrome because it’s the most widely adopted browser in the market (55% market share as of July 2019). They are also the ones leading the fight against premium certificates and the changes I’ll highlight below.

Here is an example of what an DV / OV certificate might look like in the URL inside the Chrome browser today (in 2019):

Here is an example of what an EV certificate might look like in the URL inside the Chrome browser today (in 2019):

Here is an example of what the certificates used to look like:

As you look through the examples above you can quickly see what is happening. The treatment of EV certificates is changing dramatically. In earlier versions it was easy to point out those sites that had gone through higher scrutiny in their validation process, and in theory it should have given web users a higher degree of confidence in the legitimacy of the site.

Here is an example of what you can expect in future releases of the Chrome browser:

What you see above is work being done by Google to remove the indicator all together. You can expect the final iteration to potentially look very different than the proposal above.

The genesis of why can be found in Google’s release of a research paper titled The Web’s Identity Crisis: Understanding the Effectiveness of Website Identity Indicators.

The entire paper boils down to this:

In 14 iterations on browsers’ EV and URL formats, no intervention significantly impacted users’ understanding of the security or identity of login pages.

Authors: Christopher Thompson, Martin Shelton, Emily Stark,Maximilian Walker, Emily Schechter, Adrienne Porter Felt – Google

In other words, there was no perceived value of the UI indicators. Because there is no value, Google will proceed with removing them (in the form of burying them deep into secondary panels). You can expect that the next analysis will show that users do not click on the secondary panels, as such their value is further diminished.

Discourse Makes a Solution Difficult

Here are some of my personal observations, points of contention and positions across both sides of the aisle as to why premium certificates are ineffective:

  • Even amongst security professional few truly understand the difference between certificate types;
  • We never really brought about good awareness to what these indicators were meant to signify;
  • The CA/B forum is comprised of a lot of attorneys, this creates a very CYA like approach to development of BR’s – in other words, we avoid anything that might imply liability. This framing makes it difficult, we shy away from things like “assurance” and “trust” and creates an environment of extreme interpretations;
  • Massive commercial entities were built around these SSL certificates, such that any perspective from a CA is immediately dismissed because it’s believed to be impartial and beholden purely to commercial interests;
  • There are real challenges like collisions in the systems, where two entities could exist with the same name, established under different jurisdictions. Which technically, isn’t really a problem if it’s a legitimate entity;
  • We inaccurately try to place value on premium certificates on things like security (e.g., premium certificates curtail phishing). This narrative derails and distracts the conversation;
  • Perception of issues exist with the fact that you can have a validated entity that is not the same as the domain (e.g., domains owned by franchises). Which technically isn’t a problem if we refine the meaning of the value of the premium certificate and the assurances it provides;
  • As a community there is an “us” vs “them” mentality, where the browsers are good and the CA’s are bad. This has led to a contentious, toxic, relationship between both parties, which does little for the web;
  • We lean on security whenever there is no valid answer, never differentiating between practical and theoretical security;
  • We claim to be considerate of the greater web, but share very little empathy for the challenges we’re introducing to the consumers (both micro-businesses, large organizations, and passive consumers) of the web;
  • The advent of social platforms has given a platform to pundits all around the world, experts and influencers alike, that amplify and convolute the conversation in the interest of goodness, fairness and security while simultaneously adding emotion and unreasonable candor making it impossible to collaborate for a better outcome – then again, this affects almost every industry these days;
  • The validation process requires humans, humans are fallible, and it precludes us from automating and making it available to the masses in scalable manner;
  • Traditionally, CA’s have been perceived to be stuck in their ways, my own organization included, incapable of keeping up with the evolution of the web – we are probably our own worst enemy;

The Unrecognized Value

Studies have been conducted on both sides of the aisle. On the browsers’ side, a study by Google (The Web’s Identity Crisis: Understanding the Effectiveness of Website Identity Indicators) showed that web users don’t recognize value in UI indicators. On the CA side, you have a study by Georgia Tech (funded by Sectigo) (Understanding the Role of Extended Validation Certificates in Internet Abuse) which tries to show a low propensity for validated domains to be used for malicious purposes. Whether you agree with the methodologies leveraged or the outcomes they offer, I believe the unrecognized value is somewhere in between.

I believe that Google is right, in today’s incarnation of the UI indicators it is absolutely realistic to believe that web users have no understanding of what they mean. I also believe, to an extent, that Georgia Tech’s study (while a bit limiting) speaks to a truth in the low propensity of a validated organization to be used for malicious purposes.

I believe we are missing some really interesting opportunities to help bridge the trust gap online through a structure that is already in place:

  • The validations being done for certificates like EV’s, whether we like it or not, and regardless of what the BR’s state, should facilitate a level of assurance of legitimacy to web users.
  • While not perfect, the public Web Trust ecosystem built between browsers and CA’s can be the building blocks for something that has a dramatic impact on the great problem of identity assurance and trust on the web.
  • There is some validity to the idea that a site that has a premium certificate, specifically EV, has a lower propensity to be used for malicious purposes. It’s not so much the cost, but more the level of effort required to forge all the required documents and forms of proof (which sometimes requires updating gov’t systems).
  • Validating an entity is valuable, whether they are doing something malicious or not. The process of validating helps collect real information that can be used later if required.
  • Another anecdotal insight comes in what the idea of “validating” actually means to a domain holder. It’s arguable that an organization that is going through the process of validating their domain cares enough about their identity, their security, to have more controls than the average Joe to ensure the integrity of their site. This is especially important when you think how most Phishing attacks happen today (i.e., benign sites being hacked and being used maliciously).

Where I disagree is in the statements that removing the UI indicator is the solution or that EV’s deter phishing attacks.

A failure to understand the indicator doesn’t mean the indicator isn’t valuable, but rather that we should work harder to pull the value forward.

Ironically, there is probably no greater example of the power of awareness and education than Google’s very own #httpseveryhwere campaign. A campaign in which Google drove home the importance of a HTTPS/SSL indicator by leveraging their greatest asset – SERP rankings. This initiative worked to educate consumers to look for the “lock” and the “secure” indicators, which makes me believe we can educate web consumers.

We live in a world where trust online is growing in importance. As such, we should be leaning into solutions that help pull forward that value. There are over a billion websites live on the web, and growing. Web consumers struggle every day with understanding what websites they can / should interact with.

As a community we should revisit the value and purpose of the premium certificates, specifically EV’s, and place emphasis around things like “trust” and “assurance.” We should work to pull that value forward in a way that we can help consumers differentiate and recognize easily.

Disclaimer

In full disclosure, I’m GoDaddy’s General Manager (GM) for the Security product group. This business line includes GoDaddy’s Certificate Authority (CA), which means we sell SSL certificates. The portfolio has considerable depth in the presence domain; features like a Web Application Firewall (WAF), Content Delivery Network (CDN), Security Scanning, Brand Monitoring, Incident Response, Premium DNS, Website Backups and the Sucuri brand.

Securitystandard post icon

ANALYZING SUCURI’S 2018 HACKED WEBSITE TREND REPORT

Published in Security on April 15, 2019

The Sucuri team recently released their second annual security report for 2018 – Hacked Website Report 2018. It looks at a representative sample of infected websites from the Sucuri customer base ONLY. This report helps understand the actions taken by bad actors once they penetrate a website.

This report analyzed 25,466 infected websites and 4,426,795 cleaned files; aggregating data from the Threat Research Group. This is the team that works side-by-side with the owners of infected websites on a daily basis, and are also the same team members that generate a lot of the research shared on the Sucuri Blog and Sucuri Labs.

This report is divided into the following sections:

  • Top affected open-source CMS applications ​
  • Outdated CMS risk assessment​
  • Blacklist analysis and impact on webmasters​
  • Malware family distribution and effects​

This post will build on the analysis found in the report, and share additional insights from the reports webinar.


CMS ANALYSIS

The analysis shows that in 56% of the hacked sites the core platform was patched with the latest version. The real insights, however, come into focus as we dive into the specific CMS’ distribution in the sample base.

2018 – Sucuri CMS Distribution fo Out-of-Date Core at point of Infection

Although WordPress is the one platform that is the most up-to-date at the point of infection, it continues to be the # 1 platform we see in our environment.

2018 – Platform Distribution in Sucuri Sample Base

This is undoubtedly related to Sucuri’s popularity in the platform ecosystem, but with 60% market share of CMS applications, and 34% of the websites on the web its representation is also understandable. What this also highlights is that something else is contributing to these hacked sites.

2018 – WordPress Out-of-Date State at point of infection

WordPress Version – In the 2016 report, 61% of the WordPress sites had been out of date at the point of infection. In 2018, this number dropped to 36.7% (2017/39.3%). Overall I’d say that’s pretty amazing, and a direct reflection of the hard work by the WordPress security team to introduce and deploy auto-updates.

E-commerce Platforms – The platforms that do concern me the most are the platforms used for online commerce. They represent a big % of the platforms that are out-of-date at the point of infection – Magento (83.1%), OpenCart (91.3%) and PrestaShop (97.2%). These are the core applications users are leveraging to perform online commerce transactions. This is especially concerning, because unlike WordPress, these platforms are still experiencing critical vulnerabilities in its core. Coincidently, these are also the platforms that have security obligations set forth by the Payment Card Industry (PCI) Data Security Standards (DSS), one if which includes keeping software up-to-date (requirement 6).

PCI Requirement 6.2 Ensure that all system components and software 
are protected from known vulnerabilities by installing applicable
vendor supplied security patches. Install critical security patches
within one month of release. - 2018 Payment Card Industry (PCI) Data
Security Standard, v3.2.1

Another theme you’ll find with this cohort is that they are also the platforms whom struggle with backwards compatibility. This speaks directly to the complexities associated with these platforms to upgrade, which when coupled with human behavior, is a recipe for disaster.

Common Issues & Threats

While the report does show an increase of WordPress sites year over year, it’s not indicative of the platform being more or less secure. The leading contributions to websites hacks, holistically speaking, can be boiled into two key categories:

  • Credential Stuffing (Brute Force Attacks)
  • Exploitation of Vulnerabilities in Third Party Ecosystems

I won’t spend much time talk to credential stuffing, the act of stuffing access control access points with different username / password combinations; instead I want to focus our discussion on the third party ecosystems.

The accompanying webinar did peal the layers back on the threats posed by the third-party ecosystem (e.g., plugins, modules).

2018 – Identified and Analyzed Vulnerabilities in CMS Third-Party Ecosystems

Of the 116 WordPress vulnerabilities Sucuri identified, 20 were categorized as severe (17%), and another 28 in Joomla! (50%). Of the 196 total vulnerabilities, 35 had an installation base over 1 million users. 2019 has seen a spike in the number of vulnerabilities hitting the market; to date, WordPress severe vulnerabilities are 50% of the total identified in 2018.

The one platform you don’t see in this analysis is Magento. For that, I would leverage insights from Willem’s Lab. His insights on the platform and its ecosystem are spot on; unlike WordPress, Magento has predominantly been plagued with issues from core vulnerabilities (e.g., ShopLift crica 2015), but the end of 2018 and beginning of 2019 is seeing a shift in which the platform’s third-party ecosystem is becoming the attack vector of choice.

Note: If you’re a Magento operator, I encourage you to leverage the new central repository of insecure modules released by a group of Magento professionals. A similar repository exists for WordPress.

BLACKLIST ANALYSIS

The report highlights the distribution of blacklisted and non-blacklisted sites at the point of infection. This illustrates a) the different indicators of compromise and b) the effectiveness and reach of blacklist authorities.

2018 – % of Hacked Websites Blacklisted at Point of Cleanup

This year we saw a 6% drop (17% -> 11%) in the blacklist state of the sites worked on. It’s difficult to say exactly why this is, but it’s likely related to how these blacklists operate. It does highlight the need to have a comprehensive monitoring solution set as part of your security controls, depending solely on authorities like Google, Norton, and McAfee is not enough.

This becomes even more evident when you look at the detection effectiveness across the different authorities.

This year we saw Google drop from 12.9% to 10.4%, and we also saw Yandex join the the top 4 (previously it was not material enough to rank). We also saw McAfee drop about 4% and Norton continue to lead the detection rate at 46.1%.

Not all blacklist authorities are the same.

Google is the most prominent because it’s the one that most browsers leverage, most commonly Chrome. The Sucuri team put together a great guide to understand the different Google warnings. When it detects an issue it presents the users with a red splash page – stopping a visitor dead in their tracks.

2018 – Example Google Blacklist Block

Other entities however are effective for a different reason; for instance, when Norton and McAfee flag you this implies anyone using their desktop AV client will be prevented from visiting the site or at least notified of an issue. These entities also share their API’s with a number of different services and products, great example is the use of McAfee in Facebook to parse malicious domains.

2018 – Example AV Blacklist Block

Being blacklisted by one doesn’t necessarily mean the other will, and being removed from one doesn’t mean others will respect this state change. This introduces a lot of stress and frustration with website owners. The best approach managing this is to register with as many of them as possible so that you can maintain direct relationship with each:

  • McAfee Site Advisor: http://trustedsource.org/
  • Norton SafeWeb: https://safeweb.norton.com/tags/show?tag=WebMaster
  • Yandex Webmaster: https://webmaster.yandex.com/
  • Google Webmaster: https://www.google.com/webmasters/#?modal_active=none
  • Bing Webmaster: https://www.bing.com/toolbox/webmaster

MALWARE FAMILIES

This section shows you what attackers are doing once they have access to your environment. It helps shed light on “intent”.

2018 – Malware Family Distribution (Sucuri Labs)

It is very common to have sites with more than one payload, which is why the report represents sites with multiple malware families. Backdoors are a great example of the type of thing you can expect to find in any compromise.

Backdoors are payloads that are designed to give attackers continued access to an environment, bypassing existing access controls. They were found in 68% (modest 2% drop from 2017) of the infected sites analyzed. Backdoors will be one of the first things an attacker will deploy to ensure that even if their actions are found, they can retain access and continue to use the site for nefarious actions. It is one of the leading causes of reinfections, and the most commonly missed payload.

2018 – SEO Spam Growth

Last year I called out the continued rise of SEO Spam, this year was no different.

This is the result of a Search Engine Poisoning (SEP) attack in which an attacker attempts to abuse the ranking of your site for something they are interested in. Years ago this would be almost synonymous with the Pharma Hack, but these days you see attackers leveraging this in a number of other industries (e.g., Fashion, Loans, etc..). You can expect this in any industry where impression based affiliate marketing is at play.

Example Site with SEO SPAM

The teams analysis highlighted an impressive increase (78%) in the number of files being cleaned in every case. This shows the pervasiveness you should expect after every compromise.

2018 – Sucuri Report – # fo Files Affected Post-Compromise

It’s not enough to clean the file you see, but instead perform a deep scan across files and databases to ensure everything has been removed.

Of the files affected, there were some trends in the file types targeted. The Top 3 modified files were the index.php (34.5%), functions (13.5%), and wp-config.php (10.6%) file.

Every file saw an increase over 2017, and there was a change in the top three – .htaccess dropped to make remove for wp-config.

2018 – Top Three Files Modified to Post-Compromise

The report outlines how each of the files are being leveraged, specifically for what malware families.  These three files are not a surprise, they are popular because they load on every site request, belong to core files, and are often ignored by integrity monitoring systems.

Great details by Fio Cavallari & Denis “Unmaskparasites” Sinegubko on what attackers are using these files for.

Index.php

  • Approximately 34.5% of sites had their index.php files modified after a compromise.​
  • The index.php file is modified by attackers for a variety of reasons including malware distribution, server scripts, phishing attacks, blackhat SEO, conditional redirects, and defacements.​
  • 24% of index.php files were associated with PHP malware responsible for hiding a file inclusion.​
    • This malware calls to PHP functions like include and include_once by replacing the file path characters with corresponding Hexadecimal and mixed up alphabetic characters.​
  • 15.8% of index.php files were affected by malicious PHP scripts disguised using absolute paths and obfuscated characters and hidden within seemingly innocent files.​
    • Instead of injecting full malware code into a file, this method makes the malware more difficult to detect by using PHP includes and obfuscation.​

Functions.php

  • 13.5% of compromised sites had modified functions.php files, which are often used by attackers to deploy SEO spam and other malicious payloads, including backdoors and injections.​
  • Over 38% of functions.php files were associated with SEO spam injectors:​
    • Malware that loads random content from a third-party URL and injects it on the affected site.​
    • Able to update configurations through a remote command.​
    • Doesn’t explicitly act as a backdoor but can use the function to load any kind of code – including a backdoor.​Usually found on nulled or pirated themes and plugins.​
  • 8.3% of functions.php files impacted by generic malware.​
  • 7.3% of files associated with PHP.Anuna, which injects malicious code into PHP files.​
  • Malicious payloads vary from spam injection, backdoors, creation of rogue admin users, and a variety of other objectionable activities.​

WP-Config.php

  • wp-config.php was the third most commonly modified file (10.6%).​
  • Contains sensitive information about the database, including name, host, username, and password. It is also used to define advanced settings, security keys, and dev options.​
  • 11.3% of wp-config.php files were associated with PHP malware responsible for hiding a file inclusion, also commonly seen with index.php.​

CryptoMining and Ransomware

As we talk about what attackers are doing post-compromise, it’s worth spending a few minutes on Cryptomining and Ransomware.

In 2017, we saw the rise of Ransomware across the entire InfoSec ecosystem. It’s impacts on websites, however, were marginalized because of its ineffectiveness; mitigating a ransomware attack on a website is relatively straight forward, have a backup.

Cryptomining, however, is a bit of a different story.

Relationship Between Crypto Currency and CryptoMining Activity (2018 CheckPoint Report)

Cryptomining, the act of verifying and adding different forms of cryptocurrency transactions to the blockchain ledger. This process is the necessary step to adding to the ledger, and under this model the spoils belong to the group that processes the request the fastest. To achieve this you need processing power, this is where sites and their associated hosts come into play.

Although we saw a decrease in cryptomining activity in 2018, it’s an interesting payload to pay special attention to.

What you see in CheckPoints report is the correlation between the “value” of a coin, and cryptomining activity. In other words, as the price of cryptocurrency increases (think back to the 1: $19k) so did the cryptomining activity.

Analyzing this behavior (Thanks Thiago), you also find the following actions:

  • 67% of all Cryptomining signatures were related to client-side infections with JavaScript based miners like CoinHive.​
    • This means these payloads are abusing your browsers processing on your users local machines (ever go to a site and your browser dies? or it starts chewing up a lot of local resources?)
  • Remaining 33% of Cryptominers were server-side and used PHP to mine digital currencies.​
    • This means these payloads are abusing your host server, the server housing your website. This can lead to your hosting provider shutting down your site or you might experience degraded performance on your site.

I am particularly fond of this payload because it’s a great example of what we can expect form attackers when incentives are aligned. While I don’t really expect to see much activity with website ransomware, I do expect to see more with cryptomining when the incentive increases (e.g., value of cryptocurrency increases again).


I encourage you to read through Sucuri’s Hacked Website Report for 2018. It’s perfect for a website owners to understand the threats they face as they get their ideas online.

If you’re an online consumer and wondering how you can protect yourself from falling victim to hacked websites, then I encourage you to spend some time learning more about how DNS resolvers, like CleanBrowsing, can help keep infected websites from reaching your devices.

Watch Sucuri’s webinar, with yours truly, below:

Sucuri 2018 Hacked Website Webinar
standard post icon

The Evolving World of DNS Security

Published in Security on March 2, 2019

I was recently at an event listening to representatives of ICANN and CloudFlare speak on security with DNS and it occurred to me that very few of us really understand or appreciate its nuances. It also so happens that the past 5 years have brought forward a lot of curious, and interesting, developments in one of the last untouched founding components of the internet.

DNS Primer

The Domain Name System (DNS) is comprised of a number of different Domain Name Servers (DNS). I wrote an article that offers an illustration and better understanding of how the entire DNS ecosystem works together. There is an even cooler illustration explaining how DNS works.

Read More

standard post icon

Installing OSSEC on Linux Distributions

Published in Security on January 3, 2019

The last few posts have been about deploying and configuring OSSEC as an important tool in your security suite. In this article I will provide you a script I wrote to help you quickly deploy OSSEC.

This script assumes you are deploying on a Linux distribution (e.g., Fedora, Ubuntu, CentOS, or Debian). It will force you to choose a distribution OS before it runs, this ensures it installs the appropriate dependencies based on the distribution type.

Read More

standard post icon

Tips to Protect Your Domain[s] Investments

Published in Security on November 20, 2018

A few months back I was working with a customer that was having the worst day of their lives. Attackers had taken full control of their most critical digital asset – their domains and the domains of their customers.

The organization affected was an agency. They built and managed sites for their customers and in a relatively short period they lost access to their site and their emails. In this article I’ll share what happened, and offer tips that would have made things a lot harder for the attackers to hijack their domains.

Read More

Software Design Challengesstandard post icon

A Primer on DNS and Security

Published in Security on November 4, 2018

If you’re reading this article you’ve interacted with DNS. In fact, you’d be hard pressed to spend any time online and not interact with DNS.

Many of us spend very little time thinking about it. By design, it’s a “set-it and forget-it” tool that is often set up on our behalf (e.g., our home network, local ISP, office network). Ironically, it’s a critical piece of our security landscape.

This post will explain what DNS is and highlight some of it’s key security considerations.

Read More

standard post icon

How HTTPS Works – Let’s Establish a Secure Connection

Published in Security on October 28, 2018

The need to use HTTPS on your website has been spearheaded by Google for years (since 2014), and in 2018 we saw massive improvements as more of the web became encrypted by default. Google now reports that 94% of its traffic on the web is now encrypted.

What exactly does HTTPS mean though? And how does that relate to SSL or TLS? These are the more common questions I get when working with customers and in this article I hope to break it down for the every day website owner.

Read More

standard post icon

The 2018 Facebook Data Breach

Published in Security on October 20, 2018

On September 28th, 2018, Facebook announced it’s biggest data breach to date. They estimated 50 million accounts were affected at the time of the disclosure. Subsequent to the disclosure, security professionals from all verticals took to the interwebs to provide what most would consider sensible advise:

  1. Time to update your passwords;
  2. Time to enable Two Factor Authentication (2FA);

Neither, however, would offer users an appropriate response to this type of data breach. This knee jerk reaction speaks to a bigger problem we have in our community of misinformation, often as a result of our own lack of understanding of a problem.

Read More

Securitystandard post icon

Analyzing Sucuri’s 2017 Hacked Website Trend Report

Published in Security on April 6, 2018

The Sucuri team just released  their first annual security report that looks at telemetry from hacked websites – Hacked Website Report 2017. It uses a representative sample of infected websites from the Sucuri customer base to better understand end-user behavior and bad-actor tactics.

It specifically focuses on 34,371 infected websites, aggregating data from two distinct groups – Remediation and Research Teams. These are the teams that work side-by-side with the owners of infected websites on a daily basis, and are also the same team members that generate a lot of the research shared on the Sucuri Blog.

In this post I  will expand on the analysis shared, and add my own observations.

Read More

standard post icon

Diving Into the Dark Web and Understanding the Economy Powering Cyber Attacks

Published in Security on March 20, 2018

This morning, Armor, a cloud security provider, released a great report into the cyber crime black market. Armor was formerly known as as FireHost – they were one of the leading hosts boasting security first and have dramatically evolved over the years. This report was put together by the Armor Threat Resistance Unit (TRU), whom extrapolated data from a number of dark web sources; focusing specifically on the fourth Quarter of 2017 (2017/Q4).

The report strives to give us a view into an otherwise elusive world, specifically highlighting the economic foundation of cyber crime. Understanding the criminal economy is critical to understanding the ease of use, motivations and behaviors of bad actors.

Effective security takes more than technology; it requires realtime knowledge of the threat landscape and risks to your data. – The Black Market Report

Read More

standard post icon

Phishing and Ransomware Leads Security Concerns for Organizations

Published in Security on August 22, 2017

The SANS Institute recently released their 2017 Threat Landscape Survey: User on the Front Line in which they interviewed 263 IT and security professional on the things that keep them up at night. Survey was conducted in May / June of 2017, it’s no surprise Ransomware was top of mind (e.g., WanaCry and Petya dominated the media). I am constantly amazed at the continued impact of Phishing threats.

This survey helps provide a deeper appreciation for what the security domain is faced with, while also providing insights into what the SMB market should be aware of (but are often not). This specific audience is technically capable, with a vested in interest in security as it’s their job (i.e., security / IT professionals), and it stands in stark contrast to the SMB market.

Read More

standard post icon

Google Begins Campaign Warning Forms Not Using HTTPS Protocol

Published in Security on August 17, 2017

August 2014, Google released an article sharing their thoughts on how they planned to focus on their “HTTPS everywhere” campaign (originally initiated at their Google I/O event).

The premise of the idea was that every website, regardless of what it was doing, should be communicating securely between point A and point B. To help motivate users, it went right for the carotid artery by making it a ranking factor in search.

December 2015, Google adjusted their crawlers to start start prioritizing and indexing HTTPS pages by default. If you had HTTP / HTTPS, they would start giving more weight to your HTTPS pages.

Read More

standard post icon

Password Management

Published in Security on June 27, 2017

The year is 2017 and we continue to give advice on the process of creating passwords. This must stop. The phrase “These are the tips to creating a secure password” should be stricken from all presentations, articles, tips and side-bar conversations.

Managing passwords has never been more streamlined. Organizations have invested countless hours and resources into building solutions that seamlessly integrate into our habits, and every business owner, and individual, should invest energy into integrating password managers into their overarching security program. So ask yourself, why are you, or your organization, not employing the tools designed to help you from yourself?

Read More

standard post icon

We Must Improve the HTTPS Message

Published in Security on December 4, 2016

HTTPS is as important today as it has ever been. If you are transferring sensitive data you should use HTTPS to encrypt data in transit, that is not up for debate. Understand though that it is but one piece of a larger security conversation, and that’s where the message falls flat on it’s face.

I shared my thoughts last year on how HTTPS does not secure websites, and in the time since the message has only grown as to be expected. You’ve seen exponential growth of the LetsEncrypt initiative which we fully support at Sucuri (making us one of the first cloud CDN / Firewall solutions to do so). Additionally, organizations of all sizes have been adamantly pushing the importance of SSL, including both hosting and service providers alike. The WordPress Foundation, the organization spearheading the growth of the WordPress platform (currently at 27% market share of all websites), also recently announced that it would only be promoting hosting companies that offer SSL by default:

Read More

standard post icon

Google Introduces new Repeat Offender Blacklist

Published in Security on November 9, 2016

On November 8th, 2016, Google introduced a new feature to Chrome that would blacklist repeat offenders.

Once Safe Browsing has designated a site as a Repeat Offender, the webmaster will be unable to request additional reviews via the Search Console. Repeat Offender status persists for 30 days, after which the webmaster will be able to request a review.

The feature was introduced to address an issue where they noticed a trend where websites would remove infections long enough to have warnings removed, only to return once removed. They go on to say that this won’t affect websites that have been hacked. In other words, it’s built to affect intentional misuse versus unintentional.

Please note that websites that are hacked will not be classified as Repeat Offenders; only sites that purposefully post harmful content will be subject to the policy.

I can’t help but ask myself – how are they going to differentiate between intentional and unintentional misuse?

Read More

standard post icon

Defense in Depth And Website Security

Published in Security on October 23, 2016

The concept of Defense in Depth is not new. It’s been leveraged in the InfoSec domain for a long time, and has it’s roots deeply embedded in military strategy and tactics. That however doesn’t mean that even those in the InfoSec domain explain or implement it correctly. To fully appreciate the idea of Defense in Depth you have to subscribe to a very simple idea:

There is no single solution capable of providing 100% protection against any environment. 

I recently wrote an article on the Sucuri blog sharing some thoughts on how I feel we should think about the concept, and how we should go about deploying it within our technology stacks and organizations. I expanded my thoughts this past weekend at the BadCamp Hack The Planet summit in Berkeley where I shared some of the challenges we face in the website security domain pertaining to the subject.

The idea of Defense in Depth is simple: employ as many complementary defensive controls as makes sense for you and your organization. The optimal word being “complementary”. It’s based on the idea that every tool has a weakness, so find tools that help address them and that work in unison with one another. This does not mean you deploy every tool available, instead you must strategically map out the threats that you are most concerned with, that pose the biggest impact to your organization, and prioritize your defensive posture.

Today’s threats are evolving at a faster clip than any one solution or team can account for. It’s not a matter of finding the 100% solution, but about deploying the things we need to help reduce the growing risk. This has never been truer than in the website security domain. If employed correctly we should be better prepared to quickly identify issues, mitigate the threats and respond to incidents if so required. Attackers only need to win once. As defenders, we have to win every time. 

 

  • 1
  • 2
  • 3
  • 4
  • Next Page

Tony Perez CEO Sucuri

About Tony Perez

I've spent the better part of the past 15 years dabbling in various technical industries, and these days my focus is website security and business. This blog, regardless of topic is a chronicle of my thoughts and life as I navigate those things that interest me the most.

  • Facebook
  • Twitter
  • LinkedIn
How To Block Porn

Recent Security Posts

Three Things that DNS Outages Teach Administrator

NOC Introduces a CDN. Yes, a CDN.

Feelings Have No Place in the World of Security

Unleashing the Power of Authoritative DNS

Content Filtering with CleanBrowsing

View All Security Posts

Recent Business Posts

Thoughts on The BaseCamp Mass Exodus

It’s Ok to Focus On What You Do

11 Things to Consider Before Making the First Hire

Stop Thinking, Start Doing

The Selling Process

View All Business Posts

Recent Life Posts

Screaming Into the Void

What Are the Trade-Offs that Make Trump Ok?

Thanks FaceBook, Bye

A World of Absolutes

Thank You GoDaddy / Sucuri. A New Chapter Begins | CleanBrowsing

View All Life Posts

Like what I have to say?

Subscribe to hear more...

I don't always have something to say, but when I do I will aim to make it insightful. Subscribe to hear my thoughts as I make them available.

PerezBox

  • Facebook
  • Twitter
  • LinkedIn

Copyright © 2022 Tony Perez, PerezBox. All Rights Reserved | Security | Privacy