Application Whitelisting

Application whitelisting is a proactive security control built on a simple principle: only explicitly trusted applications are allowed to run. Instead of trying to detect and block everything malicious, application whitelisting focuses on permitting only approved software to run. Everything else is denied by default. This deny-by-default approach significantly reduces the risk of malware, shadow IT, and unauthorised activity.

When implemented correctly, whitelisting strengthens protection, enhances control, and simplifies endpoint monitoring. However, like any high-assurance control, it demands structure, ongoing oversight, and organisational commitment.

Benefits of Application Whitelisting:

  • Reduces the risk of malware infection by blocking unapproved executables before they can run.
  • Blocks unauthorised software installations that could introduce vulnerabilities or licensing exposure.
  • Limits shadow IT, ensuring that only vetted tools are used within the business.
  • Streamlines monitoring by enabling teams to focus on deviations from a known-good baseline.
  • Supports compliance by enforcing clear software control policies.

At its core, whitelisting software distinguishes between approved and unapproved applications using a range of attributes:

  • File name, location, and file size
  • Digital signature from the software publisher
  • Cryptographic hash of the file

Among these, the cryptographic hash provides the highest assurance of integrity. It ensures that only an exact version of an approved file can execute. Any modification, even minor, would result in a different hash and be blocked. This makes it far harder for an attacker to substitute or tamper with software undetected.

Relying solely on weaker attributes such as file name or file path introduces opportunities for circumvention. A malicious file could simply be renamed or placed in a whitelisted directory. Strong whitelisting solutions use multiple attributes in combination, with hashes providing definitive assurance of integrity.

Key considerations when building an effective application whitelist:

  • Establish a clean baseline – use a standard build as a reference point. Perform a full scan to define an initial whitelist that can govern other endpoints.
  • Identify other software – business environments evolve and software in use is not always known, and legacy software could be part of critical business processes. Using only a standard build to create a whitelist would likely cause failures and inconvenience staff and business activities.

Challenges and considerations:

  • Operational disruption – if the whitelist is incomplete or poorly maintained, legitimate users or systems may be blocked from essential tools.
  • Ongoing maintenance – software updates, new business requirements, and evolving job roles all require updates to the whitelist.
  • Resource demand – maintaining accuracy requires time, policy oversight, and either dedicated staff or vendor support.

Phased deployment minimises disruption. Begin with a pilot group to identify issues before implementing a wider rollout. A robust whitelisting policy should define approval criteria, update procedures, and exception handling. Regular audits of the whitelist are essential to avoid drift and reduce unnecessary entries.

Application whitelisting is a powerful control that shifts focus from blocking threats to enforcing trusted execution. If implemented correctly, it can reduce attack surface, mitigate malware threats, and bring clarity to what software runs across your environment. A whitelist must be carefully constructed, actively maintained, and consistently enforced. Not treated as a one-time project but as an ongoing control.

Inadequate SAM during Mergers & Acquisitions

Buying a company or merging to create a new company, without understanding the current software licensing position, can lead to significant and avoidable costs. Financial and legal due diligence is standard practice in mergers and acquisitions, but IT implications are seldom given the same required attention. Software Asset Management (SAM) is a recurring blind spot in IT due diligence, often leading to licensing exposure and unforeseen integration costs. Unplanned and unnecessary IT costs can significantly impact on the overall economic viability of the deal. IT due diligence will eventually become a standard part of the mergers and acquisitions process.

In the post-merger environment, with significant changes taking place such as role changes, redundancies, new computer systems and business processes, the addition of new geographical locations and new legal jurisdictions, this is likely a time when compliance is low. Therefore it is not surprising that announcements of mergers and acquisitions are an impetus for software licence audits by vendors.

Key questions to ask include:

  • What is the current position with software licensing across both companies?
  • What is the software licence shortfall value of the target company?
  • Has the software licence shortfall cost been factored into the sale price of the business?
  • How mature are SAM processes in both companies?
  • What will the software licence position look like after two companies have merged?
  • How much will it cost to purchase new licences?
  • Who owns the software licences?
  • Can the software licences be reallocated to a new organisation?
  • Are existing licences transferable under the terms of the original agreement—and are they valid in the new legal entity?
  • What opportunities are available to renegotiate licences?

Important points to remember are:

  • Proactive dialogue with software vendors during the merger or acquisition process will result in increased bargaining power and strengthen supplier relationships
  • Reactive dialogue in response to a software vendor licence audit puts the licensee in a relatively weak position

IT due diligence must become non-negotiable in any acquisition strategy. As software landscapes become more complex and licensing models evolve, asset visibility is not optional, it is essential.

Improving Software Purchasing Decisions

Vendor software solutions usually have configurable settings to work in a particular environment. However, businesses must avoid solutions that require extensive custom development simply to meet baseline expectations. Without understanding the required level of configuration or customisation, costs for the new solution can quickly skyrocket. Investigate the solution more thoroughly before making a buying decision. Using the system ‘out of the box’ should be a viable option.

Here are some examples, but these will vary depending on the type of system:

  • Identify and Access Management – does the solution have a built-in option to integrate with Active Directory or other directory services? Given the level of Active Directory usage throughout the world, this should be the case for this genre of software. Configuring the system to know which domain to look at is expected. However, it would be a disappointment to purchase the solution and then need to pay extra for the development of an integration module.
  • Support packages used to fix products – bundling support with software is common, but selling consultancy just to make the software function properly reflects poor product maturity. Sadly, offering consultancy to make the software operational and fit for purpose is often a fact of life in the IT sector. The consultancy services could quickly become more expensive than a “built in-house” solution.
  • Existing integration modules – what systems does the solution already integrate with as standard out of the box?
  • Identify and assess all customisation requirements upfront – a viable solution should be functional out of the box. If there are special requirements that no other organisation has, then the software may need tailoring. Assess whether the level of customisation justifies choosing the solution at all.

If the product needs modifications, you should thoroughly consider if it is the right choice and investigate other options.

Stakeholder Engagement with HAM and SAM

Data silos emerge when teams operate on disconnected datasets, with each assuming theirs is the definitive version, and all carry out their work duties on the assumption that the data they are using is complete, reliable, and trustworthy.

Organisations often evolve into this state when different teams self-manage their areas, their hardware and their software in isolation. Although attempts may have been made on occasion to collate the data and provide a more accurate holistic view, without processes to maintain such data, it inevitably becomes out of date very quickly. Where such an aggregation of data is undertaken upon the initiative of an individual team or team member rather than driven by senior-level strategic direction, data from some areas may be missing.

If you allocated time and resources to building or buying a new solution:

  • The system must become the authoritative source of truth for hardware and software assets across the enterprise. If other teams need a subset of the data, it should come from this source and not independently compiled.
  • Where multiple teams manage hardware and software, the new HAM/SAM delivery project should have buy-in from all relevant departments to ensure that the system becomes the central system of record. Without this, the new solution will quickly become another “one of many” solution.
  • Understand any specific requirements that individual teams may have for data on hardware and software. If the new solution doesn’t meet the needs of all stakeholders, they are likely to continue compiling their data independently. Reports should not be generated using separate data sources but should come from a consistent data source.

Consider the following actions:

  • Make sure all the stakeholders are involved in the process or have an awareness of the new project.
  • Maintain regular communication with stakeholders, ensuring visibility into project status, decisions, and expected outcomes.
  • Catalogue all inventories or processes currently in use by teams throughout the business and identify any specific requirements at a local level.