Key qualities to look for in emerging specs
By Terry Gold, Founder, IDAnalyst LLC
I engage in a variety of conversations with manufacturers, integrators and end users in the physical access control space about systems that must live for a decade or two. Traditionally, technology decisions have been made with heavy influence from existing relationships, cost sensitivity and feature sets from the perspective of those that will operate these systems. But for end users conducting long term strategic planning that conversation is changing, and part of this new conversation surrounds physical access standards.
Physical security professionals have to make decisions that serve the overall organization rather than a closed group. Their decisions are being driven by the need to offer value and become more than just a corporate cost center. Physical access control systems need to reduce risk and cost, increase efficiency and add value. They also need to collect better intelligence and enable collaboration with other departments for improved incident response and remediation.
These requirements aren’t unique to physical access, as they reflect a common maturity cycle for organizations mandated to increase profitability. In turn these pressures place demands on vendors to design products that enable this goal.
The legacy challenge
Years of regional decision-making and acquisitions have led to a collection of disparate physical access control systems. The physical security industry has an uneven record when it comes to driving interoperability with the implementation of standards, which has resulted in silos of infrastructure.
Traditional approaches to dealing with this have been limited – forcing organizations to “rip and replace” silos with yet another proprietary technology from a single vendor. But this isn’t sustainable. It requires end users to be highly dependent, if not “locked-in,” for the lifespan of the investment, a commitment that cultivates the same long-term dependencies, limits adoption of competing innovation, results in uncompetitive pricing and reduces pressure on the incumbent vendor to innovate.
IT has been working under the same set of drivers for more than a decade and the solutions available to them are much more advanced. The bottom line is that IT is far ahead of the physical access control side and pressure to align with them means playing catch-up by taking a page from their playbook. This is why standards are the key ingredient.
What physical access control standards can bring to the table
Think of standards like Bluetooth headsets. As a consumer, I really don’t want to purchase a specific Bluetooth headset that only works with my handset model. Rather, I prefer to just know that anything called Bluetooth works with other devices that use Bluetooth. I can select from a range of vendors offering a variety of features, price, quality, performance and design.
If I need to get another phone, my previous investment will still work and the same benefits will remain even when paired with a phone from another manufacturer. Physical access infrastructure should work in a similar manner.
Shades of gray
In physical access, standards are still the subject of confusion. What constitutes a standard? It’s a topic with room for variance of opinion and shades of gray. Books can – and have – been written on the topic in an attempt to clarify. There are, however, some universally accepted principles that can guide one to reach their own conclusion when assessing if something is a standard or not.
Standards are always specifications, but not all specifications are standards. Think of a specification like a blueprint explaining “how” to build or execute. And then look at a standard as a common agreement for that particular blueprint.
There are different paths that standards can take from birth to maturity. Typically they start out as common groups getting together to solve a problem, defining a charter and working on an initial specification. From here, it starts to get fuzzy. In general, they define processes for reasonably inclusive participation, neutrality in competing interests and control, appeals processes and common agreement.
The measure of ‘open’
Standards typically contain a defined set of specifications that govern data formats, protocols and interfaces. Thus, if any one vendor solely controls any one of these, it fosters many of the problems that we spent years trying to remediate. Single-vendor control limits choices and the ability to execute not only interoperability, but also the type of functions and services one can interface. At best, a “toll” is paid to that vendor for very specific and often limited use.
Openness is important as it can be used constructively to determine how benevolent a standard really is. In determining how open a standard really is, one should ask several questions:
- What is the process by which the standard was created?
- Who maintains it after the initial version?
- Is there a commitment to backward compatibility for early and subsequent adopters?
- Is it extensible? If so, does it impact compliance?
- Are the specifications reasonably accessible?
- Is it too restrictive to achieve desired goals?
From my own perspective, I lean toward those standards that are more transparent in their mission and process. I also look at those that are democratic and engage a broader community, are less restrictive to access, and that implement, extend and reuse in a manner that is free of encumbrances.
The good news is that there has been a great deal of focus and progress in this area over the past few years, bringing tangible results. Not intended to be a complete list, here are just a few examples that illustrate cooperation and execution across a subset of the physical access community.
OSDP (Open Supervised Device Protocol) is a specification that addresses key limitations of the legacy Wiegand communication protocol that defines data transfer between access control readers and systems. Using serial RS-485 cabling, it enables readers to communicate bi-directionally with a control panel. Where it was a laborious process to update firmware and settings locally at each door, it will be possible to do so centrally and remotely, as well as push out useful notifications, among other things. It also leverages Global Platform’s Secure Channel Protocol, a widely accepted secure communications method in smart cards for everything from readers to controllers, making up for where Wiegand increasingly falls short. It will also be extensible to allow transport over TCP/IP so it’s both backward compatible and forward capable.
ONVIF (Open Network Video Interface Forum) was started back in 2008 to harmonize interoperability between network video vendors for the benefit of end users and integrators implementing the devices. It is a good example of an open approach in governance and leveraging existing IT standards, such as Web Services, as opposed to reinventing something new and obscure.
PSIA (Physical Security Interoperability Alliance) is focused more broadly on interoperability across various IP-enabled devices to achieve plug-and-play functionality and, in turn, enable a variety of services to be shared for greater actionable intelligence.
PIV (Personal Identity Verification) is an initiative on the Identity and credentialing front, created by and for the U.S. government. PIV filled a void particularly in the smart card market where standards for interoperability had long been a barrier to adoption and maintenance. It also enabled a path to use PKI in physical access, which is valuable because it offers an alternative to symmetric key implementations.
OPACITY (Open Protocol for Access Control, Identification and Ticketing with PrivacY) addresses the performance and complexity that PKI presents, but also provides the openness and security that the market increasingly demands on a contactless platform (such as leakage of identifiers).
PLAID (Protocol for Lightweight Authentication of Identity) is a contactless standard developed by the Australian government to address its requirement for stronger contactless security.
Both OPACITY and PLAID are open source – the source code can be downloaded and has flexible terms for use and reuse – and share similar goals and principles. OPACITY has the edge for adoption in the U.S. given its approach in alignment with existing U.S. government specifications and registration with ISO as an authentication protocol. It is currently under review by ANSI (American National Standards Institute) for standards adoption.
There is still a choice
So does all this mean that there’s no place for proprietary technology? Not at all. A vendor may have an approach that is patented and exclusive but is incredibly valuable to a given situation. Also, vendors that leverage standards can decide the manner by which they carry out the specification, how well they do it, as well as offer additional services and functions that, when used in conjunction, make a stronger value proposition.
The choice is up to the customer. There is no right or wrong choice. In terms of security, open standards promote accessibility, and in turn, peer review and testing across a large and competent community. This process is very good at discovering and correcting vulnerabilities. Therefore, it would just be logical when considering proprietary approaches to demand similar transparency and make sure that claims can be validated versus being told to “just trust us.”
The benefit of community
Standards can, if executed properly, bring together a community wanting to solve the same problem. The individual standards development efforts are important, but more significant are the communities being built to solve the long-standing challenges that have prevented real progress and created the chasm between IT and physical access control.
While we will have to wait and see which specifications are accepted and widely adopted, the participation of vendors, integrators, end users, trade organizations and others at the table is the new reality. This in itself will foster innovation and accelerate progress across an industry that had become complacent.