Knowing employees, authenticating users and provisioning access across networks and in the cloud
But in the last five years this has changed. It’s still about authenticating and enabling access, but it’s also about provisioning access within an application, says Vishvas Patel, vice president and chief architect at IdenTrust. Instead of inserting the smart card, entering a PIN and having access to everything, federal enterprises are fine-tuning that access.
Enterprises have more applications that employees need to access and they want to control what can be accessed within those apps. “Organizations are moving away from a one-size-fits all approach,” says Patel. “Users are only provided access to what they need.”
This makes logical access control complicated and time consuming. When onboarding a new employee who needs access to 25 different applications, it can take time to enable the access to start and then fine-tune what they can do, Patel says.
Also, access to the system or application is done separately from provisioning access to data within that application. For example, Joe from IT might give access to an ordering application but it’s the department director Steve who will provision specific access within the ordering app. “Users are only provided access to what they need,” Patel adds.
Another trend that has emerged is altering the requirements for authentication depending on the application and the risk involved, Patel says. For example, authorizing payments for contracts might require authentication factors beyond the smart card and PIN. “Additional challenges will be presented depending on the risk involved,” he explains.
The concept of traditional perimeters and firewalls is out the window as employees need access to all types of cloud-based applications
Layering different authentication methods and making them variable is another new technique. Instead of asking for a fingerprint every time, for example, an authentication system can use the fingerprint once, a one-time passcode the next time and a voice biometric after that. Since a hacker doesn’t know which authenticator will be requested, the system is more secure, says Pam Dingle, principal technical architect at Ping Identity. “It used to be about using a smart card or biometric before access, but now you can layer things and make them more complex and variable,” she explains. “It’s harder to hack things if you don’t know which authentication factor will be requested.”
Feds redefining logical access for contractors
In the past, federal contractors who worked with multiple agencies needed different logical access credentials for each project. Even though the credentials used for access were the same, agencies weren’t able to provision them into their system. This created a cumbersome and expensive situation.
This has changed, says Vishvas Patel, vice president and chief architect at IdenTrust. Now contractors can be vetted once, receive a credential and then have it provisioned appropriately wherever necessary.
“Contractors interacting with civilian federal agencies are able to use a single authenticator across different agencies, be it a one-time passcode or PKI-based smart card,” Patel explains. “This makes their lives easier as they’re dealing with fewer – or even just one – authenticator.”
Many of these contractors are using PIV-I smart cards, Patel says. Enabling them to work on agency networks requires additional middleware, but after that, deployment is a straightforward process, Patel says.
Then there are techniques that don’t even require the user to actively participate – provide a biometric, passcode or other interaction – in the authentication at all. Adaptive authentication, sometimes referred to as passive authentication, is an oft-discussed topic when it comes to securing logical resources, Dingle says. Adaptive systems require no input from a user, but instead look at a variety of other factors – IP address, time of login, device being used, etc. – to determine risk associated with the login attempt. If something is out of the ordinary the system asks for additional authentication factors before enabling access, but if all is well work can proceed without further interaction.
Modern systems are being configured for continuous authentication that occurs in the background, says Ryan Zlockie, vice president of authentication at Entrust Datacard. “We’re taking the intelligence we use for the initial authentication and also using it to make sure nothing happens during that session,” he explains. “We’re making sure someone doesn’t do something inappropriate during that session, and if something weird happens they can be asked to step up their authentication.”
The cloud disrupts
As more enterprises are moving applications to the cloud it has made access more complicated, says Patel. “The cloud has complicated the management of logical access specifically when part of the app is in the enterprise’s control and parts are in the cloud,” he explains.
Enterprises have to administer two different domains, the cloud and the network, which can be difficult. “The concept of traditional perimeters and firewalls is out the window as employees need access to all types of cloud-based applications,” says Zlockie.
Before the emergence of cloud-based applications, the corporate enterprise owned and managed applications. Controlling access was easy and done through traditional directories. As access to more and more applications and data was necessary, single sign-on came onto the scene to ease the burden of having to remember so many different usernames and passwords.
Then there’s the cloud. It is not owned by the enterprise, but the enterprise still must manage access to its resources. Standards exist to enable use of existing systems for access to cloud apps, but the process can be complicated and take time to create.
The evolution of logical access standards
Standards have helped ease some of the complexity surrounding logical access to the cloud at the enterprise level. Prior to the introduction of the iPhone, most applications were delivered via web browsers using SAML. “SAML was the right technology, delivered at the right time, to secure web sessions across domains,” says Ping Identity’s Dingle.
SSO manages 100s of username and password combos in one secure login process
But, she explains, two things happened to alter this landscape. First, there was an explosion of smart phones and mobile devices. Second, the application programming interfaces (APIs) changed to meet the need for multi-tenant cloud platforms to interact with many clients simultaneously. Soon after, the market shifted to native applications running on smartphones. “There was no browser involved, so SAML was no longer appropriate,” Dingle explains.
The initial solution that developers came up with was far from ideal because the credentials were passed through multiple services, Dingle says. “The opportunities for abuse were terrible with that scheme, because the credentials were passed all over the place. The native application sees them, the API sees them, if the transport layer isn’t secured, and interception is a risk too,” she explains.
Enter the OAuth protocol family, which solved the problem of storing user credentials and passing them to APIs by replacing the actual credential with temporary “access tokens,” Dingle explains. Users would authenticate once, preferably at their home authentication service, and then the application would receive an access token that could be stored and used to call APIs on behalf of the user, without storing their password.
Identity professionals now have a comprehensive set of standards and supporting tools with which to work, Dingle explains. These tools can work under most circumstances to keep a user’s credential the secret that it should be, while giving IT departments tools to reduce risk.
Next steps for logical access control
“We need a single trusted identity that can be used to access all the requisite digital resources,” he adds. “Logical access will not hold true in the future, we’re talking about digital rights management in a digital world.”