Use cases target children to seniors
Internet2’s involvement with the national strategy is no surprise as it is an identity ecosystem in itself. “When NSTIC first came out, they had an animation on their web site about how they saw the world, and I remember going, ‘wow, that’s just like a diagram we drew on a place mat in an Ethiopian restaurant in 2001,'” says Kenneth Klingenstein, Director of Middleware for Internet2. “So we got to see our vision being adopted by a much broader constituency.”
The organization will use the $1.8 million grant to build a privacy infrastructure through common attributes, user-effective privacy managers and anonymous credentials using Internet2’s InCommon Identity Federation service. It will also encourage the use of multifactor authentication and other technologies.
Internet2’s partners include the Carnegie Mellon and Brown University computer science departments, University of Texas, the Massachusetts Institute of Technology, and the University of Utah. The intent is for the research and education community to create a scalable privacy infrastructure for the nation’s identity ecosystem.
Part of the pilot will include promotion of multi-factor authentication, Klingenstein says. “Because good privacy begins with good security.”
The project will create a set of user attributes that represent use cases for the marketplace. But before that can be done there needs to be an understanding of the context of when an identity might be used.
Fundamental to the approach is the group’s concept of user contexts or roles. Klingenstein says there are four typical contexts: employee, consumer, citizen and anonymous. “Our goal is to provide a consistent privacy infrastructure that spans all of those contexts so whether you’re functioning as a citizen, a consumer or a worker, a common set of tools is used in a common way to manage privacy,” he explains.
These tools include privacy manager software that enables the user to control the release of attributes and have informed consent. “When somebody asks for attributes, it gives you the opportunity to say ‘who’s asking and why do they need it?” he explains.
The pilot will also be developing metadata mechanisms that can support attributes, context and privacy management. Metadata describes the context, content and structure of records and provides information about format, date, authority and other information. “It’s really metadata that going to give us the scaling to a national and global level,” Klingenstein says. “So we have to press very hard on the metadata boundaries that we currently see and evolve those.”
Internet2 will also work on integration of anonymous credentials into the identity ecosystem. The focus is often on low assurance or high assurance credentials, but often neglected are the credentials that enable a user to be anonymous. These credentials exist but have not been widely deployed because of engineering gaps between the credential and the attribute authorities.
Klingenstein offers a real world example of how an identity ecosystem might work with anonymous credentials. Imagine an individual acting in the context of a citizen, participating in a neighborhood wiki conversation about landscaping concerns. The conversation only pertains to people whose lawns are really being affected but we also want to give them anonymity in the conversation.
In this situation the identity ecosystem can know something about where the individual lives to ensure that those involved in the conversation are relevant to it. But it can do so while concealing the identity, thus providing anonymity, for the participants.
There are also use cases around the Children’s Online Privacy Protection Act. “In the electronic world there’s no good mechanisms to protect the privacy of children and make sure that when they enter a chat room for kids there’s only kids in there,” Klingenstein says. “I think these technologies can get us a fair bit of the way there.”
The anonymous credentials will likely lead to policy issues, the final area in which Internet2 is focused. “Any good technology exposes policy gaps,” Klingenstein says. “It may be the inconsistencies of what people think of as personally identifiable information, it may be inconsistencies of privacy regimes and consent at state, federal and international levels.”