Thursday, March 22, 2012

Adobe Moving Forward with Smartcard Usage ... or is it moving forward?

Over the last couple of days I have had numerous people point me to a post on Adobe's blog about PIV card usage. For those of you not familiar, the PIV card is the US governments implementation of an end-to-end specification for identity issuance to its employees and approved contractors. The standards that support PIV come out of the work that followed on from Homeland Security Presidential Directive 12 (HSPD-12) and they address the technical specifications for the card including the specifications of the digital credentials on the card as well as the process for issuance and most of the things that build around that. The PIV specification was then leveraged to implement a credential for non-Federal entities that may wish to interoperate with the US Federal government and this is called PIV-I for PIV Interoperable. PIV and PIV-I credentials have been rolling out over the last few years and the PIV-I market is growing quickly with interest from companies that provide products and services to the government, state and local governments and now growing into the healthcare arena.

The post by Adobe was very good in that it talked about usage - and how these cards can be used to sign documents electronically and then provide a way to validate them - but there were a few things in there that hit the wrong chord. Let me explain .....

First off the post talks about validating credentials per the US Federal Common Policy. Well the US Federal Common Policy does not tell you how to validate a credential. It specifies the policy under which you would operate a PKI such that it could be trusted by the Federal agencies. NIST did create a set of tests, PKITS, that would let you know if your product could validate a certificate, and thereby signature, properly through the Federal PKI architecture ... maybe that is what was being thought of.

But another bad chord .... the post goes on to say "A recommendation to make this easier is for all of the issuing certificate authority public key certificates to be stored on the smartcard and available to the OS+applications." The example they give has a signature that is tied through two bridges to the Common Policy. So if I am reading this correctly I need to put the Common Policy Root, the Federal Bridge certificate, the Certipath certificate, the Root of the issuing architecture and the issuing CA certificate all on my card. The idea is that this is what I need to validate the signature. Well yes I need this data but what of revocation data for that chain? So do I put that on there as well - a rhetorical question since we need to get that live - so I need network connectivity. Well if I have network connectivity why not just use the data in the certificates in my trust chain, issuer and it's root, to discover and validate the path that is appropriate based on policy identifiers and business rules?

That was the idea behind the PKITS test - to do the path validation real time using software that did it completely. Do I need this software on my desktop? The answer is no - numerous solutions also provide this capability on server based systems using implementations of SCVP. There are desktop solutions that do it right but it is not the only way.

The other issue that comes up is that the cross certificates that are used between these CAs have shorter lifetimes than the cards and certainly are not in sync with user updates so how do I update these root, issuing and cross certificates on the card?

Yes Adobe you did the right thing by presenting a usability case that truly is needed - we just need to make sure that the system is truly usable in the end-to-end implementation. I think that is where this has fallen short.


- Posted using BlogPress from my iPad

Thursday, March 15, 2012

Some Thoughts from IDTrust 2012

I spent the last two days at the IDTrust Conference which was held at NIST in Gaithersburg. This conference started about 11 years ago as a PKI centric conference but over the years it has evolved into a broader discussion on identity. Ian Glazer did a great job of laying this out in his presentation early on the first day. This move from an almost pure PKI discussion to a broader identity discussion was seen even at the opening with the initial presentation given by Jeremy Grant, who leads the NSTIC program, and re-enforced the desire to get industry to move ahead with innovative ways to improve the authentication discussion and move towards real implementations.

The discussions held over the two days were great. There was good focus on authentication but also very broad discussions around attributes and their role in improving the confidence levels of the parties involved in transactions. The two days did generate some interesting thoughts, three of which are discussed here.

There appears to be a growing need to handle the lexicon for attributes - this is something that I wrote about quite a while back. The context for my previous discussion was a broker for managing the lexicon - handling the differences between the varying attribute terms and definitions that are being used. This does require considerable cooperation between organizations but a managed central service that is participatory and leverages recognized standards group involvement should address the majority of the interoperability issues.

Identity management appears to be taking on a new scope. When we speak of identity management today we speak of things like registration for authentication credentials, usage of these credentials and maintenance. It does appear though that even within this there is some aspect of attribute management as part of the identity. Now there are some that feel that everything is an attribute, including your name, and I will not be debating that here, but whatever we cover as an attribute we must contextualize those attributes and their reliability, relevance and effectiveness and consider how this may change over time. A simple example is something like address. Even today I can go to a store that has had a record of me from an online purchase and they will still have my address from 4 years ago, even though it is no linger relevant/accurate. Management of these elements of data, including weighting them, is becoming a critical element of the personal data economy. Companies need to know what is current and also what is more likely to be accurate when they access these elements.

A third, and final thought for this post, is the need that comes from the prior two points - how do we effectively manage the attribute lexicon and the data represented within it? One would assume that the data is the users but is the user the only one that can manage it? Do existing attribute brokers/holders such as EQUIFAX and Experian have some level of control or responsibility to handle the weighting or accuracy of the data? Do we provide an easy interface for the user to handle their data and how do we link that to the brokers?

As you can see there was considerable discussion on attributes and attribute management during the sessions and in between them. There was also a lot more data and information and some of the presentations are available on the NIST/OASIS IDTrust 2012 site.


Let's get the discussions going and let's see if we can help move this yardstick forward some.

- Posted using BlogPress from my iPad

Tuesday, March 6, 2012

Is my smart-phone smart enough?

I read an interesting article this morning that came out of the RSA 2012
conference. Two researchers had found that cell phones leaked data through their transistors which could reveal private keys in use within the running application. One would think after seeing this headline it was a case of poor implementation but these researchers demonstrated this on multiple platforms.


Should we be worried? Is there now an easy way for people to get at your data? The research did show it is achievable to gain access to the keys that are protecting data. An overall successful attack would require multiple elements of course. The attacker needs to get the keys and then gain access to data, either over the air or through a hosted server. Again none of this is impossible but it certainly would be a coordinated attack. So should we be worried? Well if you or your employees are using your phone to protect sensitive data then maybe there is a reason here to start looking at protection mechanisms and procedures that would mitigate some of the risk.

- be aware of your surroundings when you use applications where sensitive data is accessed;
- limit the sensitivity of information that is stored on the device
- start looking to phone vendors that have external validation of their devices or cryptographic implementations whether that be a FIPS style validation or Common Criteria
- have a plan in place to update keys on a regular basis if you need to store sensitive data on your phone

The news of this research is fresh so there is still lots to learn about the risk and mitigations but some of the things above are common sense guidelines that will help to mitigate some of the risk


- Posted using BlogPress from my iPad