Tuesday, December 6, 2011

Looks Can be Deceiving ....

One of the issues I personally run into is walking into a new environment and people asking me about my experience. The issue, or so I am told, is that I look young for my age. The problem this creates is that when I state that I have 25 years of experience in the secure data world people begin to wonder if I am overstating things or intentionally inflating them. No the reality is I am progressing in age and my college age kids will attest to that. I just seem to be gifted with good genes.

Now what does that have to do with security and identity. Well for those that have been around I think it will be obvious - things are not always what they seem. "That cant't be - it's too good to be true" "Really you saw that online?" are just different ways of saying the same thing. Security is about diligence. In my example the customers are being diligent, they are making sure they are getting what they paid for. In these other cases - it is quite likely someone will get what they deserve if they were not diligent.

In the corporate and government security world we try to make diligence programmatic. We black or white list things. We take out what is bad or we close everything down and open up what meets the security requirements and need. We define policy and practices to adhere to. We have firewalls, intrusion detection, intrusion prevention, virus scanners and on and on. But we still need to remember we have people involved.

In the past year we have had lots of cases where the person was the problem - a Citi exec charged with wire fraud and UBS and Countrywide execs also charged or sentenced. The lesson from these is broader than just the specific financial mess that falls out but it is a lesson of diligence and oversight. It is a lesson of education. These are lessons that equally apply in building a security plan. No matter how good the technology and policy it these things have to get implemented. And that implementation needs to be done by well trained people with appropriate oversight dependent upon the associated risk profile.

Tuesday, November 22, 2011

Internet Privacy ... the discussion is happening

Over the last year or so I have been involved in a number of initiatives that have privacy as a key aspect to it whether it was work on Attribute Exchange, NSTIC or FICAM Federal PKI policies. I am one of the people that has had their eyes opened more and more with respect to the aspects of privacy.

Yes I do understand there is a balance when it comes to privacy. Usability is a factor as well as governance and oversight. In that regard I read an interesting interview with Viviane Reding, Vice-President of the European Commission. It brings up some interesting ideas on privacy, especially in regards to the individual and data protection as well as governance/oversight.

One of the interesting aspects of this is the difference in governmental views on how to deliver on privacy. Recent White House discussions center around self-governance/monitoring while EC initiative are driven centrally through the Government. This tends to reflect the traditional view of European governments and identity while the US has been careful about any form of National Identity. The US political views seem to focus on commercial delivery of identity solutions. Not a bad thing when you are in the identity business but that business comes with risk, especially when federated identity requires interoperability of these identity infrastructures. How does one guarantee compliance without the external oversight? yes there are great organizations that can manage and police, ones like those structured around the Trusted Framework Providers program within the US Government but how does that match to what is happening in Europe and elsewhere? If an IDP has to build separate infrastructures for separate markets then how does that business truly operate globally?

I am not suggesting that the US approach is wrong or right - nor am I suggestion the EC has the perfect answer but there does need to be a way to marry the discussions so the questions of risk mitigation for companies, both IDPs and RPs can be managed. Lets hope that the discussions happening today get us toward that business nirvana.

Wednesday, November 9, 2011

Is the sky really falling on the security world?

One has to wonder if this is the year when someone mentions security and we get a collective guffaw or is it truly a case of people are starting to pay real attention. I like to believe the latter but then I see article from Dark Reading on an Ernst & Young Report "Security is Still an Afterthought..." and I am not sure. Certainly attention in the media can be a good thing in that it should get people thinking. I certainly start thinking when I see news of breaches, attacks or vulnerabilities - "Do I have to worry about my environments?" "Is this an opportunity to share knowledge?" or "Is this an opportunity to look at a system differently?". Part of the issue I have with the coverage though is that there is usually the sensationalist article "<insert technology here> is Broken" and it gets mainstream attention and explaining to people the real story then takes lots of time.

Experience has taught me that it is not always a case of technology being broken. Now grant it we have had those cases but generally speaking what we have seen lately is not technology being broken but technology being poorly leveraged or poorly implemented. Lets take some examples:

- The RSA breach: Why was it as bad as it was? Well someone had left critical data on a networked computer. Is the RSA two factor solution a bad technology - NO! Did the implementation of their infrastructure  have some fundamental design/implementation issues - YES.
- The Comodo attack: Is PKI a broken technology - NO! Did Comodo miss some fundamental implementation rules be not having strong multi-factor authentication for their RAs and not having back-end checking for domain use during issuance - It certainly looks like it.
- BEAST: Is SSL/TLS a bad technical specification - fundamentally NO! Have browser, server and other vendors that leverage SSL/TLS done a good enough job in keeping abreast with updates based on enhancements to the specification - obviously NOT!

So what do we learn from this - yes we do need to pay attention to the press, whether it be a trusted blogger or trusted news source. The data that they deliver though is only one piece of the picture so we need to make sure we take that data, add to it and then assess what it means in our personal, corporate or organizational content.

It is an old adage but it is not just technology - you need to consider the people and process around it. That includes education, policy, implementation and all the other elements that make up a good security plan.

So CL you are safe - the sky is not falling - but make sure you are looking all around you and not just up - problems can come from any direction.

Friday, August 26, 2011

Is Authenticating to the Cloud different than anything else?

I was reading an interesting article the other day on a new Government Cloud service being offered by Amazon. Security advances and budgetary pressures draw agencies to cloud - Nextgov: This to me raised a number of thoughts including the cost of compliance for Amazon to maintain the system to meet some very broad and detailed government requirements. Now do not get me wrong, I think that Amazon has the capability to do this, the question becomes is there the long term desire to maintain things that the government will require of them. The flip side of this is that it may encourage the government agencies to rethink how it looks at maintaining systems and may in turn help them to reduce some of their costs internally as well.

The other thought I had was one of protecting access to the data. The federal agencies have broadly moved to smart card based authentication systems and are now looking at how to enhance that with attribute based authorization using architectures like BAE (Backend Attribute Exchange). I wonder how Amazon intends to leverage the authentication infrastructures that have been put in place. Does the Amazon offering now allow extension of the user platform beyond the traditional desktop to tablets and smartphones, both of which have become very relevant in the government market? How will Amazon handle the enhanced checking of credentials and interoperation with these systems? How open to the acceptable government profiles for SAML, OpenID and Kantara will they be? There are lots of questions here and dependent on which requirements the government has been testing the Amazon service against these may already be in the forefront or these may start to appear as people use the service.

Of course there is no lack of technology that will enhance the architecture - systems that provide for multiple authentication device types, which may be required dependent on the resource accessed, combined with the ability to roll out strong authentication credentials to smartphones or tablets (whether PKI, OTP or others), along with a variety of smart card/chip capabilities that can use various communication technologies certainly opens the field of use.

These are all things we are working with today and implementing for a broad audience. The technologies are there, the systems just need to leverage them appropriately.



- Posted using BlogPress from my iPad

Friday, August 12, 2011

Those who cannot remember the past are condemned to repeat it

This quote from George Santayana has been somewhat skewed over time .... "Those who do not learn history are apt to repeat it" being one popular one but I believe Santayana's words are as true today as ever.

No this is not a post about politics - it is about security and ties into some of the recent thoughts on planning. As I had a coffee today I began to think about security from a network perspective. Not network security but the perspective of security being established through the interconnection of people, technology and events. As I did this I remembered some of my history and it then dawned on me the similarities of things in the past and what has been happening today. Let me try to explain.

Most of you are familiar with the Comodo attack from earlier this year. The attack was perpetrated by going after the platform used by an administrator. The success of this attack led the attacker to being able to create credentials in the name of some very significant companies that would have allowed very broad attacks on potentially hundreds of thousands of users. Thankfully the latter part of the attack was not executed and the breach was discovered before major widespread damage. The point here was that the attack was against the management plane of the system and an attack at that level can be hard to discover. A similar management plane attack occurred in 2010 that allowed someone to take control of a private Certificate Authority which caused major problems for a very large contracting firm in the US. Two examples of management plane attacks that created great havoc.

So where is the history linkage. Well 25 years ago there was a manager at a firm in California that discovered an accounting error in a system. He asked one of his people, Clifford, to look into this. It took some time but Cliff was able to discover a sophisticated attack against the burgeoning defense and other networks. He eventually traced the perpetrator to a network connection coming from Germany. For some time he got no where working with the Deutsch Bundespost, who ran the networks, and then one day they called with the data he needed. What Cliff did not know at the time was that the perpetrator had used a management plane attack within the Bundespost system. He was able to connect to the DBP network, carry out his attack against networks all throughout the US and then he would go back and delete the accounting record on the network switch before it was uploaded to the accounting system. To the DBP the user was never there. A young guy living in Ottawa worked with the DBP and they found the switch coding issue that allowed the guy to delete the accounting records. The hole was closed and a couple of months later Markus Hess was caught.

Now closing the accounting error at the DBP was only one piece of the puzzle, to learn the rest read The Cuckoo's Egg, but it did show that an attack against the management plane would provide a mechanism to hide the real attack.

The lessons learned here are many but the big ones - understand system connectivity (the network), plan to protect hierarchically making sure the high value management system gets attention, and leverage new technologies that provide strong two or three factor authentication on the highest value assets since a breach there will either bring the entire system down or will create a security gap that is not even known about.

- Posted using BlogPress from my iPad

Monday, August 8, 2011

Timely ....

A few weeks back I wrote a piece on planning. The context may have seemed odd to some, unless you are a runner, but the basic idea is that for any challenging undertaking you need to plan not just for completion but for events that may hinder that completion whether those events occur in the preparation or execution of your plan.

The timeliness aspect comes in light of a lot of recent articles and commentary around breaches, social engineering attacks and announced vulnerabilities. It should be no surprise that we are seeing an increase in articles on this with DefCon in Vegas this past weekend, and the events that lead up to it, but I think we are also seeing the recognition of a true problem even from outside of the technical community.

As I read some of the articles that come out I see a consistent theme - little opportunities that are missed that either created the gap that was taken advantage of or created a gap that made the initial event so much worse. One of the best pieces I have read that begins to address some of the issues with actionable ideas was a piece written by Jeffrey Carr on Shady Rat. In this piece he identifies a four step process that starts to address the "gap that made the initial event worse". This type of direct action taken conjunction with development or revisiting of a broader plan is what is needed for organizations big and small. (For those small organizations that think this is a non-issue take a look at the Anonymous hack at rural sheriff offices and that is the new stuff - the older stuff would really scare you)

What is that broader plan? I wrote on some of this a few weeks back and I still contend that it is bigger than this or any blog is, but there are some basics. I hate simple graphics as they can be so empty but I think in this case if we go beyond simple we are writing a book - so here goes:


For most organizations 5 simple areas are what is needed to be looked at. I was going to do the loop-back diagrams but being a car guy I like gears better and it gets the point across. The point is that all five elements need to work together. The four outer gears, although smaller, are as important to get right as the overall strategy. All these working together is what drives the organizations business purpose. Mess with one gear or implement it poorly and the overall plan suffers. As I mentioned a few weeks back, ignoring the surprises, or not being prepared to respond to the unknowns will also cost dearly.

I think that most companies will say that they have these basic elements in their plans but based on what is happening in in the real world we are seeing that either they are not well implemented or not being effectively updated and monitored. A plan is only as good as it's execution. So take a look at your plans, update them as needed and have in place a regular review - and that does not mean every 5 years. In today's world it should, at least, be part of your quarterly reviews.


- Posted using BlogPress from my iPad

Monday, August 1, 2011

NSTIC Privacy - a better understanding

A few weeks ago I wrote about the NSTIC privacy conference. It was one of those events where a lot of ideas and concerns were discussed, which I always find helpful. Sometimes, however, you need to live some of these things to appreciate it. Here is a case in point .....

Yesterday I took two of my kids to have lunch in Baltimore. We all like the city and being a short drive away it is nice to swing by to see what is new and what is going on. We decided to grab lunch at a little pizza place just east of the main Inner Harbor area. When we got there I did a quick Yelp check-in and that got posted to my twitter feed. Three hours later I got a text from a Baltimore targeted Restaurant deal site's Twitter account. My conclusion is because of my Twitter post I was marketed directly for this service.

Now I am the first one to admit that Twitter, Yelp, etc are very open with limited privacy controls. But I had never been targeted so directly before. The fact that I had dined in a city I was now a market target. Do I have a right to let my friends know where I dine and if I enjoy it without the worry of being bombarded with service offerings? It is a simple case but I do see why the privacy groups are concerned with tracking of activities based around our identity.

How do we solve this? Of course there are those that are out there that will say - do not twitter all you do. But I am not sure that is a sane business model for someone like Yelp or for the businesses that rely on what is effectively word of mouth advertising. I will give kudos to the guys that developed the system to target market me but that being said I want control of what I am getting or seeing. I strongly believe that a combination of attribute release capabilities and opt-in/opt-out mechanisms need to be built into the provider systems so I can turn on or turn off these types of activities.

Hopefully NSTIC will drive the different parties to cooperate on an interoperable way to achieve this which I believe reduces risks for users and certainly makes them feel more confident about their privacy.


- Posted using BlogPress from my iPad

Friday, July 29, 2011

The Economics of Security

Sitting here in the Washington, DC area these days gives one much reason to ponder economics. As someone who has worked with the Government for over 15 years there is always the economic factor in the back of ones mind. Most people think that the government just buys things and the process is easy and straight forward. However this is generally not the case, especially when we talk technology. Programs need to be defined, planned, and budgeted. These budgets need to get into a budgetary cycle that can last 18 months and beyond and then budgets need to be passed. Even after that it is always possible that things change and 2011 has offered lots of opportunity for change.

Now you may ask what this has to do with the economics of security. Well we have all seen the numbers - and while they do vary they are significant. Data breaches are up over the last 5 years, and up substantially. Over the last couple of years the costs of these breaches has been up as well, with average cost increases per incident up somewhere around 15%. Average individual incident costs have been stated to be anywhere from 3-7 million dollars.

With this backdrop Congress has been working on a couple of new bills covering Data Security and Breach Notification. The work done here is to be applauded as it is a step in the right direction. The question becomes what happens to these efforts and other government security efforts as Congress moves to reduce spending?

It is correct that corporations and individuals need to be aware of risks and implement proper mitigation strategies but what happens to programs like NSTIC, which is advocating increasing personal security, and to legislation that defines oversight framework? Are bills that require action by industry and government effective without oversight and enforcement? On the other front what happens to government programs within agencies that are looking to improve security with the goal to reduce costs? There are numerous technical advancements that could reduce costs while improving security, such as moving from expensive radio systems within DOD to smartphone based systems. With the ability today to enable smartphones as strong authentication systems the technology can be more broadly deployed at lower costs than existing systems. The question is how does such an idea move forward if the funding is not there for it?

The goal here is to mitigate the new risks by improving the baseline systems. Improved authentication systems based on open standards; enhanced authorization systems that leverage existing standards and are interoperable; and leveraging validated COTS products are all ways to improve security while controlling costs. Will these get lost in today's new realities? I hope not. Seeing how an NFC enabled smartphone can be used to access multiple applications using a variety of authenticators, that is easy for a person to understand and use, will only improve the overall security posture and reduce the number of these breaches. And that is good for our economy.


- Posted using BlogPress from my iPad

Monday, July 25, 2011

Authentication Beyond the "Norm"

A week of vacation always gives one time to do some extra reading - catching up on things and looking at a broader base of things. This past week I had that opportunity and a theme came through in some of the articles I stumbled upon ... Authentication is generally thought of as a person authenticating to an application or two applications authenticating to allow processing of data. However there are other things that we need to keep in mind when we think of authentication.

Some of the major issues that are now being encountered are centering around source of base technology. Malicious code embedded in devices manufactured in other countries is one example. To date we have seen this method leveraged in devices used within the power grid and this has also been seen within components in laptops. This type of extended attack is much deeper and worrisome for many as it can go undetected for many years before being launched. Within the software application environment we of course have code signing that is intended to mitigate the similar style of attack which has been seen through code delivery. Code signing itself cannot stop the attacks that are further embedded in the base technology and to some extent does not mitigate all software based code attacks. There is a need in both environments to design a more complete end to end protection of systems.

Part of the issue in design of such systems comes to ensuring compliance in an environment where base technology is being delivered from many sources and in many cases to a consumer that is first concerned with immediate technological gratification - how many of us click through user agreements before installing software? Certainly there is an attempt at resolution here with the Trusted Computing Group's platform approach but broad acceptance of this across the many technology platforms has not happened.

So how do consumers and users protect themselves? That may be the muti-billion dollar question. Certainly there are simple things to mitigate risk for a consumer:

- Where possible buy a platform with a TPM (Trusted Platform Module)
- Only execute signed code
- Ensure proper validation options are turned on (CRL/Revocation Status checking for example)

The broader answer is something like TC. Is it signed object code? But how do you validate that code? Who authorizes the signature? Certainly there are many more questions and solutions but it is a problem that is growing and it will take a cooperative solution between hardware technology manufacturers, software distribution companies and software developers at a minimum. Education for the consumer is also important and something that has to be considered.

Maybe initiatives like NSTIC will get the conversation going - that would at least be a start.

Sunday, July 10, 2011

What is Security?

I read an interesting article this morning on mobile banking. Now do not get me wrong - I am looking forward to mobile banking but the article raised another question in my mind - what exactly is security?

There are lots of people around the world using mobile payments today. With the advent of Google Wallet, Serve, VISA Wallet, ISIS and others combined with the rollout of NFC enabled phones this will only grow. Of course as this begins to grow these companies will focus their attention on making sure that their systems cannot be breached. The last thing they need is someone to pay for something they did not buy or to have the bitCoins situation of suspected account breaches occur. The use of strong authentication and fraud detection improves the security posture and the retailers themselves are bound to protecting data to maintain their agreements with payment systems such as VISA and MasterCard, so one would say the system should be as secure as the payment systems in use today.

But is it truly a secure system? The WSJ article discusses one of the big concerns that came out of the NSTIC privacy conference - reuse of data. What information is the payment system I am using or the retailer I am dealing with collecting? How are they using this? If a retailer collects my cell phone number as part of the authentication process can they keep that data tied to what I bought and the next time I walk near their store text me a coupon? That of course is the most minor case here.

This is yet another demonstration of the need for privacy controls at the device and within the relying party. At the device it should come as a form of an information or attribute release option, defaulted for some transaction types and interactive for others. At the relying party, in this case retailer, some form of opt-in mechanism for data handling. Maybe I want the coupons - but give me the option.

So is a system secure if the user does not have control of their privacy or is it good enough to say that the system will not have a serious breach? It is a interesting question.

As can be seen there are still lots of areas of specifications that need to happen. Are these areas for policy definition within something like PCI or are the areas here something that need to be included in legislative discussions? Lots to be done and how it happens, I believe, will determine how successful these systems are.

- Posted using BlogPress from my iPad

Wednesday, July 6, 2011

Planning is important

This week is my last week of doing baseline running in getting prepped for the Marine Corps Marathon. Next week the real training plan kicks in. As I was doing a few miles the end of last week I had a long discussion with myself on the similarities between marathon training and security planning - and the glaring similarity is that you need a plan and you need to stick to it.

Everyone sees a marathon as 26.2 miles. Yes that is accurate but it is not the whole story. It does not talk about the weeks of planning and the months of 30 or 40 mile weeks that you are running. Those 26.2 miles do not talk about the tempo runs, the interval runs, nor the long distance runs. It does not talk about the handling of injuries, the planning for hydration and nutrition when running 16, 20 or 26.2 miles. Those are all details that get lost in the vision of a marathon.

Security planning is no different. There is no silver bullet in security planning. It is a long hard slog. Planning simply makes most events more controllable. Yes there will always be challenges, a breach due to a new zero-day attack or a true ATP, but the plan should also include how you handle these much like the marathon plan includes how you handle a strained muscle or a bad cold. Security planning needs to involve all relevant parties - business owners, CxOs, developers, hr, and where possible relying parties and end users. At least for relying parties and end users their needs and concerns need to be assessed and addressed. Security planning also is broad as well as deep. It is not enough to protect the boundaries, as we have all seen from recent attacks like those at the National Labs, but it must also consider subversive attacks. The plan must also involve things like: how I know who is entering my network; what happens when their means of authentication has to be challenged and how that happens; how do I detect attacks - on the periphery and inside the network; how do I control what hits a desktop - balancing between business function and protection; ad of course this list could go on for days worth of reading.

The point here is that it is the plan that is important - and that plan needs to be constantly assessed based on new needs, new data points, new attacks - and execution of the plan has to be the responsibility of one part of the organization with the assistance, cooperation and input from the others. Just like my marathon training plan, which does not succeed without a lot of help, input and support from my wife and kids, an organization security plan will not succeed without help and support from your organization and when needed some outside experts to give that independent view of how you are doing.

Tuesday, June 28, 2011

Report from NSTIC Privacy at MIT

I have attended the initial two NSTIC conferences and I think I can safely say that things are ..... interesting. My first comment is that it is quite obvious that there is still a lot of work to be done. I firmly believe that in the governance and privacy areas there is one big thing that needs to be done and that is to take what has been accomplished in other areas and map those to see where there is intersection and then see if that is something that will be useful for NSTIC. I say this because there are lots of good things going on in a number of areas but right now I think trying to take each one of those and to map to NSTIC or to see which piece is useful may be overwhelming. I hope that as responses to the NOI come in that a process like this will help to ease the burden.

The conference itself was thought provoking. Some ideas that struck me were ideas of data ownership and pseudonymity versus anonymity. There were many others but these two struck me in particular.

On the data ownership side there was much discussion on data ownership. Certainly it is easier to define ownership of some elements of data including things like credit card numbers, social security numbers, birthdate, address, weight, height etc .... but what of other types of identifying data? When I buy something online at apple.com is the fact that I bought something make that my data element or does it belong to Apple? Certainly Apple needs to know who to charge and what and where to ship but outside of that, once the transaction is complete, do they need to keep that data if I do not want them to? Should they be allowed to tell Verizon that I just bought a 3G iPad 2 with a Verizon chip in it? These "data breadcrumbs" are left by all kinds of transactions and the question of ownership is interesting.

But of course it is not just ownership - once I have ownership how do I protect that data from improper use or for that matter any use that I do not want? This is an interesting challenge in terms of privacy and in the process does it step on things like tracking (web tracking being looked at legislatively today)? Does it also step on business model? Experian, Transunion and others keep data on me that they use to provide market targets to other service and product providers. What happens to these entities and the downstream providers, who use the information, if we change how those breadcrumbs get picked up?

The other interesting data point was the pseudonymity versus anonymity thoughts. For those of us that believe we can be anonymous on the Internet I present an excerpt from an LA Times blog on the possible exposing of LulzSec. "The A-Team said LulzSec's members were a product of the hacking culture found on the Website 4chan, which is rooted in anonymity, making some feel invincible. .... "The Internet by definition is not anonymous," the group said. "Computers have to have attribution. If you trace something back far enough you can find its origins.""

So do we accept that we will at best be pseudonymous? Does that lead to multiple identities or multiple personae within a single identity? In either case it becomes critical that we prevent linkage between these unless that linkage is driven by the identity owner. This idea is one I will be thinking about some more - it is definitely interesting.

There were many more great ideas shared and I would encourage anyone with interest to visit the NIST NSTIC site to follow the updates.

- Posted using BlogPress from my iPad

Sunday, June 26, 2011

Re-application of Technology

As I was thinking about the upcoming NSTIC Privacy Conference my mind wandered to some of the technical challenges that exist. Some of these are things that we have been discussing for some time and are directly related to privacy. One of the core ideas is that in our online lives we have different degrees of relationships with other entities. Some of these relationships require a high degree of assurance as to who I am, for example, accessing my health records online, others do not require a high degree of assurance (commenting on blogs is one of them). That being said the question becomes how do I maintain a single identity but use it differently in different places?

There are a couple of technical solutions that are out there today that are being discussed:

  • Backend Attribute Exchange (BAE) leverages SAML 2.0 and a cooperative architecture to build a system whereby a relying party can request further information about an authenticated entity using a set of standard protocols. This system works very well in an environment like one of first responders where the community is well defined and the sources of attribute information are well recognized. I also think it can be extended to a more general use but a broader infrastructure for identifying Attribute Providers needs to be architected and a mechanism for predefined release needs to be implemented within these. I do believe all the pieces are there and I know some of this work is ongoing so we may not be that far away.
  • uProve is driven largely by Micrsosoft but there are a number of open working groups including one on Claims Agents that are looking to open source variants of elements of the system. Teh basic tenet of uProve, from a architecture perspective, is not that different than BAE. Grant it the underlying technology is very different but the architecture of a relying party, communicating with an end entity for authentication and then using a third party to validate claims is not that different than what BAE is achieving. 
There is a third one that comes to mind and it relates to the idea of re-application of technology. ePassport systems are based upon either Basic Access Control (BAC) or Extended Access Control (EAC) mechanisms for access to information on the ePassport. At Entrust, where I work, we have implemented both types of solutions in multiple countries and have been involved in the standards work around ePassports for a number of years. As the BAE and uProve technologies have come to surface I began to think about how EAC has the same basics in terms of architecture. The EAC architecture is more closed today due to the application but with the ongoing definition of EAC 2.0 there is a lot more similarities in terms of architecture. Can EAC 2.0 be even further extended such that its protocols extend beyond the chip-reader communication set to reuse the ideas there to allow for extension into chip-relying party in general? Does this provide the end entity with greater control in terms of information release and do so in a self contained environment? Today information release through uProve and BAE have release notification at the attribute service but could EAC bring that to teh credential holder in a self contained way.

I am planning to explore this a bot more over the next two days at the NSTIC conference in Cambridge so look for a follow-up.

Friday, June 24, 2011

Why PIV-I?

I wanted to followup on the discussion of PIV-I that I started a couple of weeks back. That discussion centered around what is PIV-I. As is the case the "what" has to be paired with a "why". Over the last couple of weeks I have also brought up a couple of other issues that tie in here that can be summarized with "authentication architectures".

The last few months have seen a significant proliferation of attacks against authentication systems: RSA SecurID, a defense contractor having their Active Directory administration services breached; and the Comodo attack against an administrative function. These attacks, and more, were intended to achieve one thing and that is to be able to get deeper into systems than a normal attack would and to potentially extend the attack reach outside of the initial target.

So how does this play to PIV-I?

PIV-I does  a number of things: it defines a standard issuance process; it defines a standard token; and it defines a standard token interface. The PIV-I specification allows organizations to readily see that another organizations have implemented an authentication mechanism that is measurable. Once you know that the mechanism is measurable then you can decide how it meets your requirements so you can decide if you can trust these credentials, and for what purpose. A good example of this is within the US Government agencies looking to PIV-I as a way to be able to easily work with external parties, contractors, suppliers, partners and other governments like State and Local. An example of this happening is that DOD has created a list of commercial SSPs that issue PIV and PIV-I credentials that can be trusted within DOD. So there is a business reason to use PIV-I, a strong mechanism to allow for digital interoperability at a high degree of assurance.

There is also a business reason for PIV-I based on the above mentioned attacks. A PIV-I token is a FIPS validated credential with strong protection mechanisms. Too many wrong passwords and the token is no longer usable. Private keys cannot leave the card. It is a PIN protected device; and has a biometric capability for additional assurance requirements. On top of these the credential is issued through a process that is well defined, requires in person proofing, and is separate from the relying party application. This last piece is the important element as it provides a safety barrier in the case of attack against the authentication system, as discussed in the last post. Another good read on the subject can be found here.

So PIV-I is an authentication mechanism that provides a well defined process for issuance, a strong credential  for identity (physically and logically) and mitigates some of the risk factors that were in place with the attacks against authentication systems that have happened over the last many months.

Next will be a discussion on using PIV-I in a solutions architecture.

Tuesday, June 21, 2011

APT and Layered Authentication

I was recently speaking with someone about their infrastructure and an issue they were addressing. Their infrastructure is based around Active Directory. It is a standard implementation that uses AD to identify end entities, grant privilege and to push policy. The issue is that they are faced with an Advanced Persistent Threat against this existing AD implementation. The question becomes how does one move from this existing infrastructure to a new one while ensuring that they can safely port the existing end entity population (machines and persons) and to ensure that they implement a strategy that mitigates the risk of the ongoing and future APTs.

It seemed to me that the initial strategy has to be one around the segregation of the authentication infrastructure. In a traditional layered security approach we speak of separation of duty, roles and in some cases networks but infrequently do I see people look at the authentication infrastructure as a separate element - it normally is imbedded into some other element - usually preceded by comments like "well Microsoft gives me a CA imbedded in AD". no knock against Microsoft but we need to be careful of architectural implementations.

The authentication infrastructure is a key component of usability but also of defense. Knowing who is accessing a resource is critical and in the case of an APT even more so as the traditional ways to protect a resource may already be compromised. There was a very good example of this within the last year when a major technology/defense firm had their AD compromised. In the process of that compromise the attacker was able to act as a administrator for the certificate authority that was part of the AD implementation and was able to issue credentials that allowed broader access and possibly to a wider audience.

It is becoming critical in light of these persistent attacks that additional precautions be taken. Layering the authentication infrastructure is one element of this that allows for migration of the core elements with less impact on end entity credentials. Similarly if the authentication infrastructure becomes compromised, such as the RSA breach, then you can also take a layered approach to credential replacement. Although this is more challenging if you do not know how deep the attack became after the credential breach but this is another reason to stress the importance of the authentication infrastructure. One step beyond that would be to look at layers of credentials within your infrastructure that would allow levels of access to resources based upon risk and type of credential presented .... but that is another topic.


- Posted using BlogPress from my iPad

Saturday, June 18, 2011

Is security less important to some companies?

I know the above question is a dangerous one. The answer in a general sense is Yes. Security is more important to some companies. But my question is a bit more esoteric.

I read an article yesterday saying that while RSA has come out to say that they will replace SecurId tokens for customers it will take months for that to happen for all of them and actually only about a third of customers will have their tokens replaced.

Yes, I, as do most people, understand that there are economic sensibilities here and that there may even be uncertainty as to the breadth of the breach and what was taken - but really - only a third will have their tokens replaced? If you are in that 67% what are you thinking? If it was me I would be thinking .... Is RSA certain I am not vulnerable? Will they warrant that ... with insurance?

Given the mixed messages coming from RSA on the breach overall, Jeffrey Carr did a couple of great pieces on this - see this one, one would have to wonder on the strategy for the decision on who gets replaced.

It will be interesting to see what follows from all of this. Token based OTP has been a ubiquitous element of multi-factor authentication for some time and one wonders if this whole RSA mess hurts the market or just RSA. Only time will tell.


- Posted using BlogPress from my iPad

Thursday, June 16, 2011

Perfect

I went out just before 6 AM for a five mile run .... I have a few gravel roads I can run on and took one of those today. 

As I ran the geese were taking off from a field as light rain drops fell on my face. The sky was cloud covered to the west and a bit overhead but to the east it was clear enough that you could see the sun coming up. I looked and to the south I see a near perfect rainbow ... end to end. I saw that rainbow for the next mile.

As I ran by the school on the hill the sun was now on my right ... I looked left and I could see my shadow on the green field by the school and arching over the school ... framing it ... the perfect rainbow.

I think I found new motivation to get out and run

Tuesday, June 14, 2011

What is PIV-I?

I have been involved with credentialing in the Federal Government for many years, coming on multiple decades to be honest, and it has been an interesting ride. Over the last few years there has been a substantial change, starting with the signing of HSPD-12 in 2004. What HSPD-12 did was to codify credential issuance within the Federal Government. HSPD-12 brought in not just a technical specification but also a process specification. The combination of these two elements was intended to address the differences between agencies issuance processes as well as allowing interoperability of credentials, at a logical and physical level, between agencies.

As part of the extended process the Federal working groups within the Federal Identity, Credential and Access Management community also developed a set of requirements for organizations external to the Federal Government to create credentials that were based upon a well defined set of standards both from a technical perspective as well as from a issuance perspective.

So what are the differences between a PIV card and a PIV-I card? I hate to give a bulleted list but I am not sure how else to do it so here goes: (Note the left side will always be PIV and the right PIV-I)

  • Identity Verification: While both PIV and PIV-I strongly authenticate individuals there is a slightly higher requirement for PIV. The PIV IV process requires that the end entity have a National Agency Check with Inquiries (NACI). 
  • 10 vs 2: For PIV cards, ten fingerprints are collected, where possible, while for PIV-I only two are required to be collected
  • FASC-N vs UUID: These are identifiers for the end entity normally used as part of the access control decision for physical access. It is possible that the government will move towards the UUID but for now it is FASC-N. The structure does allow for quick differentiation between a PIV and PIV-I card at a PACS system.
  • Common Policy vs FBCA mapping: From a logical perspective the trust anchor for PIV is the Federal PKI Common Policy Root. For PIV-I the trust anchor is the issuers root but the issuers policy must be mapped to the Federal Bridge CA. This is demonstrated in the different OIDS used in the certificates in the cards. The intent here is to allow users under the Common Policy can trust users of PIV-I cards through the linkage at the FBCA.
  • Content Signing: Both PIV and PIV-I cards contain signed objects. These objects are signed using certificates that carry specific indicators that they be used only for content signing. The identifiers used are different for PIV and PIV-I and therefore it provides another way to differentiate the card.
  • Physical Layout: A PIV-I card MUST look different than a PIV card. 
So as you can see there are differences but the intent is to make the PIV-I card usable within an organization and to also allow it to interoperate with other organizations, both physically and logically. The publication of the specifications, with a validation process, allows application vendors to build systems, logical and physical, to use the cards and also allows relying parties to be able to determine their policy on use of PIV-I cards.

So who should think about PIV-I and why? I will leave that to another day.

Friday, June 10, 2011

Continued thoughts on NSTIC Conference

Today is day two of the NSTIC conference and most of today is an extension of the discussions that started to happen in the working groups yesterday afternoon. The discussions themselves are interesting and will lead to a consolidated set of ideas that will be published and then taken into account with the responses to the NOI. All of that will hopefully be the basis of a governance structure that will allow NSTIC to move ahead.

As I sit through these discussions I hear the familiar refrains: Government should not be involved - this should be a Private sector driven effort; government needs to handle liability and provide funding for pilots; the governance needs to be all inclusive with industry, government, relying parties, consumers, education ....; the steering committees need to be small - but represent all parties. Some of these certainly seem to be contradictory so it will be interesting to see where the arrow lands as it spins through the options.

I will note that even when people say that the market will decide how this will happen I get a little bit nervous. FFIEC defined some very strong requirements for banking systems and authentication, as did HIPAA for healthcare systems, however these requirements were later overshadowed by other events and the strong requirements were never implemented or enforced. This lack of implementation was generated by industry and the lack of enforcement a reaction by government.

We need to get better at this - we need to make rules that are implementable and enforceable and we must hold feet to the fire to have it happen. The end goal is to make the environment more secure but that does not happen unless someone starts and it does not start unless there is a relying party need or a mandate from industry or the government. The reaction of FDIC's Harper is a good example of the right direction - we just need to see it get implemented.

None of this will happen overnight but, as I mentioned in previous posts, that should not stop people from moving forward it should only help them to understand that they need to be involved and to build systems that can grow and be flexible. Today's authentication solutions will likely not be tomorrow's. I never thought that when I got my first RSA securID two decades ago that today I would be using a phone based OTP not from RSA. In fact after the last 3 months I am kind of glad it is not from RSA.

A Response to a Breach .... Is it enough?

As I was preparing to step into day two of the NSTIC conference I came across an interesting article on the Citi breach that was announced yesterday. Citi Data Theft Points Up a Nagging Problem

Now I would have said the new breach at Citi but as we know it was not new and that of course raises a number of questions: When they discovered it were they worried about the effect of the subsequent press on their reputation so they kept quiet? Did they want to handle their customers first? Were they worried that they may be targeted further, possibly because they had not resolved the "how it happened" questions? The are lots of questions and likely we will never know the answers.

What was interesting was Sheila Harper's response - that some banks need to strengthen their authentication mechanisms. Now I do have concern that she said "some" banks but if some large ones do move ahead then it is likely that market forces will convince the small and medium sized banks to also move the bar forward.

All of this is very interesting given my last couple of days. Wednesday's discussion on the US ideas on International Cyberspace and the last two days working on NSTIC Governance structure. NSTIC certainly is very relevant here in that it embodies the administrations ideas that enhanced credentialing improves security which in turn improves commerce. Certainly this is also what the chairwoman of FDIC sees as well, even above and beyond the protection of consumers. Banks will continue to face struggles until they get a handle on making online transactions and access more secure and reliable for their users. NSTIC certainly is a medium to long term solution. I say medium to long term in that it will take at least a couple of years to see broad implementations that will get the interest of banks. In the interim what do banks do?

Well one thing they can do is look to the government. Treasury Direct today implements enhanced security with its consumers through the use of an additional token. This token is inexpensive, easy to use and has even been implemented in Braille. Today Treasury has issued over a million of these so yes it is scalable. A near term solution that can later interoperate with the NSTIC model - an idea that should be thought about.

Maybe Citi needs to look to Treasury for more than financial bailouts - seems Treasury also has innovative ideas for dealing with customers and keeping their transactions secure.


- Posted using BlogPress from my iPad

Thursday, June 9, 2011

Highlights From Day One of NSTIC Conference

Today was the first day of the NSTIC Governance Conference in Washington. The intent of today and tomorrow is to start to generate the discussion about the governance model for the broad NSTIC program and that combined with the NOI released yesterday, see
http://www.nist.gov/nstic/nstic-frn-noi.pdf, the belief is that a workable governance model should emerge.

Today started with introductions from Howard Schmidt and Jeremy Grant. Probably the biggest news out of it was the multiple mentions of the newly publicized breach at CITI. Along with that Jeremy threw out some interesting stats when he introduced NSTIC:
- when DOD moved to their Common Access Card from passwords for authentication they saw a near immediate drop in breaches of 46%
- last year there were 8.1 million US citizens affected by ID theft totaling $37 billion in losses.
Clear reasons why moving to stronger credentialing is important.

Howard made it clear that this is not an effort that means government credentialing of the citizenry. In fact he was very clear to say, as does NSTIC, that the credentialing needs to come from the private sector and the government can help that advancement with some funding for pilots and leading by example, such as FICAM credentialing efforts and being an early adopter. This plays into something that Jeremy also brought up that there needs to be services to use the credential for interest to occur. In discussing this idea he invoked Metcalfe's Law, the idea that the value of a network (telecom in Metcalfe's case) is proportional to the square of the number of connected users. A graphic he showed with the pinnacle being economic benefit based on trusted identities leading to enhanced security and improved privacy re-enforces the idea that the end goal here is to improve the capabilities to deliver services electronically. That of course needs services to be trusting the credentials and needs people using credentials that are available.

After these introductions the agenda switched to the discussion on governance. The latter part of the morning was focused on how different parts of the technology and application sphere built governance and saw people like Chris Louden discuss FICAM and Joni Brennan of Kantara discuss Kantara and the OIX linkage. Other speakers covered efforts within NACHA, SmartGrid and OMB.

The afternoon saw another element of the though process with Tom Smedinghoff kicking off the afternoon discussing elements of the framework and how legal/policy/contracts and the technical side intersect and become somewhat co-dependent (my words - not Tom's). Other speakers included some interestingviews on privacy from the ACLU, and other views on governance structure from eCitizens Foundation and OASIS.

The afternoon then split into work groups and I will give a synopsis of that tomorrow after round 2 concludes.

All in all an interesting set of discussions and it is obvious that there is still much work to be done.


- Posted using BlogPress from my iPad

International Strategy in Cyberspace

I consider myself a fortunate person. Of course that starts with a wonderful and supportive family (it would be nice for the kids to get part-time jobs so they can contribute to that college education ..... but that aside) but for me I enjoy my work. It gives me some great opportunities and I do not mean just the traveling but it gives me opportunity to be involved with people who make a difference.

Today I was in a meeting of a group that I became involved with over a decade ago that was doing some great work on Critical Infrastructure Protection. That group morphed over the years, and in particular after the creation of DHS, into a group that focused on economic security. Now that may not have sounded all that interesting to many people 7 or 8 years ago but now with the importance of the cyber world to our economy .... things get a bit more interesting now.

Today David Edelman from the National Security Staff was there to discuss the White House's document on International Strategy in Cyberspace. This paper had input from lots of people including many that I had worked with that decade ago, including Dan Hurley from NTIA and Howard Schmidt, who a decade ago was with Microsoft and of course now is at the White House. The document itself is an interesting document and I saw in it and in the discussion a lot of ideas that Richard Clarke (some would say one of Howard's predecessors) had brought out in Cyber War.

I will likely mention this paper again in the coming week as it is interesting but I wanted to focus for a minute on one thing that David brought up - the idea of the Nationalistic Internet. In the paper the administration promotes an open and interoperable cyberspace. The nationalistic Internet is of course not that. When you think of the nationalistic Internet you can think of China and the restrictions it has placed around information and companies operating there. Another example is the recent discussions coming out of Iran speaking of an Iranian Internet. This view of controlling the citizenry through what they have opportunity to see is certainly not what was thought of oh so many years ago when people like Vint Cerf started to think about how to extend what was being done in closed communities out into a bigger world.

Now there are definitely times when we want to know where information is coming from and who is gaining access to it but generally speaking having access to a broad set of information so that we can evaluate sources and ideas around a single topic before we make a decision sounds like it would lead us to a better place. Restricting information, in most cases, is simply wrong. I say most cases as there are sets of information, things like child pornography, that should never be available.

Of course we also need to be careful because we do need to understand that the internet is not solely about information it has become a major commerce tool. Restricting access or sites at a national level changes some of that paradigm as well and overall success and growth of commerce without broad access will in the long term be limited.

The paper is a great starting point for the discussion on how to successfully use the Internet as a tool for growth, both economic and social. Tying this paper into what is being done in the US with NSTIC makes it even more interesting and I will talk about that over the next few days.


- Posted using BlogPress from my iPad

Thursday, May 19, 2011

Google Authenticator

Yesterday I sat in on the Interagency Advisory Board meeting. The first topic of conversation was the Google two factor solution which made me think that I should write a brief piece on my experience with it as I have used it for a few months now on my personal Google account.


The widely available two factor solution is based upon an OATH implementation. It supports both HMAC-based (HOTP) and Time-based (TOTP) one-time password algorithms. Today it has two major components - apps including an Android app, a Blackberry app and a IOS app and; a PAM module that can add two factor authentication to PAM-enabled applications. I use the Android app which allows me to use counter or time based OTP. The photo is of the iPad (IOS) app.

Setting up two factor can be a bit of an issue - but only from a time perspective. You need to have the app downloaded so you can get going right away and then you need to think about your applications where you are using your Google identity as your login credential. In some cases these apps are not PAM enabled so you will be generating application specific random passwords for these accounts. For example - on my Apple TV I access my YouTube account but since it is not PAM enabled I generate a random password and then use this to register the account. These random passwords are 16 character passwords generated by the application so changing them as frequently as I change others is not needed and especially since it is YouTube. In my case because of the mix of applications and devices, Gmail on my iPad and on my Android for example, I have 20 apps registered for these random passwords.

In terms of use - the passwords for non-PAM apps are not an issue as they are configured to not require re-entry every time. For the PAM-enabled apps it is as easy as starting up the Google Authenticator app on my phone and I am more likely to have that with me than any other hardware based token. If for some reason I do not have my phone, and I REALLY need to access the app then I have the option of using one of ten, one time use, pre-generated backup codes or I can use voice based OTP where I can have them call my pre-registered number. I have never had to do either of these as I always have my phone.

So far I would say it has been a good experience and I am on the verge of converting my work-related account as well.

Where are things headed? Well from the conversation yesterday it does sound like Google is looking at alternatives for authentication besides the Authenticator app but I will leave that conversation for another day.


- Posted using BlogPress from my iPad

Thursday, April 28, 2011

Leveraging existing work to deliver on NSTIC

I was fortunate enough to sit in on a session at the Department of Defense IPM conference a couple of weeks ago that was given by Andy Ozment, the White House Director for cybersecurity policy. Andy was speaking on the just released NSTIC (National Strategy for Trusted Identities in Cyberspace).

Andy's talk was a general one that discussed the background, need, plan in general and the intended benefits. I will be upfront here in that I believe that NSTIC is a good idea, especially when combined with other initiatives that are being undertaken inside the Federal government today. NSTIC as a program is intended to move the bar forward by incentivizing industry to provide a better credential set to their users. Options will allow some vendors to have a highly interoperable credential that addresses a very broad range of services and these implementations will be created following guidelines that are created by the government and industry. This truly becomes a public-private initiative, government providing some base requirements to ensure interoperability and improved security and then providing a framework for credential providers and relying parties to use these credentials. At the same time the government will provide seed money for pilot programs where interoperability with government applications becomes the carrot for end users to also get involved.

Again I think this is a great idea and we need to make sure that this initiative leverages existing work that has been done and is underway. The existing PIV card is a good example of a federated credential and it's extension into PIV-I broadens the user base for that federation. The PIV and DOD CAC programs today cover somewhere less than a few million users, PIV-I is targeted at organizations that have a need for a well defined credential for physical and logical access that also allows for a high level of interoperability. This type of credential is well suited for medium to large corporations, first responders in all market segments and state and local governments. That is a very large user population.

Of course PIV and PIV-I are not credentials for the masses. Today people have varying levels of credentials that they use, some with a true identity linkage and others totally anonymous. Some of these credentials have second factor capabilities such as the ability to tie into Google Authenticator or tokens such as the eBay and PayPal tokens. As smartphones become more capable and secure we could also see expansion of the applications such as those offered by Visa and even Starbucks to capabilities that would allow interoperability with other relying parties.

The idea that a single relying party will accept credentials from multiple providers using multiple technologies may seem far-fetched but even today government applications such as NIH's PubMed allows use of different trust provider platforms for access to it's resources.

Of course as we extend the credential acceptance we need to remember that identification does not mean authorization. We need to make sure that while a federated identity makes things easy for the user, and in some regards easier for the relying party, since they do not need to be an identity provider as well, that we are aware that we need to know who we are letting perform transactions and why. Attribute exchange allows identity providers and relying parties to communicate to ensure that authorization decisions can be appropriately made at the relying party end. This capability could also allow for end users to better control what information they share with relying parties by giving them the ability to only release information in certain cases. Some examples - a blog I am commenting on does not need to know my real name; a online wine store does not need my birthdate only an assurance that I am 21. There are initiatives today, that I have previously discussed in other posts, that build the basis for this.

Again this is not a case of building something new but rather a case of taking what is out there, leveraging some updated policies and guidance and then architecting an interoperable system.

- Posted using BlogPress from my iPad

Wednesday, April 6, 2011

Rumblings in the identity world

These last few weeks have almost seemed like the coming of the apocalypse .... breaches at EMC opening vulnerabilities to SecurID tokens, someone taking control of a Comodo registration account and issuing certificates improperly, and a breach at Epsilon opening the door to mass phisihing attacks.

Of course these events have generated lots of press about the impacts of these events, including things like "The Public Key Infrastructure Under Siege" and many others. What I find interesting about much of the press is the very negative side of things, including the implications that the underlying technology is flawed.

Looking at each of these events points to a set of issues in implementation and relying party applications.

- The EMC breach was a result of an employee opening a file embedded in an email. Yes it was a zero-day attack but if the employee is receiving emails with attachments they need to be aware of the threat and take proper actions - such as validating the source email and mapping the message to the likelihood that the file is appropriate to come from that party. No technology involved here.
- The Comodo attack also likely involved a multi-step process with some malware allowing capture of the RA credentials. This is easily preventable by having RAs issued smart cards that ensure the authentication credentials cannot be removed from the card. In this case an attack needs to get at the card and the passphrase for card unlock.
- the Epsilon attack is still being evaluated but the message for the consumer is to be aware and smart. Do not click on links in messages that you are not confident in. If you deal with someone and get an email, purportedly from that company, then go direct to their website - do no use links in email. And when you go direct to the site make sure it is a protected site before giving up information.

Can we stop all attacks - the answer is no and likely that will always be true. So with that in mind let's be smart and let's make the end user smart through education. The messages need to be simple though:
- Do not click on links in emails unless you have very high confidence. Go direct to company websites rather than using links in email.
- When you go to company sites, look for green! That means look for companies that are using Extended Validation certificates whose validation includes turning the browser address bar green.
- Do not accept certificates where you need to imbed a new root unless it has come from an EV protected site. It opens you to long term vulnerabilities.

The education also extends to companies:
- Educate employees on the above.
- Review your logs regularly - daily for sensitive systems for those that are issuing credentials.
- If you are building software that uses Digital credentials such as PKI then implement complete solutions. NIST provide some test suites to test implementations.

These are just some starting points. The main element here is that the technology is not in itself broken but we as developers, users and relying parties need to make sure we are using it correctly. There are no silver bullets so let's make sure we educate people as to how they use the bullets they have.


- Posted using BlogPress from my iPad

Friday, March 4, 2011

The move to the Cloud

It sometimes makes me smile when everyone starts talking about cloud computing. All the buzzwords are out there - no longer TLAs but now Four Letter Acronyms such as SaaS, IaaS, NaaS and dual meanings for these and more. Is SaaS software as a service or storage as a service? What we end up with are a few great ideas, some good ideas and then a whole lot of misunderstanding and what then become bad paths taken that end in poor implementations.

All of that being said cloud computing, as an idea, shared services that reduce implementation and operating costs in a standard way, is a great one. Honestly I use it every day - my Google services as one easy example.

Now that good Idea is gaining attention within the US Federal Government and I believe it is the right move - if attention is paid. Reduction of operating costs through consolidation of data centers is a prudent thing to be looking at. I know that the government knows how to protect resources so if they leverage that expertise then consolidation can be achieved with not just cost savings but in fact improved service and reliability of the infrastructure. To achieve this will take some push from Capitol Hill, OMB or maybe even the White House to ensure that implementations are not hindered by agency politics but I think that in today's environment of shrinking budgets this will be easier to deliver now than 10 years ago.

One of the other big considerations, of course, will be security. Again this is not a case of me believing that the government does not understand security but may instead be a case of ensuring that the implementation covers the areas of concern. In my eyes this includes things like:

- appropriate means of authentication based on the resource as well as the operational environment
- implementation of a open authorization model to consider government employees, contractors and non-governmental personnel with a need to access such as law enforcement and other first responders
- implementation of fraud detection within the transactional environment to ensure that information remains in control of the parties authorized to see and use it
- an understanding of the ability to open certain resources to the public using appropriate authentication means

These things, as you can see, are not technology specific but should be considered as part of the governments overall programs with regards to identity within the federal space as well as identity within the non-federal space such as that as defined by NSTIC. Of course as part of this there is also consideration to information sharing with other national partners.

This is a topic that has much interest to me so I will be following and discussing this a bit more in the near future.



- Posted using BlogPress from my iPad

Wednesday, February 9, 2011

Traveling and Identity

I was reading an interesting article today from Aviation Week on airport security screening. Of course we all have heard about, and large numbers of us have complained about, the recent changes at the airport. We jockey lines to try and avoid the full body scanners and possible pat-downs. I know most people are not doing this to avoid security but to avoid embarrassment or at least perceived embarrassment. This article made me think more about identity and traveling and some of the work that is being done today to improve identity and authorization decisions within the government sector.

We all have heard that air travel is a privilege - one generated out of the convenience of time. I can go anywhere without having to fly - it just may take a lot longer. In that vein I begin to wonder if people are ready to accept the need for better identity assurance at a security screening checkpoint to make traveling easier and maybe safer. What if we had a card based credential that allowed a user to scan the credential prior to entering a metal detector? The credential would ideally carry the same level of assurance of any ICAO travel document and could be issued by a nation, such as a passport, or could be issued by the private sector. The traveller would scan their card immediately before entering the metal detector. While the traveller is passing through the scanning device an automated check of the credential would allow personnel to know if the credential presented was valid and combined with the visual check of the card would allow match of person to the credential. Using this combination could increase the level of assurance of identity of the traveller. If the credential does not properly validate then some additional screening could be performed.

If we take it one step further we could incorporate some of the work being done in the government with Backend Attribute Exchange, aka BAE, which would allow the system to reach back to the credential issuer or potentially a National Travel Blacklist, to see if there are any reasons to further screen the individual. These checks could be performed in seconds, the time it takes for a person to step through a metal detector and make it through the security area.

Of course in such a idealized system one would need to consider the issues of privacy and the need to ensure tracking information is not being maintained in the system. Integrity and availability of the system would be critical to ensure minimized additional screening. For the frequent, trusted, traveler this may mean a faster and easier trip through security at the airport and could provide a basis for a system that adds to the security environment for all air passengers.

Something to think about.

- Posted using BlogPress from my iPad

Sunday, February 6, 2011

Mixed Messages

The beginning of this past week I was getting on a plane to head to the left coast when I saw, what I thought was, a good piece on Headline News on the National Strategy for Trusted Identity in Cyberspace aka NSTIC. It was a quick overview but they seemed to have gotten the message right - that it was an effort to get industry to improve the capabilities for online identity. The idea around NSTIC is that the government and industry would work together to define/refine standards to ensure that it was not a set of stovepipe identity solutions that could not interoperate; work together so that the systems would be secure; and to, in the process, protect privacy as appropriate.

I was somewhat encouraged - mainstream media had seemed to actually understand the effort .... and then a few hours later I saw the headline Why You Should Trust Apple More Than the U.S. Commerce Dept. With Your Universal Online ID

POP goes the bubble. Here we have someone writing for Fast Company proposing a corporately patented idea as the right approach. Now we know that as the standards evolve there will be patent issues and, as in the past, I expect them to be resolved for the greater good, but this article seems to suggest that Commerce is going to hold your identity and everything is in the governments control. This is not the vision of NSTIC that I see, or anyone that I know and work with sees.

If you want a vision of what NSTIC is look no further than the US Government employee ID, the PIV card. Here a standard was developed for internal government use. It had technical and policy aspects and required the use of Government run or Government contracted identity providers. But then industry realized that the technical specifications of the card provided a good base for non-governmental people. What happened next - well government and industry worked on refining the standard for non-governmental work. The identity issuers were now private industry. The card issuers we now private industry. The only thing the government did to stay involved was to crate a test program around the standard that would ensure that the credential could be trusted ... and PIV-I was born.

This is what NSTIC envisions - a public-private partnership where good standards are made better through joint discussion; testing programs are put in place to ensure that products and services meet a set of standards and private industry provides these products and services to the masses.

How hard is that to understand? I hope Fast Company spends some time researching so they can criticize where it is deserved.

- Posted using BlogPress from my iPad

Location:the stratosphere

Friday, January 28, 2011

Moving ahead with reduced identities

I had lunch last week with an old colleague. During the lunch we had a good chat on NSTIC. One of the points she brought up was the education aspect - the fact that to most people this whole cyber security thing is very foreign and that there are many things that exist today that people do not realize bring additional trust to what they do online.

I have been thinking about that for a while. Especially in relation to the work that I have been doing on what effectively is credential re-use. She was correct in that most people do not know to look for the green bar in the browser indicating the extra diligence to validate the site. I think that it is a case of people not knowing why it is there and what it brings. But I also think the same is true of identity re-use. I think people do it today and do not realize it.

Now I do not mean just the re-use of drivers licenses, Social Security numbers and the such but of online identities. I started to think about my own environment. I use my Google identity to use many things today - the normal Gmail, Calendar etc but also using it to log onto other applications on my iPad, laptop and Android phone. I use my OpenID to access ToodleDo and other sites and many applications that I use leverage SAML, Oauth and OpenID to allow me to take advantage of credential re-use. Of course in a lot of these cases I also see Facebook and Twitter options for login so I have to imagine that people are using these rather than create yet another account.

I think the strength and advantage of NSTIC is the possibility that in a few years I will be able to do all my online functions using a small set of identities. I may always have that Yahoo mail address so getting rid of all but one is unlikely but if I can get to 3 that would be great. And at that point I hope I will be able to do what I do today with my OpenID identity and use functions like CallVerifID when the transaction calls for that level of additional authentication.

So yes there is possibility - and now we just need to decide which we need to do first - get infrastructure available or get people educated as to what they can do today as a staring point.

Some things for thought .... I hope.


- Posted using BlogPress from my iPad