Comments Off on NSTIC IDESG “layers”

NSTIC IDESG “layers”

Today at the 3rd Plenary of the IDESG, the Chair of the IDESG  (Bob Blakley) presented a high level vision slide of what the IDESG should be working on. Its a very good slide for the purposes of uniting the work of the IDESG.  Each industry area (or stakeholder group) would end-up with its own Trust Framework Provider that covers IdPs in that space, and users and RPs.

 

 

 

 

 

 

 

 

 

 

 

Comments Off on NSA introduces two new lightweight ciphers (SIMON and SPECK)

NSA introduces two new lightweight ciphers (SIMON and SPECK)

MIT Media Lab – 2013 Legal Hack-a-thon on Identity

Today we had the privilege of hearing a presentation by Loius Wingers and Stefan Treatman-Clark on  a couple of lightweight ciphers from the NSA.  These are called SIMON and SPECK. The algorithms are not yet published, but they have a paper (pdf copy here) that shows some numbers on the performance of the proposed ciphers.

The SIMON and SPECK algorithms come in a family that range from 48-bits to 128-bits.  Since the target deployment area is low-power and low memory devices (i.e. RFID devices, etc), the requirement is that these algorithms do not use more than 2000 gates. The paper has a table showing the GE and throughput.

Louis and Stefan presented SIMON and SPECK at the 2013 Legal Hack-a-Thon and  the MIT Media Lab.

 

 

Comments Off on The 4 questions on transparency in personal data (disclosure management)

The 4 questions on transparency in personal data (disclosure management)

MIT Media Lab – 2013 Legal Hack-a-thon on Identity

Ray Campbell argues quite elegantly and convincingly that the “data ownership” paradigm is not the correct paradigm for achieving privacy and control over personal data. The notion that “I own my data” can be impractical especially in the light of 2-party transactions, where the other party may also “own” portions of the transaction data and where they might be legally bound to keep copies of “my data”.

Instead, the better approach is to look at “transparency” and visibility into where our data reside and who is using it. Here are the four questions that Ray poses:

  • Who has my data
  • What data they have about me
  • How did they acquire my data
  • How are they using my data

Transparency becomes an important tool disclosure management of personal data. These questions could be the basis for the development of a  trust framework on data transparency, one which can be used to frame Terms of Service that both myself and the Relying Party must accept.

Comments Off on Limitations of the OAuth 2.0 definition of “Client”

Limitations of the OAuth 2.0 definition of “Client”

I believe the OAuth 2.0 definition of the “client” is too restrictive, and by doing so it has effectively closed-off any possibility of OAuth 2.0 entertaining true third party access on the Internet. Although OAuth speaks in terms Alice-to-Bob sharing of resources, in reality it caters only as far as Alice-to-client sharing (where the “client” is a piece of application software possibly operated by a third party). This point jumps-out clearly when we compare the OAuth view of Alice-to-Bob sharing against the UMA view.

The definition of “client” in OAuth 2.0 (RFC6749) is as follows:

 An application making protected resource requests on behalf of the resource owner and with its authorization.

The UMA (draft-06) definition of the “client” is as follows:

 An application making protected resource requests with the resource owner’s authorization and on the requesting party’s behalf.

UMA makes the clear distinction between the “Requesting Party” and the Client (or the “Requester”). The Requesting Party is considered to be the human being (or organization, or a human legally representing an organization), while the Client in UMA is the “proxy” entity through which the Requesting Party accesses the resources hosted at the Resource Server. In the UMA view, Bob is the human person who is using the client but may not be in full control of all aspects of the client’s operation.

Nat Sakimura (from the OpenID Foundation) in his recent blog corrects the common misconception that “many people seem to think that this client as “Alice” the resource owner.” I absolutely agree with this view.  However, in order to truly support a realistic Alice-to-Bob sharing, OAuth 2.0 needs to expand its definition and understanding of the client.

The following diagram illustrates further. In this diagram, Alice is wanting to let Bob access her calendar so that Bob could adjust his travel itinerary to match Alice’s itinerary. Alice is the owner of her resource (her calendar file) which resides at the Resource Server (operated by MyCalendardz.com). Bob is using the client (the application operated by Tripitz.com) in his desire to access Alice’s calendar file at the MyCalendarz.com. The client is therefore acting on behalf of Bob.

 

 

The OAuth 2.0 definition of “client” fails to recognize that a (legal) relationship may exist between the human person (Bob who is driving the “client” application at Tripitz.com) and the company called Tripitz.com.  Thus, in the Alice-to-Bob sharing, OAuth assumes that Bob is directly accessing the resources, whereas in reality Bob is more likely to be using his browser to “remotely manipulate” the client application (being operated by a third-party Tripitz.com) to access Alice’s resources at the Resource Server (MyCalendarz.com). The UMA architecture recognizes the real-world reality that Bob will likely need to have an account at Tripitz.com, in which Bob will be required to accept the Terms of Service (TOS) of Tripitz.com.

UMA recognizes (i) the Bob-to-Tripitz relationship and (ii) the Tripitz-to-CopMonkey relationship by requiring two (2) types of OAuth tokens to be presented/wielded by the client:

  • The Authorization API Token (AAT):  this is the OAuth token that belongs to the client (TripItz.com) and which authenticates the client to the Authentication Server.
  • The Requesting Party Token (RPT): this is the OAuth token that belongs to the Requesting Party (Bob) and which authenticates Bob to both the Authentication Server and the Resource Server.

This distinction between the Requester and the Requesting Party in UMA allows legal agreements (i.e. trust framework) to recognize Bob as distinct from Tripitz.com, and accord different legal obligations to these two entities. And from a risk management perspective, it allows finer grain analysis and risk assignments to these entities.

In summary, in order to address the true Alice-to-Bob sharing of resources, OAuth 2.0 needs to:

  • expand its understanding of “client” to mean an application being owned and operated by a third party (not Bob).
  • add another player to the ecosystem, namely Bob the Requesting Party.
  • define that the client is acting on behalf of Bob.

 

 

 

Comments Off on IDESG Membership, ROA and IPR

IDESG Membership, ROA and IPR

After over 6 weeks of the IDESG Governance subgroup drafting the IDESG Membership and ROA related docs, these are finally completed.

(1) Proposed Membership Agreement

(2) Proposed Intellectual Property Rights Policy

(3) IDESG Rules of Association

 

Key Dates and Times

  • Ballot on the Membership Agreement & IPR Policy opened at 12:00 noon ET on December 3, 2012
  • Ballot on the Membership Agreement & IPR Policy will close at 12:00 noon ET on December 17, 2012
Comments Off on UMA Tutorial – High Level (Part 1)

UMA Tutorial – High Level (Part 1)

Since I’m editing the User Managed Access (UMA) Core spec, I always seem to be getting questions about UMA.  Also I’ve noticed that some folks in the IETF OAuth2.0 WG have not really understood the UMA flows (not suprising, since the UMA Core Rev 5C is now over 40 pages long). So I thought a very high level tutorial would benefit lots of people.

So here goes Part 1 of the tutorial.

 

 

 

 

 

 

When a Requester (acting on behalf of a Requesting Party) seeks access to resources sitting at the Host, the Requester must obtain an OAuth2.0 token from the Authorization Manager (AM).  In this case the Requester is behaving like an OAuth Client, while the AM is behaving like an OAuth2.0 Authorization Server (AS). The resulting token is referred to in the UMA specs as the Authorization API Token (AAT).  This token allows the Requester to use the other APIs at the AM at a later stage. It signals authorization by the AM for the Requester to access the other APIs at the AM.

After the Requester gets its own AAT token, it must now get another token specific to the Requesting Party (call him Bob).  This token is referred to as the Requester Permission Token (RPT).  Later when Bob (the Requesting Party) is seeking to access resources to at the Host (e.g. MyCalendarsz.com) by employing the Requester (e.g. TripItz.com), the requester must forward both tokens (the AAT and the RPT tokens) to the Host.

The reason why two tokens are needed is that there might be multiple Requesting Parties (eg. Bob, Billy, Becky, etc) at the same Requester (TripItz.com) seeking to access the same resources at the same Host. However, the Requester needs to be authorized only once by the AM to access the AM’s APIs.

[In the next post, we’ll see how the Host has to do the same basic registration to the AM]

 

 

Comments Off on On the survival NSTIC Privacy Standing Committee

On the survival NSTIC Privacy Standing Committee

Aaron Titus writes an interesting piece based on his analysis of the recent proposal from Trent Adams (PayPal) to modify the NSTIC governance rules. The abolition of the NSTIC Privacy Standing Committee may have unforeseen impact on the acceptance of the whole NSTIC Identity Ecosystem idea, notably from the privacy front.

During the last decade — starting from the Liberty vs Passport kerfufle — we have seen a number of proposals for components of an “identity infrastructure” for the Internet.  All in all, there has been little adoption (by consumers) of these technologies for high-value transactions due IMHO to the lack of privacy-preserving features.

So far I have yet to see a sustainable business model for identities which is focus on the “individual” (i.e. individual centric) and which preserves his/her personal data.  All the agreements and EULAs that we click “yes” to seem to be titled in favor of the provider.  If a provider “loses” my personal information (including credit-card information), there is really little incentive (positive or negative) to get them to recover my data.  The individual suffers all the losses. Little wonder there is no buy-in from the consumer 🙂

 

 

 

Comments Off on NSTIC Identity Ecosystem Steering Group

NSTIC Identity Ecosystem Steering Group

Today NSTIC started its 2 day Ecosystem Steering Group meeting in Chicago.  Never thought that dialing-in all day would be so tiring. Glad that the group (of about 300 people, half in-person and half virtual) got over the initial confusion about voting for the candidates and dealing with proposed changes to the Charter and By Laws.

Comments Off on ZDnet interview

ZDnet interview

Its kinda late, but here is a link to the recent interview with Dana Gardner from ZDnet.  On the panel was Jim Hietala (VP of Security, Open Group), Dazza Greenwood (Civics.com & MIT) and myself.  This was in preparation for the open group conference in DC.

 

 

 

 

 

 

 

 

More info on the Open group conference can be found here.

 

 

Comments Off on Eran bails out of the OAuth2.0 Spec

Eran bails out of the OAuth2.0 Spec

So the news this week was that Eran has decided that OAuth2.0 is a bad specification and wants nothing to do with it.  Its kinda a bit too late to complain about OAuth2.0.  Its out there, its being used as the basis for many other protocols, such as OpenID-Connect and UMA. Its going to stay around for a while, and perhaps even evolve further. Its a workable solution for this current generation of web applications APIs.

John Bradley got it right: the OAuth2.0 sky is not falling.

PS.  I don’t know why people are so upset about the IETF process (see comments by Eran & responses by other folks).  How many people in the OAuth WG were around for the creation of IPsec protocol? What about the IKE protocol (starting from Hilary Orman’s ISAKMP draft).  All in all it took 5 years at least. Not to mention the PKIX WG (still around after 15 years). This *is* the IETF process. Love it or leave it.

Newer Posts
Older Posts