RSS

API Authentication News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API autehtnication conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

The Information You Get When Allowing Developers To Sign Up For An API Using Github

</a>I'm a big Github user. I depend on Github for managing all my projects, and Github Pages for the presentation layer around all my research. When anything requires authentication, whether for accessing an API, or gaining access to any of my micro apps, I depend on Github authentication. [I have a basic script that I deploy regularly after setting up a Github OAuth application](https://gist.github.com/kinlane/00db3d871b615c8b1c43dbc60ae41f86), which I use to enable authentication for my API portals and applications, handling the OAuth dance, and returning me the information I need for my system. After a user authenticates I am left with access to the following fields: id, avatar_url, gravatar_id, url, html_url, followers_url, following_url, gists_url, starred_url, subscriptions_url, organizations_url, repos_url, events_url, received_events_url, type, site_admin, name, company, blog, location, email, hireable, bio, public_repos, public_gists, followers, following, created_at, updated_at, private_gists, total_private_repos, owned_private_repos, disk_usage, collaborators, two_factor_authentication, and plan. Not all these fields are filled out, and honestly I don't care about most of them for my purposes, but it does provide an interesting look at what you get from Github, over a basic email and password approach to authentication. I'm just looking for any baseline information to validate someone is a human being when signing up. Usually a valid email is this baseline. However, I prefer some sort of active profile for a human being, and have chosen Github as the baseline. When anyone signs up I also quickly calculate some other considerations regarding how long they've had a Github account, how active it is, and some numbers regarding this history and activity. I don't expect everyone to have a full blown public Github profile like I do, but if you are looking to use on of my APIs, or API-driven micro tools I'm looking for something more than just a valid email--I want some sign of life. I will be evolving this algorithm, and enforcing it in different ways at different times. I always hesitate using Github as the default login for my API portals and applications, but honestly I think it is a pretty low bar to expect folks to have a Github account. I feel like we should be raising the bar a little when it comes to who is accessing our resources online. The APIs and tooling I'm making available are mine, and I just want to make sure you are human, and are verifiable on some level, and I find that the links available as part of your Github profile provide me with more reliable and verifiable aspects of being human in the tech space. Making the fields returned as part of Github authentication pretty valuable for verifying humans in my self-service, and increasingly automated world.


API Developer Account Basics

I’m helping some clients think through their approach to API management. These projects have different needs, as well as different resources available to them, so I’m looking to distill things down to the essential components needed to get the job done. The first element you need to manage API access is the ability for API consumers to be able to sign up for an account, that will be used to identify, measure usage, and engage with each API consumer.

Starts With An Account While each company may have more details associated with each account, each account will have these basics:

  • account id - A unique identifier for each API account.
  • name - A first name and last name, or organization name.
  • email - A valid email address to communicate with each user.

Depending on how we enable account creation and login, there might also be a password. However, if we use existing OpenID or OAuth implementations, like Github, Twitter, Google, or Facebook, this won’t be needed. We are relying on these authentication formats as the security layer, eliminating the need for yet another password. However, we still may need to store some sort of token identifying the user, adding these two possible elements:

  • password - A password or phrase that is unique to each user.
  • token - An OAuth or other token issued by 3rd party provider.

That provides us with the basics of each developer API developer account. It really isn’t anything different than a regular account for any online service. Where things start to shift a little specifically for APIs, is that we need some sort of keys for each account that is signing up for API access. The standard approach is to provide some sort of API key, and possibly a secondary secret to compliment it:

  • api key - A token that can be passed with each API call.
  • api secret - A second token, that can be passed with each API call.

Many API providers just automatically issue these API keys when each account is created, allowing consumers to possibly reset and regenerate at any point. These keys are more about identification than they are about security, as each key is passed along with each API call, and provide identification via API logging, which I’ll cover in a separate post. The only security keys deliver is that API calls are rejected if keys aren’t present.

Allow For Multiple Applications Other API providers will go beyond this base level of API account functionality, allowing API consumers to possibly generate multiple sets of keys, sometimes associated with one or many applications. This opens up the question of whether these keys should be associated with the account, or with one or many registered applications:

  • app id - A unique id for the application.
  • app name - A name for the application.
  • app description - A description of the application.
  • api key - A token that can be passed with each API call.
  • api secret - A second token, that can be passed with each API call.

Allowing for multiple applications isn’t something every API provider will need, but does reduce the need for API consumers to signup for multiple accounts, when they need an additional API key for a separate application down the road. This is something that API providers should be considering early on, reducing the need to make additional changes down the road when needed. If it isn’t a problem, there is no reason to introduce the complexity into the API management process.

This post doesn’t even touch on logging and usage. It is purely about establishing developer accounts, and providing what they’ll need to authenticate and identify themselves when consuming API resources. Next I’ll explore the usage, consumption, and billing side of the equation. I’m trying to keep things decoupled as much as I possibly can, as not every situation will need every element of API management. It helps me articulate all the moving parts of API management for my readers, and customers, and allows me to help them make sensible decisions, that they can afford.

Ideally, developer accounts are not a separate thing from any other website, web or mobile application accounts. If I had my way, API developer accounts would be baked into the account management tools for all applications by default. However, I do think things should be distilled down, standardized, and kept simple and modular for API providers to consider carefully, and think about deeply when pulling together their API strategy, separate from the rest of their operations. I’m taking the thoughts from this post and applying in one project I’m deploying on AWS, with another that will delivered by custom deploying a solution that leverages Github, and basic Apache web server logging. Keeping the approach standardized, but something I can do with a variety of different services, tools, and platforms.


That Point Where API Session Management Become API Surveillance

I was talking to my friends TC2027 Computer and Information Security class at Tec de Monterrey via a Google hangout today, and one of the questions I got was around managing API sessions using JWT, which was spawned from a story about security JWT. A student was curious about managing session across API consumption, while addressing securing concerns, making sure tokens aren’t abused, and there isn’t API consumption from 3rd parties who shouldn’t have access going unnoticed.

I feel like there are two important, and often competing interests occurring here. We want to secure our API resources, making sure data isn’t leaked, and prevent breaches. We want to make sure we know who is accessing resources, and develop a heightened awareness regarding who is accessing what, and how they are putting them to use. However, the more we march down the road of managing session, logging, analyzing, tracking, and securing our APIs, we are also simultaneously ramping up the surveillance of our platforms, and the web, mobile, network, and device clients who are putting our resources to use. Sure, we want to secure things, but we also want to think about the opportunity for abuse, as we are working to manage abuse on our platforms.

To answer the question around how to track sessions across API operations I recommended thinking about that identification layer, which includes JWT and OAuth, depending on the situation. After that you should be looking other dimensions for identifying session like IP address, timestamps, user agent, and any other identifying characteristics. An app or user token is much more about identification, than it ever provides actual security, and to truly identify a valid session you should have more than one dimension beyond that key to acknowledge valid sessions, as well as just session in general. Identifying what healthy sessions look like, as well as unhealthy, or unique sessions that might be out of the realm of normal operations.

To accomplish all of this, I recommend implementing a modern API management solution, but also pulling in logging from all other layers including DNS, web server, database, and any other system in the stack. To be able to truly identify healthy and unhealthy sessions you need visibility, and synchronicity across all logging layers of the API stack. Does the API management logs reflect DNS, and web server, etc. This is where access tiers, rate limits, and overall consumption awareness really comes in, and having the right tools to lock things down, freeze keys and tokens, as well as being able to identify what healthy API consumption looks like, providing a blueprint for what API sessions should, or shouldn’t be occurring.

At this point in the conversation I also like to point out that we should be stopping and considering at what point all of this API authentication, security, logging, analysis, and reporting and session management becomes surveillance. Are we seeking API security because it is what we need, or just because it is what we do. I know we are defensive about our resources, and we should be going the distance to keep data private and secure, but at some point by collecting more data, and establishing more logging streams, we actually begin to work against ourselves. I’m not saying it isn’t worth it in some cases, I am just saying that we should be questioning our own motivations, and the potential for introducing more abuse, as we police, surveil, and secure our APIs from abuse.

As technologists, we aren’t always the best at stepping back from our work, and making sure we aren’t introducing new problems alongside our solutions. This is why I have my API surveillance research, alongside my API authentication, security, logging, and other management research. We tend to get excited about, and hyper focused on the tech for tech’s sake. The irony of this situation is that we can also introduce exploitation and abuse around our practices for addressing exploitation and abuse around our APIs. Let’s definitely keep having conversations around how we authenticate, secure, and log to make sure things are locked down, but let’s also make sure we are having sensible discussions around how we are surveilling our API consumers, and end users along the way.


The Concept Of API Management Has Expanded So Much the Concept Should Be Retired

API management was the first area of my research I started tracking on in 2010, and has been the seed for the 85+ areas of the API lifecycle I’m tracking on in 2017. It was a necessary vehicle for the API sector to move more mainstream, but in 2017 I’m feeling the concept is just too large, and the business of APIs has evolved enough that we should be focusing in on each aspect of API management on its own, and retire the concept entirely. I feel like at this point it will continue to confuse, and be abused, and that we can get more precise in what we are trying to accomplish, and better serve our customers along the way.

The main concepts of API management at play have historically been about authentication, service composition, logging, analytics, and billing. There are plenty of other elements that have often been lumped in there like portal, documentation, support, and other aspects, but securing, tracking, and generating revenue from a variety of APIs, and consumers has been center stage. I’d say that some of the positive aspects of the maturing and evolution of API manage include more of a focus on authentication, as well as the awareness introduced by logging and analytics. I’d say some areas that worry me is that security discussions often stop with API management, and we don’t seem to be having evolved conversations around service conversation, billing, and monetization of our API resources. You rarely see these things discussed when we talk about GraphQL, gRPC, evented architecture, data streaming, and other hot topics in the API sector.

I feel like the technology of APIs conversations have outpaced the business of APIs conversations as API management matured and moved forward. Advancements in logging, analytics, and reporting have definitely advanced, but understanding the value generated by providing different services to different consumers, seeing the cost associated with operations, and the value generated, then charging or even paying consumers involved in that value generation in real-time, seems to be being lost. We are getting better and the tech of making our digital bits more accessible, and moving them around, but we seemed to be losing the thread about quantifying the value, and associating revenue with it in real-time. I see this aspect of API management still occurring, I’m just not seeing the conversations around it move forward as fast as the other areas of API management.

API monetization and plans are two separate area of my research, and are something I’ll keep talking about. Alongside authentication, logging, analysis, and security. I think the reason we don’t hear more stories about API service composition and monetization is that a) companies see this as their secret sauce, and b) there aren’t service providers delivering in these areas exclusively, adding to the conversation. How to rate limit, craft API plans, set pricing at the service and tier levels are some of the most common questions I get. Partly because there isn’t enough conversation and resources to help people navigate, but also because there is insecurity, and skewed views of intellectual property and secret sauce. People in the API sector suck at sharing anything they view is their secret sauce, and with no service providers dedicated to API monetization, nobody is pumping the story machine (beyond me).

I’m feeling like I might be winding down my focus on API management, and focus in on the specific aspects of API management. I’ve been working on my API management guide over the summer, but I’m thinking I’ll abandon it. I might just focus on the specific aspects of conducting API management. IDK. Maybe I’ll still provide a 100K view for people, while introducing separate, much deeper looks at the elements that make up API management. I still have to worry about onboarding the folks who haven’t been around in the sector for the last ten years, and help them learn everything we all have learned along the way. I’m just feeling like the concept is a little dated, and is something that can start working against us in some of the conversations we are having about our API operations, where some important elements like security, and monetization can fall through the cracks.


Github OAuth Applications As A Blueprint

I was creating a very light-weight API management solution for one of my projects the other day, and I wanted to give my API consumers a quick and dirty way to begin making calls against the API. Most of the API paths are publicly available, but there were a handful of POST, PUT, and DELETE paths I didn’t want to just have open to the public. I didn’t feel like this situation warranted a full blown API management solution like Tyk or 3Scale, but if I could just let people authenticate with their existing Github account, it would suffice.

This project has it’s own Github organization, with each of the APIs living as open source API repositories, so I just leveraged Github, and the ability to create Github OAuth applications to do what I needed. You can find OAuth applications under your Github organizational settings, and when you are creating it, all you really need is to give the application a name, description, and a home page and callback URL, then you are given a client id and secret you can use to authenticate individual users with their Github accounts. I didn’t even have to do the complete OAuth dance to get access to resources, or refresh tokens (may will soon), I was just able to implement a single page PHP script to accomplish what I needed for this version:

I am wiring this script up to a Github login icon on my developer portal, and each API consumer will be routed to Github to authenticate, and then the page will handle the callback where I capture the valid Github OAuth token, and the login, name, email, and other basic Github information about the user. Right now the API is open to anyone who authenticates, but eventually I will be evaluating the maturity of the Github account, and limiting access based upon a variety of criteria (number of repos, account creation date, etc.). For now, I’m just looking for a quick and dirty way to allow my API consumers to get access to resources without creating yet another account. Normally I would be using OAuth.io for this, but I’m trying to minimize dependencies on 3rd party services for this project, and Github OAuth applications plus this script worked well.

Once a user is authenticated they can use their Github user name as the appid, and the valid Github OAuth token as the appkey, which are both passed through as headers, leveraging encryption in transport. I’m not overly worried about security of my APIs, this is more about a first line of defense and identifying consumers, however I will be validating the token with particular API calls. I’m also considering publishing API consumption data to Github repository created within each users accounts as part of API activity, publishing it as YAML, with a simple dashboard for view (authenticated with Github of course). I’ve had this model in my head for some time, and have written about it before, but I’m just now getting around to having a project to implement it in. I’m calling it my poor man’s API management, and something that can be done on a budget (FREE), but if my needs grow any further I will be using a more professional grade solution like 3Scale or Tyk.


The ElasticSearch Security APIs

I was looking at the set of security APIs over at Elasticsearch as I was diving into my API security research recently. I thought the areas they provide security APIs for the search platform was worth noting and including in not just my API security research, but also search, deployment, and probably overlap with my authentication research.

  • Authenticate API - The Authenticate API enables you to submit a request with a basic auth header to authenticate a user and retrieve information about the authenticated user.
  • Clear Cache API - The Clear Cache API evicts users from the user cache. You can completely clear the cache or evict specific users.
  • User Management APIs - The user API enables you to create, read, update, and delete users from the native realm. These users are commonly referred to as native users.
  • Role Management APIs - The Roles API enables you to add, remove, and retrieve roles in the native realm. To use this API, you must have at least the manage_security cluster privilege.
  • Role Mapping APIs - The Role Mapping API enables you to add, remove, and retrieve role-mappings. To use this API, you must have at least the manage_security cluster privilege.
  • Privilege APIs - The has_privileges API allows you to determine whether the logged in user has a specified list of privileges.
  • Token Management APIs - The token API enables you to create and invalidate bearer tokens for access without requiring basic authentication. The get token API takes the same parameters as a typical OAuth 2.0 token API except for the use of a JSON request body.

Come to think of it, I’ll add this to my API management research as well. Much of this overlaps with what should be a common set of API management services as well. Like much of my research, there are many different dimensions to my API security research. I’m looking to see how API providers are securing their APIs, as well as how service providers are selling security services to APIs providers. I’m also keen on aggregating common API design patterns for security APIs, and quantity how they overlap with other stops along the API lifecycle.

While the cache API is pretty closely aligned with delivering a search API, I think all of these APIs provide a potential building block to think about when you are deploying any API, and represents the Venn diagram that is API authentication, management, and security. I’m going through the rest of the Elasticsearch platform looking for interesting approaches to ensuring their search solutions are secure. I don’t feel like there are any search specific characteristics of API security that I will need to include in my final API security industry guide, but Elasticsearch’s approach has re-enforced some of the existing security building blocks I already had on my list.


Requiring ALL Platform Partners Use The API So There Is A Registered Application

I wrote a story about Twitter allowing users to check or uncheck a box regarding sharing data with select Twitter partners. While I am happy to see this move from Twitter, I feel the concept of information sharing being simply being a checkbox is unacceptable. I wanted to make sure I praised Twitter in my last post, but I’d like to expand upon what I’d like to see from Twitter, as well as ALL other platforms that I depend on in my personal and professional life.

There is no reason that EVERY platform we depend on couldn’t require ALL partners to use their API, resulting in every single application of our data be registered as an official OAuth application. The technology is out there, and there is no reason it can’t be the default mode for operations. There just hasn’t been the need amongst platform providers, as as no significant demand from platform users. Even if you don’t get full access to delete and adjust the details of the integration and partnership, I’d still like to see companies, share as many details as they possibly can regarding any partner sharing relationships that involve my data.

OAuth is not the answer to all of the problems on this front, but it is the best solution we have right now, and we need to have more talk about how we can make it is more intuitive, informative, and usable by the average end-users, as well as 3rd party developers, and platform operators. API plus OAuth is the lowest cost, widely adopted, standards based approach to establishing a pipeline for ALL data, content, and algorithms operate within that gives a platform the access and control they desire, while opening up access to 3rd party integrators and application developers, and most importantly, it gives a voice to end-users–we just need to continue discussing how we can keep amplifying this voice.

To the folks who will DM, email, and Tweet at me after this story. I know it’s unrealistic and the platforms will never do business like this, but it is a future we could work towards. I want EVERY online service that I depend on to have an API. I want all of them to provide OAuth infrastructure to govern identify and access management for personally identifiable information. I want ALL platform partners to be required to use a platforms API, and register an application for any user who they are accessing data on behalf. I want all internal platform projects to also be registered as an application in my OAuth management area. Crazy talk? Well, Google does it for (most of) their internal applications, why can’t others? Platform apps, partner apps, and 3rd party apps all side by side.

The fact that this post will be viewed as crazy talk by most who work in the technology space demonstrates the imbalance that exists. The technology exists for doing this. Doing this would improve privacy and security. The only reason we do not do it is because the platforms, their partners and ivnestors are too worried about being this observable across operations. There is no reason why APIs plus OAuth application can’t be universal across ALL platforms online, with ALL partners being required to access personally identifiable information through an API, with end-uses at least involved in the conversaiton, if not given full control over whether or not personally identifiable information is shared, or not.


Making An Account Activity API The Default

I was reading an informative post about the Twitter Account Activity API, which seems like something that should be the default for ALL platforms. In today’s cyber insecure environment, we should have the option to subscribe to a handful of events regarding our account or be able to sign up for a service that can subscribe and help us make sense of our account activity.

An account activity API should be the default for ALL the platforms we depend on. There should be a wealth of certified aggregate activity services that can help us audit and understand what is going on with our platform account activity. We should be able to look at, understand, and react to the good and bad activity via our accounts. If there are applications doing things that don’t make sense, we should be able to suspend access, until more is understood.

The Twitter Account Activity API Callback request contains three level of details:

  • direct_message_events: An array of Direct Message Event objects.
  • users: An object containing hydrated user objects keyed by user ID.
  • apps: An object containing hydrated application objects keyed by app ID.

The Twitter Account Activity API provides a nice blueprint other API providers can follow when thinking about their own solution. While the schema returned will vary between providers, it seems like the API definition, and the webhook driven process can be standardized and shared across providers.

The Twitter Account Activity API is in beta, but I will keep an eye on it. Now that I have the concept in my head, I’ll also look for this type of API available on other platforms. It is one of those ideas I think will be sticky, and if I can kick up enough dust, maybe other API providers will consider. I would love to have this level of control over my accounts, and it is also good to see Twitter still rolling out new APIs like this.


Making An Account Activity API The Default

I was reading an informative post about the Twitter Account Activity API, which seems like something that should be the default for ALL platforms. In today’s cyber insecure environment, we should have the option to subscribe to a handful of events regarding our account or be able to sign up for a service that can subscribe and help us make sense of our account activity.

An account activity API should be the default for ALL the platforms we depend on. There should be a wealth of certified aggregate activity services that can help us audit and understand what is going on with our platform account activity. We should be able to look at, understand, and react to the good and bad activity via our accounts. If there are applications doing things that don’t make sense, we should be able to suspend access, until more is understood.

The Twitter Account Activity API Callback request contains three level of details:

  • direct_message_events: An array of Direct Message Event objects.
  • users: An object containing hydrated user objects keyed by user ID.
  • apps: An object containing hydrated application objects keyed by app ID.

The Twitter Account Activity API provides a nice blueprint other API providers can follow when thinking about their own solution. While the schema returned will vary between providers, it seems like the API definition, and the webhook driven process can be standardized and shared across providers.

The Twitter Account Activity API is in beta, but I will keep an eye on it. Now that I have the concept in my head, I’ll also look for this type of API available on other platforms. It is one of those ideas I think will be sticky, and if I can kick up enough dust, maybe other API providers will consider. I would love to have this level of control over my accounts, and it is also good to see Twitter still rolling out new APIs like this.


Defining OAuth Scope Inline Within The API Documentation

I am working on a project using the Youtube API, and came across their inline OAut 2.0 scopes, allowing you to explore what the API does as you are browsing the API docs. I am a huge fan of what interactive documentation like Swagger UI, and Apiary brought to the table, but I'm an even bigger fan of the creative ways people are evolving upon the concept, making learning about APIs a hands-on, interactive experience wherever possible.

To kick off my education of the YouTube API I started playing with the search endpoint for the Youtube Data API. As I was playing with I noticed the had an API explorer allowing me to call the search method and see the live data.

Once I clicked on the "Authorize requests using OAuth 2.0" slider I got a popup that gave me options for selecting OAuth 2.0s copes, that would be applied by the API explorer when I make API calls.

The inline OAuth is simple, intuitive, and what I needed to define my API consumption, in line within the Youtube API documentation. I didn't have to write any code or jump through a bunch of classic OAuth hoops. It gves me what I need for OAuth, right in the documentation--simple OAuth is something you don't see very often.

I'm a supporter of more API documentation being an attractive static HTML layout like this, with little interactive modules embedded throughout the API docs. I'm also interested in seeing more web literacy being thrown in at this layer as well, pulling common web concepts and specification details, and providing popups, tooltips, and other inline API design learning opportunities.

I'm adding YouTube's approach to OAuth to my list of approaches to a modular approach to delivering interactive API documentation, for use in future storytelling.


Adding An OAuth Scope Page As One Of My API Management Building Blocks

I've had a handful of suggested building blocks when it comes to authentication, as part of my API management research, but after taking a look at the OAuth Scopes page for the Slack API, I'm going to add another building block just for listing out OAuth scopes.

For platforms who provide OAuth, scopes are how access to users content and data is being broken down, and negotiated. When it comes to industry levels, OAuth scopes are how power and influence is being brokered, so I'm going to start tracking on how leading providers are defining their scopes--I am sure there are some healthy patterns that we all can follow here.

I have had the pleasure of sitting in on OAuth negotiations between major utility providers, as part of my work with the White House and Department of Energy in the past. This work has given me a glimpse into the future of how access and sharing of data will be negotiated in the future, with OAuth scopes and APIs playing a central role.

It will take me some time to standardize how I gather, store, and publish the OAuth scopes for each API, but I can get started by bookmarking any provider who shares their OAuth scopes, and encourage other API providers to do, by suggesting a formal OAuth scopes page as one possible building block you should consider when crafting your API strategy.


Providing An oAuth Signature Generator Inline In Documentation

I talked about Twitter's inclusion of rate limits inline with documentation the other day, which is something I added as a new building block, that API providers can consider when crafting their own strategy. Another building block I found while spending time in the Twitter ecosystem, was an oAuth signature generator inline within the documentation.

While browsing the Twitter documentation, right before you get to the example request, you get a little dropdown that lets you select from one of your own applications, and generate an oAuth signature without leaving the page.

I am seeing oAuth signature generators emerge in a number of API platforms, but this is the first inline version I’m seeing. I’ve added this to my tentative list of oAuth and security building blocks I recommend, but will give some time before I add. I like to see more than one provider do something before I put it in there, but sometimes when it is just Twitter, that can be enough.


Reddit API Methods Listed Alpha or By oAuth Scope

I was taking another look at the Reddit API over the weekend, and thought their listing of API endpoints was pretty interesting. They provide two ways of looking at the platform APIs:

  • API Section - Listing of Reddit APIs by account, apps, links, listings, and other groupings.
  • oAuth Scope - Listing of Reddit APIs by the endpoints oAuth scope.

Not sure if it is something all APIs should follow, I just thought it was an interesting approach, something I haven’t seen in any other API area. Reddit has a number of APIs, I can see if you have specific goals with the API, this might be useful to have API sorted in this way.


When I Remix APIs Using Swagger How Do I Deal With Authentication Across Multiple APIs

One of the things I’m loving about where the API space is going, is the simplicity, and remixability of available API resources, when they are defined with machine readable API definitions like Swagger. An example of this, can be found in my recent work to make federal government APIs more discoverable.

I generated machine readable API definitions using Swagger, for four separate APIs out of the GSA. The APIs were spread across two separate domains: usa.gov & explore.data.gov. You can follow the details of research, at each of the project repositories, but as I continue with my work, I keep thinking about the power that having a machine readable definition for all four of these APIs, and my ability to now remix these simple, and powerful API resources into virtual stacks. After I work my way through the 120 government APIs I have targeted, I will have an amazing index of government API resources to compose from.

My vision around all of this goes beyond just API discovery, and finding government APIs. I want to make it so we can compose virtual stacks of API resources, that can be used in different scenarios. If you are building a public engagement app for an election, you can assemble exactly the API resources your developers will need, aggregate them using their API definition, into a single developer area—even though the APIs may span multiple federal agency developer areas.

In this scenario, developers don't have to go find all the API resources they need, an architect, or API lead can aggregate everything into a one-stop-shop for what the developers will need. This evolution in API delivery makes me very happy, right up until I come up against the current state of on-boarding with APIs, to get the credentials you need to use the API resource. In this particular scenario, you would have to sign up for an account with 10 or 20 separate agencies, or outside groups—stripping away any benefits gained through the remixing and aggregation of APIs.

How do we solve API consumer access for the API economy? I know that API providers like Mashape, and Mashery have tried to solve with their platforms. These touch on the problem, but are providing very siloed solutions--I feel we need one that works for API consumers who are using many APIs, across many providers, in an adhoc manner. As a lead architect for a mobile project, I shouldn't have to spend hours signing up my application across many different API consumer accounts.

As part of the APIs.json specification, we encourage API providers to include a link to their signup page, but this is just a link. Eventually we need a machine readable API registration flow, where trusted developers can ping, receive authentication instructions, and easily obtain keys to use an API resource. I’m sure there are legacy SOA scenarios my readers will school me on, and I’m hoping to also learn about any existing standards available that address this. My motivation here, is to push my understanding of what is available, and hopefully jump start a conversation with API providers, about any viable solutions to authenticating across multiple APIs as seamlessly as possible, when you are developing distributed apps.


My Response To How Can the Department of Education Increase Innovation, Transparency and Access to Data?

I spent considerable time going through the Department of Education RFI, answering each question in as much detail as I possibly could. You can find my full response below. In the end I felt I could provide more value by summarizing my response, eliminating much of the redundancy across different sections of the RFI, and just cut through the bureaucracy as I (and APIs) prefer to do.

Open Data By Default
All publicly available data at the Department of Education needs to be open by default. This is not just a mandate, this is a way of life. There is no data that is available on any Department of Education websites that should not be available for data download. Open data downloads are not separate from existing website efforts at Department of Education, they are the other side of the coin, making the same content and data available in machine readable formats, rather than available via HTML—allowing valuable resources to be used in systems and applications outside of the department’s control.

Open API When There Are Resources
The answer to whether or not the Department of Education should provide APIs is the same as whether or not the agency should deploy websites—YES! Not all individuals and companies will have the resources to download, process, and put downloadable resources to use. In these situations APIs can provide much easier access to open data resources, and when open data resources are exposed as APIs it opens up access to a much wider audience, even non-developers. Lightweight, simple, API access to open data inventory should be default along with data downloads when resources are available. This approach to APIs by default, will act as the training ground for not just 3rd party developers, but also internally, allowing Department of Education staff to learn how to manage APIs in a safe, read-only environment.

Using A Modern API Design, Deployment, and Management Approach
As the usage of the Internet matured in 2000, many leading technology providers like SalesForce and Amazon began using web APIs to make digital assets available to 3rd party partners, and 14 years later there are some very proven approaches to designing, deploying and management APIs. API management is not a new and bleeding edge approach to making assets available in the private sector, there are numerous API tools and services available, and this has begun to extend to the government sector with tools like API Umbrella from NREL, being employed by api.data.gov and other agencies, as well as other tools and services being delivered by 18F from GSA. There are many proven blueprints for the Department of Education to follow when embarking on a complete API strategy across the agency, allowing innovation to occur around specific open data, and other program initiatives, in a safe, proven way.

Use API Service Composition For Maximum Access & Control
One benefit of 14 years of evolution around API design, deployment, and management is the establishment of sophisticated service composition of API resources. Service composition refers to the granular, modular design and deployment of APIs, while being able to manage who has access to these resources. Modern API access is not just direct, public access to a database. API service composition allows for designing exactly the access to resources that is necessary, one that is in alignment with business objectives, while protecting the privacy and security of everyone involved. Additionally service composition allows for real-time awareness of how all data, content, and other resources at the Department of Education are accessed and put to use, allowing new APIs to be designed to support specific needs, and existing APIs to evolved based upon actual demand, not just speculation.

Deeper Understanding Of How Resources Are Used
A modern API service composition layer opens up possibility for a new analytics layer that is not just about measuring and reporting of access to APIs, it is about understanding precisely how resources are accessed in real-time, allowing API design, deployment and management processes to be adjusted in a more rapid and iterative way, that contributes to the roadmap, while providing the maximum enforcement of security and privacy of everyone involved. When the Department of Education internalizes a healthy, agency-wide API approach, a new real-time understanding will replace this very RFI centered process that we are participating in, allowing for a new agility, with more control and flexibility than current approaches. A RFI cycle takes months, and will contain a great deal of speculation about what would be, where API access, coupled with healthy analytics and feedback loops, answers all the questions being addressed in this RFI, in real-time, reducing resource costs, and wasted cycles.

APIs Open Up Synchronous and Asynchronous Communication Channels
Open data downloads represents a broadcast approach to making Department of Education content, data and other resources available, representing a one way street. APIs provide a two-way communication, bringing external partners and vendors closer to Department of Education, while opening up feedback loops with the Department of Education, reducing the distance between the agency and its private sector partners—potentially bringing valuable services closer to students, parents and the companies or institutions that serve them. Feedback loops are much wider currently at the Department of Education occur on annual, monthly and at the speed of email or phone calls , with the closest being in person at events, something that can be a very expensive endeavor. Web APIs provide a real-time, synchronous and asynchronous communication layer that will improve the quality of service between Department of Education and the public, for a much lower cost than traditional approaches.

Building External Ecosystem of Partners
The availability of high value API resources, coupled with a modern approach to API design, deployment and management, an ecosystem of trusted partners can be established, allowing the Department of Education to share the workload with an external partner ecosystem. API service composition allows the agency to open up access to resources to only the partners who have proven they will respect the privacy and security of resources, and be dedicated to augmenting and helping extend the mission of the Department of Education. As referenced in the RFI, think about the ecosystem established by the IRS modernized e-file system, and how the H&R Blocks, and Jackson Hewitt’s of the world help the IRS share the burden of the country's tax system. Where is the trusted ecosystem for the Department of Education? The IRS ecosystem has been in development for over 25 years, something the Department of Education has to get to work on theirs now.

Security Fits In With Existing Website Security Practices
One of the greatest benefits of web APIs is that they utilize existing web technologies that are employed to deploy and manage websites. You don’t need additional security approaches to manage APIs beyond existing websites. Modern web APIs are built on HTTP, just like websites, and security can be addressed right alongside current website security practices—instead of delivering HTML, APIs are delivering JSON and XML. APIs even go further, and by using modern API service composition practices, the Department of Education gains an added layer of security and control, which introduces granular levels of access to all resource, something that does not exist for website. With a sensible analytics layer, API security isn’t just about locking down, it is about understanding who is access resources, how they are using them, striking a balance between the security and access of resources, which is the hallmark of APIs.

oAuth Gives Identity and Access Control To The Student
Beyond basic web security, and the heightened level of control modern API management deliver, there is a 3rd layer to the security and privacy layer of APis that does not exist anywhere else—oAuth. Open Authentication or oAuth provides and identity and access layer on top of API that gives end-users, and owner of personal data control over who access their data. Technology leaders in the private sector are all using oAuth to give platform users control over how their data is used in applications and systems. oAuth is the heartbeat of API security, giving API platforms a way to manage security, and how 3rd party developers access and put resources to use, in a way that gives control to end users. In the case of the Department of Education APIs, this means putting the parent and student at the center of who accesses, and uses their personal data, something that is essential to the future of the Department of Education.

How Will Policy Be Changed?
I'm not a policy wonk, nor will I ever be one. One thing I do know is you will never understand the policy implications in one RFI, nor will you change policy to allow for API innovation in one broad stroke--you will fail. Policy will have to be changed incrementally, a process that fits nicely with the iterative, evolutionary life cyce of API managment. The cultural change at Department of Education, as well as evolutionary policy change at the federal level will be the biggest benefits of APIs at the Department of Education. 

An Active API Platform At Department of Education Would Deliver What This RFI Is Looking For
I know it is hard for the Department of Education to see APIs as something more than a technical implementation, and you want to know, understand and plan everything ahead of time—this is baked into the risk averse DNA of government.  Even with this understanding, as I go through the RFI, I can’t help but be frustrated by the redundancy, bureaucracy, over planning, and waste that is present in this process. An active API platform would answer every one of your questions you pose, with much more precision than any RFI can ever deliver.

If the Department of Education had already begun evolving an API platform for all open data sets currently available on data.gov, the agency would have the experience in API design, deployment and management to address 60% of the concerns posed by this RFI. Additionally the agency would be receiving feedback from existing integrators about what they need, who they are, and what they are building to better serve students and institutions. Because this does not exist there will be much speculation about who will use Department of Education APIs, and how they will use them and better serve students. While much of this feedback will be well meaning, it will not be rooted in actual use cases, applications and existing implementations. An active API ecosystem answers these questions, while keeping answers rooted in actual integrations, centered around specific resources, and actual next steps for real world applications.

The learning that occurs from managing read-only API access, to low-level data, content and resources would provide the education and iteration necessary for the key staff at Department of Education to reach the next level, which would be read / write APIs, complete with oAuth level security, which would be the holy grail in serving students and achieving the mission of the Department of Education. I know I’m biased, because of my focus on APIs, but read / write access to all Department of Education resources over the web and via mobile devices, that gives full control to students, is the future of the agency. There is no "should we do APIs", there is only the how, and I’m afraid we are wasting time, and we need to just do it, and learn to ask these questions along the way.

There is proven technology and processes available to make all Department of Education data, content and resources available, allowing both read and write access in a secure way, that is centered around the student. The private sector is 14 years ahead of the government in delivering private sector resources in this way, and other government agencies are ahead of the Department of Education in doing this as well, but there is an opportunity for the agency to still lead and take action, by committing the resources necessary to not just deploy a single API, but internalize APIs in a way that will change the way learning occurs in the coming decades across all US institutions.


A. Information Gaps and Needs in Accessing Current Data and Aid Programs

1. How could data sets that are already publicly available be made more accessible using APIs? Are there specific data sets that are already available that would be most likely to inform consumer choice about college affordability and performance?

Not everyone has the resources download, process and put open datasets to use. APIs can make all of the publicly available datasets more available to the public, allowing for easy URL access, deployment of widgets, visualizations as well as integration with existing tools like Microsoft Excel. All datasets should have option of being published in this way, but ultimately the Dept. of Ed API ecosystem should speak to which datasets would be most high value, and warrant API access.

2. How could APIs help people with successfully and accurately completing forms associated with any of the following processes: FAFSA; Master Promissory Note; Loan Consolidation; entrance and exit counseling; Income-Driven Repayment (IDR) programs, 15 such as Pay As You Earn; and the Public Student Loan Forgiveness program?

APIs will help decouple each data point on a form. Introductory information, each questions, and other supporting resources can be broken up and delivered via any website, and mobile applications. Evolving a form into a linear, 2-dimensional form into an interactive application that people can engage with, providing the assistance needed to properly achieve the goals surrounding a form.

Each form initiative will have its own needs, and a consistent API platform and strategy from the department of Education will help identify each forms unique requirements, and the custom delivery of just the resources that are needed for a forms target audience.

3. What gaps are there with loan counseling and financial literacy and awareness that could be addressed through the use of APIs to provide access to government resources and content?

First, APIs can provide access to the content that educates students about the path they are about to embark on, before they do, via web and mobile apps they frequent already, not being required to visit the source site and learn. Putting the information students need into their hands, via their mobile devices will increase the reach of content and increase the chances that students will consume.

Second, APIs plus oAuth will give students access over their own educational finances, forcing them to better consider how they will manage all the relationships they enter into, the details of loans, grants and with the schools they attend. With more control over data and content, will come a forced responsibility in understanding and managing their finances.

Third, this process will open up students eyes to the wider world of online data and information, and that APIs are driving all aspects of their financial life from their banking and credit cards to managing their online credit score.

APIs are at the heart of all of the API driven digital economy, the gift that would be given to students when they first leave home, in the form of API literacy would carry with them throughout their lives, allowing them to better manage all aspects of their online and financial lives—and the Department of Education gave them that start.

4. What services that are currently provided by title IV student loan servicers could be enhanced through APIs (e.g., deferment, forbearance, forgiveness, cancellation, discharge, payments)?

A consistent API platform and strategy from the department of Education would provide the evolution of a suite of verified partners, such as title IV student loan services. A well planned partner layer within an ecosystem would allow student loan services to access data from students in real-time, with students having a say in who and how they have access to the data. These dynamics introduced by, and unique to API platforms that employ oAuth, provide new opportunities for partnerships to be established, evolve and even be terminated when not going well.

API platform using oAuth provide a unique 3-legged relationship between the data platform, 3rd party service providers and students (users), that can be adopted to bring in existing industry partners, but more importantly provide a rich environment for new types of partners to evolve, that can improve the overall process and workflow a student experiences.

5. What current forms or programs that already reach prospective students or borrowers in distress could be expanded to include broader affordability or financial literacy information?

All government forms and programs should be evaluated for the pros / cons of an API program. My argument within this RFI response will be focused on a consistent API platform and strategy from the department of Education. APIs should be be part of every existing program change, and new initiatives in the future.

B. Potential Needs to be Filled by APIs

1. If APIs were available, what types of individuals, organizations, and companies would build tools to help increase access to programs to make college more affordable?

A consistent API platform and strategy from the department of Education will have two essential components, partner framework, and service composition. A partner framework defines which external, 3rd party groups can work with Department of Education API resources. The service composition defines how these 3rd party groups can can access and ultimately use Department of Education API resources.

All existing groups that the Department of Education interacts with currently should be evaluated for where in the API partner framework they exists, defining levels of access for general public, student up to certified and trusted developer and business partnerships.

The partner framework and service composition for the Department of Education API platform should be applied to all existing individuals, organizations and companies, while also allow for new actors to enter the game, and potentially redefining the partner framework and add new formulas for API service composition, opening up the possibilities for innovation around Department of Education API resources.

2. What applications and features might developers, schools, organizations, and companies take interest in building using APIs in higher education data and services?

As with which Department of Education forms and programs might have APIs apply, which individuals, organizations and companies will use APIs, the only way to truly understand what applications might developers, schools, organizations and companies put APIs cannot be know, until it is place. These are the questions an API centric company or institution asks of its API platform in real-time. You can’t define who will use an API and how they will use it, it takes iteration and exploration before successful applications will emerge.

3. What specific ways could APIs be used in financial aid processes (e.g., translation of financial aid forms into other languages, integration of data collection into school or State forms)?

When a resource is available via an API, it is broken down into the smallest possible parts and pieces possible, allowing them to be re-used, and re-purposed into every possible configuration management. When you make form questions independently available via an API, it allows you to possible reorder, translate, and ask in new ways.

This approach works well with forms, allowing each entry of a form to be accessible, transferable, and open up for access, with the proper permissions and access level that is owned by the person who owns the format data. This opens up not just the financial aid process, but all form processes to interoperate with other systems, forms, agencies and companies.

With the newfound modularity and interoperability introduced by APIs, the financial aid process could be broken down, allowing parents to take part for their role, schools for theirs, and allow multiple agencies to be engaged such as IRS or Department of Veterans Affairs (VA). All of this allows any involved entity or system to do its part for the financial aid process, minimizing the friction throughout the entire form process, even year over year.

4. How can third-party organizations use APIs to better target services and information to low-income students, first-generation students, non-English speakers, and students with disabilities?

Again, this is a questions that should be asked in real-time of a Department of Education platform. Examples of how 3rd party organizations can better target services and information to students, is the reason for an API platform. There is no way to no this ahead of time, I will leave to domain experts to attempt at answering.

5. Would APIs for higher education data, processes, programs or services be useful in enhancing wraparound support service models? What other types of services could be integrated with higher education APIs?

A sensibly design,deployed, managed and evangelized API platform would establish a rich environment for existing educational services to be augmented, but also allow for entirely new types of services to be defined. Again I will leave to domain experts to speak of specific service implantations based upon their goals, and understanding of the space.

C. Existing Federal and Non-Federal Tools Utilizing APIs

1. What private-sector or non-Federal entities currently offer assistance with higher education data and student aid programs and processes by using APIs? How could these be enhanced by the Department’s enabling of additional APIs?

There are almost 10K public APIs available in the private sector. This should be viewed as a pallet for developers, and something that developers use as they are developing (painting) their apps (painting). It is difficult for developers to know what they will be painting with, without knowing what resources are available. The open API innovation process rarely is able to articulate what is needed, then make that request for resources—API innovations occurs when valuable, granular resources are available fro multiple sources, ad developers assemble them, and innovate in new ways.

2. What private-sector or non-Federal entities currently work with government programs and services to help people fill out government forms? Has that outreach served the public and advanced public interests?

Another question that should be answered by the Department of of Education, and providing us with the answers. How would you know this without a properly definitely partner framework? Stand up an API platform, and you will have the answer.

3. What instances or examples are there of companies charging fees to assist consumers in completing otherwise freely available government forms from other agencies? What are the advantages and risks to consider when deciding to allow third parties to charge fees to provide assistance with otherwise freely available forms and processes? How can any risks be mitigated?

I can't speak to what is already going on in the space, regarding companies charging feeds to consumers, I am not expert on the education space at this level. This is just such a new paradigm made possible via APIs and open data, there just aren’t that many examples in the space, built around open government data.

First, the partner tiers of API platforms help verify and validate individuals and organizations who are building applications and charging for services in the space. A properly design, managed and policed partner tier can assist in mitigating risk in the evolution of such business ecosystems.

Second API driven security layers using oAuth give access to end-users, allowing students to take control over which applications and ultimately service providers have access to their data, revoking when services are done or a provider is undesirable. With proper reporting and rating systems, policing of the API platform can be something that is done within the community, and the last mile of policing being done by the Department of Education.

Proper API management practices provide the necessary identity, access and control layers necessary to keep resources and end-users safe. Ultimately who has access to data, can charge fees, and play a role in the ecosystem is up to Department of education and end-users when applications are built on top of APIs.

4. Beyond the IRS e-filing example, what other similar examples exist where Federal, State, or local government entities have used APIs to share government data or facilitate participation in government services or processes - particularly at a scale as large as that of the Federal Student Aid programs?

This is a new, fast growing sector, and there are not a lot of existing examples, but there area few:

Open311
An API driven system that allows citizens to report and interact with municipalities around issues within communities. While Open311 is deployed in specific cities such as Chicago and Baltimore, it is an open source platform and API that can be deployed to serve any size market.

Census Bureau
The US Census provides open data and APIs, allowing for innovation around government census survey data, used across the private sector in journalism, healthcare, and many other ways. The availability of government census data is continually spawning new applications, visualizations and other expressions, that wouldn’t be realized or known, if the platform wasn’t available.

We The People
The We The People API allows for 3rd-Party integration with the White House Petition process. Currently only allowing for read only access to the information, and the petition process, but is possibly one way that write APIs will emerge in federal government.

There are numerous examples of open APIs and data being deployed in government, even from the Department of Education. All of them are works in progress, and will realize their full potential over time, maturation and much iteration and engagement with the public.

D. Technical Specifications

1. What elements would a read-write API need to include for successful use at the Department?

There are numerous building blocks can be employed in managing read-write APIs, but there are a couple that will be essential to successful read-write APIs in government:

Partner Framework
Defined access tiers for consumers of API data, with appropriate public, partner and private (internal) levels of access. All write methods are only accessible by partner and internal levels of access, requiring verification and certification of companies and individuals who will be building on top of API resources.

Service Management
The ability to compose many different types of API resource access, create service bundles that are made accessible to different levels of partners. Service management allows for identity and access management, but also billing, reporting, and other granular level control over how services are composed, accessed and managed.

Open Authentication (oAuth 2.0)
All data made available via Department of Education API platforms and involves personally identifiable information will require the implementation of an open authentication or oAuth security layer. oAuth 2.0 provides an identity layer for the platform, requiring developers to use token that throttle access to resources for applications, a process that is initiated, managed and revoked by end-users—providing the highest level of control over who has access to data, and what they can do with it, by the people who personal data is involved.

Federated API Deployments
Not all APIs should be deployed and managed within the Department of Education firewall. API platforms can be made open source so that 3rd party partners can deploy within their own environments. Then via a sensible partner framework, the Department of Education can decide which partners they should not just allow to write to APIs, but also pull data from their trusted systems and open API deployments.

APIs provide the necessary access to all of federal government API resources, and a sensible partner framework, service management layer in conjunction with oAuth will provide the necessary controls for a read / write API in government. If agencies are looking to further push risk outside the firewall, federated API deployments with trusted partners will have to be employed.

2. What data, methods, and other features must an API contain in order to develop apps accessing Department data or enhancing Department processes, programs, or services?

There are about 75 common building blocks for API deployments (http://management.apievangelist.com/building-blocks.html), aggregated after looking at almost 10K public API deployments. Each government API will have different needs when it comes to other supporting building blocks.

3. How would read-only and/or read-write APIs interact with or modify the performance of the Department’s existing systems (e.g., FAFSA on the Web)? Could these APIs negatively or positively affect the current operating capability of such systems? Would these APIs allow for the flexibility to evolve seamlessly with the Department’s technological developments?

There are always risks with API access to resources, but a partner framework, service management, oAuth, and other common web security practices these risks can be drastically reduce, and mitigated in real-time

Isolated API Deployments
New APIs should rarely be deployed and directly connected to existing systems. APIs can be deployed as an isolated interface, with an isolated data store. Existing systems can use the same API interface to read / write data into the system and keep in sync with existing internal systems. API developers will never have access to existing system and data stores, just isolated, defined API interfaces as part of a secure partner tier, only accessing the services they have permission to, and the end-user data that has been given access to by end-users themselves.

Federated Deployments
As described above, if government agencies are looking to further reduce risk, API deployments can be designed and deployed as open source software, allowing partners with the ecosystem to download and deploy. A platform partner framework can provide a verification and certification process for federal API deployments, allowing the Department of Education to decide who they will pull data from, reducing the risk to internal systems, providing a layer of trust for integration.

Beyond these approaches to deploying APIs, one of the biggest benefits of web API deployments is they use the same security as other government websites, just possessing an additional layer of securing determining who has access, and to what.

It should be the rare instance when an existing system will have an API deployed with direct integration. API automation will provide the ability to sync API deployments with existing systems and data stores.

4. What vulnerabilities might read-write APIs introduce for the security of the underlying databases the Department currently uses?

As stated above, there should be no compromise in how data is imported into existing databases at the Department of Education. It is up to the agency to decide which APIs they pull data from, and how it is updated as part of existing systems.

5. What are the potential adverse effects on successful operation of the Department’s underlying databases that read-write APIs might cause? How could APIs be developed to avoid these adverse effects?

As stated above, isolated and external, federated API deployments will decouple the risk from existing systems. This is the benefit of APIs, is they can deployed as isolated resources, then integration and interoperability, internally and externally is up to the consumer to decide what is imported and what isn’t.

6. How should APIs address application-to-API security?

Modern API partner framework, service management and oath provide the necessary layer to identify who has access, and what resources can be used by not just a company and user, but by each application they have developed.

Routing all API access through the partner framework plus associated service level, will secure access to Department of Education resources by applications, with user and app level logging of what was accessed and used within an application.

OAuth provides a balance to this application to API security layer, allowing the Department of Education to manage security of API access, developers to request access for their applications, but ultimately control is in the hand of end users to define which applications have access to their data.

7. How should the APIs address API-to-backend security issues? Examples include but are not limited to authentication, authorization, policy enforcement, traffic management, logging and auditing, TLS (Transport Layer Security), DDoS (distributed denial-of-service) prevention, rate limiting, quotas, payload protection, Virtual Private Networks, firewalls, and analytics.

Web APIs use the exact same infrastructure as websites, allowing for the re-use of existing security practices employed for websites. However APIs provide the added layer of security, logging, auditing and analytics provided through the lens of the partner framework, service composition and only limited by the service management tooling available.

8. How do private or non-governmental organizations optimize the presentation layer for completion and accuracy of forms?

Business rules. As demonstrated as part of a FAFSA API prototype, business rules for each form field, along with rejection codes can also be made available via an API resources, allowing for developers to build in a form validation layer into all digital forms.

After submission, and the first line of defense provide red by API developers building next generation forms, platform providers can provide further validation, review and ultimately a status workflow that allows forms to be rejected or accepted based upon business logic.

9. What security parameters are essential in ensuring there is no misuse, data mining, fraud, or misrepresentation propagated through use of read- only or read-write APIs?

A modern API service management layer allows the platform provider to see all API resources that are being access, by whom, and easily establish patterns for healthy usage, as well as patterns for misuse. When misuse is identified, service management allows providers to revoke access, and take action against companies and individuals.

Beyond the platform provider, APIs allow for management by end-users through common oAuth flows and management tools. Sometimes end-users can identify an app is misusing their data, even before a platform provider might. oAuth gives them the control to revoke access to their data, via the API platform.

oauth, combined with API service management tooling has allowed for a unique security environment in which the platform can easily keep operations healthy, but end-users and developers can help police the ecosystem as well. If platform providers give users the proper rating and reporting tools, they can help keep API and data consumers in check.

10. With advantages already built into the Department’s own products and services (e.g., IRS data retrieval using FAFSA on the Web), how would new, third-party API-driven products present advantages over existing Department resources?

While existing products and services developed within the department do provide great value, the Department of Education cannot do everything on their own. Because of the access the Department has, some features will be better by default, but this won’t be the case in all situations.

The Department of Education and our government does not have unlimited resources, and with access to ALL resources available via the department the private sector can innovate, helping share the load of delivering vital services. Its not whether or not public sector products and services are better than private sector or vice vera, it is about the public sector and private sector partnering wherever and whenever it make sense.

11. What would an app, service or tool built with read-write API access to student aid forms look like?

Applications will look like turbotax and tax act developed within the IRS ecosystem, and look like the tools developed by the Sunlight Foundation on top of government open data and APIs.

We will never understand what applications are possible until the necessary government resources are available. All digital assets should be open by default, with consistent API platform and strategy from the department of Education, and the platform will answer this question.

E. Privacy Issues

1. How could the Department use APIs that involve the use of student records while ensuring compliance with potentially applicable statutory and regulatory requirements, such as the Family Educational Rights and Privacy Act (20 U.S.C. § 1232g; 34 CFR Part 99) and the Privacy Act (5 U.S.C. § 552a and 34 CFR Part 5b)?

As described above the partner framework, service management and oAuth layer provides the control and logging necessary to execute and audit as part of any application statutory and regulatory requirement.

I can’t articulate enough how this layer provides a tremendous amount of control over how these resources are access, giving control to the involved parties who matter the most—end-users. All API traffic is throttled, measured and reviewed as part of service management, enforcing privacy that in a partnership between the Department of Education, API consumers and end-users.

2. How could APIs ensure that the appropriate individual has provided proper consent to permit the release of privacy-protected data to a third party? How can student data be properly safeguarded to prevent its release and use by third parties without the written consent often required?

As articulated above the partner framework, service management and oAuth address this. This is a benefit of API deployment, breaking down existing digital access, providing access and granular control, combined with oAuth and logging of all access—APIs take control to a new level.

oAuth has come to represent this new balance in security and control of digital resources, allowing the platform, developers and end-users to execute within their defined role on the platform. This balance introduced by APIs and oAuth, allow data to be safeguarded, while also opening up for the widest possible use in the next generation applications and other implementations.

3. How might read-only or read-write APIs collect, document, and track individuals’ consent to have their information shared with specific third parties?

oAuth. Period.

4. How can personally identifiable information (PII) and other financial information (of students and parents) be safeguarded through the use of APIs?

Access of personally identifiable information (PII) via Department of Education APIs will be controlled by students and their parents. The most important thing you can do to protect PII is to give the owner of that data, education about how to allow developer access to it in responsible ways that will benefit them.

APIs open up access, while oAuth will give the students and parents the control they need to integrate with apps, and existing system to achieve their goals, while retaining the greatest amount of over safeguarding their own data.

5. What specific terms of service should be enabled using API keys, which would limit use of APIs to approved users, to ensure that information is not transmitted to or accessed by unauthorized parties?

A well designed partner layer would define multiple level of access, combined with sensible service packages, will establish the terms of service levels that will be bundled with API keys and oAuth level identity and access to personally identifiable information.

Common approaches to deploying partner layers with appropriate service tiers, using oAuth have been well established over the last 10 years in the private sector. Controlling access to API resources at a granular level, providing the greatest amount of access that makes sense, while knowing who is access data and how they are using is what APIs are designed for.

6. What are the relative privacy-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

You will face many of the similar privacy concerns whether an API is read or write. If it is personably identifiable information, read or write access to the wrong parties violates a student's privacy. Just ensure that data is updated via trusted application providers is essential.

A properly defined partner layer will separate who has read and who has write access. Proper logging and versioning of data is essential to ensure data integrity, allowing end-users to manage their data via an application or system with confidence.

F. Compliance Issues

1. What are the relative compliance-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

APIs provide a single point of access to student aid data. With the implementation of proper partner framework, service management and oAuth every single action via this doorway is controlled and logged. When it comes to auditing ALL operations whether it is from the public, partners or internal, APIs excel in satisfying compliance concerns.

2. How can the Department prevent unauthorized use and the development of unauthorized products from occurring through the potential development of APIs? How might the Department enforce terms of service for API key holders, and prevent abuse and fraud by non-API key holders, if APIs were to be developed and made available?

As described above the partner framework, service management and oAuth will provide the security layer needed to manage 99% of potential abuse, but overall enforcement via the API platform is a partnership between the Department of Education, API consumers as well as end-users. The last mile of enforcement will be executed by the Department of Education, but it will be up to the entire ecosystem and platform to police and enforce in real-time.

3. What kind of burden on the Department is associated with enforcing terms and conditions related to APIs?

The Department of Education will handle the first line of defense, in defining partner tiers and service composition that wraps all access to APis. The Department will also be the last mile of decision making and enforcement when violations occur. The platform should provide the data needed by the department to make decision as well as the enforcement necessary in the form of API key and access revocation, and banning apps, individuals and business from the ecosystem.

4. How can the Department best ensure that API key holders follow all statutory and regulatory provisions of accessing federal student aid funds and data through use of third-party products?

First line of define to ensure that API key holders follow all statutory and regulatory provision will be verification and validation of partners upon registration, applications going into production and availability in application galleries and other directories in which students discover apps.

Second line of defense will be reporting requirements and usage patterns of API consumers and their apps. If applications regular meet self-reporting requirements and real-time patterns establishing healthy or unhealthy behavior, they can retain their certification. If partners fail to comply they will be restricted from the API ecosystem.

Last line of defense is the end-users, the students and parents. All end-users need to be educated regarding the control they have, given reporting and ranking tools that allow them file complaints and rank the applications that are providing quality services.

As stated several times, enforcement will be a community effort, something the Department of Education has ultimate control of, but requires giving the community agency as well.

5. How could prior consent from the student whom the data is about be provided for release of privacy- protected data to third party entities?

An API with oAuth layer is this vehicle. Providing the access, logging all transactions, and holding all partners to a quality of service. All the mechanism are there, in a modern API implementation, the access just needs to be defined.

6. How should a legal relationship between the Department and an API developer or any other interested party be structured?

I’m not a lawyer. I’m not a policy person. Just can’t contribute to this one.

7. How would a legal relationship between the Department and an API developer or any other interested party affect the Department’s current agreements with third-party vendors that operate and maintain the Department’s existing systems?

All of this will be defined in each partner tier, combined with appropriate service levels. With isolated API deployments, this should not affect currently implementations.

However a benefit of consistent API strategy is that existing vendors can access resources via APis, increasing the agility and flexibility of existing contracts. APIs are a single point of access, not just for the public, but 3rd party partners as well as internal access. Everyone involved can participate and receive benefits of API consumption.

8. What disclosures should be made available to students about what services are freely available in government domains versus those that could be offered at a cost by a third party?

A partner tier for the API platform will define the different levels of partners. Trusted, verified and certified partners will get different recommendation levels and access than lesser known services, and applications from 3rd party with lesser trusted levels of access.

9. If the Department were to use a third-party application to engage with the public on its behalf, how could the Department ensure that the Department follows the protocols of OMB Memorandum 10-23?

Again, the partner tier determines the level of access to the partner and the protocols of all OMB memorandum call be built in. Requiring all data, APIs and code is open sourced, and uses appropriate API access tiers showing how data and resources are accessed and put to use.

API service management provides the reporting necessary to support government audits and regulations. Without this level of control on top of an API, this just isn’t possible in a scalable way, that APIs plus web and mobile applications offer.

G. Policy Issues

1. What benefits to consumers or the Department would be realized by opening what is currently a free and single-point service (e.g., the FAFSA) to other entities, including those who may charge fees for freely-available services and processes? What are the potential unintended consequences?

Providing API access to government resources is an efficient and sensible use of taxpayers money, and reflect the mission of all agencies, not just the Department of Education. APIs introduce the agility and flexibility needed to deliver the next generation government application and services.

The economy in a digital age will require a real-time partnership between the public sector and the private sector, and APIs are the vehicle for this. Much like it has done for private sector companies like Amazon and Google, APIs will allow the government to create new services and products that serve constituents with the help of the private sector, while also stimulating job growth and other aspects of the economy.

APIs will not all be an up-side, each program and initiative will have its own policy problems and unintended consequences. One problem that plagues API initiatives is enough resources in the form of money and skilled works to make sure efforts are successful. Without the proper management, poorly executed APIs can open up huge security holes, introduce privacy concerns at a scale never imagined.

APIs need to be managed properly, with sensible real-time controls for keeping operations in check.

2. How could the Department ensure that access to title IV, HEA student aid programs truly remains free, even amidst the potential development of third-party apps that may charge a fee for assistance in participating in free government programs, products, and services with or without providing legitimate value-added services?

Partner Framework + Service Management = Quality of Service Across Platform

3. What other policy concerns should the Department consider with regard to the potential development of APIs for higher education data and student aid processes at the Department?

Not a policy or education expert, I will leave this to others to determine. Also something that should be built into API operations, and discovered on a program by program basis.

4. How would APIs best interact with other systems already in use in student aid processes (e.g., within States)?

The only way you will know is if you do it. How is the IRS-efile system helping with this, but it isn’t even a perfect model to follow. We will never know the potential here until a platform is stood up, and resources are made available. All signs point to APIs opening up a huge amount of interoperability between not just states and the federal government, but also with cities and counties.

5. How would Department APIs benefit or burden institutions participating in title IV, HEA programs?

If APIs aren’t given the proper resources to operate it can introduce security, privacy and support concerns that would not have been there before. A properly run API initiative will provide support, while an underfunded, undermanned initiative will just further burden institutions.

6. While the Department continues to enhance and refine its own processes and products (e.g., through improvements to FAFSA or the IDR application process), how would third-party efforts using APIs complement or present challenges to these processes?

These two things should not be separate. The internal efforts should be seen as just another partner layer within the API ecosystem. All future service and products developed internally within the Department of Education should use the same API infrastructure developed for partners and the public.

If APIs are not used internally, API efforts will always fail. APIs are not just about providing access to external resources, it is about opening up the Department to think about its resources in an external way that benefits the public, partners as well as within the government.


Zapier Looks To Educate Everyone With An Introduction To APIs

API interoperability and reciprocity provider Zapier is looking to get everyone up to speed on the world of APIs, by providing an introduction to APIs that is meant for both developers who are new to APIs, and easy enough for non-developers to follow.

To help get people up to speed on APIs, Zapier’s introduction has eight chapters covering the big picture:

  • Chapter 1: Introduction
  • Chapter 2: Protocols
  • Chapter 3: Data Formats
  • Chapter 4: Authentication, Part 1
  • Chapter 5: Authentication, Part 2
  • Chapter 6: API Design
  • Chapter 7: Real-Time Communication
  • Chapter 8: Implementation

It is good to see providers like Zapier develop an introductory course for people to learn about APIs. It is in Zapier’s interest to help get people up to speed—the more folks who understand that APIs exist, the more chance they will use Zapier as the platform they depend on to orchestrate their lives in the cloud.

Zapier’s move reflects my original vision behind API Evangelist, and my belief that everyone should understand APIs, and I mean everyone! Much like personal finance, not everyone needs to understand the inner workings of APIs, but every individual should understand they exist, and how they can use services like Zapier to take control of our digital self.


The Future Of Public Private Sector Partnerships Being Negotiated At The API oAuth Scope Level

A couple of weeks ago I attended a two day API specification session between major California utilities, Southern California Edison (SCE), San Diego Gas & Electric (SDG&E), and Pacific Gas and Electric (PG&E), that was organized by Hypertek Inc. for National Institute of Standards and Technology (NIST), that is looking to push forward the Green Button data and API agenda of the White House and Department of Energy.

The Green Button API and open data model already exists, but the current hurdle for the initiative is to get leading utilities to agree to specific implementation definitions that serve their customers, so it can be ratified as a standard. This entire two day session was dedicated to walking through line by line, and establish the oAuth scope that each of the three utility companies would implementing when providing Green Button data via the API to 3rd party developers, who are building solutions for utility customers.

An example of this in the wild, would be if a utility customer wanted to get a quote for a rooftop solar installation, the installer could use a software solution that was developed by a 3rd party vendor, that pulls that customers energy usage via the Green Button API, and generate a precise quote for the number of solar panels they’d need to cover their energy usage. Now this is just one example of how energy data could be used, but a very powerful and relevant one in 2014, that give customers access and control over their data.

Before we even get there, the Green Button API definition, data model, and oAuth scope has to be finalized with utility providers, before it can be submitted as a standard—with the final oAuth scope being what will be agreed to by each 3rd party developer that integrates with each utility's Green Button API. This oAuth scope sets the tone for the conversation that 3rd party software providers, and vendors can ultimately have with utility customers around their private utility data.

In my opinion this is one potential model for how industry operations will be negotiated in the future. While not all API definitions, data models and oAuth scopes will be defined by the government then negotiated with industries, but oAuth will be where this scope is negotiated between industry leaders and governments, no matter which side leads the conversation. Governments and industries working together to define API standards is nothing new, but what is new, is the presence of the oAuth layer which gives end-users, and citizens an active role. 3rd party developers and end-users do not get a place in the negotiations, but they do when it comes to putting valuable industry data to use in new ways--or not, which will flow back upstream and influence future iterations of APIs, data models and oAuth scope.

This introduction of the oAuth layer in how industries operate, potentially coupled with the access and innovation that can occur via API platforms when they are executed in alignment with modern API implementations, will change how industries grow and ultimately how power flows (pun intended). This evolution in how the public and private sector partner will not always positive by default, I’m not an API solutionist in that I believe APIs bless everything as good, but with the right players at the table, a little sunlight and transparency in the process, and a feedback loop that includes 3rd party developers and end-users, we might be able to implement meaningful change across many different industries around the globe.


APIs And oAuth In A Future Where You Control All Your Data

Open Authorization or oAuth, is a standard used by API providers to identify who is accessing data made available via APIs, in a way that allows for granular access to this data, and empowers end users who created and own the data, with some control in the process.

As a Facebook user, you get to decide who has access to your Facebook content via the Facebook API. If you are using a mobile application that desires access to your Facebook, using oAuth you can give access to the developer of the application, and decide exactly what data they can have access to.

If done right, API access, secured using oAuth can give platforms like Facebook control over their platforms data, while also opening it up to developers to build new applications on top of, while giving the right amount of control and decision making to the platform’s end users—a seemingly perfect balance!

In reality, oAuth gets used only as identification, not fully realizing the potential for granular control over data, and rarely gives developers and end-users the tooling and control that would truly bring balance to operations. There are some really good uses of oAuth out there, but there are also some really bad uses, and obviously not enough use of oAuth overall.

While I’m not delusional in thinking that oAuth is a perfect solution, it holds one potential blueprint for what the future could look like. The reason our digital landscape looks like it does right now, with NSA spying, and tech companies taking in billions, is things are not balanced, all the way up the data food chain.

End-Users
The users of applications like Facebook, Instagram and other leading online platforms are not at all educated about the value they generate each day, and given very little control over their what they generate. The average user generates data, is completely ignorant, and has no ability to access, manage or own their data.

Developers
When building web and mobile applications, developers are not always given the benefit of oAuth access, required to play mediator between users and the platform, knowing their login credentials, and having full access to users data. When oAuth is available, developers are required to employ as part of development, but given very little oAuth education, best practices, or the granular level control oAuth affords—distilling oAuth down to just authentication, and not about access at all.

Platforms
Online platform are not using oAuth to its fullest, only using as basic identity, if they are using at all. oAuth access and flows are rarely master planned, considering all 3 legs of the oAuth flow, leaving a platform that tilts in a providers favor. Many of the business models of these platforms depend on keeping end-users generating valuable content with as little control and ownership, developers building applications with as little education and skin in the game, treating user generated content as purely the intellectual property of a platform.

That represents the 3 legs of oAuth as we know it today. I paint a somewhat grim picture to showcase the systemic illness that exists in a space that is often painted as all positive by tech companies and their investors. What is more worrisome, is right behind the curtain, there are other players lurking who don’t even play by oAuth rules, even when it is in place.

Big Data
Right past your field of vision, beyond the horizon on your Facebook home page are a growing wave of big data opportunists. All this data you generate on Facebook, your friends, your wall posts, the latitude and longitude given by your mobile phone is all gathered and used to generate revenue for these platforms. Either a platform themselves have various big data projects going on, or they are working with other external partners to extract revenue from the data you generate.

Government
Beyond the next mountain range, in this new digital landscape, beyond where you can see, our government is pulling data across the Internet, introducing another actor, that doesn’t play by the oAuth rules, even when it is in place. Post Snowden, we all now know that the NSA is using all of our online data--to protect us, right?

None of the big data projects or government projects that exist show up as a blip on oAuth, and it is unlikely that they ever will. First, not all systems use APIs, and rely on traditional network and database connections to exchange and move data around. Second, even if there are APIs and oAuth in place, platforms aren’t going to give up their core revenue generator, and the government doesn’t want to give away what they are doing, because they are all hush, hush secret.

Let’s try looking at this differently. Let’s play the record above, backwards. What does this look like, in a utopian API-oAuth land?

Government
Government of all levels would be required to use APIs for accessing online content, leaving entries in oAuth registry for all access, and what was accessed. Of course if this is part of an ongoing investigation it wouldn’t show up immediately, but always with the understanding that at some point a user would have access to who accessed their information, and what they looked at. As soon as a citizen gets online, they should begin interacting with their government to get healthcare, education, and ultimately become a taxpaying citizen—trust is essential to the lifecycle.

Big Data
Projects of all shapes and sizes would have to register the usage of data, even if it is anonymized and / or used in aggregate. If my social profile is part of your big data study I should know, and have the opportunity to opt in or out. Hell, I’m playing this record backwards in my utopia—I want all users to get a piece of the action. There are plenty of co-op models to choose from. You generate data for a report that gets sold for 10K? 50% of that revenue should be divided up and paid out to all participants who shared data. (oh yes, Silicon Valley will love this idea).

Platforms
I am a big fan of platforms like Twitter, Flickr, Evernote, and other online platforms I depend on, and generate data through daily. I’m all for them making money through their innovation, algorithms and secret sauces, but I also want them to acknowledge where a portion of the value comes from. Imagine if all online platforms had data portability and APIs by default? Imagine if they all didn’t just have oAuth, but designed and deployed oAuth with developers and end-users in mind? A world where oAuth is the default for any online platform, not because they are told to, but because it helps them meet their mission, the company bottom line, with a nod towards the greater good.

Developers
oAuth can be hard, and developers need to be brought up to speed, educated, and provided with the resources they need to properly implement oAuth on behalf of a platform. So much education of end-users can be achieved through empowering developers to implement properly, using intuitive oAuth flows, and consistent, granular level access to resources. This type of empowerment of developers, will not just deliver value to end-users, it will deliver value to the core platform—driving a positive value and revenue generation that will keep things growing in a healthy way.

End-Users
Whether its sharing a random thought on Facebook or applying for student financial aid from the Department of Education, end-users should be educated about the platforms they use, with an understanding of what data is generated as part of the process, and given control over who has access to this data, with control over how they can use it. As more of our lives move online, and are managed via our mobile phones, the need for APIs with oAuth to deliver valuable data, content and other resources has grown exponentially. To make this sustainable, end-users need to be educated—platforms will benefit from it.

We have to begin to look at content and data access online in this new API economy, in the same ways we look at the other legacy aspects of the economy. Markets don’t work if we don’t have a large base of educated investors, making trades and investments. The financial world ceases to operate if consumers aren’t literate in how to manage their finances, making investments, and generating savings.

APIs are touching all aspects of our economy from healthcare and education to the new sectors like cloud computing, and Internet of things(Iot). While this story may seem crazy to many technologists and people who are heavily invested in the current silicon valley paradigm—it shows two possible futures that are within our grasp. I don’t know about you, but after a taste of what the NSA and Silicon Valley has dished up, I like some of whats on the menu, but most of it is shit that I’m not interested in being fed.

I don’t mind companies making money off my use of their platforms (paid or free), but acknowledge my role in the algorithm that generates your revenue. If I’m the product, give me some revenue share and say in the process. If you don’t, just as you were the disruptive innovator, so will you be disrupted. You can’t extract value out of a platform, without generating and giving value back. Shit moves too fast on the Internet in 2014, you need to balance things out with your developers, as well as your end-users.

I don’t mind the government tracking on what their citizens are doing, but acknowledge us as citizens. We do need to be protected, but we also need to be responsible for the role we play in society. Leave a trail of what you are doing so it can be audited by your watchers, whoever they are, and be accountable. Last time I checked, this was how our country operates. Stop being such douche bags and treating us like children. If you are pulling data, at some point we should know as a country, and as an individual, and you the government should know you are never above this, and at some point have to answer for what you are up to.

We have the technology to make all of this possible, we just have to have the will. In all of this I do not see APIs or oAuth as the solution, I see the will of governments, enterprise, startups, developers and end-users as the solution. This isn’t technical solutionism, its human solutionism. We just need to position technology in the right way, augment the best that is all of us, and work together, and we will find a balance that can work.


Details About The 52 Online Services I Depend On

I went through all the online services I use, and made sure all of them are listed, and I understood more about why and how I use a service. So far I have 52 servies I depend on, providing a pretty good map of my online domain.

3Scale

  • Why do I use this service? API Management
  • What content do I generate via this service? user, access and traffic logs
  • Do I pay for this service? - Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes via email
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? No
  • Do I currently integrate with this services API? - No

about.me

  • Why do I use this service? - Profile Page
  • What content do I generate via this service? None
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes - http://about.me/developer/sdk/docs/
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Alchemy API

  • Why do I use this service? - Text Extraction
  • What content do I generate via this service? None
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes 
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Amazon Web Services (AWS)

  • Why do I use this service? - Central hosting and stroge
  • What content do I generate via this service? None
  • Do I pay for this service? Yes
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Angellist

  • Why do I use this service? - Business Profile
  • What content do I generate via this service? None
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes  https://angel.co/api
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Anypoint Platform

  • Why do I use this service? API Management
  • What content do I generate via this service? APIs
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Apiary.io

  • Why do I use this service? API Design
  • What content do I generate via this service? APIs
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

AT&T

  • Why do I use this service? - Mobile Phone
  • What content do I generate via this service? None
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Crunchbase

  • Why do I use this service? - Business Profile
  • What content do I generate via this service? Profile
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Disqus

  • Why do I use this service? - Commenting
  • What content do I generate via this service? Comments
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Dropbox

  • Why do I use this service? - Storage
  • What content do I generate via this service? Files
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - Yes
  • Do I currently integrate with this services API? - No

Drupal

  • Why do I use this service? - Nothing
  • What content do I generate via this service? None
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Dwolla

  • Why do I use this service? - Payments
  • What content do I generate via this service? Payments
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

DZone

  • Why do I use this service? - Blog Syndication
  • What content do I generate via this service? Blog Syndication
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

EventBrite

  • Why do I use this service? - Event Management
  • What content do I generate via this service? Events
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Evernote

  • Why do I use this service? - Notetaking
  • What content do I generate via this service? Notes
  • Do I pay for this service? Yes
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - Yes
  • Do I currently integrate with this services API? - No

Facebook

  • Why do I use this service? - Social
  • What content do I generate via this service? Messages, Photos, Videos, Friends
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Flickr (Yahoo)

  • Why do I use this service? - Manage photos
  • What content do I generate via this service? Photos
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Foursquare

  • Why do I use this service? - Track my locations
  • What content do I generate via this service? Checkins, Photos
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

FullContact

  • Why do I use this service? - Contact Profling
  • What content do I generate via this service? Profile
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Geeklist

  • Why do I use this service? - Developer Profile
  • What content do I generate via this service? Profile
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Github

  • Why do I use this service? - Manage all projects
  • What content do I generate via this service? Websites, Code
  • Do I pay for this service? Yes
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Gliffy

  • Why do I use this service? - Diagramming
  • What content do I generate via this service? Diagrams
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

GoDaddy

  • Why do I use this service? - Domain Management
  • What content do I generate via this service? Domains
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - Yes
  • Do I currently integrate with this services API? - No

Google

  • Why do I use this service? - Primary account
  • What content do I generate via this service? Email, Contacts, Calendar, Documents
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - Yes
  • Do I currently integrate with this services API? - Yes

Hacker News

  • Why do I use this service? - News syndication
  • What content do I generate via this service? Bookmarks
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Hover

  • Why do I use this service? - Domain Management
  • What content do I generate via this service? Domains
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

IFTTT

  • Why do I use this service? - Automation
  • What content do I generate via this service? Jobs
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Instaper

  • Why do I use this service? - Reading service
  • What content do I generate via this service? Bookmarks
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Klout

  • Why do I use this service? - Social Ranking
  • What content do I generate via this service? No
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No
  • Laneworks - http://control.laneworks.net/admin/

Lanyrd

  • Why do I use this service? - Event discovery
  • What content do I generate via this service? Events
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

LinkedIn

  • Why do I use this service? - Social
  • What content do I generate via this service? Messaging, Links
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Mashape

  • Why do I use this service? - API Management
  • What content do I generate via this service? API Profiles
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Meetup

  • Why do I use this service? - Event Discovery
  • What content do I generate via this service? Events
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Mega

  • Why do I use this service? - File Storage
  • What content do I generate via this service? Files
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Noun Project

  • Why do I use this service? - Image Discovery
  • What content do I generate via this service? Images
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Paypal

  • Why do I use this service? - Payments
  • What content do I generate via this service? Payments
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Pinboard

  • Why do I use this service? - Bookmarking
  • What content do I generate via this service? Bookmarks
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Plancast

  • Why do I use this service? - Event discovery
  • What content do I generate via this service? Events
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Quora

  • Why do I use this service? - QA
  • What content do I generate via this service? Questions, Answers
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Reddit

  • Why do I use this service? - Bookmarking
  • What content do I generate via this service? Bookmarks
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Serve

  • Why do I use this service? - Payments
  • What content do I generate via this service? Payments
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Soundcloud

  • Why do I use this service? - Audio Discovery
  • What content do I generate via this service? Audio Files
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Square

  • Why do I use this service? - Payments
  • What content do I generate via this service? Payments
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

Stack Overflow

  • Why do I use this service? - QA
  • What content do I generate via this service? Question, Answers
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

StumbleUpon

  • Why do I use this service? - Bookmarks
  • What content do I generate via this service? Bookmarks
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Thingiverse

  • Why do I use this service? - 3D Printing
  • What content do I generate via this service? 3D Designs
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Tumblr

  • Why do I use this service? - Blogging
  • What content do I generate via this service? Blog
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - Yes
  • Do I currently integrate with this services API? - No

Twitter

  • Why do I use this service? - Tweeting
  • What content do I generate via this service? Tweets, Friends
  • Do I pay for this service? No
  • Does this service provide data portability? - Yes
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

VectorStock

  • Why do I use this service? - Stock Images
  • What content do I generate via this service? Images
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - No
  • Does this service have an API? - No
  • Does this service offer oAuth? - No
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Yahoo

  • Why do I use this service? - Profile
  • What content do I generate via this service? Profile
  • Do I pay for this service? No
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - No

Zapier

  • Why do I use this service? - Automation
  • What content do I generate via this service? Jobs
  • Do I pay for this service? Yes
  • Does this service provide data portability? - No
  • Can I terminate my use of this service? - Yes
  • Does this service have an API? - Yes
  • Does this service offer oAuth? - Yes
  • Does this service offer 2 Factor Authentication? - No
  • Do I currently integrate with this services API? - Yes

I think I will need to build some sort of tracking system for the services I use. Something that runs on Github, and can be forked by anyone, and made public or private. 

I'll update this as I add new ones. I know there are more servies I use, but it is hard to remember all of them.


Reclaim Profile For Each Service I Depend On

As I’m going through each of the 50+ services I depend on, and change my password this weekend, I want to apply a little critical reclaim my domain thinking to each service as I pass through. I’m not naive to think I will be able to reclaim 100% of my domain, but I should have a bar defined, of what I expect from each provider in the area of domain management.

What is the the minimum I need to know about each online service I depend on?

  • Why do I use this service?
  • What content do I generate via this service? (ie. Messages, Images, Videos)
  • Do I pay for this service?
  • Does this service provide data portability? (Download of settings and content)
  • Can I terminate my use of this service?
  • Does this service have an API?
  • Does this service offer oAuth?
  • Does this service offer 2 Factor Authentication?
  • Do I currently integrate with this services API?

That will do for now. Eventually I’m sure I will have more questions I will need to ask, but for this round I am just looking to generate a profile each service and identify whether or not they have an API.

By establishing a Reclaim Profile for each online service, I will have mapped out my online domain--where I exist online, and potentially where I generate content and other information I may want to reclaim.


API Definitions: How Do They Model REST?

Last week at #APIStrat Amsterdam, I moderated, and presented in a session that was called API service descriptions. I gave the talk for the first 15 minutes, then Sumit Sharma (@sumitcan), Ole Lensmar (@olensmar), and Ruben Verborgh (@RubenVerborgh) followed me-- the full video is on Youtube if you are interested.

Over the last couple months I've been doing a deeper dive into the area of API design, with a specific look at API definition formats from API Blueprint, RAML and Swagger, so the session was intended to help me continue the conversation, in person, on the stage at #APIStrat Amsterdam. I'm happy I did, because Ole came to the table with some valuable data on API definitions, that save me some valuable research hours.

I'm breaking up his work into several smaller posts, you can find his full deck on slideshare, next up after API Definitions: What Is Behind The Name?, is a side by side comparison of how Blueprint, RAML and Swagger each model REST:

 

API-Blueprint

RAML

Swagger

Resources

X

X

X (“api”)

Methods/Actions

X (“action”)

X (“method”)

X (“operation”)

Query Parameters

X

X

X

Path/URL Parameters

X

X

X

Header Parameters

X

X

X

Representations

(status codes, mime-types)

X

X

X

Documentation

X

X

X

Authentication

 

Basic, Digest, Oauth 1&2, (*)

Basic, API-Key, OAuth 2

Representation Metadata

<any> (inline)

<any> (inline/external)

JSON Schema (subset)

Nested Resources

X

X

 

Composition/Inheritance

Resource Models

Traits, Resource Types

 

File inclusions

 

X

 

API Version metadata

 

X

X

Sample Representations

X

X

 

Ole provides a nice overview of the three leading API definition formats, giving API providers a good side-by-side summary that can be used when deciding which format to support. I will work with Ole to help keep the numbers up to date, and include in my final research white paper for API design when finished.

Thank you too Ole Lensmar (@olensmar) and Smartbear Software for doing this research, and allowing me to share it with you.


I Am Up To 34 APIs Out Of The Netherlands

As we gear up for API Strategy & Practice in Amsterdam, March 24-26th, I’m spending my time getting to know API companies across Europe. I published a listing of APIs I found from the United Kingdom a few days ago, and next up is taking a look at what APIs are coming out of the Netherlands.

Last week I did a roundup of 17 APIs out of the Netherlands, but thanks to Twitter, and specifically Gerard van Enk (@gvenk) who was curating a spreadsheet of APIs, I was able to double the amount of APIs I'm tracking on out of the country.


Arts Holland

Arts Holland is a consortium of three leading players in the field of arts, culture, new media and tourism. The institutes part of this consortium are the Netherlands Board of Tourism & Conventions, Netherlands Uitburo and Waag Society. Together with organizations in the field of arts, culture, communication, transport, creative industry and technology, a series of tools will be developed that will guide any arts lover through the high-brow cultural landscape Holland has to offer to it’s visitors. The Arts Holland data platform, developer's site and SPARQL tutorial are created and hosted by Waag Society.


BOL.com

The bol.com Open API gives you access to the complete range of the biggest online store in the Netherlands. Use bol.com data into your own concept or application and earn money through the affiliate program bol.com.


CitySDK

The CitySDK Linked Open Data Distribution API is a linked data distribution platform. Developed by Waag Society, the distribution API is a component of the CitySDK toolkit. This toolkit supports the development of open and interoperable interfaces for open data and city services in eight European cities (Amsterdam, Helsinki, Manchester, Lisbon, Istanbul, Lamia, Rome and Barcelona).


Cloudspeakers

The Cloudspeakers API is a REST API and makes it possible for developers to access the data of Cloudspeakers. Cloudspeakers tries to match all found reviews, audio and video files to the MusicBrainz database. MusicBrainz is a community music metadatabase that attempts to create a comprehensive music information site.


Democracy API

The Democracy API enables developers to interact with the Democracy web site programmatically. It's designed to make it possible for anyone to improve Democracy or integrate Democracy into other applications. You can develop a Democracy interface for a mobile phone, build a Democracy widget for your blog, or develop an application that makes it easy to post photos to your feed from your iPhone.


Distimo

Distimo has a very clear objective: to make the app market transparent. The company was born out of the frustration of a lack of insights into the performance of apps and the manual work needed to track important metrics. Our goal is to provide the best and most actionable app intelligence for anyone who wants to compete in the app market. Our data-driven team seeks to help developers, brands and financial services companies gain actionable, timely and factual knowledge of what’s happening daily in the global app market.


Drillster

The Drillster API enables any developer to write applications that interact with Drillster. The API is based on the principles of REST, and comes in both an XML and a JSON flavor. Authentication is taken care of by the OAuth 2.0 protocol. Non-commercial use is free of charge, but commercial use requires prior arrangement.


Dutch Parliament

Digitization of collections creates new opportunities for cooperation. Are you interested in offering collections of KB through your own online channels? Want your digital collection enrich our datasets?


Dutch Schools

The educational system in the Netherlands is complex, and many public bodies are tasked with keeping schools and students on the right track. However, data regarding this task is made available to the general public in a way it is not easily processable, which makes checks by the public (and press) more difficult. The OpenOnderwijs API alleviates this problem by providing a unified interface to data on education collected from several institutions with different responsibilities with regard to the Dutch educational system, such as DUO, Onderwijsinspectie and Vensters voor Verantwoording. The OpenOnderwijs API is built on open source technology, such as Scrapy for scraping school data (the scrapers can be found at Github), Sphinx for the documentation, and ElasticSearch for efficient data storage and retrieval. Also, all school addresses are geocoded using the BAG42 service, an Dutch initiative aiming to integrate high quality, official data sources with “regular” geocoding.


ElasticSearch

Elasticsearch is on a mission to organize data and make it easily accessible. The company delivers the world’s the most advanced open source search and analytics engine available and make real-time data exploration available to anyone. By having a laser focus on achieving the best user experience imaginable, Elasticsearch has become one of the most popular and rapidly growing open source solutions in the market. Today, Elasticsearch is used by thousands of enterprises in virtually every industry. We take good care of our customers and users, providing production support, development support and training worldwide.


Geosophic

Geosophic is the gaming platform that allows you to get behavior analytics from your players while offering them new engagement triggers in the form of geolocated leaderboards. Geosophic is your ‘neighbourhood arcade’ of the mobile age. Our location based leaderboards allow gamers to show off that they’re the best in their cities, countries as well as the whole world. They also include customized game recommendations (performance advertising) which creates a new monetization channel for the developers


Ikdoe API

The Ikdoe API is a RESTful API. Since ikdoe is all about activities, the only resource we provide through this api is the activity resource. At the moment we only support XML as response format, but might support additional formats in the future.


Incubate

Incubate has launched the Incubate API to make all festival-data available for use in apps, websites or even art created by Incubate-minded developers. The Incubate API is an Ally API, a REST oriented JSON Web Service Interface to communicate with the Ally database. Information about artists, time schedules, venues or the latest festival news can be easily collected using the API.


Layar

Layar B.V. designs and develops mobile phone application. It offers Layar, a mobile augmented browser that provides information on top of the camera display view in various categories, including eating and drinking, entertainment and leisure, games, government, health care, local search and directory services, real estate, retail, schools and universities, social networks and communities, tourism, transportation, and weather.


Mobypicture

Share your adventures instantly with your friends. Mobypicture enables users to share photos, video, text and audio directly to their friends on their favorite social sites like Flickr, Facebook, YouTube, Wordpress, Twitter and many more. Mobypicture supports groups, geolocation and a lot of extra features.


MoneyBird

MoneyBird is an online service providing fast and easy billing. You can create, manage and send invoices, but also manage your contacts, send recurring invoices and manage your expenses. PayPal integration available to enable your contacts to pay even faster!


MovieMeter

The MovieMeter API is a XML-RPC webservice which you can use to retrieve film information in Dutch. This API is not intented to provide a complete set of MovieMeter functionalities, for instance the API doesn't provide operations to log in, vote or place new messages. You are allowed to create a website or application which has a primary goal of showing MovieMeter information. However, you must make it clear that while you're application uses our information, it's not made by MovieMeter. Also don't name your application "MovieMeter" or something similar. The application or website must be free of charge and ads. If you don't comply to these rules, your access to the API could be revoked.


Nimbuzz

Nimbuzz is the free call and messaging app for the connected generation. Nimbuzz combines the powers of the Internet and mobile communications into one, and lets you meet, share and connect with family and friends on any mobile device. Nimbuzz users enjoy the freedom of communicating with friends between any internet enabled device, from mobile to mobile, mobile to PC/Mac and vice versa, harnessing the power of the Internet (Wi-Fi, 3G, 2G, GPRS). With its mobile, Web, Wap and desktop clients, Nimbuzz is available on thousands of the worlds’ most popular devices across all major platforms – Nokia Symbian, iPhone, iPod touch, Android, BlackBerry, J2ME, as well as Windows and Mac desktop computers. Social networks & communities supported by Nimbuzz include, Facebook®, GoogleTalk , Twitter, Yahoo!.


NS API

NS has a large amount of data with information on the planned and actual schedules. We make this information like available to developers with a RESTful API. NS API currently features the following services: Prices, Live departures, Faults and activities, the station list of all stations in the Netherlands including geodata, Travel advice from station to station.


OIPA

OIPA is a framework that provides a rich and usable API for parsing, ingesting, storing and searching IATI standard compliant datasets.


Open Images

Open Images is an open media platform that offers online access to audiovisual archive material to stimulate creative reuse.


Openkvk

openkvk.nl is a database created from a collection of data from various sources including kvk.nl , belastingdienst.nl and rechtspraak.nl . These sources are public and subject to the legal publication and conditions. To get the data from these sources to integrate into this system and make searchable we run daily maintenance queries, and quarterly we refresh the sources completely.


PeerReach

Peerreach is an Amsterdam-based social media startup that provides an influence metric that measures your influence within different areas of expertise. The Peerreach algorithm is similar to the pagerank algorithm that Google uses to identify influential websites.


Postcode API

The Dutch postal codes, addresses and locations, since early 2012 public data. We believe that public data should be accessible via accepted standards. Really open And that everyone has access to that data, and that data can be used. The Postcode API gives you free and easy access to the postcode database of the Land Registry.


React.com

React.com offers web services to connect your web site or mobile application with a continuously growing broad selection of social networks. For quick integration we offer hosted services which will allow you to directly use social network functionality by just adding HTML to your website. No need to ask your users for passwords and profiles, let them register and log in with their social network accounts through social sign-in integration.


ReadSpeaker

ReadSpeaker speech-enables online content on the fly in 35+ languages and 100+ voices. In 1999, ReadSpeaker pioneered the first-ever speech-enabling application for websites. Today, the company provides a portfolio of web-based text-to-speech solutions for websites, mobile sites, mobile apps, RSS feeds, online documents and forms, as well as online campaigns. Its solutions are used by over 5000 corporate, media, government, and nonprofit customers around the world.


Rijksmuseum API

The Rijksmuseum API (Application Programming Interface) is a new, state-of-the-art service for application developers. Using the API allows the Rijksmuseum collection and other content, and (high resolution) images available for use in, for example apps or web applications.


Springest

Springest is a Dutch continuing education service that helps users find and compare courses, books, articles, training courses using search and filter capabilities. The Springest API opens up all of the searchable data from Springest to users. Users can query for training guides, courses, and other content as well as integrate it into programs. The service uses REST calls and returns XML or JSON.


The Wrds API

With Wrds API you can link your own program Wrds . You can let the program then lists Wrds add , view, edit , and remove them . For example, if you want a MSN-bot that you hear derogatory words about you , then you can use this API .


Total Film

Total Film provides developers free access to its movies on TV database through an API. With this, the total supply of television films for today and tomorrow can be obtained. More APIs will be released in the future.


TwitterCounter

 

TwitterCounter is Feedburner for Twitter.

TwitterCounter tracks Twitter users and displays attractive stats to everybody interested. Bloggers can add a simple button displaying the number of Followers they have on Twitter.

Thousands of blogger currently display the TwitterCounter button which links to their personal stats page at Twittercounter.com. All these buttons drive traffic to TwitterCounter which shows simple text ads next to the Twitter stats. This creates a positive feedback loop of increasing members, traffic and ad views.

 


Viewbook

Viewbook exposes an API for 3rd party tools to enhance Viewbook using the Flickr REST style API for it's request and response formats. Viewbook uses Flickr's REST implementation. Requests and responses are the same (with a few minor differences, see below), so you can follow the documentation at Flickr to implement the Viewbook API.


Weather API

This is a very basic JSON API for very basic weather data from the Royal Netherlands Meteorological Institute. The KNMI offers no simple API for simple data, so I built this one, which scrapes their website. Currently, it only offers a single API call, for the latest weather observations at 36 monitoring stations.


Webservices.nl

Webservices.nl has a mission to raise. Data quality of Dutch companies at the highest level to make this possible, it offers a variety of convenient online data services, which every company cheap and simple data data to validate and enrich.


What I see when I look through these 34 APIs, is a wealth of creative talent exposing their resources as APIs. You have art, museums, images, photos, music and other right brain activity. 

With multiple Paris API events, and a single API event in Madrid so far, I'm very pleased with the types of API conversations going on in Europe, and getting really excited to see who comes together to talk APIs in Amsterdam. 

I’m sure there are still other Dutch API efforts I’m missing, so if you know of any companies with a significant presence in the Netherlands, doing cool stuff with APIs, make sure and let me know@kinlane.


APIs Coming Out Of The Netherlands

As we gear up for API Strategy & Practice in Amsterdam, March 24-26th, I’m spending my time getting to know API companies across Europe. I published a listing of APIs I found from the United Kingdom a few days ago, and next up is taking a look at what APIs are coming out of the Netherlands.

You tend to not think of what country an API is from, unless its attached to public infrastructure, or a company is extremely vocal about their home country. I confess that I couldn’t name a single API from Amsterdam before this week, but now I notice that Distimo, Drillster, ElasticSearch, Peerreach, and TwitterCounter were all out of the Netherlands, and were APIs I am already familiar with.

I was able to easily find 17 separate APIs out of the Netherlands:


Arts Holland

Arts Holland is a consortium of three leading players in the field of arts, culture, new media and tourism. The institutes part of this consortium are the Netherlands Board of Tourism & Conventions, Netherlands Uitburo and Waag Society. Together with organizations in the field of arts, culture, communication, transport, creative industry and technology, a series of tools will be developed that will guide any arts lover through the high-brow cultural landscape Holland has to offer to it’s visitors. The Arts Holland data platform, developer's site and SPARQL tutorial are created and hosted by Waag Society.


BOL.com

The bol.com Open API gives you access to the complete range of the biggest online store in the Netherlands. Use bol.com data into your own concept or application and earn money through the affiliate program bol.com.


Democracy API

The Democracy API enables developers to interact with the Democracy web site programmatically. It's designed to make it possible for anyone to improve Democracy or integrate Democracy into other applications. You can develop a Democracy interface for a mobile phone, build a Democracy widget for your blog, or develop an application that makes it easy to post photos to your feed from your iPhone.


Distimo

Distimo has a very clear objective: to make the app market transparent. The company was born out of the frustration of a lack of insights into the performance of apps and the manual work needed to track important metrics. Our goal is to provide the best and most actionable app intelligence for anyone who wants to compete in the app market. Our data-driven team seeks to help developers, brands and financial services companies gain actionable, timely and factual knowledge of what’s happening daily in the global app market.


Drillster

The Drillster API enables any developer to write applications that interact with Drillster. The API is based on the principles of REST, and comes in both an XML and a JSON flavor. Authentication is taken care of by the OAuth 2.0 protocol. Non-commercial use is free of charge, but commercial use requires prior arrangement.


Dutch Parliament

Digitization of collections creates new opportunities for cooperation. Are you interested in offering collections of KB through your own online channels? Want your digital collection enrich our datasets?


Dutch Schools

The educational system in the Netherlands is complex, and many public bodies are tasked with keeping schools and students on the right track. However, data regarding this task is made available to the general public in a way it is not easily processable, which makes checks by the public (and press) more difficult. The OpenOnderwijs API alleviates this problem by providing a unified interface to data on education collected from several institutions with different responsibilities with regard to the Dutch educational system, such as DUO, Onderwijsinspectie and Vensters voor Verantwoording. The OpenOnderwijs API is built on open source technology, such as Scrapy for scraping school data (the scrapers can be found at Github), Sphinx for the documentation, and ElasticSearch for efficient data storage and retrieval. Also, all school addresses are geocoded using the BAG42 service, an Dutch initiative aiming to integrate high quality, official data sources with “regular” geocoding.


ElasticSearch

Elasticsearch is on a mission to organize data and make it easily accessible. The company delivers the world’s the most advanced open source search and analytics engine available and make real-time data exploration available to anyone. By having a laser focus on achieving the best user experience imaginable, Elasticsearch has become one of the most popular and rapidly growing open source solutions in the market. Today, Elasticsearch is used by thousands of enterprises in virtually every industry. We take good care of our customers and users, providing production support, development support and training worldwide.


Geosophic

Geosophic is the gaming platform that allows you to get behavior analytics from your players while offering them new engagement triggers in the form of geolocated leaderboards. Geosophic is your ‘neighbourhood arcade’ of the mobile age. Our location based leaderboards allow gamers to show off that they’re the best in their cities, countries as well as the whole world. They also include customized game recommendations (performance advertising) which creates a new monetization channel for the developers


Nimbuzz

Nimbuzz is the free call and messaging app for the connected generation. Nimbuzz combines the powers of the Internet and mobile communications into one, and lets you meet, share and connect with family and friends on any mobile device. Nimbuzz users enjoy the freedom of communicating with friends between any internet enabled device, from mobile to mobile, mobile to PC/Mac and vice versa, harnessing the power of the Internet (Wi-Fi, 3G, 2G, GPRS). With its mobile, Web, Wap and desktop clients, Nimbuzz is available on thousands of the worlds’ most popular devices across all major platforms – Nokia Symbian, iPhone, iPod touch, Android, BlackBerry, J2ME, as well as Windows and Mac desktop computers. Social networks & communities supported by Nimbuzz include, Facebook®, GoogleTalk , Twitter, Yahoo!.


PeerReach

Peerreach is an Amsterdam-based social media startup that provides an influence metric that measures your influence within different areas of expertise. The Peerreach algorithm is similar to the pagerank algorithm that Google uses to identify influential websites.


Postcode API

The Dutch postal codes, addresses and locations, since early 2012 public data. We believe that public data should be accessible via accepted standards. Really open And that everyone has access to that data, and that data can be used. The Postcode API gives you free and easy access to the postcode database of the Land Registry.


ReadSpeaker

ReadSpeaker speech-enables online content on the fly in 35+ languages and 100+ voices. In 1999, ReadSpeaker pioneered the first-ever speech-enabling application for websites. Today, the company provides a portfolio of web-based text-to-speech solutions for websites, mobile sites, mobile apps, RSS feeds, online documents and forms, as well as online campaigns. Its solutions are used by over 5000 corporate, media, government, and nonprofit customers around the world.


Rijksmuseum API

The Rijksmuseum API (Application Programming Interface) is a new, state-of-the-art service for application developers. Using the API allows the Rijksmuseum collection and other content, and (high resolution) images available for use in, for example apps or web applications.


Springest

Springest is a Dutch continuing education service that helps users find and compare courses, books, articles, training courses using search and filter capabilities. The Springest API opens up all of the searchable data from Springest to users. Users can query for training guides, courses, and other content as well as integrate it into programs. The service uses REST calls and returns XML or JSON.


TwitterCounter

 

TwitterCounter is Feedburner for Twitter.

TwitterCounter tracks Twitter users and displays attractive stats to everybody interested. Bloggers can add a simple button displaying the number of Followers they have on Twitter.

Thousands of blogger currently display the TwitterCounter button which links to their personal stats page at Twittercounter.com. All these buttons drive traffic to TwitterCounter which shows simple text ads next to the Twitter stats. This creates a positive feedback loop of increasing members, traffic and ad views.

 


Weather API

This is a very basic JSON API for very basic weather data from the Royal Netherlands Meteorological Institute. The KNMI offers no simple API for simple data, so I built this one, which scrapes their website. Currently, it only offers a single API call, for the latest weather observations at 36 monitoring stations.


I noticed numerous open data portals as I was doing my research into APIs out of the Netherlands, which I will showcase in a separate post—you can tell there is a serious passion for open data in the country.

I’m sure there are other Dutch API efforts I’m missing, so if you know of any companies with a significant presence in the Netherlands, doing cool stuff with APIs, make sure and let me know @kinlane.


What Are The Common Building Blocks of API Integration?

I started API Evangelist in 2010 to help business leaders better understand not just the technical, but specifically the business of APIs, helping them be successful in their own API efforts. As part of these efforts I track on what I consider the building blocks of API management. In 2014 I'm also researching what the building blocks are in other areas of the API world, including API design, deployment, discovery and integration.

After taking a quick glance at the fast growing world of API integration tools and services, I've found the following building blocks emerging:

Pain Point Monitoring
Documentation Monitoring - Keeping track of changes to an APIs documentation, alerting you to potential changes in valuable developer API documentation for single or many APIs
Pricing Monitoring - Notifications when an API platform's pricing changes, which might trigger switching services or at least staying in tune with the landscape of what is being offered
Terms of Use Monitoring - Updates when a company changes the terms of service for a particular platform and providing historical versions for comparison
Authentication
oAuth Integration - Provides oAuth integration for developers, to one or many API providers, and potentially offering oAuth listing for API providers
Provider / Key Management - Management of multiple API platform providers, providing a secure interface for managing keys and tokens for common API services
Integration Touch Points
API Debugging - Identifying of API errors and assistance in debugging API integration touch points
API Explorer - Allowing the interactive exploring of API providers registered with the platform, making calls and interacting and capturing API responses
API Feature Testing - The configuring and testing of specific features and configurations, providing precise testing tools for any potential use
API Load Testing - Testing, with added benefit of making sure an API will actually perform under a heavy load
API Monitoring - Actively monitoring registered API endpoints, allowing real-time oversight of important API integrations endpoints that applications depend on
API Request Actions
API Request Automation - Introducing other types of automation for individual, captured API requests like looping, conditional responses, etc.
API Request Capture - Providing the ability to capture a individual API request
API Request Commenting - Adding notes and comments to individual API requests, allowing the cataloging of history, behavior and communication around API request actions
API Request Editor - Allowing the editing of individual API requests
API Request Notifications - Providing a messaging and notification framework around individual API requests events
API Request Playback - Recording and playing back captured API requests so that you can inspect the results
API Request Retry - Enabling the ability to retry a captured API request and play back in current time frame
API Request Scheduling - Allowing the scheduling of any captured API request, by the minute, hour, day, etc.
API Request Sharing - Opening up the ability to share API requests and their results with other users via email, or other means
Other Areas
Analytics - Visual analytics providing insight into individual and bulk API requests and application usage
Code Libraries - Development and support of code libraries that work with single or multiple API providers
Command Line - Providing a command line (CL) interface for developers to interact with APIs
Dashboard - Web based dashboard with analytics, reports and tools that give developers quick access to the most valuable integration information
Gateway - Providing a software gateway for testing, monitoring and production API integration scenarios
Geolocation - Combining of location when testing and proxying APIs from potentially multiple locations
Import and Export - Allowing for importing and exporting of configurations of captured and saved API requests, allowing for data portability in testing, monitoring and integrationPublish - Providing tools for publishing monitoring and alert results to a public site via widget or FTP
LocalHost - Opening up of a local web server to a public address, allowing for webhooks and other interactions
Rating - Establishment of a ranking system for APIs, based upon availability, speed, etc.
Real-Time - Adding real-time elements to analytics, messaging and other aspects of API integration
Reports - Common reports on how APIs are being used across multiple applications and user profiles
Teams - Providing a collaborative, team environment where multiple users can test, monitor and debug APIs and application dependencies
Workflow - Allowing for the daisy chaining and connecting of individual API request actions into a series of workflows and jobs

What else are you seeing? Which tools and services do you depend on when you are integrating one or many APIs into your applications? What tools and services would you like to see?

I'm looking at the world of API design right now, but once I'm done with that research, I will be diving into API integration again, trying to better understand the key players, tools, services and the building blocks they use to get things done.


An API Evangelist Review Of Your API

Over the last 3 years I have looked at all the APIs available in ProgrammableWeb API directory, with about 2500 of which I monitor regularly. Throughout this process I've evolved an eye for what building blocks go into a successful API program.

When I review an existing API area or program, public or private, I spend as little or as much time as I need to look at an API initiative through the following 20 lenses:

  • Overview - The general look, feel and initial impression of an API at the high level.
  • Endpoints - Review of API endpoints, looking at the detail.
  • On-boarding - How easy is it to get up and going? Where is the friction?
  • Documentation - General API documentation review and critique.
  • Authentication - What is involved with authentication and security.
  • Code - A look at all available code libraries, SDKs, apps and language or platform availability.
  • Mobile - Is there a mobile fit and how does an API address this world.
  • Support - What do direct and indirect support practices look like and what are activity levels.
  • Communications - How are communications handled, and is it done in a transparent way.
  • Change Practices - How are updates, changes, and communication around the roadmap handled.
  • Business Model - What is the business model of an API, how does it make ends meet.
  • Resources - What resources are provided from how-to, case studies to videos and workshops.
  • Research & Development - Does an API reflect a research & development group.
  • Legal Department - Covering the legal areas of terms of use, privacy and branding.
  • Embeddable - How portable and embeddable are aspects of the API? Can it be distributed easily?
  • Environment - A look at the underlying environment of the API, sandboxing, testing, monitoring, etc.
  • Developers - What does this look like for a developer? What tools do we get to ensure success?
  • Consistency - How consistent are API endpoints and the support resources, and the overall operations.
  • Openness - If the API is public, to what degree of open are API efforts? 
  • Evangelism - What does outreach around the API look like, regarding events, social, storytelling, etc.

You can look across the building blocks I have listed for API management to get a better idea of the detail I'm looking for across API operations. These building blocks have been assembled through over 3 years of reviewing APIs, and can provide a good checklist to use when applying the above lenses.

Each review involves visiting an API area, applying the above 20 lenses, register and often times hack on an API. When done I write up a review report, which in many cases will be privately shared with a company and key stakeholders for review and discussion.

In some cases, when approved, I will publicly write up a review, providing a more polished view of an API area based upon final review report, and publish on my API Evangelist network of blogs.

The goal with a review is to better understand the balance of technology, business and politics going on within an API ecosystem, providing feedback that will help a company better achieve balance and success with their API initiative.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.