{"Database APIs"}

Blog Posts on API Database

These are posts from the API Evangelist blog that are focused on databases and APIs, allowing for a filtered look at my analysis on the topic. I rely on these posts, along with the curated, organizations, APIs, and tools to help paint me a picture of what is going on.

Your API Should Reflect A Business Objective Not A Backend System

I'm in the middle of evolving a data schema to be a living breathing API. I just finished generating 130 paths, all with the same names as the schema tables and their fields. It's a natural beginning to any data-centric API. In these situations, it is easy for us to allow the backend system to dictate our approach to API design, rather than considering how the API will actually be used.

I'm taking the Human Service Data Specification (HSDS) schema, and generating the 130 create, read, update, and delete (CRUD) API paths I need for the API. This allows the organizations, location, services, and other details being managed as part of any human service API that will be managed in a very database-driven way. This makes sense to my database administrator brain, but as I sit in a room full of implementors I'm reminded that none of this matters if it isn't serving an actual business objective.

If my API endpoints don't allow a help desk technician properly search for a service, or a website user browse the possibilities to find what they are looking for, my API means nothing. The CRUD is the easy part. Understanding the many different ways my API paths will (or won't) help someone find the services they need or assist a human service organization to better reach their audience is what the API(s) are all about, not just simply reflecting the backend system, and then walking away calling the job done.


I Am Keeping My Mind Open And Looking Forward To Learning More About GraphQL

I wrote a post the other day sharing my thoughts around GraphQL seeming like we were avoiding the hard work of API design. Shortly after publishing Sashko Stubailo (@stubailo) from Apollo, a GraphQL solution provider, wrote a very thoughtful response to me comments and questions about GraphQL. First I wanted to say that I really dig this approach to responding to other people's blog posts, with a blog post of your own, within your own personal or company domain.

I don't think Sashko has convinced me 100% that GraphQL is the solution we are looking for, but he has convinced me that I should be learning more about it, keeping a closer eye on the technology, and better understand how people are putting it to use.

Regarding my primary question regarding how GraphQL could benefit non-technical folks and end-users--I would say he answered it 50% of the way:

I’m a frontend developer. GraphQL makes my life easy.

It doesn't touch on whether or not non-technical users will be able to reverse engineer and put it to work for them, but that's ok for now. One thing that Sashko touched on for me, that move GraphQL closer to being simple enough for non-technical users, is he helped differentiate GraphQL from SQL:

GraphQL is a query language, like SQL? While GraphQL looks like a query language at first, I think its name might be one of the things that gets people off on the wrong foot. GraphQL is not at all like SQL...So GraphQL is a “query language” just like URLs are the “query language” of REST—it’s a contract that describes how to tell the API server what you’re looking for.

I like the separation from "structured" query language, and moving us to something that augments HTTP and the URL, and doesn't just tunnel into the backend database. This has the potential to move REST forward, not drag the database out front for me--which leaves me more hopeful.

Another area Sashko answered for me was regarding GraphQL seeming like it was too hard:

This is a very real concern whenever a new technology is introduced. Is this going to make stuff more complicated for everyone who isn’t in the know? 

Fair enough. Makes me happy to hear this from a service provider who is leading the charge when it comes to GraphQL. His stance seems like it is pragmatic, and aware of the importance that GraphQL needs to be accessible to as wide as possible audience as we can--building on the momentum REST has in this area.

What really pushed away my concern, and got me more interested in paying more attention to GraphQL was when Sashko talked about this just being the beginning:

The most exciting thing to me is that GraphQL has been publicly available for barely more than a year, and already a huge number of people I respect for their technical abilities and design thinking are trying to figure out ways to add it to their architecture. 

Ok. Now I want to see where this goes. I've been tracking on GraphQL as a subset of my data API research, but will spend some extra cycles each week keeping an eye who is doing anything interesting with GraphQL. I've added Apollo to my research, and I will work on a roundup of other providers, and any open source tooling I can find out there. I also wanted to thank Sashko for taking the time to answer some of my questions, and respond to my uncertainty around GraphQL. I dig it when API service providers and API providers provide responses to my storytelling on API Evangelist--makes for good conversation in the API community.


Create a PHP Script To Generate An OpenAPI Specification From The ClinicalTrials.Gov Database I Created

One of my objectives around importing the ClinicalTrials.gov data, is to create an API. The first step in creating an API, before we ever get programming anything, is to create an OpenAPI Spec for use as a version 1.0 scaffolding for the clinical trials data we now have stored in a MySQL database.

I sure wasn't going to be hand crafting an OpenAPI Spec for this fairly large data set, so I got to work creating a crude PHP script that would do the heavy lifting for me:

This script loops through all the tables in my clinical trials database, and auto generates the necessary JSON schema for the data structure, combined with OpenAPI Spec for describing the API interface for the clinical trials database. I have the result in a single OpenAPI Spec file, but will most likely be breaking up to make it easier to work with:

This OpenAPI Spec for the clinical trials API gives me a base blueprint I can use to generate server side code, client side code, documentation, and other essential building blocks of the API operations I put in place to support accessing the clinical trials API.

I will be added better descriptions for paths, parameters, schema, and other elements of this in the future, with this definition acting as the contract for the clinical trials API.


We Need An API For The Chronology of Data Breaches Database

I came across the Privacy Rights Clearinghouse, while conducting a search that turned up the chronology of data breaches, which provides details on 4,725 data breaches that have been made public since 2005. The website allows you to search for data breaches by type of breach, type of organization, and the year in which it occurred--some very valuable information.

In 2016, as breaches continue to be common place across almost all industries, we are going to need to take the chronology of data breaches up a notch. I would like to see an API be made available for the valuable database. As I do, I write stories about what I'd like to see in the space, and forward the link to key actors, and tell the story to the public at large, in hopes of inciting something to happen.

Making the data breach information available via API, would encourage more storytelling around the events, which could include much more meaningful visualizations using solutions like D3.js. Information about companies, could be included into other business search and discovery tooling, and more push notification networks could be setup that could keep industry experts more informed about what is happening across the sector.

Now that I am on the subject, it would make sense if all the privacy topics, and other resources available via the Privacy Rights Clearinghouse accessible through a simple API interface. If you work with the Privacy Rights Clearinghouse, and would like to talk about making this happen, feel free to reach out. If you are someone who would like to help work on this project, or possibly fund this work, also please let me know.

The type of information the Privacy Rights Clearinghouse is aggregating is only going to become more important in the insecure cyber world we have created, and making it accessible for reading and writing via a simple API, would significantly help the organization make a bigger impact, and educate a larger audiences.


State of Popular Database Platforms And Their Native API Publishing Features

I had a reinder on my task list to check-in on where some of the common database platforms were when it came to APIs. I think it was a Postgres announcement from a while back that put the thought in my notebook, but as an old database guys I tend to check-in regularly on the platforms I have worked most with.

The point of this check-in, is to see how far along each of the database platforms are when it comes to easy API deployment, directly from tables. The three relational database platforms I'm most familiar with are:

  • SQL Server - The platform has APIs to manage, and you deploy an OData service, as well as put .NET to work, but nothing really straightward, that would allow any developer to quickly expose simple RESTful API.
  • PostgreSQL - I'd say PostgreSQL is furthest along with thier "early draft proposal of an extension to PostgreSQL allowing clients to access the database using HTTP", as they have the most complete information about how to deploy an API.
  • MySQL - There was a writeup in InfoQ about MySQL offering a REST API, but from what I can tell it is still in MySQL Labs, without much movement or other stories I could find to show any next steps.

The database that drives my API platform is MySQL running via Amazon RDS. I haven't worked on Postgres for years, and jumped ship on SQL Server a while back (my therapist says I cannot talk about it). I automate the generation of my APIs using Swagger and the Slim framework, then do the finish work like polishing the endpoints to look less like their underlying database, and more like how they will actually be used. 

Maybe database platforms shouldn't get into the API game? Leaving API deployment to the API gateway providers like SlashDB and DreamFactory. It just seems like really low hanging fruit for these already used database solutions, to make it dead simple for developers to expose, and craft APIs from existing datasources.

if you are using any database to API solutions for SQL Server, PostgreSQL, or MySQL, please let me know.


Scientific Database Monetization

Wormbase working.


The Next Steps For The The Recreation Information Database (RIDB) API

I referenced the Recreation Information Database (RIDB), in my story late last year, when I was asking for your help to make sure the Department of Agriculture leads with APIs in their parks and recreation RFP. I'm not exactly sure where it fits in with the RFP, because the RIDB spans multiple agencies.

Here is the description from the RIDB site:

RIDB is a part of the Recreation One Stop (Rec1Stop) project, initiated as a result of a government modernization study conducted in 2004. Rec1Stop provides a user-friendly, web-based resource to citizens, offering a single point of access to information about recreational opportunities nationwide. The web site represents an authoritative source of information and services for millions of visitors to federal lands, historic sites, museums, and other attractions/resources.

When I wrote the post last October, I referenced the PDF for the REST API Requirements for the US Forest Service Recreation Information Database (RIDB), but this week I got an update, complete with fresh links to a preview of a pretty well designed API, complete with documentation developed using Slate.

I haven’t actually hacked on the endpoints, but I took a stroll through the docs, and my first impression was it is well designed, and robust, including resources for organizations involved with the RDB, recreational areas, facilities, campsites, permit entrances, tours, activities, events, media, and links. The RIDB documentation also includes details on errors, pagination, version, and a data dictionary, and the on-boarding was frictionless when looking to get a key.

Everything is in dev mode with the RIDB API, and the team is looking for feedback. I’m not entirely sure they wanted me to publish as story on API Evangelist, but I figure the more people involved the better, as I’m not sure when I’ll get around to actually hacking on the API. I’m happy to see such a quick iteration towards the next generation of the RIDB API, and it makes me hopeful to be covering so many new API efforts out of the federal government in a single day.


Reworking My API 101 Content And First Up Is The 100K View

The home page of API Evangelist has always been my API 101 page, where any new visitor can land, not know a thing about APIs, read the page, and walk away understanding at least what an API is, and hopefully also a greater understanding of how it impacts their world. From the beginning the API Evangelist home page always had a lean toward API providers, and over time I've added information about consuming APIs, as well as trends in the API space I'm seeing.

In my opinion my 101 content has become too lengthy, and not properly onboarding people to the multiple areas of APIs like providing APIs vs consuming APIs. I think it is time to overhaul my 101 section, and produce some newer, more relevant API 101 content for 2014. To support this, I'm working on a series of 15 API 101 segments, with each one having a story version, as well as a slideshow edition, complete with audio. 

I finished the first one, which is my 100K view, API 101 introduction. I'm still working on the slideshow version. Once I'm done I"ll move on to providing APIs, and as I finish each one I'll hang on the homepage, as well as publish out via the blog, and Twitterz.


API 101

APIs, also known as Application Programming Interfaces, at their most basic level, allows applications to talk to other applications, but they are so much more than this when you begin to explore the world of APIs further. APIs are slowly becoming part of every aspect of our personal, and business worlds, often even without us knowing. If you surf the web, and use a mobile phone, this explanation of what an API is for you. APIs are affecting your life right now, and with a little more understanding you can learn to identify APIs, and put them to work for you, even if you are not a geek.

861,379,000 Websites

There are 861,379,000 websites registered in 2014, making up what we know as the World Wide Web. In 2014 it is easier to build a web than it has at any other pointing in the history of the Internet, allowing any individual, company or organization to publish a website and even point a custom address to it.

In 1995 only the most tech savviest had websites, and by 2000 many businesses realized having a website was the normal way of doing business. In 2010 many of the tech savviest companies had APIs, and by 2015 many businesses are realizing that having an API is the normal way of doing business--imagine what the Internet will look like by 2020.

Facebook

Websites like Facebook allow us to interact with friends and family, sharing messagings, photos, links, and other essential life bits, establishing a social platform for us to engage in our online lives.

Facebook is a daily destination for users around the world, with much of the interactions via an increasing number of mobile devices that all of us have in our pockets. These mobile devices all use APIs to connect our mobile app to Facebook.com.

Yahoo

Yahoo has long been a staple of the Internet providing users with access to their news, sport, weather, stock, and other important bits of our daily lives. Yahoo was an early player on the web, and continues to find new ways to compete in 2014 via the web and mobile.

Yahoo may not be what it used to be, but the company continues to innovate through acquisitions, and still plays a role as the home page for many users who were first introduced to the World Wide Web through Yahoo search and directories.

Youtube

YouTube provides us with hours of online entertainment, bringing home user-generated, and sometimes professionally produced videos, ranging from family home videos to music videos, and live concerts. YouTube is quickly replacing the television of the previous century, re-inventing how we are entertained in our homes.

YouTube users often spend time watching videos on the main YouTube website, but also consume a considerable amount of videos using the universal embeddable video player allowing Youtube videos to be embedded on any website around the web, all driven by APIs.

Amazon

Amazon has made e-commerce part of the mainstream conversation, allowing us to order almost any physical, or digital product we can imagine from the vast warehouses that Amazon posses, as well as those of its affiliates. Amazon found got started with physical products like books, but in 2014 has led the charge in delivering virtual goods such as movies, e-books, and other online products consumers are craving.

Selling you products is just the first part of the conversation when it comes to Amazon, they also want to provide with consumer any possible channel they could possibly choose to purchase their products, including providing infrastructure for web, mobile, and tablet developers, and even producing their own tablet device, and supporting platform.

Twitter

Twitter allows us to send short, SMS style messages across the public Internet, and follow, or be followed by the people we find ost interesting. Twitter has gone beyond just a messaging platform, or even a social network, and has become the pulse of the online world through its API driven platform.

Everything you know about Twitter was originally developed using the Twitter API, its mobile applications, desktop applications, buttons, widgets, and other analytics and visualizations are all developed using the Twitter public API, which makes the entire Twitter platform available to developers.

Pinterest

Pinterest provides users with a simple, yet addictive way to share photos on the open Internet, and with friends. Photos play an important role in our lives, and Pinterest stepped up as a new way to express ourselves using these valuable photos and images.

Only part of the interactions with Pinterest occur via their website, the majority of engagements occur through sharing on other sites and social networks, and via mobile applications that allow users to quickly share and express themselves with photos.

Reddit

Reddit is the home page of the Internet, providing users with links to the important stories, products, and conversations of the day. Redditors curate the best links that are submitted through the site, and through user engagements the best of the best floats to the top for everyone to explore.

Reddit demonstrates the real-time network effect that we often associate with the Internet, providing a heartbeat of the best links being viewed, and engaged with across the Internet, making the trillions of web pages, across billions of websites accessible each day by average people.

Bits Of Our Life Across These Sites

All of these websites are built by other people, but as users of these sites we all leave a little bit of ourselves in these locations, each time we visit. Whether its just a comment, photo, all the way up to sustained engagement via our mobile phones, we are consistently leaving ever growing footprint across this digital landscape of web sites, and mobile applications.

It has taken decades to go from just a handful of the first sites on the World Wide Web to the 861,379,000 we have in 2014. It is taken the hard work of web site designers and developers to make these sies a reality, but increasingly through the use of API driven interactions via 3rd party websites and mobile devices, all the information on these sites are being generated by YOU, the user.

Hand Written Websites Using HTML

Iin the early days, all websites were hand coded, with web designers writing all the HTML, and building their own images. Individuals spent numerous hours maintaining each page across vast sites, and often hired large teams help maintain single sites that could span millions of actual web pages.

As the number of websites grew this became even more tedious to maintain, but savvy designers turned developers were finding innovative ways to create, publish, and maintain the growing number of websites. While some websites are still hand-coded, the majority of sites are generated, published, and maintained by an ever evolving set of web development software, and Software as a Service (SaaS) platforms.

Website Content Was Coming From Databases

Eventually the concept of a database driven website came to be, allowing websites to evolve from sites into applications, and all the content across a website was stored in a database. Now your site was more than just publishing content from a database, you also allowed users to interact, purchase, and generate content as well.

Databases allowed websites to evolve, and go beyond just content publishing, potentially making their websites not just about the delivery of valuable content, but you could also encourage users to create content either knowingly or unknowingly, and not just on a single website, but across hundreds or thousands of sites around the world.

Adding An Amazon Product Widget

Along the way it became clear that it wasn't enough to have just your products on your website, or maybe you were a content site and didn't have your own products. Either way, users needed a way to put widgets, and other syndicated content on their site, which would pull products from other sites, which is something Amazon became very, very good at.

API driven content widgets became a great way to publish products onto the growing number of blogs, review, and other sites that were popping up across the Internet. These approaches to content delivery proved to work not just for product owners, but eventually any content could be syndicated across websites using simple, small, embeddable widgets, benefitting an entire network of connected sites.

Add A Delicious Bookmark Widget

As the social evolution of the Internet began, links to valuable content, and the social influence behind these curated groups of links became apparent. It wasn't just about bookmarking your favorite sites, it was about grouping, curated, and sharing of this content across any website that would have you, using embeddable tools.

Curated links often show as spammy advertised links, but can also show relevant links based upon what you are searching for, what your friends are doing, and in other ways, that makes the embedding of links, or groups of links, an important way to share information across the Internet.

Add Twitter Widget

With the social movement in full swing, just posting links to your favorite sites was not enough. Users wanted to be able to communicate in real-time about the sites they were visiting, what they were reading, watching online, and increasingly what people were doing offline. With the introduction of Twitter users wanted to be able to show what what on their mind, not matter where website it was on the internet.

Widgets were now allowing every piece of your life that was generated online to live beyond just the website where you created it, and could travel around from any website, to any other site or application that it could. Embeddable content and tools elevated the World Wide Web beyond just any single site, it was a distributed Internet now.

Content Comes From Many Databases & APIs

If you were a website developer, your website content no longer comes from any single database. You were pulling your content from a variety of databases, and API resources that you may or may not own. Product sites like Amazon, and social networks like Twitter were emerging, equipped with APIs they were generating content, and allowing it to come in from, and be published back to any website on the open Internet.

Having a direct database connection to your content wasn't enough anymore. You needed to have APIs so that trusted partners, as well as potentially the public, can get access to data, content and other information, for use across other websites, anywhere on the Internet. Just as this concept was spreading across the Internet, another thing happened, that would forever change how we interact online.

Then The IPhone Happened

On June 29, 2007 the first edition of the iPhone was release, and slowly over the next 3 years computing would undergo a transformation, building on existing approaches that first were used to distribute content to website, but would also prove to be low cost, efficient ways to distribute content to a growing number of mobile devices.

While the iPhone is just one of many mobile devices being manufactured today, it did set in motion a new way of designing, and developing applications that could run on the web, or mobile devices, anywhere in the world using simple web-based APIs. Mobile application development has brought many changes to the world of compute, but simple API access to valuable data, content, and other resources has been the most significant.

Foursquare

Mobile applications came with the simple knowledge of your location, as well as where you were at online, enabling a new form of communication, called the check-in. A new breed of mobile apps like Foursquare allowed users to see where they were, find friends, as well as places and experiences nearby, making the Internet a more local affair.

From this point forward, the longitude and latitude of each online user would be an option when receiving messages, photos, videos and other life bits from users, changing how we think about developing applications--it was a mobile app world now.

Instagram

In addition to knowing the location of each application user, each user was potentially given a high quality, internet enable camera on their mobile device. With the ability to take a picture, anywhere, anytime, new applications like Instagram emerged providing users with new features, and social capabilities built around photos.

Photos continue to be one of the most important, and valued life bits that we can generate across the Internet pipes, providing a very basic ingredient for use in online communication. This is something that would prove essential to users of all ages, from teens all the way up to grandparents discovering the potential of web and mobile apps.

Pandora

As with photos, music plays a central role in our life, and it is no accident that along with the introduction of the iPhone, Apple brought along its successful model for a music platform centered around their portable music player, the iPod. Music is an extremely valuable space when it comes to our life bits, something developers and platforms are looking to capitalize on by delivering the next generation of music apps.

APIs are stimulating our musical tastes beyond just our Pandora or Rdio playlists, by allowing us to interact with friends, watch videos, purchase concert tickets, buy merchandise, and much, much more. The music industry has been slow to change, but ultimately consumer interests are winning out, and the industry is slowly understanding that a mobile experiences is where today's music experience is at.

WhatsApp

In an online, mobile world, staying in touch is one of the most important elements of our digital self. Staying in touch with friends, co-workers, and the ability to organize into groups and meet for sport practice, or to protest the government, is the focus of leading messaging apps like WhatsApp.

Companies like WhatsApp are seeing record investment, valuations, and usage by users around the world--showing that messaging online is an important part of the online and mobile experience, and building apps that connect users in real-time, when they they want, and how they want, is a huge opportunity for start-ups.

APIs Are Powering Websites

APIs have been powering websites for almost 15 years now, delivering data, content, and other digital information to the websites we use everyday like Amazon, Twitter, and Pinterest. The average website gets its content from a variety of API driven resources, both from public and private sources, making the aggregation of content a viable business model in 2014.

Using content management systems (CMS) like WordPress, combined with API driven plugins, anyone can be a curator of content, providing a website where users can find stories, photos, videos, and other valuable resources. In 2015 WordPress will also launch its own API, which will give 65 million websites their own API, turning many average site owners, into API providers.

API Are Powering Mobile Apps

The number of mobile phones in our pockets will reach 7.3 billion in 2014--equally the population of the world. Of course not everyone has a cell phone, but it shows the ubiquitous nature of these devices, and shows the size of the opportunity to provide API driven resources like messaging, products, images, videos, and the other life bits we generate, and depend on each day.

Web APIs are well suited for sending and receiving the small, bite-size chunks of data, content, and other digital resources we use through our personal and business worlds. Mobile apps are not just being developed by the latest tech start-ups, mobile apps are becoming the normal mode of operations for companies, organizations, institutions, and government agencies of all shapes and sizes.

APIs Are Powering Buttons, Badges, and Widgets

APIs break up our valuable data and content into small bite-size chunks for transmission to mobile devices, but also so that they can be moved around the web, and published to where we need them. APIs are powering a wide range of buttons, badges, and other widgets that are making any single website, into a globally distributed network of content publishing and creation.

Buttons, badges and widgets are extending likes, follows and favorites from your favorite social network to any other site you may be reading or participating on, providing a bridge between all of the application you use on the web and your mobile devices. Facebook is a big part of the identity of any online user, and being able to log-in, share, and participate on the open Internet using our Facebook profiles has become normal for the average Internet user.

APIs Are Powering Spreadsheets

APIs are quickly finding their way beyond web and mobile applications, and are being used in spreadsheets to provide the data needed for business calculations, analysis and visualizations. A large amount of the worlds data exists in spreadsheets and the next wave of API developers are currently spreadsheet administrators, they just have not been informed that they can get the data they need via APis.

The days of emailing around spreadsheets, having out of date excel documents strewn here and there, is going away. Spreadsheets are connecting to APIs using excel connectors, and online using Google Spreadsheets. Spreadsheets are being constructed to make sense of Tweet streams, aggregate comments or messages, and number crunch the latest industry numbers from the government.

APIs Are Powering Devices

I'm not exaggerating when I say APIs are penetrating almost every aspect of our world. APIs are being used to connect physical devices in our lives like thermostats, smoke detectors, automobiles, clothing, glasses, and just about every object you can imagine, and some you can't.

In 2014 a wave of new start-ups, calling themselves Internet of Thing companies, have emerged to build on the API movement, extending the low-cost, simple approach to delivering resources to the physical world around us. Using APIs, the Internet is moving offline, and wiring up almost every aspect of our personal, public, and professional life.

Not Just For Developers

APIs are not just for developers. If you can use a web page, you can use most APIs. There are a great deal of educational materials (like this), to get you up to speed with APIs. API providers are also working to make it a priority to focus on non-developer users, such as data journalists, analysts, and many other non-programming user types.

Even if an API has a more developer focus to it, often times you can just ask questions of the community, or the API provider and they will help you understand the value an API delivers, and how you can possibly put it to use. One of the characteristics of Software as a Service (SaaS) applications that people are using, is there is an API behind the scenes, something that only the API savviest of users are aware of.

Orchestrate Using APIs With IFTTT And Zapier

A new generation of services have emerged, acknowledging the wealth of API resources available, and the growing number of SaaS applications that have APIs, and are delivering tools that help anyone put APIs to work. Services like IFTTT and Zapier provide very simple recipes that allow anyone to migrate content, data, and other digital resources between online platforms--no code necessary.


These API enabled orchestration platforms allow anyone to automate, migrate, and keep their online worlds in sync. These services are targeting non-developers, but provide such instant value, they are often used by programmers to make their worlds go around. This approach to using online services, is allowing companies, organizations, and even individuals to find new found freedom and control in their digital lives.

Making Companies More Nimble & Agile

Companies are learning that when they make all data, content, and other digital assets available via well designed, simple, and modular APIs over the Internet, they find a new nimbleness and agility in how they operate internally, engage with partners, and even are able to interact with the public at large.

While APIs are technical in nature, their simplicity, and efficiency are meant to make things more accessible, in simple but secure way, using the path of least resistance. Historically valuable digital resources are locked up by software developers and information technology groups, API developers are looking to change this, making resources available to as wide of an audience as possible.

Making Companies More Open & Transparent

Another aspect that is transcending the technology of APIs, is the openness and transparency it can introduce into company, organizational, institutional, and government operations. This isn't always good by default, but when done properly it can let a little sunlight into a potentially closed process, and eliminate some illnesses that occur behind closed doors.

Of course, when it comes to business, sometimes transparency can work against you, but many corporations, and young start-ups find that being closed can hurt not just potential competitors, but also slow your own company in finding success. As we've learned from open source software, openness can actually boost security, privacy and many other concerns, and a true approach to using APIs can bring some of the same effects to organizations of any size.

Allowing Companies To Be More Competitive

The ability to rapidly design, develop, deploy and iterate websites, mobile applications, while also empowering workers to be more effective via API driven spreadsheets and widgets, is giving companies a competitive edge. APIs open up a new way of doing business that the savviest of online businesses are participating in. There is a stunning amount of investment for start-ups looking to disrupt even the most entrenched industries, and companies, but it also doesn't take a start-up to make change, and many large businesses are also taking notice and getting to work on their own API efforts.

In the next five years, it will become evident which companies, and organizations do not have APIs, as they will not be participating at the levels of API driven companies. Large organizations will not be able to keep up in an always on, real-time business environment, something that APIs power across multiple channels including web, mobile, devices, and other ways of doing business online, in this new environment.

Empowering Individuals To Take Control Online

The most important thing APIs are bringing to the table, is the empowerment of the average online individual. APIs are providing access to much of the data and content we are generating online. APIs are how we can get access to the photos of our children on Facebook, and our home videos we've shared on YouTube. APIs are not just about driving business innovation, it is how the average user is taking back their digital self from a growing wave of data hungry start-ups.

API literacy is much like financial literacy. You don't have to understand the inner workings of banking and credit card industry, but you damn sure need to be educated enough to understand who has access to your bank account and credit card statement. With API literacy you don't have to understand how APIs work, but you should only use online services that have an API, and know who has access to your data and content, while also being able to download and walk away with your own valuable information that you have generated via ANY online service.

APIs Are Important, Please Keep Learning

This is just the first of a series of 101 tutorials on the world of APIs. On the home page of API Evangelist you can find other tutorials on providing APIs or consuming APIs, and the research that went into this material. I have other research areas like real-time, and voice APIs, which as I have time will be generating 101 tutorials for--let me know if there are any of them I should prioritize

I will be adding new tutorials as I have time, until then you can tune into the news stream from each area of research on the API Evangelist network, as well as tune into the API Evangelist blog for the latest analysis on the business of APIs, API Voice for the latest on the politics of APis, and the API.Report the latest in news from the APi space.


Expanding API Gateway Connectors Into A World of API Deployment Startups

I’m seeing an increase in the number of API deployment services this year, such as startups like StrongLoop and API Spark. These companies are looking to help all of us deploy APIs from common systems, often without the need for IT or programming resources.

The providers I’m seeing emerge are catering to some of the lowest hanging fruit for deploying APIs. The commonly used, and easiest to access systems, that contain the valuable content, data, and media we need to make accessible via APIs.

The common source for many of these API deploy solutions are:

These common information sources, represent the places where the average person in any size company, organization or government agency will be storing their valuable resources. It makes sense that this new wave of API deployment startups would target these services.

If you consider every system integration option that classic API gateways have delivered, it provides a good reference for finding opportunities for building independent API deployment solutions, that if done right, could be a startup all by itself.

Not all companies can afford a full API gateway solution, and their needs are too small for a gateway. There is an emerging opportunity to help people quickly deploy APIs from common sources like spreadsheet, database, file stores, as well as more unique sources like specialty CMS and CRM systems.

Sometimes I like to look at the expanding API universe as a result of the SOA big bang, where all the tools that were in your SOA toolbox, being liberated, and available as independent services that you can get from an increasing number of new API deployment startups.


APIs Can Open Up Your Company To Outside Ideas

I talk about this concept often, but couldn't find any definitive post on APIs opening up a company, organization, or agency to outside ideas. This is something I hear from startups, up to federal government agencies, and from well known business brands, such as Absolut Vodka.

Absolut was one of the keynotes at API Strategy & Practice in Amsterdam, this last march. Eva Sjokvist, from Absolut, walked us through the development of their Absolut Drinks Database API. An amazing story by itself, but one thing she said that stood out to me, which is an interestingly common by-product of the API lifecycle, was the process itself opened up the company, making it more receptive to outside ideas, and collaboration.

I hear this sentiment often from groups who are developing API programs at their companies, and you see shining examples, like from Edmunds.com with their internal API Summit, but rarely do I see a metric to measure API driven, cultural change. APIs don't just open up your company’s assets and resources for access by trusted partners, or potentially the public, establishing a sort of external R&D lab, it has the potential to open up a valuable feedback loop as well, bringing in new ideas that have the potential to change how a group sees the world--evolving internal culture.

The potential for opening up a company, bringing them closer to their partners and customers, or a government agency, opening up healthier dialogue with constituents, is the most important results of an API program—greater than any single application that will get built on an API.


Building Blocks Of API Deployment

As I continue my research the world of API deployment, I'm trying to distill the services, and tooling I come across, down into what I consider to be a common set of building blocks. My goal with identifying API deployment building blocks is to provide a simple list of what the moving parts are, that enable API providers to successfully deploy their services.

Some of these building blocks overlap with other core areas of my research like design, and management, but I hope this list captures the basic building blocks of what anyone needs to know, to be able to follow the world of API deployment. While this post is meant for a wider audience, beyond just developers, I think it provides a good reminder for developers as well, and can help things come into focus. (I know it does for me!)

Also there is some overlap between some of these building blocks, like API Gateway and API Proxy, both doing very similiar things, but labeled differently. Identifying building blocks for me, can be very difficult, and I'm constantly shifting definitions around, until I find a comfortable fit--so some of these will evolve, especially with the speed at which things are moving in 2014.

CSV to API - Text files that contain comma separate values or CSVs, is one of the quickest ways to convert existing data to an API. Each row of a CSV can be imported and converted to a record in a database, and easily generate a RESTful interface that represents the data stored in the CSV. CSV to API can be very messy depending on the quality of the data in the CSV, but can be a quick way to breathe new life into old catalogs of data lying around on servers or even desktops. The easiest way to deal with CSV is to import directly into database, than generate API from database, but the process can be done at time of API creation.
Database to API - Database to API is definitely the quickest way to generate an API. If you have valuable data, generally in 2013, it will reside in a Microsoft, MySQL, PostgreSQL or other common database platform. Connecting to a database and generate a CRUD, or create, read, updated and delete API on an existing data make sense for a lot of reason. This is the quickest way to open up product catalogs, public directories, blogs, calendars or any other commonly stored data. APIs are rapidly replace database connections, when bundled with common API management techniques, APIs can allow for much more versatile and secure access that can be made public and shared outside the firewall.
Framework - There is no reason to hand-craft an API from scratch these days. There are numerous frameworks out their that are designed for rapidly deploying web APIs. Deploying APIs using a framework is only an option when you have the necessary technical and developer talent to be able to understand the setup of environment and follow the design patterns of each framework. When it comes to planning the deployment of an API using a framework, it is best to select one of the common frameworks written in the preferred language of the available developer and IT resources. Frameworks can be used to deploy data APIs from CSVs and databases, content from documents or custom code resources that allow access to more complex objects.
API Gateway - API gateways are enterprise quality solutions that are designed to expose API resources. Gateways are meant to provide a complete solution for exposing internal systems and connecting with external platforms. API gateways are often used to proxy and mediate existing API deployments, but may also provide solutions for connecting to other internal systems like databases, FTP, messaging and other common resources. Many public APIs are exposed using frameworks, most enterprise APIs are deployed via API gateways--supporting much larger ideployments.
API Proxy - API proxy are common place for taking an existing API interface, running it through an intermediary which allows for translations, transformations and other added services on top of API. An API proxy does not deploy an API, but can take existing resources like SOAP, XML-RPC and transform into more common RESTful APIs with JSON formats. Proxies provide other functions such as service composition, rate limiting, filtering and securing of API endpoints. API gateways are the preffered approach for the enterprise, and the companies that provide services support larger API deployments.
API Connector - Contrary to an API proxy, there are API solutions that are proxyless, while just allowing an API to connect or plugin to the advanced API resources. While proxies work in many situations, allowing APIs to be mediated and transformed into required interfaces, API connectors may be preferred in situations where data should not be routed through proxy machines. API connector solutions only connect to existing API implementations are easily integrated with existing API frameworks as well as web servers like Nginx.
Hosting - Hosting is all about where you are going to park your API. Usual deployments are on-premise within your company or data center, in a public cloud like Amazon Web Services or a hybrid of the two. Most of the existing service providers in the space support all types of hosting, but some companies, who have the required technical talent host their own API platforms. With HTTP being the transport in which modern web APIs put to use, sharing the same infrastructure as web sites, hosting APIs does not take any additional skills or resources, if you already have a web site or application hosting environment.
API Versioning - There are many different approaches to managing different version of web APIs. When embarking on API deployment you will have to make a decision about how each endpoint will be versioned and maintained. Each API service provider offers versioning solutions, but generally it is handled within the API URI or passed as an HTTP header. Versioning is an inevitable part of the API life-cycle and is better to be integrated by design as opposed to waiting until you are forced to make a evolution in your API interface.
Documentation - API documentation is an essential building block for all API endpoints. Quality, up to date documentation is essential for on-boarding developers and ensuring they successfully integrate with an API. Document needs to be derived from quality API designs, kept up to date and made accessible to developers via a portal. There are several tools available for automatically generting documentation and even what is called interactive documentation, that allows for developers to make live calls against an API while exploring the documentation. API documentation is part of every API deployment.
Code Samples - Second to documentation, code samples in a variety of programming languages is essential to a successful API integration. With quality API design, generating samples that can be used across multiple API resources is possible. Many of the emerging API service providers and the same tools that generate API documentation from JSON definitions can also auto generate code samples that can be used by developers. Generation of code samples in a variety of programming languages is a requirement during API deployment.
Scraping - Harvesting or scraping of data from an existing website, content or data source. While we all would like content and data sources to be machine readable, sometimes you have just get your hands dirty and scrape it. While I don't support scraping of content in all scenarios, and business sectors, but in the right situations scraping can provide a perfectly acceptable content or data source for deploying an API.
Container - The new virtualization movement, lead by Docket, and support by Amazon, Google, Red Hat, Microsoft, and many more, is providing new ways to package up APIs, and deploy as small, modular, virtualized containers.
Github - Github provides a simple, but powerful way to support API deployment, allowing for publsihing of a developer portal, documentation, code libraries, TOS, and all your supporting API business building blocks, that are necessary for API effort. At a minimum Github should be used to manage public code libraries, and engage with API consumers using Github's social features.
Terms of Use / Service - Terms of Use provide a legal framework for developers to operate within. They set the stage for the business development relationships that will occur within an API ecosystem. TOS should protect the API owners company, assets and brand, but should also provide assurances for developers who are building businesses on top of an API. Make sure an APIs TOS pass insepection with the lawyers, but also strike a healthy balance within the ecosystem and foster innovation.

If there are any features, service or tools you depend on when deploying your APIs, please let me know at @kinlane. I'm not trying to create an exhaustive list, I just want to get idea for what is available across the providers, and where the gaps are potentially. 

I'm feel like I'm finally getting a handle on the building blocks for API design, deployment, and management, and understanding the overlap in the different areas. I will revisit my design and management building blocks, and evolve my ideas of what my perfect API editor would look like, and how this fits in with API management infrastructure from 3Scale, and even API integration.

Disclosure: 3Scale is an API Evangelist partner.


API Virtual Stack Composition Like The Absolut Drinks Data API

If you read my blog regularly, you know I am constantly pushing the boundaries of how I see the API space, and sometimes my ramblings can be pretty out there, but API Evangelist is how I work through these thoughts out loud, and hopefully bring them down to a more sane, practical level that everyone can understand.

My crazy vision for the day centers around virtual API stack composition, as beautiful as the Absolut Drinks Database API. Ok, before you can even begin to get up to speed with my crazy rant, you need to be following some of my rants around using virtual cloud containers like we are seeing from docker, AWS and OpenShift, and you need to watch this video from APIStrategy & Practice about Absolut Drink Databse API deployment.

Ok, you up to speed? Are you with me now?

Today, as I was playing around with the deployment of granular API resources using AWS CloudFormations, I was using their CloudFormer tool, that allows me to browse through ALL of my AWS cloud resources (ie. DNS, EC2 Servers, S3 Storage, RDS Databases), and assemble them into a CloudFormation Templates, which is just a JSON definition of this stack I’m going to generate.

Immediately I think of the presentation from Absolut, and how they spent years assembling the image and video parts and pieces that went into the 3500 drinks they wanted available via their API, for developers to use when developing interactive cocktail apps. They had 750 images, and video clips, with a combination of 30K mixing steps, that went into the generation of the 3500 drink combinations. * mind blown *

Now give me this same approach but for composing virtual API stacks, instead of cocktails. With this approach you could define individual API resources such as product API or screen capture API. These are all independent API resources, with open source server and client side code, openly licensed interface via API Commons, and virtual container definitions for both AWS CloudFormations and OpenShift.

Imagine if I have hundreds of high value, individual API resources available to me when composing a virtual stack. I could easily compose exactly the stack my developers need, composed of new and existing API resources. I wouldn’t be restricted to building directly on top of existing data stores or APIs, I could deploy external API deployments that depend on sync to stay up to date, providing the performance levels I demand from my API stack--I could mix virtual API stacks, like I would a cocktail. 

Anyhoooo, that is my rant for now. I’ll finish doing the work for deploying AWS CloudFormation and OpenShift containers for my screen capture API, rounding of all the architectural components I outlined in my API operational harness, and then rant some more.

Thanks for reading my rant. I hope it made some sense! ;-)


Common Building Blocks of Cloud APIs

I’ve been profiling the API management space for almost four years now, and one of the things I keep track of is what some of the common building blocks of API management are. Recently I’ve pushed into other areas like API design, integration and into payment APIs, trying to understand what the common elements providers are using to meet developer needs.

Usually I have to look through the sites of leading companies in the space, like the 38 payment API providers I’m tracking on to find all the building blocks that make up the space, but when it came to cloud computing it was different. While there are several providers in the space, there is but a single undisputed leader—Amazon Cloud Services. I was browsing through AWS yesterday and I noticed their new products & solutions menu, which I think has a pretty telling breakdown of the building blocks of cloud APIs.

Compute & Networking

Compute - Virtual Servers in the Cloud (Amazon EC2)

Auto Scaling - Automatic vertical scaling service (AutoScaling)

Load Balancing - Automatic load balancing service (Elastic Load Balancing)

Virtual Desktops - Virtual Desktops in the Cloud (Amazon WorkSpaces)

On-Premise - Isolated Cloud Resources (Amazon VPC)

DNS - Scalable Domain Name System (Amazon Route 53)

Network - Dedicated Network Connection to AWS (AWS Direct Connect)

Storage & CDN

Storage - Scalable Storage in the Cloud (Amazon S3)

Bulk Storage - Low-Cost Archive Storage in the Cloud (Amazon Glacier)

Storage Volumes - EC2 Block Storage Volumes (Amazon EBS)

Data Portability - Large Volume Data Transfer (AWS Import/Export)

On-Premise Storage - Integrates on-premises IT environments with Cloud storage (AWS Storage Gateway)

Content Delivery Network (CDN) - Global Content Delivery Network (Amazon CloudFront)

Database

Relational Database - Managed Relational Database Service for MySQL, Oracle, SQL Server, and PostgreSQL (Amazon RDS)

NoSQL Database - Fast, Predictable, Highly-scalable NoSQL data store (Amazon DynamoDB)

Data Caching - In-Memory Caching Service (Amazon ElastiCache)

Data Warehouse - Fast, Powerful, Fully Managed, Petabyte-scale Data Warehouse Service (Amazon Redshift)

Analytics

Hadoop - Hosted Hadoop Framework (Amazon EMR)

Real-Time - Real-Time Data Stream Processing (Amazon Kinesis)

Application Services

Application Streaming - Low-Latency Application Streaming (Amazon AppStream)

Search - Managed Search Service (Amazon CloudSearch)

Workflow - Workflow service for coordinating application components (Amazon SWF)

Messaging - Message Queue Service (Amazon SQS)

Email - Email Sending Service (Amazon SES)

Push Notifications - Push Notification Service (Amazon SNS)

Payments - API based payment service (Amazon FPS)

Media Transcoding - Easy-to-use scalable media transcoding (Amazon Elastic Transcoder)

Deployment & Management

Console - Web-Based User Interface (AWS Management Console)

Identity and Access - Configurable AWS Access Controls (AWS Identity and Access Management (IAM))

Change Tracking - User Activity and Change Tracking (AWS CloudTrail)

Monitoring - Resource and Application Monitoring (Amazon CloudWatch)

Containers - AWS Application Container (AWS Elastic Beanstalk)

Templates - Templates for AWS Resource Creation (AWS CloudFormation)

DevOps - DevOps Application Management Services (AWS OpsWorks)

Security - Ops Application Management Services (AWS OpsWorks)Security - Hardware-based Key Storage for Regulatory Compliance (AWS CloudHSM)

The reason I look through at these spaces in this way, is to better understand the common services that API providers are, that are really making developers lives easier. Through assembling a list of the common building blocks, it allows me look at the raw ingredients that makes things work, and not get hunt up with just companies and their products.

There is a lot to be learned from API pioneers like Amazon, and I think this list of building blocks provides a lot of insight into what API driven resources the are truly making the Internet operate in 2014.


So You Wanna Do a Spreadsheet or Database To API Startup

A question I get regularly at API Evangelist is around the need for spreadsheet to API, and database to API services. Every couple weeks I get DMs and emails from someone who ask what tools are available, and that they were thinking about developing a solution. Right now I have three separate people asking this, and rather than just reply in email I figured I’d write a piece to respond, based upon my experience.

First, let me state that there is a need for simple spreadsheet to API solution. Much of the worlds data is locked up in spreadsheets, and in control of data stewards who don't access to API deployment resources. However, I'm not sure there is a lot of money to be made in providing this as a service, i feels like something that should just be default for all spreadsheet operations—you know like Google Spreadsheet.

Google Spreadsheet allows XML and JSON access to spreadsheets by default. in addition to the full blown Google Spreadsheet API. The only problem is with Google Spreadsheet you have some size limitations that make it a solution for only the simplest of data sources. I have written several posts on deploying APIs from Google Docs for both public and private spreadsheets, and have received numerous requests for a cloud version of this functionality, that would allow anyone to quickly deploy APIs from Google Spreadsheet sources.

I think the fact that Google Spreadsheet is free to use, and has API access to is spreadsheets, make this a real tough space to make money in, even if you do smooth out the process. To further convince people to not reinvent the wheel, there is already the “Wordpress” for data management, called CKAN, which is, "a powerful data management system that makes data accessible – by providing tools to streamline publishing, sharing, finding and using data.” CKAN is open source, and a tool that city, state, county and federal governments around the globe are downloading and deploying in growing numbers.

Even with the availability of Google Spreadsheet and its API, and data portal solutions like CKAN, this is still a problem I try to tackle from time to time, when I get the same itch as the people who have triggered this post. I’ve created what I call my OnGithub solutions, which allows for data management via Github, and API deployment using simple AWS hosted API framework.

  • Data.OnGithub.com - My attempt at providing users with a free (if publicly licensed) place to upload and manage their open data that originates in spreadsheets and CSVs within Github repositories. The entire app runs on Github Pages, using HTML, CSS ad JavaScript for core functionality, making it something that can be forked and implemented by anyone.
  • API.OnGithub.com - My attempt to provide the API layer on top of data.ongithub.com, separating the data management from the API deployment. This application runs as simple PHP API framework, deployed on Linux on AWS.

While this work was interesting, it still didn’t solve the problem of allowing the average data steward to easily manage their data that is warehoused in spreadsheets, and deploy APIs with no technical experience required. However it was a fun exercise to see what is possible for developing open tooling around this problem.

When it actually comes to deploying services that address data management I’ve watched providers like InfoChimps try and end up pivoting, and others such as Kasabi and Emergent One try, then go out of business. There are successful providers like SlashDB, Junar, and Socrata, who focus on providing open data platforms, whether it originates from a database or a spreadsheet. In 2014 I predict we will continue to see new startups emerge that promise to solve this problem, but will run into walls when it comes to monetization. There is money to be made in delivering APIs from spreadsheets, there just won’t be VC level money to be made when it comes to converting spreadsheets to APIs, but if you want to provide some open tooling and provide professional services, I think you can make an acceptable living.

The moral of this story is that if you want to do a startup, get funding and offer a data solution, you better demonstrate you have something really new and innovative, otherwise don’t bother, the space is saturated. If you want to develop some open tools, or use existing open tools and help organizations tackle their open data problems, I think there is plenty of opportunity to make both significant change, but also make some money offering sensible consulting services.


Setting Up My Database for FAFSA API

Before I got to work on the development of the FAFSA API I needed to select what my storage architecture will be for the API. When I'm developing an API around smaller sets of data I've been using JSON as the datastore, but in this case I think I will go with a MySQL backend.

This prototype of the FAFSA API will be running on an AWS EC2 linux instance, so the MySQL backend will run using AWS RDS. I created a new database and then using my JSON list of FAFSA fields I generated a MySQL table script.

Each FAFSA form can have well over 100 fields, so making sure they are small, precise datatypes is pretty critical. I will keep adjusting this as I am building the FAFSA API.


Automatic REST API for Databases As Complete Amazon Machine Image

SlashDB aka /db, has recently been added to the Amazon Marketplace, providing a complete database to API solution as an Amazon Machine Image (AMI).

Companies can use the /db Amazon image to automatically generate REST APIs from common relational databases like Microsoft SQL Server, Oracle, MySQL, PostGreSQL, which includes Amazon RDS.

/db charges based upon the number of databases you launch and the number of users that are using the API for the database, and you will have to pay for the regular charges for any Amazon EC2 instances.

I like the idea of building API solutions and launching as Amazon Machine Images. I think ready-to-go platform solutions like this for AWS, Google, Heroku and other top cloud platforms are good for potential API providers.


Take That Data Dump Access To Your Organizations Database And Build Your API

I have a long, winding history of database administration in my past. I've been managing databases since my first job working on school district databases in the State of Oregon in 1988. So I've been that database administrator you are asking for data access from, and I've personally managed and interacted with too many of these DBAs to keep track.

I was discussing strategies for getting access to your organizations central database today, and realized I don't have enough of these data acquisition stories on API Evangelist.

This data access quest will start with a burning need to build a website, mobile app or provide access to a 3rd party partner to some data. You know that this data is in your organizations database, all you need is access to make your project come to life.

In many organizations this can be easier said than done. There could be numerous reasons why this will be difficult ranging from technical to political, and be fuly prepared for it to often be political, masked as technical. 

Database access bottlenecks can range from security concerns to cranky, unfriendly DBAs or possibly just that the central database system has been in operation for many years and cannot handle large number of additional requests.

Unwinding the politics of database access can often be more time consuming than the technical side of database access. In many organizations your time is better spent just getting what you need and moving on. Make it only about access, and not looking to change the legacy process.

This legacy is why the most common way of getting at your organizations data will be a data dump to a network or FTP location, in a CSV, XML or JSON format. If your database administrator offers this to you, take it. You can make use of this, design the application you need need, without the headache of battling to change things or getting direct access.

A data dump is a great for DBA's to give you the data you requested, but offload the processing and responsibility to someone else. So take it, if it will meet your minimal requirements.

Once you get your data dump, build your applications datastore, API and site or application. Then at a later date, you can take your blueprint and sell it back to your database administrator, IT or further up the management chain, on how things can be done better.

While a data dump may not be optimal, it might be all you can get for now. A great deal of education of IT and database folks is needed, to help them understand that APIs are not all publicly available like Twitter, and they actually provide much granular level security and access than traditional IT access levels. But we have many years of education to occur before we will change the tides.

So take that data dump, build your own API and application and make evolving IT a separate battle from your project goals.


The Resource Stack

I've been organizing much of my research around APIs into groupings that I call "stacks". The term allows me to loosely bundle common API resources into meaningful "stacks" for my readers to learn about.

I'm adding a new project to my list of 30+ stacks, that is intended to bring together the most commonly used API resources, into a single, meaningful stack of resources any web or mobile developer can quickly put to use.

So far I have compiled the following APIs in 29 separate groups:

  • Compute
    • Amazon EC2
    • Google AppEngine
    • Heroku
  • Storage
    • Amazon S3
    • Dropbox
    • Rackspace Cloud Files
  • Database
    • Amazon RDS
    • Amazon SimpleDB
  • DNS
    • Amazon Route 53
    • Rackspace Cloud DNS
    • DNS Made Easy
    • DNSimple
  • Email
    • SendGrid
    • Amazon SES
    • Rackspace Email
  • SMS
    • Twilio
    • AT&T SMS
  • MMS
    • Mogreet
    • AT&T SMS
  • Push Notifications
    • Urban Airship
    • AT&T SMS
  • Chat
    • Skype
    • Facebook Chat
    • Google Talk
  • Social
    • Twitter
    • Facebook
    • Google+
    • LinkedIn
  • Location
    • Google Directions
    • Google Distance Matrix
    • Google Geocoding
    • Google Latitude
    • Geoloqi
  • Photos
    • Flickr
    • Facebook
    • Instagram
  • Documents
    • Box
    • Google Drive
  • Videos
    • YouTube
    • Flickr
    • Facebook
    • Viddler
    • Vimeo
    • Instagram
  • Audio
    • SoundCloud
    • Mixcloud
  • Music
    • Echo Nest
    • Rdio
    • Mixcloud
  • Notes
    • Evernote
  • Bookmarks
    • Delicious
    • Pinboard
  • Blog
    • Wordpress
    • Blogger
    • Tumblr
  • Content
    • ConvertAPI
    • AlchemyAPI
  • Contacts
    • Google
    • Facebook
    • LinkedIn
    • FullContact
  • Businesses / Places
    • Factual
    • Google Places
  • Checkins
    • Foursquare
    • Facebook
  • Calendar
    • Google
  • Payments
    • Dwolla
    • Stripe
    • Braintree
    • Paypal
    • Google Payments
  • Analytics
    • Google
    • Mixpanel
  • Advertising
    • Adsense
    • Adwords
    • Facebook
    • Twitter
    • AdMob
    • MobClix
    • InMobi
  • Real-time
    • Google Real-time
    • Firebase
    • Pusher
  • URL Shortener
    • Bit.ly
    • Google URL Shortener

This is just a start. I will publish a full stack, complete with logos, descriptions and links. For now I'm just flushing out my thoughts regarding what are some of the top resources that are currently available for developers.

I will be making another pass over the APIs I track on in the coming weeks, as well as add to the list each week as part of my monitoring.

If you see anything missing, that should be in there...let me know!


I Want An API

I feel we have done a good job explaining what is an API, why people need APIs, and providing services to manage APIs, but we are falling short on delivering information, tools and services for deploying APIs.

First I want to acknowledge that many companies have established IT resources, in the form of technology talent, and internal systems they can put to use when deploying APIs. This post isn’t for you. This is about individual problem owners looking to deploy APIs, who most likely have little or no access to traditional IT resources for a variety of reasons.

You want to deploy an API, because you havplease some data, content or other resources you need to make accessible to partners or the public, allowing for integration into other systems or be used in the development of web or mobile applications or possibly even data analysis and visualizations.

Machine Readable

As much as I love APIs, I suggest everyone consider your overall objective behind launching an API. You may not need an API, you may just need to make your information machine readable and available on the open Internet. Technically, this isn’t an API, but publishing your data or content in a machine readable format like XML or JSON can accomplish many of the same goals of an API, without the overhead of an API.

Many tools you use every day like Microsoft Office, Google Apps, Salesforce provide tools for publishing data and content in machine readable formats like CSV, XML or JSON, much like publishing or printing as a PDF. Not everyone needs an API, but everyone needs to understand the how-to, and the benefits of making machine readable a default in all business, organization, government and even individual processes.

PDF made data and content portable, for humans. But a PDF does very little good for other machines, websites and mobile apps. CSV, XML and JSON make data and content portable, while also making them machine readable. Please consider this simple evolution, while planning your API.

Machine readable will go far to increase efficiencies, interoperability, transparency and innovation within many existing government, organizational and business operations. By providing data, content and other resources in CSV, XML and JSON they can be easily used across mobile, web and data implementations. However in other situations there will be a higher need for more advanced interaction with data and content in the form of querying, filtering, transformations as well as the need to secure resources, limiting who has access and to what scale. This is where we start moving beyond just machine readable, and into a truly needing an API.

When it comes to API deployment I would put users into two distinct groups:

  • Technical - You have skills and resources to architect, design, develop and deploy infrastructure, code to meet your API needs
  • Non-Technical - You do not have the resources to assemble and execute an API strategy, making a cloud service an optimal route

APIs are about making critical assets and resources available to technical and non-technical groups.  APIs are often about circumventing some of the ways technology bottlenecks can slow us down.  It is important that there are not just a wealth of cloud services for non-technical folks to deploy APIs, but also a buffet of openly licensed, downloadable tools that anyone can put to use when deploying an API.

Technical

You have some technical resources and the know how (or willingness to learn), and you want an API. Ideally there would be open source tools available for you to download and implement:

  • CSV to API 
  • Database to API
  • Developer Portal 
    • Docs 
    • Code 
    • Support 
  • Registration
  • Authentication
  • Key Management 
  • App Management
  • Billing

These tools would allow you to assemble your own API strategy in the language and platform of your choice. There should be educational materials helping you understand how to implement individually or collectively as a single platform.

Non-Technical

You do not have the technical resources or time to implement your own API Strategy. You need a dead simple, software as a service implementation. You can see examples of this with the latest breed of API service providers, Emergent One, API Spark, and API Engine.

We are just seeing this new breed of API service providers rise to meet the demand of not just API management, but now also API deployment. Making API deployment accessible to anyone, not just developers or companies with necessary resources. This will make the benefits of APIs accessible to the everyday problem owners, who often are restricted by IT bottlenecks or entire lack of IT resources.

Making data and content from a spreadsheet or existing database, machine readable by default or deployed as an API, allowing for more sophisticated interactions, should be as easy as deploying Wordpress for your blog. You should be able to go to a service provider and get exactly the hosting, deployment and management options you desire, or you should be able to download and install on your own infrastructure, giving you the control you desire over storage, hosting and access.

I’m adding a new project area to API Evangelist called I Want An API, where I’ll be working to help define this space, and bring together other people to assist in developing the resources and tools we will all need to easily launch an API.


From ETL to API Reciprocity, Looking at 20 Service Providers

I spent time this week looking at 20, of what I’m calling API reciprocity providers, who are providing a new generation of what is historically known as ETL in the enterprise, to connect, transfer, transform and push data and content between the cloud services we are increasingly growing dependent on.

With more and more of our lives existing in the cloud and via mobile devices, the need to migrate data and content between services will only grow more urgent. While ETL has all the necessary tools to accomplish the job, the cloud democratized IT resources, and the same will occur to ETL, making these tools accessible by the masses.

There are quite a few ETL solutions, but I feel there are 3 solutions that are starting to make a migration towards an easier to understand and implement vision of ETL:

 

These providers are more robust, and provide much of the classic ETL tools the enterprise is used to, but also have the new emphasis on API driven services. But there are 10 new service providers I’m calling reciprocity platforms, that demonstrate the potential with offering very simple tasks, triggers and actions that can provide interaction between two or more API services:

I consider reciprocity an evolution of ETL, because of three significant approaches:

  • Simplicity - Simple, meaningful connections with transfer and tranformations that are meaningful to end users, not just a wide array of ETL building blocks an IT architect has to implement
  • API - Reciprocity platforms expose meaningful connections users have the cloud services they depend on. While you can still migrate from databases or file locations as with classic ETL, reciprocity platforms focus on APIs, while maintaining the value for end-users as well as the originating or target platforms
  • Value - Reciprocity focus on not just transmitting data and content, but identifying the value of the payload itself and the relationships, and emotions in play between users and the platforms they depend on

This new generation of ETL providers began the migration online with Yahoo Pipes. Which resonated with the alpha developers looking to harvest, migrate, merge, mashup and push data from RSS, XML, JSON and other popular API sources--except Yahoo lacked the simplicity necessary for wider audience appeal.

While I feel the 10 reciprocity providers isted above represent this new wave, there are six others incumbents trying to solve the same problem:

While studying the approach of these 20 reciprocity providers, it can be tough to identify a set of common identifiers to refer to the value created.  Each provider has their own approach and potentially identifying terminology. For my understanding, I wanted to try and establish a common way to describe how reciprocity providers are redefining ETL.  While imperfect, it will give me a common language to use, while also being a constant work in progress.

For most reciprocity providers, it starts with some ecompassing wrapper in the form of an assembly which describes the overall recipe, formula or wrapper that contains all the moving ETL parts.

Within this assembly, you can execute on workflows, usually in a single flow, but with some of the providers you can daisy chain together multiple (or endless) workflows to create a complex series of processes.

Each workflow has a defining trigger which determines the criteria that will start the workflow such as new RSS post or new tweet, and with each trigger comes a resulting action which is the target of the workflow, publishing the RSS post to a syndicated blog or adds the tweet to a Google Spreadsheet or Evernote, or any other combination of trigger and action a user desires.

Triggers and actions represent the emotional connections that are the underpinnings of ETL’s evolution into a more meaningful, reciprocation of value that is emerging in the clouds. These new providers are connecting to the classic lineup of ETL interfaces to get things done:

  • Databases
  • Files
  • Messaging
  • Web Service

While also providing the opportunity for development of open connectors to connect to any custom database, file, messages and web services. But these connectors are not described in boring IT terms, they are wrapped in the emotion and meaning derived from the cloud service--which could have different meanings for different users. This is where one part of the promise of reciprocity comes into play, by empowering average problem owners and every day users to define and execute against these types of API driven agreements.

All of these actions, tasks, formulas, jobs or other types of process require the ability to plan, execute and audit the processes, with providers offering:

  • Scheduling
  • History / Logging
  • Monitoring

With data being the lifeblood of much of these efforts, of course we will see “big data” specific tools as well:

  • Synchronization
  • Data Quality
  • Big Data
  • Analytics

While many reciprocity providers are offering interoperability between two specific services, moving data and resource from point a to b, others are bringing in classic ETL transformations:

  • Reformat
  • Aggregate
  • Sort
  • Dedupe
  • Filter
  • Partition
  • Merge
  • Join
  • Split
  • Convert

After the trigger and before the action, there is also an opportunity for other things to happen, with providers offering:

  • Push
  • Events

During trigger, action or transformation there are plenty of opportunities for custom scripting and transofrmations, with several approaches to custom programming:

  • Custom Scripts
  • JavaScript
  • Command Line
  • API

In some cases the reciprocity provider also provides a key value store allowing the storage of user specified data extracted from trigger or action connections or during the transformation process. Introducing a kind of memory store during the reciprocal cycle.

With the migration of critical resources, many of the leading providers are offering tools for testing the process before live execution:

  • Test
  • Debugger
  • Sandbox
  • Production

With any number of tasks or jobs in motion, users will need to understand whether the whole apparatus is working, with platforms offering tools for:

  • Performance
  • Monitoring
  • Optimization

While there are a couple providers offering completely open source solutions, there are also several providing OEM or white label solutions, which allow you to deploy a reciprocity platform for your partners, clients or other situations that would require it to branded in a custom way.

One area that will continue to push ETL into this new category of reciprocity providers is security. Connectors will often use OAuth, respecting a users already established relationship with platform either on the trigger or action sides, ensureing their existing relationship is upheld. Beyond this providers are offering SSL to provide secure transmissions, but in the near future we will see other layers emerge to keep agreements in tact, private and maintain the value of not just the payload but the relationships between platforms, users and reciprocity providers.

Even though reciprocity providers focus on the migration of resources in this new API driven, cloud-based world, several of them still offer dual solutions for deploying solutions in both environments:

  • Cloud
  • On-Premise

There is not one approach either in the cloud, or on premise that will work for everyone and all their needs. Some data will be perfectly find moving around the cloud, while others will require a more sensitive on-premise approach. It will be up to problem owners to decide.

Many of this new breed of providers are in beta and pricing  isn’t available. A handful have begun to apply cloud based pricing models, but most are still trying to understand the value of this new service and what market will bear. So far I’m seeing pricing based upon:

  • Seat
  • Assembly
  • Tasks
  • Connections
  • Extension
  • Sync
  • Support
  • Training

Much like IaaS, PaaS SaaS and now BaaS, reciprocity providers will have a lot of education and communication with end users before they’ll fully understand what they can charge for their services--forcing them to continue to define and differentiate themselves in 2013.

One of the most important evolutionary areas, I’m only seeing with one or two providers, is a marketplace where reciprocity platform users can browse and search for assemblies, connectors and tasks that are created by 3rd party providers for specific reciprocity approaches. A marketplace will prove to be how reciprocity platforms serve the long tail and niches that will exist within the next generation of ETL. Marketplaces will provide a way for developers to build solutions that meet specific needs, allowing them to monetize their skills and domain expertise, while also bringing in revenue to platform owners.

I understand this is all a lot of information. If you are still ready this, you most likely either already understand this space, or like me, feel it is an important area to understand and help educate people about. Just like with API service providers and BaaS, I will continue to write about my research here, while providing more refined materials as Github repos for each research area.

Let me know anything I'm missing or your opinions on the concept of API reciprocity.


AWS API Reference Architecture

I was checking out the updates to the AWS Reference Architecture, where they provide blueprints for how you can use AWS. In this version AWS provides an e-commerce architecture reference--providing a system overview, a detailed architectural diagram, and a list of the AWS services used in the architectural approach.

The AWS e-commerce architecture reference provides three separate areas:

  • Web Frontend - This is a reference architecture for the web frontend of an e-commerce site. It makes use of Route 53, CloudFront, Elastic Beanstalk, S3, ElastiCache, DynamoDB and CloudSearch
  • Checkout Pipeline - This is a reference architecture for a secure and highly available checkout pipeline service for an e-commerce site. It uses the Virtual Private Cloud, the Simple Workflow Service, Elastic Beanstalk, the Relational Database Service, and the Simple Email Service
  • Marketing & Recommendations Service - This is a reference architecture for a marketing and recommendation service for an e-commerce site. It uses Elastic MapReduce, S3, Elastic Beanstalk, the Relational Database Service, DynamoDB, and the Simple Email Service

Beyond providing the detailed reference card, AWS provides more information on a three-day course in AWS Architecture Training that will be held in numerous U.S. cities this spring.

I will add API architecture references as a building block for API owners to consider when planning and managing an API. If you can provide clear blueprints for developers to follow, you can increase the chances developers will be successful with integration of an API into their application or systems.


Which BaaS Pricing Model Is Better?

I’ve been processing a conversation over at Branch, that was triggered by a story in TechCrunch by Sarah Perez(@sarahintampa) called, “StackMob Ratchets Up The Competition: Makes API Calls Free, Launches A Marketplace For Third-Party Mobile Services”.

There are several layers to the convSteersation, the part I’m focusing on is about how BaaS providers are structuring their pricing, in which Sarah Perez writes:

StackMob is now making API calls free. This latter news will impact the competitive landscape, which includes startups like Parse, Kinvey and others, all of which have traditionally charged developers as API calls increase alongside an app’s popularity.

Making the argument that:

Today, developers have to have an understanding of how many API calls they make, and if an app becomes really successful, developers are penalized for that with increased pricing.

Sarah quotes StackMob CEO Ty Amell, saying that:

“this isn’t really all that fair, explaining that it doesn’t matter to StackMob how many API calls the app makes – the cost to them doesn’t really go up. “It doesn’t matter to us whether someone’s making a million API calls or 20 million API calls – the cost is fairly manageable on our parts. So we felt like this was a blocker to innovation on the developer’s side,” he explains. “And we feel like API calls are starting to become a commodity, where it’s really how can you provide value beyond that?”

The conversation immediately begins on Twitter, with:

Steve Gershnik(@StackMobSteve) of Stackmob - @stackmob doesn't charge for 'active users' either. No success tax for you.
Joe Chernov (@jchernov) of Kinvey - but you charge for features. In the end, everyone charges for something. It's business!

After this, the conversation moves to Branch. I’m not including full comments, because there is a lot of banter back and forth, discussing various claims BaaS providers make, that I feel are irrelevant to the more important, pricing story.

The conversation continues:

Joe Chernov (@jchernov) of Kinvey:
If a developer prefers to pay by number of users, our model may work for them; if someone would rather buy features, then StackMob may make more sense. It's all good. Everyone has their own twist on pricing. Vendors should strive to highlight their own value without misrepresenting others'. Fair? Steve Gershnik (StackMob):
We don't charge per user. We don't charge per API call. We think those are outdated and archaic pricing models.
Miko Matsumura (@mikojava) of Kii:
I respect what @StackMob is doing but in looking at their pricing page, isn't it just segmenting everyone into a free tier and a "custom" Enterprise tier?
Steve Willmott(@njyx) of 3Scale:
Which model is best in our experience depends a lot on what type of market you're serving - individual developers tend to like by the drop because (start cheaply and scale). Large companies tend to like predictability so they can budget properly.
Box.net is a storage company - yet, for all their enterprise accounts storage is unlimited. Variable scared their customers. Instead they charge by number of seats/users - something their enterprise customers CAN predict.
Thinking more abstractly: If a developer is successful, there will be some sort of increase in volume. Unless the PAAS service is truly magical that means extra cost for the PAAS provider.
You can call this a success tax, but in my view what matters is not if there is more cost to the developer but if the unit cost goes down over time. I.e. As I succeed, I'm happy to pay you a bit more as long as you are not getting an increasing %age of the pie (and ideally if the %age is decreasing). If that %age increases I'll most likely ditch you for someone who offers me better terms at that point.
QuickBlox(@QuickBlox):
Our pricing model is per API calls as well, we do it like that because it is clear for developers (and for our business model), also allows to create apps and prototypes that, in case they are not successful, the developers won't be charged. according to @jchernov

Only Stackmob, Kinvey, Kii and Quickblox are represented in the conversation on Branch. So I wanted to see how each of the 20 BaaS providers I’m tracking on, structure their pricing:

I couldn’t find pricing for five of the BaaS providers, because they are in beta, but pricing by API call is definitely the preferred way for the rest of them, with 11 of the 15 using this approach. The rest of them charge based upon a mix of number of users, number of apps, data and file storage, push messages, and email messages. With Stackmob going with pricing based upon feature, add-on or premium services, which can also be from 3rd party services via the StackMob marketplace.

Personally I don’t think there is one approach to API or BaaS pricing. Seems to me, that different models will work for different industries, companies and even at the application level. But I wanted to document this conversation for my larger research project around the BaaS space--maybe I will have more opinions after I dive in deeper.

Let me know what you think about BaaS pricing.


Digital Strategy: 20 Federal Agencies, 76 data API and 75 Mobile API Initiatives

It has been a while since I provided an update on the White House Digital Strategy.  I monitor the progress of federal agencies participation programmatically, using JSON reports published by each agency at the agencies domain, /digitalstrategy.

After running the script today, I notice 20 federal agencies with active footprints.  There are about five more, but there are issues with the JSON version of their digital stratgies--I really want to focus on the programmatic value. So, across these 20 agencies I find 76 data API initiatives and 75 mobile API projects.

To bring you up to speed, there are two specific milestones in the Digital Government Strategy that specifically address API deployment:

  • 2.2 (Data) - Make high-value data and content in at least existing two major customer-facing systems available through web APIs, apply metadata tagging and publish a plan to transition additional high-value systems
  • 7.2 (Mobile) - Optimize at least two existing priority customer-facing services for mobile use and publish a plan for improving additional existing services  

To help shed light on where these progressive agencies are going with both their data and mobile intiatives I wanted to break them out into separate groups, for each agency.

National Science Foundation (NSF)

2.2 Data

  • Scientists and Engineers Statistical Data System (SESTAT) - The Scientists and Engineers Statistical Data System (SESTAT) provides longitudinal information on the education and employment of the college-educated U.S. science and engineering workforce, collected through three biennial surveys. These surveys capture trends in employment opportunities and salaries in various degree fields, and help federal researchers to evaluate the effectiveness of equal opportunity efforts.
  • NSF Award Information - Information pertaining to NSF awards from 1959 through the present. Data includes principal investigator, awardee institution, NSF program and associated NSF organizations, award amount, award dates, award abstract, and publications and conference proceedings produced as a result of the research.
  • NSF Research Grant Funding Rates - NSF funding rates for competitive research proposals by organizational unit. (Funding rates constitute the number of awards divided by the number of actions for a given year by organizational unit).
  • Science & Engineering Indicators - Science and Engineering Indicators (SEI) provide U.S. and international data on the following subjects: 1) Elementary and Secondary Mathematics and Science Education 2) Higher Education in Science and Engineering 3) Science and Engineering Labor Force 4) Research and Development: National Trends and International Comparisons 5) Academic Research and Development 6) Industry, Technology, and the Global Marketplace 7) Science and Technology: Public Attitudes and Understanding 8) State Indicators.
  • Project Outcome Reports - Section 7010 of the America COMPETES Act requires that researchers funded in whole or in part by NSF report on the outcomes of the funded research for the general public. Project Outcomes Reports describe the project outcomes or findings, addressing the intellectual merit and broader impacts of the work as defined in NSF merit review criteria.
  • Graduate Research Fellowship Program Awardees and Honorable Mention Recipients - Demographic information for recipients of NSF GRFP awards. The GRFP provides three years of graduate education support for individuals who have demonstrated the potential for significant achievements in science and engineering research

7.2. Mobile

  • News - NSF press releases and other agency news products
  • Funding - Catalog of NSF’s funding opportunities with links to Grants.gov System
  • Discoveries - Articles describing notable results from NSF funded research
  • Awards - Searchable database of NSF awards.
  • Staff Directory - Senior management list, organization list, browse-able and searchable staff list.
  • Events - Calendar of NSF events.
  • Vacancies - Current job vacancies at NSF.
  • Directions - Directions for visiting NSF headquarters in Ballston.
Department of State

2.2 Data

  • Bibliographical Metadata of the Foreign Relations of the United States Series - Raw bibliographical metadata for the nearly 500 official historical documentary volumes of U.S. foreign policy in the Foreign Relations of the United States published since 1861
  • aoprals.state.gov - Foreign Travel Per Diem rates
  • J1 Visa Exchange Visitor Program website - Provide accessible, plain-language information on the J1 visa Exchange Visitor Program
  • ForeignAssistance.gov - The goal of the Foreign Assistance Dashboard is to make all U.S. Government foreign assistance investments available in an accessible and easy-to-understand format.
  • Travel.State.gov - Provides an informational portal into Consular Affairs pages on international travel, passports, visa, international child abduction, and international law and policy.
  • U.S. Passport Issuance Data - U.S. Passports Issued per Fiscal Year (2010-1996); U.S. Passports Issued per Calendar Year (1995-1974); U.S. Passport Applications Received by Fiscal Year (1986-2010); Passport Issuance by State per Fiscal Year (2007-2010)

7.2. Mobile

  • Bibliographical Metadata of the Foreign Relations of the United States Series - Raw bibliographical metadata for the nearly 500 official historical documentary volumes of U.S. foreign policy in the Foreign Relations of the United States published since 1861
  • aoprals.state.gov - Foreign Travel Per Diem rates
  • Foreign Service Mobile App - The goal of this app is to serve as a learning tool to educate diverse university students (undergraduate and graduate), alumni and mid-career professionals about the various career opportunities in the Foreign Service and provide the information and resources to help them prepare for the selection and hiring processes.
  • Usembassy.gov (450+ websites under the usembassy/usconsulate.gov domain) - Provides public affairs/diplomacy information and critical citizen services (passports, visa, and business opportunities) for in-country American citizens.
  • J1 Visa Exchange Visitor Program website - Provide accessible, plain-language information on the J1 visa Exchange Visitor Program
  • ForeignAssistance.gov - The goal of the Foreign Assistance Dashboard is to make all U.S. Government foreign assistance investments available in an accessible and easy-to-understand format.
  • U.S. Passport Issuance Data - U.S. Passports Issued per Fiscal Year (2010-1996); U.S. Passports Issued per Calendar Year (1995-1974); U.S. Passport Applications Received by Fiscal Year (1986-2010); Passport Issuance by State per Fiscal Year (2007-2010)
Department of Housing and Urban Development (HUD)

2.2 Data

  • HUD User - HUD USER provides interested researchers with access to the original data sets generated by PD&R-sponsored data collection efforts, including the American Housing Survey, HUD median family income limits, as well as microdata from research initiatives on topics such as housing discrimination, the HUD-insured multifamily housing stock, and the public housing population.
  • Low Rent Apartment Search - The government gives funds directly to apartment owners, who lower the rents they charge low-income tenants. You can find low-rent apartments for senior citizens and people with disabilities, as well as for families and individuals.
  • Fair Market Rents - The Fair market rents (FMR) and Income limits data app will provide users the ability to easily obtain statistics for FMR and income limits by their present or other locations.
  • PD&R Edge - Provides access to PD&R's on-line magazine "the Edge". TheEdgeis an online magazine providing news, a message from the Assistant Secretary, and a wide range of information on housing and community development issues, regulations, and research that is updated biweekly.The Edgeis available on Apple iOS and Android powered smartphones.ThePD&R EdgeApp directs users to the PD&R mobile webpage where up- to -date content can be accessed. The app allows users to share content on social media networks like Facebook and Twitter, and via email.
  • Public Housing (PHA) Contact - View contact information for Public Housing Agencies in your city and state
  • Enterprise GIS - Data is a key component to any location-based application. Enterprise GIS aims to remove data acquisition, integration, and maintenance obstacles for geo-developers by providing access to premium HUD data sets.

7.2. Mobile

  • Housing Counselor - HUD sponsors housing counseling agencies throughout the country that can provide advice on buying a home, renting, defaults, foreclosures, and credit issues. This app allows you to select a list of agencies for each state below. You may search more specifically for areverse mortgage counseloror if you are facing foreclosure, search for aforeclosure avoidance counselor.
  • File a Fair Housing Complaint - Federal law prohibits housing discrimination based on your race, color, national origin, religion, sex, familial status, or disability. By creating a mobile app for the HUD Form 903, FHEO will expand the ability to reach anyone that feels that their civil rights have been violated and have them submit a fair housing complaint easily and confidentially.
  • FHEO HUD.gov mobile adaptive web content - Expanding access to Fair Housing an
  • Housing Discrimination Investigative Checklist - Developing and updating the design of the FHEO investigative checklist that is used by FHEO investigative resources.
  • HUDMaps - HUD has developed a number of Geospatial Information Systems (GIS) that are currently available on the web and are listed at http://egis.hud.gov/. The HUDMap tool allows HUD Employees and Contractors to pull information from various internal and external sources to assist programmatic and disaster response projects. HUDMaps access is being expanded to provide for mobile device capabilities.
  • GMP Monitoring Exhibits Handbook - The CPD Monitoring Handbook includes all programs and technical functions for which CPD Field staff have monitoring responsibilities.
Federal Energy Regulatory Commission (FERC)

2.2 Data

  • Decisions and Notices - Provides a monthly collection of Delegated Orders, Notices, and Commission Decisions from Commission Meetings or Notational Voting arranged by date.
  • eTariff - Allows for tariffs, tariff revisions and rate change applications to be filed electronically in the manner prescribed by Order No. 714. The affected regulated entities are: • Public utilities and Power Marketing Administrations under Parts 35 and 300; • Natural gas pipelines under Parts 154 and 284; • Intrastate gas pipelines under Part 284; and • Oil pipelines under Part 341.
  • eLibrary - eLibrary is a records information system that contains: 1. Electronic versions of documents issued by FERC from 1989-Present; 2. Documents received and issued by FERC: a. A description of documents from 1981-Present; b. Microfilm and aperture cards of documents for 1981-1995; c. Scanned images of paper documents from 1995-Present; and d. Native files electronically submitted from November 2000-Present
  • eService - Provides users with official mailing list or service list for a docketed proceeding.
  • Electric Quarterly Reports (EQR) - Allows users to access data submitted by utilities and power marketers. Access is achieved through the following means: 1. Download Spreadsheets - Contract and transaction data by company by quarter in a structure similar to the one used for data import. 2. Summary Reports - Short summaries of each company's EQR filings (beginning Q1 2005) identifying the products they sell, the customers they sell to and the control areas where deliveries are made. 3. Filing Inquiries - EQR data can be retrieved using standard queries which can be customized by the user. 4. Selective Filings Download - Retrieves data on multiple companies and quarters with one request. Processed overnight and sent via email the next day. 5. Download Database - Provides for the download of the full EQR database. It should be used only by advanced users with several gigabytes of disk space.

7.2. Mobile

  • Decisions and Notices - Provides a monthly collection of Delegated Orders, Notices, and Commission Decisions from Commission Meetings or Notational Voting arranged by date.
  • What’s New - The “What’s New” RSS feed provides news and information details about the events at FERC.
  • eSubscription - Users subscribe or ‘sign up’ for specific dockets and are notified via email about future correspondence. Users have immediate access to the correspondence or documents in eLibrary.
  • eFiling - Allows users to electronically submit qualified documents to FERC in lieu of paper filings.
  • eRegistration - eRegistration provides the FERC customer an easy-to-use entry point to do business with all FERC Online applications. Think of eRegistration as a form of membership. By registering, the user will receive a single user id and password that allows them to transact all of their business with FERC. eRegistration is valuable to any person who transacts business with the FERC on behalf of themselves or another organization (e.g. companies or corporations). It provides authentication support to the FERC Online applications that ensures safe and secure transactions, thereby protecting the integrity of your data.
  • Electric Quarterly Reports (EQR) - Allows all public utilities and power marketers to file EQRs for the most recent calendar quarter. The filings must summarize contractual terms and conditions for: Market-based power sales, Cost-based power sales, and Transmission service.
National Archives and Records Administration (NARA)

2.2 Data

  • FederalRegister.gov - Integration of the Regulations.gov API into FederalRegister.gov and its API. This integration would provide greater access to public comments and supporting documents in Regulations.gov, and improve process for submitting public comments from FederalRegister.gov into Regulations.gov.
  • FederalRegiser.gov API - Expand the FederalRegister.gov API to include the \"Public Inspection Desk.\"
  • Code of Federal Regulations; Federal Register - Develop an API for FDsys through the Office of Federal Register-Government Printing Office Partnership
  • National Archives Catalog on Wikipedia - Make additional National Archives records available through Wikipedia, which is accessible through the MediaWiki API
  • National Archives Catalog on Flickr - Make additional National Archives records available on Flickr, which is accessible through the Flickr API

7.2. Mobile

  • FederalRegister.gov - Mobile optimize FederalRegister.gov
  • Daily Compilation of Presidential Documents - Develop a mobile application based on the Daily Compilation of Presidential Documents
  • Archives.gov - Mobile optimize Archives.gov
  • National Archives Catalog on Wikipedia - Make additional National Archives records available through Wikipedia, which is mobile optimized and available through mobile apps
  • National Archives Catalog on Flickr - Make additional National Archives records available through Flickr, which is mobile optimized and available through mobile apps
Social Security Administration (SSA)

2.2 Data

  • SSA State Agency Monthly Workload Data - Monthly information from October 2000 onwards concerning claims for disability benefits that were referred for a disability determination to one of the 54 state agencies. The data may be used to examine disability application filing trends by time and by state, state agency workloads, and disability claims outcomes.
  • Average Wait Time Until Hearing Held Report - A presentation of the average time (in months) from the hearing request date until a hearing is held for claims pending in the Office of Disability.
  • Hearing Office Workload Data - A monthly presentation of four key workload indicators (pending, receipts, dispositions and average processing time) for each hearing office in the Office of Disability Adjudication and Review (ODAR).
  • Hearing Office Average Processing Time Ranking Report - A monthly ranking of the 165 ODAR hearing offices (including 3 satellite offices) by the average number of days until final disposition of the hearing request. The average shown will be a combined average for all cases completed in that hearing office. The public will be able to determine where a particular hearing office stands among the total with respect to this workload category.
  • SSA/Department of State Identity Verification Web Service - SSA is providing the DOS with information to verify the identity of passport applicants via a web service. This is being done based on a reimbursable agreement with DOS.

7.2. Mobile

  • Mobile Supplemental Security Income (SSI) Wage Reporting Application - SSI recipients will use this application to report their monthly wage amounts. This enables them to meet their monthly reporting obligations.
  • Mobile Optimized Frequently Asked Questions (FAQs) - People seeking information from SSA will use this mobile optimized FAQ website to obtain information about numerous topics.
  • Mobile Contact - People seeking to do business with SSA will use this application to get needed information and directions to their local SSA office. In addition they will have access to information about services available by phone and online.
  • Mobile Optimized Life Expectancy Calculator - The Life Expectancy Calculator is a valuable financial planning tool which we encourage the public to use to help decide when to retire and begin collecting Social Security benefits. The Calculator uses the gender and birth date entered by the user to provide the average number of additional years a person of the same gender and age can expect to live when he reaches a specific age. Providing a mobile optimized version of the Calculator also provides links to other helpful retirement planning tools and helps to promote other online services.
Department of Agriculture (USDA)

2.2 Data

  • National Farmers Market Directory - AMS produced director containing information about U.S. farmers market locations, directions, operating times, product offerings, and accepted forms of payment. Supports local and regional food systems, as well as development of local economies.
  • Office Information Profile System - USDA Service Centers are designed to be a single location where customers can access the services provided by the Farm Service Agency, Natural Resources Conservation Service, and the Rural Development agencies. This tool provides the address of a USDA Service Center and other Agency offices serving your area along with information on how to contact them.
  • ERS Charts of Note - Highlights from our current and past research currently provided in daily Charts of Note available via Web APIs (Application Programming Interfaces, a system of machine-to-machine interactions over a network), and an embeddable widget code
  • Digital Asset Management - A DAM system (DAMS) is a combination of software, hardware and professional services that provides a central location for storing, managing and accessing digital assets - both the files and the accompanying metadata. Digital assets include images, graphics, logos, animations, audio/video clips, presentations, pages, documents and a number of other digital file formats.

7.2. Mobile

  • USDA Newsroom - The USDA Newsroom holds official news releases, statements, transcripts and speeches released by the Department.
  • USDA Blog - The Blog features content from all USDA agencies and features the latest news, events and features. The Blog also provides the public an opportunity to ask questions or share their thoughts about the latest issues.
  • AmberWaves eZine - Fully mobile-optimized online magazine, and magazine app for offline reading on tablets and other mobile devices
  • LDP/PCP Rates - Mobile Version of the Loan Deficiency Payment and Posted County Price rate system
  • Meat and Poultry Inspection Directory - The Meat, Poultry and Egg Product Inspection Directory is a listing of establishments that produce meat, poultry, and/or egg products regulated by USDA's Food Safety and Inspection Service (FSIS) pursuant to the Federal Meat Inspection Act, the Poultry Products Inspection Act, and the Egg Products Inspection Act.
Department of Education (ED)

2.2 Data

  • EDFacts - The purpose of EDFacts is to collect and report K-12 education performance data for use by policymakers and Department of Education program offices. With relevant, actionable data supplied by EDFacts, decision-makers can identify which programs are working, have insight into education progress at the state and district levels, identify gaps and best practices, and make sound education policy and budgetary decisions to improve education outcomes. http://www2.ed.gov/about/inits/ed/edfacts/index.html
  • State Education Data Profiles - Web-based search tool for statewide information in elementary/secondary education, postsecondary education and selected demographics for all states in the U.S. using a variety of NCES data sources including NAEP, IPEDS, and CCD data. http://nces.ed.gov/programs/stateprofiles/
  • College Navigator - A web-based tool for searching all colleges and universities in the United States. College Navigator consists primarily of the latest data from the Integrated Postsecondary Education Data System (IPEDS), the core postsecondary education data collection program for the National Center for Education Statistics. http://nces.ed.gov/collegenavigator/
  • Program Information Publication System (Part of Program Information on the Web) - One of ED's core purposes is to inform the public about funding opportunities (programs). Currently program information is available on the Web in HTML web pages. ED is developing a central database repository (Program Information Publication System or PIPS) with a web API. ED.gov will call on the PIPS web API to publish program information on the website and provide an online search tool. Customers will be able to find more consistent, up-to-date program information quickly and easily. Information for this initiative is limited to programs authorized and funded under federal law as well as other related efforts, and encompasses several ED offices.

7.2. Mobile

  • G5 Grants Management System - G5 supports the Agency's grant making business process and is a full lifecycle end-to-end grants management system (from intake of applications, peer review, award, payment, performance monitoring and final closeout of the grant award. https://www.g5.gov
  • StudentAid.gov - StudentAid.gov is the first step in a multi-phase project to provide consumers with a one-stop website where they can access federal student aid information, apply for federal aid, repay student loans and navigate the college decision-making process. Whether you're a student, a parent, a borrower in repayment, an educator or a professional engaged in influencing and informing students and borrowers, StudentAid.gov has useful information for you. The site, available in English and Spanish, combines content and interactive tools from several U.S. Department of Education websites and makes it easy for you to find the information you need. It also features videos and infographics to help answer the most frequently asked questions about financial aid accessible via smartphones and tablets. http://www.studentaid.gov
  • Homeroom Blog - Homeroom Blog is the official blog of the Department. The purpose of these blog is to facilitate an ongoing dialogue on education issues.
  • College Navigator - Web-based tool for searching all colleges and universities in the United States. College Navigator consists primarily of the latest data from the Integrated Postsecondary Education Data System (IPEDS), the core postsecondary education data collection program for the National Center for Education Statistics. http://nces.ed.gov/collegenavigator/
  • ED.gov - ED.gov is the Department's primary Internet portal and website. As such, it performs the following functions to ensure efficient and expanded public access and communication between the government and citizens: convey the Department's brand and key messages, provide relevant and timely information, hosts sites for ED offices and programs, and provide a unified entry point to other ED resources. http://www.ed.gov/
Department of Justice (DOJ)

2.2 Data

  • ATF Trace Data Report - A key component of the Bureau of Alcohol, Tobacco, Firearms and Explosives’ (ATF) enforcement mission is the tracing of firearms on behalf of thousands of Federal, State, local and foreign law enforcement agencies. Firearms trace data is critically important information developed by ATF. ATF has prepared the following state-by-state reports utilizing trace data which is intended to provide the public with insight into firearms recoveries.
  • National Sex Offender Website and Database - The Dru Sjodin National Sex Offender Public Website (NSOPW), coordinated by the U.S. Department of Justice, is a cooperative effort between jurisdictions hosting public sex offender registries (“Jurisdictions”) and the federal government and is offered free of charge to the public. These Jurisdictions include the 50 states, U.S. Territories, the District of Columbia, and participating tribes. The Website provides an advanced search tool that allows a user to submit a single national query to obtain information about sex offenders; a listing of public registry Web sites by state, territory, and tribe; and information on sexual abuse education and prevention.
  • FOIA.gov - FOIA.gov houses annual Freedom of Information Act (FOIA) data from all agencies subject to the FOIA. This data is collected annually and available to the public as PDF reports or in various machine-readable formats on FOIA.gov.
  • Uniform Crime Report - The Uniform Crime Reporting (UCR) Program was conceived in 1929 by the International Association of Chiefs of Police to meet a need for reliable, uniform crime statistics for the nation. In 1930, the FBI was tasked with collecting, publishing, and archiving those statistics. Today, several annual statistical publications are produced from data provided by nearly 17,000 law enforcement agencies across the United States.
  • National Crime Victimization Survey - NCVS is the Nation's primary source of information on criminal victimization. Each year, data are obtained from a nationally representative sample of about 40,000 households comprising nearly 75,000 persons on the frequency, characteristics and consequences of criminal victimization in the United States. Each household is interviewed twice during the year. The survey enables BJS to estimate the likelihood of victimization by rape, sexual assault, robbery, assault, theft, household burglary, and motor vehicle theft for the population as a whole as well as for segments of the population such as women, the elderly, members of various racial groups, city dwellers, or other groups. The NCVS provides the largest national forum for victims to describe the impact of crime and characteristics of violent offenders.

7.2. Mobile

  • Justice.gov - The main website of the Department of Justice
  • StopFraud.gov - The website of the President's Financial Fraud Task Force hosting information about the work of the task force as well as resources to about fraud - including prevention tips and where to report crimes if they occur.
  • Civil Rights Division Report a Violation Web Resources - The Civil Rights Division enforces civil rights laws in a wide variety of contexts. This resource directs individuals on how to submit a complaint or report of a potential civil rights violation.
  • Office on Violence Against Women Resource Map - A comprehensive list of national, state, local and tribal resources for victims of violence against women.
Department of Energy (DOE)

2.2 Data

  • ALTERNATIVE FUELING STATION LOCATION DATA - API - The underlying data used to populate the Alternative Fueling Station Locator tool is available via a Web service. This data set is considered the most trusted industry resource for location-based alternative fueling station data because of the long-standing collection process fostered by relationships with industry and fuel providers and its continuous and rigorous vetting process. Using this API, developers can access the data to build their own mobile apps, widgets, or tools. Developers may choose to mash up the station data set with numerous other data sets available on the Web to create useful products to fill industry-specific needs. http://developer.nrel.gov/doc/transportation
  • DOE GREEN ENERGY - This API data service provides green energy results from research and development conducted throughout the Department and by DOE-funded awards at universities. The service allows extraction of two data sets: green energy technical (approximately 40,000) reports and green energy patents (over 2,000). http://www.osti.gov/GreenEnergyXMLManual.pdf
  • Electricity API - The U.S. Energy Information Administration (EIA) collects, analyzes, and disseminates independent and impartial energy information to promote sound policymaking, efficient markets, and public understanding of energy and its interaction with the economy and the environment. The Electricity API project will make this dataset publically available.Beta Site: http://www.eia.gov/beta/api/
  • OpenEI - OpenEI continues to strive to be a global leader in open-data source for energy information--specifically analyses on renewable energy and energy efficiency. The platform is a Wiki, similar to Wikipedias Wiki, which many users are already familiar with and include an API. Users can view, edit, and add data, " and download data for free. https://www.openei.org

7.2. Mobile

  • MOBILE SITE FOR JOBS AT THE ENERGY DEPARTMENT - The site features a searchable data pull from USAJobs of the currently available Energy Department job opportunities. It is updated daily, allowing prospective employees to search for the latest opportunities wherever and whenever they want. In addition to the job search capabilities, there are resources available to the public concerning student, veteran, and other office programs/opportunities.
  • ENERGY.GOV - The Energy Department’s main site, Energy.gov and its contents, are now available on the go. This allows users to access the Energy Departments’ resources over a variety of mobile devices such as smart phones and tablets. The American public is becoming increasingly mobile and the Energy department is responding to this demand
  • ENERGY CAREER GAME - This is a fun, fast-paced puzzle/strategy game intended to generate interest and recruit the next generation of individuals and entrepreneurs in the energy industry. In the game, the player is tasked with managing resource networks between cities, power plants, homes, and businesses to provide power to these buildings. The game will educate players about the unique challenges and economic opportunities faced in meeting America’s energy needs and transitioning to a clean energy future. It will give players a better understanding of the career opportunities in the energy sector as well as the Energy Department. The game is being developed for the Department free of charge by college students who are working with the Department’s Office of Human Capital. The game will also incorporate a plug-in that will allow players to post their scores to Facebook. Expected to be ready for beta release by October 2012.
National Aeronautics and Space Administration (NASA)

2.2 Data

  • NASA Data API - The data.nasa.gov API allows a machine-readable interface to return metadata from the site organized by category, tag, date, or search term. We’re hoping this allows new and creative visualizations of the data resources NASA provides to the public. Additionally, it is a learning experience for us as we work to expand transparency, participation, and collaboration at NASA through new uses of technology.
  • ISS Live API - Thousands of data points are downloaded every minute from the Station, and ISS Live! makes a broad set of that data open and accessible. Furthermore, ISS Live! will make an application programming interface (API) available as a web service for external developers to take ISS data and put it into their own websites and mobile applications.
  • ExoAPI - ExoAPI is an ongoing project that extends the accessibility of exoplanetary data by providing an easy to use RESTful API. ExoAPI was created during the NASA Space Apps Challenge by a team of three amazing geniuses who knew nothing about space before they started this...and still don't really. Currently the data is provided by http://exoplanetology.blogspot.com/ who in turn feeds the data from http://exoplanet.eu/. The ExoAPI team plans on extending the API to encompass a wider array of data sources and more interesting space data to reach as many programmers as possible and encourage an explosion of space data based mashups.

7.2. Mobile

  • NASA Apps Store - Provide employees and contractors with access to agency data and systems on the go.
  • WebTADS - WebTADS Mobile is a lighter version of the desktop-based WebTADS developed to provide NASA Civil Servants with the convenience of recording time when they're not in the office or connected via VPN.
  • Visualization Explorer - NASA Visualization Explorer, the coolest way to get stories about advanced space-based research delivered right to your iPad. A direct connection to NASA’s extraordinary fleet of research spacecraft, this app presents cutting edge research stories in an engaging and exciting format.
  • people.nasa.gov - NASA Enterprise Directory (NED) Search
Nuclear Regulatory Commission (NRC)

2.2 Data

  • Reactor Operating Status Reports - A web-based system that allows the NRC to report on the operating status and power output of commercial power reactors located within the continental United States
  • Power Reactor Daily Event Reports - A web-based system that allows the NRC to report on the daily events and activities occurring at commercial power reactors located within the continental United States
  • Operating Reactor Inspection Reports - A web-based system that allows the NRC to report on observations and findings of inspections occurring at commercial power reactors located within the continental United States
  •   - A web-based system that allows the NRC to report on licensed facilities, activities, or basic component that fail to comply with NRC regulations
  • Part 21 (Component Defect Reports) - A web-based system that allows the NRC to communicate with the public daily on our regulatory activities and events

7.2. Mobile

  • Public Meeting Feedback System - A web-based system that allows the NRC to collect public meeting feedback from meeting attendees
  • Top 5 Most-Accessed Agency Web Pages - A web-based system that allows the NRC to communicate with the public on our regulatory activities and events
United States Agency for International Development (USAID)

2.2 Data

  • USAID Portfolio Map - Map depicts the locations of USAID-funded projects to better monitor development results, improve aid effectiveness and coordination, and enhance transparency and social accountability. The map depicts the total number of unique projects at the administrative one boundary level (i.e. region, state, province). For example, a Global Health project may operate in several provinces within a country, in which case it will be given a count of one in each of those provinces. A project count will not be given to a province if that project does not operate there. National projects are depicted across all administrative one boundaries to show the impact of where the work is taking place.
  • Famine Early Warning System Network (FEWS NET) - A USAID-funded food security and famine early warning system covering more than 30 of the most food insecure countries in the world
  • U.S. Overseas Loans and Grants (Greenbook) - These data are U.S economic and military assistance by country from 1946 to 2010. This is the authoritative data set of U.S. foreign assistance. The data set is used to report U.S foreign assistance to Congress as required by the Foreign Assistance Act, Section 634.

7.2. Mobile

  • Development Experience Clearinghouse - USAID's Development Experience Clearinghouse (DEC) is the largest online resource for USAID funded technical and program documentation, with over 141700 documents available for electronic download. Search USAID's online database of agency-funded technical and program-related documents to download USAID documents in PDF format for free.
  • USAID Portfolio Map - The Portfolio Map is a mobile app for accessing information about the development work USAID is performing every day. The app will give mobile device users the ability to browse our portfolio for a subset of the countries in which USAID is working. The app will provide general country overviews at a glance and also will allow users to access more detailed information as needed.
Department of Defense (DOD)

2.2 Data

  • United States Army Corps of Engineers (USACE) Lockage Data - The Corps Locks website contains lock and vessel specific information derived from the United States Army Corps of Engineers Lock Performance Monitoring System (LPMS).
  • United States Army Corps of Engineers (USACE) Commodity Data - The USACE Commodity system provides commodities information to the Department of Agriculture on a weekly basis.

7.2. Mobile

  • TRICARE Website - Provides TRICARE Coverage Plan information to active and retired Uniformed Services and their families
  • Defense Commissary Agency (DeCA) Commissaries Locator - Provides users the closet store hours, directions, contact information, floor plans, etc.
  • Arlington National Cemetery Grave Site Locator - Enables veterans, family members and the public to locate gravesites; generate front and back photos of a headstone or monument; and receive directions to those locations.
Department of the Treasury

2.2 Data

  • Wholesale Securities Services (WSS) - Provides information on recent auction results (issue date, discount rate, investment rate, offer amount), upcoming auctions (auction date, issue date, offer amount), and savings bond rates.
  • Summary Debt Accounting Services - Gives users the ability to find the total public debt outstanding for a specific day or date range. The data components will be useful to feed the Federal Reserve System's FRED.
  • OCC Derviatives and Capital Markets Quarterly Reports - Each quarter, based on information from the Reports of Condition and Income (call reports) filed by all insured U.S. commercial banks and trust companies as well as other published financial data, the Office of the Comptroller of the Currency prepares a report. That report describes what the call report information discloses about banks' derivative activities

7.2. Mobile

  • IRS2Go (English) - A smartphone application that lets you interact with the IRS using your mobile device. IRS continues innovation with its award-winning mobile application, IRS2Go. The new version will have improved access to IRS’s online services and may include additional features such as include “Pay your tax bill”, “Learn about your notice”, “Find a form”, “Find an eFile provider”, “Find a practitioner”, “Find a VITA Site nearest you”.
Department of Commerce

2.2 Data

  • Export.gov API - An API that will allow developers access to Export.gov\\\'s events data
  • Census API - Additional economic and demographic data

7.2. Mobile

  • Commerce.gov - Ensuring mobile-optimization of Commerce.gov
  • Census mobile apps - Census will develop 2 additional mobile apps
Department of Health and Human Services (HHS)

2.2 Data

  • HealthCare Finder API (http://healthdata.gov/data/dataset/healthcare-finder-api) - All of the data used on the Finder.HealthCare.gov web application is available through the API. There are multiple collections of data available through the API including Public Options Data, Individual and Family Health Insurance Options Data, and Small Group Insurance Options Data. Visit http://www.hhs.gov/digitalstrategy/open-data/index.html to learn more about this API.
  • HealthData.gov Catalog API (http://healthdata.gov/catalog-api) - The HealthData.gov API is used to provide software developers with programmatic access to the contents of our data catalog. The API can be used to find recently added datasets, to search the catalog, to download the contents of the catalog for analysis, or to build a new data catalog tool. Visit http://www.healthdata.gov/catalog-api to learn more about this API.

7.2. Mobile

  • Medicare.gov (http://www.medicare.gov/) - Medicare.gov is the consumer website for Medicare beneficiaries, caregivers, and advocates. The implementation of responsive design on this site allows us to support traditional desktop PCs, tablets, and smartphones all from one URL and code base. Learn more about the redesign process here: http://www.hhs.gov/digitalstrategy/mobile/medicare-responsive-design.html We have made the responsive design code available to the public here: http://www.hhs.gov/digitalstrategy/blog/2012/10/medicare-assets.html
  • HHS Digital Strategy Site (http://www.hhs.gov/digitalstrategy) - HHS uses this website to engage the public, report progress on the implementation of the Digital Government Strategy, showcase digital strategy best practices, and test new technology and tools. Visit http://www.hhs.gov/digitalstrategy/ to learn more.
Department of Transportation (DOT)

2.2 Data

  • Federal Railroad Administration 10 Year Accident Reports - The 10-Year Accident/Incident Report, one of the most requested reports on the Federal Railroad Administration’s website, provides historical statistics on rail-related train accidents, injuries and fatalities, highway-rail crossing collisions and operational information. We will create an API to render all this information in a user-interactive dashboard program.
  • SaferCar.gov Recall and Complaint Data - The National Highway Transportation Safety Administration will make vehicle safety data available in API format. Specifically, we will create an API with safety-related complaints about motor vehicles and motor vehicle equipment, including vehicle make, model and year. We will also create an API with recall data for vehicle products that have safety-related defects or do not comply with federal motor vehicle safety standards.

7.2. Mobile

  • SaferCar App - DOT will create a user-friendly iOS smartphone product to streamline access to information on SaferCar.gov, including vehicle 5-star safety ratings; vehicle defects; and the Child Safety Seat Locator. The app will also allow consumers to search and submit vehicle complaints
  • Grade Crossing App - The Federal Railroad Administration is developing an application that will provide users with mobile access to Grade Crossing Information. Users will be able to visualize the rail grade crossings on a map interface, find their own location on the map, and view crossing specific information such as Accident Data and Inventory Reports from FRA’s safety data website.
Federal Communications Commission (FCC)

2.2 Data

  • Broadcast Public Inspection File - For decades, the public file for each station has been kept at the station’s main studio in paper form (or more recently in electronic form at some stations) and made available during normal business hours. Relying on the advantages of current technology, however, the Commission has now changed its rules to require almost all of this public file information for television stations to be posted online at this site.
  • License View - Spectrum is a national resource. License View provides information on over 3 million FCC issued licenses for use of the nation's airwaves and other purposes.

7.2. Mobile

  • fcc.gov - The entire fcc.gov experience will be a mobile ready application this fall.
  • National Broadband map - The National Broadband Map (NBM) is a searchable and interactive website that allows users to view broadband availability across every neighborhood in the United States. The NBM was created by the National Telecommunications and Information Administration (NTIA), in collaboration with the Federal Communications Commission (FCC), and in partnership with 50 states, five territories and the District of Columbia. The NBM is part of NTIA's State Broadband Initiative. The NBM is updated approximately every six months and was first published on February 17, 2011.
Office of Personnel Management (OPM)

2.2 Data

  • USAJOBS - Website to post and find federal jobs. By providing an API for USAJOBS, we will make it easier for third parties to (1) provide customized access to job postings (e.g., only those that would be of interest to members of a particular professional organization) or (2) combine job posting with other services or applications (e.g., discussion boards or other social media around federal jobs).
  • Federal Government Operating Status - Provides information on the operating status of federal agencies. Not only do federal employees need to know the operating status of their offices, but also many businesses and other organizations base their operating status decisions on ours. The API for the operating status will make it easier for those organizations and even other federal agencies to make full use of operating status data.

7.2. Mobile

  • USAJOBS - Website to post and find federal jobs. By keeping our popular iOS mobile app up-to-date and creating an Android version, we will better serve more people seeking federal employment.
  • Federal Operating Status - Provides information on the operating status of federal agencies. Not only do federal employees use this information whenever there is bad weather or an unexpected major event such as an earthquake, but local businesses and other organizations often make their own operating status decisions based on the federal government’s decision. Although OPM has never created an application for the operating status, one exists for iOS devices and is sold by a private entity for $0.99. We will make this freely available information also more freely accessible with a mobile application for both iOS and Android devices.


146 Planned APIs from 19 Federal Depts and Agencies as Part of their Digital Strategy

Three months after the White House CIO mandated that all federal departments and agencies have a digital strategy, we are getting closer to having access to some high value APIs, across almost twenty participating departments and agencies.

Up to now, it is just a lot of talk about strategy, without much detail on what will be deployed.  Now that there are 20 digital strategies published, we can start seeing some of the APIs the departments and agencies will be deploying in the coming months.

Here is a list of "systems" that each participating department or agency will be deploying:

Department of Agriculture (USDA)
  • World Agricultural Supply and Demand Estimates
  • Meat, Poultry and Egg Product Inspection Directory
  • USDA Newsroom - Provides USDA's comprehensive forecasts of supply and demand for major U.S. and global crops and U.S. livestock. The report gathers information from a number of statistical reports published by USDA and other government agencies, and provides a frame
  • National Farmers Market Directory
  • USDA Blog - Agricultural Marketing Service-produced directory containing information about U.S. farmers market locations, directions, operating times, product offerings, and accepted forms of payment. Supports local and regional food systems, as well as developm
  • List of Disaster Counties
  • AmberWaves eZine - The Blog features content from all USDA agencies and features the latest news, events and features. The Blog also provides the public an opportunity to ask questions or share their thoughts about the latest issues.
  • Office Information Profile System
  • SNAP Retailer Locator information - As the agency's flagship publication, Amber Waves provides a window into ERS research through highly readable articles geared to educated but non-specialized audiences. Amber Waves covers important issues on U.S. markets & trade, diet & health, resou
  • Office Information Profile System
Department of Commerce
  • Export.gov API
  • Commerce.gov
  • Census API - Ensuring mobile-optimization of Commerce.gov
  • Census mobile apps
Department of Defense (DOD)
  • United States Army Corps of Engineers (USACE) Corps Locks Website
  • TRICARE Website
  • Defense Commissary Agency (DeCA) Commissaries Locator - Provides lock and vessel specific information derived from the USACE Lock Performance Monitoring System (LPMS). The information contained here represents hourly and daily snapshots of Freedom of Information Act (FOIA) data on U.S. flag vessels and fo
  • United States Army Corps of Engineers (USACE) Commodity Data
  • United States Army Corps of Engineers (USACE) Lockage Data - Provides users the closet store hours, directions, contact information, floor plans, etc.
  • Defense Finance and Accounting Service (DFAS) myPay system
Department of Education (ED)
  • EDFacts
  • G5 Grants Management System
  • StudentAid.gov - The purpose of EDFacts is to collect and report K-12 education performance data for use by policymakers and Department of Education program offices. With relevant, actionable data supplied by EDFacts, decision-makers can identify which programs are
  • State Education Data Profiles
  • College Navigator - StudentAid.gov is the first step in a multi-phase project to provide consumers with a one-stop website where they can access federal student aid information, apply for federal aid, repay student loans and navigate the college decision-making process.
  • Homeroom Blog
  • College Navigator - Homeroom Blog is the official blog of the Department. The purpose of these blog is to facilitate an ongoing dialogue on education issues.
  • Program Information Publication System (Part of Program Information on the Web)
  • ED.gov - Web-based tool for searching all colleges and universities in the United States. College Navigator consists primarily of the latest data from the Integrated Postsecondary Education Data System (IPEDS), the core postsecondary education data collection
Department of Housing and Urban Development (HUD)
  • HUD User
  • Housing Counselor
  • File a Fair Housing Complaint - HUD USER provides interested researchers with access to the original data sets generated by PD&R-sponsored data collection efforts, including the American Housing Survey, HUD median family income limits, as well as microdata from research initiatives
  • Low Rent Apartment Search
  • FHEO HUD.gov mobile adaptive web content - The government gives funds directly to apartment owners, who lower the rents they charge low-income tenants. You can find low-rent apartments for senior citizens and people with disabilities, as well as for families and individuals.
  • Fair Market Rents
  • PD&R Edge - The Fair market rents (FMR) and Income limits data app will provide users the ability to easily obtain statistics for FMR and income limits by their present or other locations.
  • Housing Discrimination Investigative Checklist
  • HUDMaps - Provides access to PD&R's on-line magazine "the Edge". TheEdgeis an online magazine providing news, a message from the Assistant Secretary, and a wide range of information on housing and community development issues, regulations, and research that is
  • Public Housing (PHA) Contact
  • GMP Monitoring Exhibits Handbook - View contact information for Public Housing Agencies in your city and state
  • Enterprise GIS
Department of Justice (DOJ)
  • ATF Trace Data Report
  • Justice.gov
  • StopFraud.gov - The main website of the Department of Justice
  • National Sex Offender Website and Database
  • Civil Rights Division Report a Violation Web Resources - The website of the President's Financial Fraud Task Force hosting information about the work of the task force as well as resources to about fraud - including prevention tips and where to report crimes if they occur.
  • FOIA.gov
  • Uniform Crime Report - The Civil Rights Division enforces civil rights laws in a wide variety of contexts. This resource directs individuals on how to submit a complaint or report of a potential civil rights violation.
  • Office on Violence Against Women Resource Map
  • National Crime Victimization Survey - The Uniform Crime Reporting (UCR) Program was conceived in 1929 by the International Association of Chiefs of Police to meet a need for reliable, uniform crime statistics for the nation. In 1930, the FBI was tasked with collecting, publishing, and ar
Department of Labor (DOL)
  • BLS Employment and Unemployment Data
  • COBRA Continuation Coverage - A portion of the WHD site with a map and tables of minimum wages across the US and its territories.
  • State Minimum Wage Laws
  • Compliance Assistance: Family and Medical Leave Act (FMLA)
  • Current Population Survey (CPS)
Department of State
  • Bibliographical Metadata of the Foreign Relations of the United States Series
  • Bibliographical Metadata of the Foreign Relations of the United States Series
  • aoprals.state.gov - Raw bibliographical metadata for the nearly 500 official historical documentary volumes of U.S. foreign policy in the Foreign Relations of the United States published since 1861
  • aoprals.state.gov
  • Foreign Service Mobile App - Foreign Travel Per Diem rates
  • J1 Visa Exchange Visitor Program website
  • ForeignAssistance.gov - The goal of this app is to serve as a learning tool to educate diverse university students (undergraduate and graduate), alumni and mid-career professionals about the various career opportunities in the Foreign Service and provide the information and
  • Usembassy.gov (450+ websites under the usembassy/usconsulate.gov domain)
  • J1 Visa Exchange Visitor Program website - The goal of the Foreign Assistance Dashboard is to make all U.S. Government foreign assistance investments available in an accessible and easy-to-understand format.
  • Travel.State.gov
  • ForeignAssistance.gov - Provides an informational portal into Consular Affairs pages on international travel, passports, visa, international child abduction, and international law and policy.
  • U.S. Passport Issuance Data
  • U.S. Passport Issuance Data - The goal of the Foreign Assistance Dashboard is to make all U.S. Government foreign assistance investments available in an accessible and easy-to-understand format.
Department of Transportation (DOT)
  • Federal Railroad Administration 10 Year Accident Reports
  • SaferCar App
  • SaferCar.gov Recall and Complaint Data - DOT will create a user-friendly iOS smartphone product to streamline access to information on SaferCar.gov, including vehicle 5-star safety ratings; vehicle defects; and the Child Safety Seat Locator. The app will also allow consumers to search and s
  • Grade Crossing App
Environmental Protection Agency (EPA)
  • Envirofacts
  • How’s My Waterway
  • Integrated Health Alerts - Envirofacts provides access to several EPA databases that provide information about environmental activities that may affect air, water, and land. While an API exists for all of the public data sources in Envirofacts, there is a need to develop dedi
  • Regulations.gov
  • FRS Facility Search Service - EPA is considering the development of creating mobile access to an integrated set of related advisories (UV Index, Air Quality Index, Fish advisories, and beach closures) to provide a more comprehensive indication of environmental conditions that hav
  • EPA.gov
  • FRS Mobile Facility Data Collection - EPA is considering replacing the limited mobile access currently available for home page content with a more fully featured page using responsive design.
Federal Energy Regulatory Commission (FERC)
  • Decisions and Notices
  • Decisions and Notices
  • What’s New - Provides a monthly collection of Delegated Orders, Notices, and Commission Decisions from Commission Meetings or Notational Voting arranged by date.
  • eTariff
  • eSubscription - The “What’s New” RSS feed provides news and information details about the events at FERC.
  • eLibrary
  • eService - Users subscribe or ‘sign up’ for specific dockets and are notified via email about future correspondence. Users have immediate access to the correspondence or documents in eLibrary.
  • eFiling
  • eRegistration - Provides users with official mailing list or service list for a docketed proceeding.
  • Electric Quarterly Reports (EQR)
  • Electric Quarterly Reports (EQR) - eRegistration provides the FERC customer an easy-to-use entry point to do business with all FERC Online applications. Think of eRegistration as a form of membership. By registering, the user will receive a single user id and password that allows them
General Services Administration (GSA)
  • GSA Per Diem
  • USA.gov and GobiernoUSA.gov
  • eBuy - Per Diem Rates
  • Federal Agency Directory
  • GSA.gov - Comprehensive directory of all federal agencies to enable the public to better understand how government is organized and to locate agencies of interest.
  • Federal Data Center Consolidation Initiative (FDCCI)
  • Mobile Apps Gallery - Status of data center closings. Data stored on Data.gov.
  • Fine Arts Database
  • Fine Arts Database - Gallery on USA.gov of government apps that use government data or provide a government service
  • City Pairs
  • GSA Advantage and National Stock Numbers (NSNs) - System to provide users with access to the most efficient travel arrangements for government travel.
  • Energy Usage Analysis System
  • Automated Advanced Acquisition Program (AAAP) - Tracks energy details for various energy sources namely electricity, natural gas, oil, chilled water, steam and renewable energy.
  • GSA Auctions
  • Small Business Dashboard - Offers the general public and commercial businesses the opportunity to electronically offer building space for lease to the Federal Government. This poses some interesting advantages for accessing this information from a mobile device, including geo-
  • National Stock Numbers (NSNs)
  • Computers for Learning - The Small Business dashboard provides contract information specific to small business.
  • City Pairs
  • Automated Advanced Acquisition Program (AAAP) - Federal agencies can report their excess computers and related peripheral equipment to GSA through the GSAXcess® website. Eligible recipients can view and request the available federal excess property at the CFL website.
  • GO.USA.gov
National Aeronautics and Space Administration (NASA)
  • NASA Data API
  • NASA Apps Store
  • WebTADS - The data.nasa.gov API allows a machine-readable interface to return metadata from the site organized by category, tag, date, or search term. We’re hoping this allows new and creative visualizations of the data resources NASA provides to the public.
  • ISS Live API
  • ExoAPI - WebTADS Mobile is a lighter version of the desktop-based WebTADS developed to provide NASA Civil Servants with the convenience of recording time when they're not in the office or connected via VPN.
  • Visualization Explorer
  • people.nasa.gov - NASA Visualization Explorer, the coolest way to get stories about advanced space-based research delivered right to your iPad. A direct connection to NASA’s extraordinary fleet of research spacecraft, this app presents cutting edge research stories
National Archives and Records Administration (NARA)
  • FederalRegister.gov
  • FederalRegister.gov
  • FederalRegiser.gov API - Integration of the Regulations.gov API into FederalRegister.gov and its API. This integration would provide greater access to public comments and supporting documents in Regulations.gov, and improve process for submitting public comments from Federal
  • Daily Compilation of Presidential Documents
  • Archives.gov - Expand the FederalRegister.gov API to include the \"Public Inspection Desk.\"
  • Code of Federal Regulations; Federal Register
  • Online Public Access - Develop an API for FDsys through the Office of Federal Register-Government Printing Office Partnership
  • Online Public Access
  • National Archives Catalog on Wikipedia - Mobile optimize the Online Public Access resource, the online public portal for National Archives records
  • National Archives Catalog on Wikipedia
  • National Archives Catalog on Flickr - Make additional National Archives records available through Wikipedia, which is accessible through the MediaWiki API
  • National Archives Catalog on Flickr
  • Today's Document - Make additional National Archives records available on Flickr, which is accessible through the Flickr API
  • DocsTeach - Make improvements to the Today's Document mobile application
National Science Foundation (NSF)
  • Scientists and Engineers Statistical Data System (SESTAT)
  • News
  • NSF Award Information - The Scientists and Engineers Statistical Data System (SESTAT) provides longitudinal information on the education and employment of the college-educated U.S. science and engineering workforce, collected through three biennial surveys. These surveys ca
  • Funding
  • Discoveries - Information pertaining to NSF awards from 1959 through the present. Data includes principal investigator, awardee institution, NSF program and associated NSF organizations, award amount, award dates, award abstract, and publications and conference pr
  • NSF Research Grant Funding Rates
  • Science & Engineering Indicators - NSF funding rates for competitive research proposals by organizational unit. (Funding rates constitute the number of awards divided by the number of actions for a given year by organizational unit).
  • Awards
  • Project Outcome Reports - Searchable database of NSF awards.
  • Staff Directory
  • Events - Section 7010 of the America COMPETES Act requires that researchers funded in whole or in part by NSF report on the outcomes of the funded research for the general public. Project Outcomes Reports describe the project outcomes or findings, addressing
  • Graduate Research Fellowship Program Awardees and Honorable Mention Recipients
  • Vacancies - Demographic information for recipients of NSF GRFP awards. The GRFP provides three years of graduate education support for individuals who have demonstrated the potential for significant achievements in science and engineering research
  • Directions - Current job vacancies at NSF.
Nuclear Regulatory Commission (NRC)
  • Reactor Operating Status Reports
  • Public Meeting Feedback System
  • Power Reactor Daily Event Reports - A web-based system that allows the NRC to report on the operating status and power output of commercial power reactors located within the continental United States
  • Top 5 Most-Accessed Agency Web Pages
  • Operating Reactor Inspection Reports - A web-based system that allows the NRC to report on the daily events and activities occurring at commercial power reactors located within the continental United States
  • Part 21 (Component Defect Reports) - A web-based system that allows the NRC to report on observations and findings of inspections occurring at commercial power reactors located within the continental United States
  • Public Affairs Daily News Releases - A web-based system that allows the NRC to report on licensed facilities, activities, or basic component that fail to comply with NRC regulations
Social Security Administration (SSA)
  • SSA State Agency Monthly Workload Data
  • Mobile Supplemental Security Income (SSI) Wage Reporting Application
  • Mobile Optimized Frequently Asked Questions (FAQs) - SSI recipients will use this application to report their monthly wage amounts. This enables them to meet their monthly reporting obligations.
  • Hearing Office Average Request Wait Time Report
  • Mobile Contact - People seeking information from SSA will use this mobile optimized FAQ website to obtain information about numerous topics.
  • Hearing Office Workload Data
  • Hearing Office Average Processing Time Ranking Report - People seeking to do business with SSA will use this application to get needed information and directions to their local SSA office. In addition they will have access to information about services available by phone and online.
  • Mobile Optimized Life Expectancy Calculator
  • SSA/Dept of State Identity Verification Web Service - A monthly ranking of the 165 ODAR hearing offices (including 3 satellite offices) by the average number of days until final disposition of the hearing request. The average shown will be a combined average for all cases completed in that hearing offic
United States Agency for International Development (USAID)
  • USAID Portfolio Map
  • Development Experience Clearinghouse
  • Famine Early Warning System Network (FEWS NET) - Map depicts the locations of USAID-funded projects to better monitor development results, improve aid effectiveness and coordination, and enhance transparency and social accountability. The map depicts the total number of unique projects at the admin
  • USAID Portfolio Map
  • U.S. Overseas Loans and Grants (Greenbook) - A USAID-funded food security and famine early warning system covering more than 30 of the most food insecure countries in the world

I don't know about you, but I'm starting to get excited about the potential here. With almost 20 departments and agencies on board, and 166 APIs planned--there is a lot of potential for the other 226 departments and agencies to see the potential and get on board.

This list of systems are generated programmatically using each department or agencies machine readable digital strategy. I didn't have to compile this by hand, which shows the potential of the digital strategy to become a federal API discovery platform.  You can view the real-time version of systems pulled from each department or agencies digital strategy, which gets updated every night when I run my monitoring script.


API Automation Platforms

I’ve been doing lots of research into the future of web APIs lately, and one area that is definitely gaining more traction is the ability to automate tasks, by defining triggers and actions on top of web APIs.

If you’ve heard about API automation, it’s probably due to the attention If This Then That (IFTTT) and Zapier have been getting. While these are two of the most popular platforms currently, I wanted to dive in and understand the entire landscape.

Currently I’ve found 8 API automation platforms:

Cloudwork - Cloudwork is a service that allows users to automate tasks between Google Apps, Salesforce, Evernote, Zoho, Twitter, Freshbooks, MailChimp, Zendesk, Dropbox, WordPress and others.
Elastic.io - Elastic.io is an API integration and orchestration platform for non programmers, offering a simple tool for users to create and run data/API mashups directly from the browser, to automating simple tasks between API platforms.
FoxWeave - FoxWeave enables enterprises that are building or using Cloud based Services (SaaS, DBaaS etc) to connect those Services to each other and to external Cloud and On-Premise Apps and Databases.
If This Then That (IFTTT) - IFTT is a service that allows anyone to built connections driven from APIs by building channels made up of triggers and actions, bundled into whats IFTT calls recipes, which are triggered every 15 minutes.
MashableLogic - MashableLogic is a mashup development platform that provides a system for leveraging API's by turning them into re-usable components that can be combined to compose software solutions.
Tarpipe - Tarpipe provides a platform for automate tasks, creating workflows between apps to automate low value tasks, generate activity streams from multiple apps in one place, sync data from one app to another as a background task, and publishing of content to multiple API locations.
Wappwolf - Wappwolf is focused on deconstructing the barriers of the Cloud, by connecting your Evernote, Facebook, Flickr, and other web services / apps to Dropbox, allowing users to drag & drop files into a predefined folder on Dropbox and automatically convert and sync to your favorite places.
We-Wired Web - We-Wired Web enables users to define automated tasks using over 50 popular web services using APIs that execute periodically.
Yahoo Pipes - Pipes is a composition tool to aggregate, manipulate, and mashup content from around the web. Like Unix pipes, simple commands can be combined together to create output that meets your needs.
Zapier - Zapier uses what they call a zap to deliver a combination of a trigger and an action using APIs, allowing users to drag and drop to build new zaps and run in background or manually from a dashboard.

API automation platforms provide a new way for developers and non-developers to put API resources to use for business or personal tasks. These automation platforms provide a new opportunity for companies looking to deploy APIs, providing additional channels for distribution and user acquisition.

If you know of any API automation platforms I missed, let me know.

  • UPDATE 9/5/2012 - I added Foxweave to the list.
  • UPDATE 9/1/2012 - I added Cloudwork to the list.


Overview of 11 Places Data APIs

Since starting as API Evangelist here at CityGrid, I have been asked a couple of times how we stack up against other places APIs. So I went through the 11 other places APIs, gathering info, in an attempt to see what each offered.


CityGrid Places API

Search Overview - Providing a places search that can be searched by longitude/latitude, "where" using cities, neighborhoods, zip codes, metro areas, addresses and intersections. Details for each places is also available.

  • Database Size - 18 Million US Places
  • Store / Cache Data -  No Storage.  Cache up to 15 minutes.
  • Attribution - Include logo and phrase “powered by CityGrid; data from Infogroup ©[YEAR]”
  • Multi-Provider IDs - Yes
  • Meta Data - No
  • Rating / Review - Yes
  • Deals / Offers - Yes
  • Revenue Share - Yes
  • Call Limits - 10M / Month
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

Other

  • Places that Pays - A program that monetizes the display of, and interaction with, CityGrid places is called Places that Pay.

URL - http://docs.citygridmedia.com/


Facebook Graph API

Search Overview - Providing the ability to search the Facebook Graph objects with a “type” of place, and longitude/latitude, keyword search and area to find places listed as objects within Facebook.

  • Database Size - Not Found
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review -Yes
  • Deals / Offers - Yes
  • Revenue Share - No
  • Call Limits - One call per second
  • Response Format - XML / JSON
  • Authentication - OAuth
  • Pricing - FREE
  • Check-In - Yes
  • Write - Yes
  • Delete - No

URL - http://developers.facebook.com/docs/reference/api/


Factual

Search Overview - Providing a places search that can be searched by latitude/longitude, and “where” using full text search query string.

  • Database Size - 55 million entities in 47 countries
  • Store / Cache Data - Yes
  • Attribution - Yes
  • Multi-Provider IDs - Yes
  • Meta Data - Yes
  • Rating and Review - No
  • Deals - No
  • Revenue Share - No
  • Call Limits - cross ref = 10,000 per day / crosswalk = 500 per day / read = 10,000 per day  / resolve = 100 per day
  • Response Format - JSON
  • Authentication - Unsigned and signed requests w/ 2-legged OAuth.
  • Pricing - Free
  • Write - Yes
  • Delete - No

Other:

  • Select - What fields to include in the query.
  • Places API - Resolve - Resolve is an entity resolution API that makes partial records complete, matches one entity against another, and assists in de-duping and normalizing datasets.
  • Places API - Crossref - The Crossref API enables you to find the URLs for pages that mention a specific business or point of interest or vice versa.
  • Places API - Restaurants - The U.S. Restaurant table contains Factual's core places attributes in addition to 43 extended attributes on 800,000+ restaurants, bars, and casual eateries including datatypes such as cuisine, ratings, hours of operations, and price.

URL - http://developer.factual.com/


Foursquare Venue API

Search Overview - Providing a places search that can be searched by  hierarchical list of categories, longitude/latitude, “where” using search term, managed by requesting users, over time range, trending and exploration.

  • Database Size - Could Not Find
  • Store / Cache Data - Okay to keep caches of foursquare data as long as they are refreshed at least every 30 days.
  • Attribution -Yes
  • Multi-Provider - Yes
  • Meta Data - Yes
  • Rating and Review - Yes
  • Deals - No
  • Revenue Share - No
  • Call Limits - 5,000 requests per hour
  • Response Format - JSON
  • Authentication - OAuth 2.0 w/ To make a userless request, specify your consumer key's Client ID and Secret instead of an auth token in the request URL.
  • Pricing - Free
  • Write -Yes
  • Delete - No

Other:

  • Actions - You can edit, flag, mark to do, propose edit for venues in Foursquare database.

URL - https://developer.foursquare.com/overview/venues


Fwix

Search Overview - Providing a places search that can be searched by
latitude/longitude, and text search based upon categories, address, city, province, postal code, country, neighborhood and text keyword.

  • Database Size - 23M in US.
  • Store / Cache Data - No storage.  Yes to cache.
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - Yes
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - Yes
  • Call Limits - 5,000 calls per unique user per day
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Write - Yes
  • Delete - Yes

Other:

  • Geotagger Methods - Returns places geotagged to a given web page.
  • Content Methods - Returns geotagged content in or near a location.

URL - http://fwix.com/developer_tools/api


Google Places API

Search Overview - Providing a places search that can be searched by latitude/longitude, keyword matched against all fields, name of place, type of place restricted by radius.   As well as pulling details for each places.

  • Database Size - Could not find
  • Store / Cache Data -
  • Attribution - "Powered by Google" logo is displayed above or below the data
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - Yes
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - 1 000 requests per 24 hour period
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - Yes
  • Write - Yes
  • Delete - Yes

URL - http://code.google.com/apis/maps/documentation/places/


InfoChimps

Search Overview - Providing a places search that can be searched by longitude/latitude with radius, address, bounding box or IP address.

  • Database Size - Could not find
  • Store / Cache Data - Yes
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data -  No
  • Rating and Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - Couldn’t Find
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free w/ Premium Pricing
  • Check-In - No
  • Write -  Yes
  • Delete - Yes

URL - http://www.infochimps.com/datasets/business-places-by-locationary


Nokia

Search Overview - Providing a JavaScript places search that can be searched by search term, with a detail search for display by JS widget.

  • Database Size - Not Found
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - Not Found
  • Response Format - JSON
  • Authentication - None
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://api.maps.nokia.com/places/index.html


Yahoo GeoPlanet

Search Overview - Providing a places search that can be searched by type, county, state, country, oceans, seas, continents, hierarchy and full text search.  Also returns places detail by ID.

  • Database Size - Could not find
  • Store / Cache Data - No
  • Attribution - Must contain the copyright notice "Copyright © Yahoo! Inc. 2008, All Rights Reserved"
  • Multi-Provider IDs -No
  • Meta Data - Yes
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - “Reasonable Request Volume”
  • Response Format - JSON / XML
  • Authentication - Key
  • Pricing - Free
  • Write - Yes
  • Delete - Yes

URL - http://developer.yahoo.com/geo/geoplanet/


Yelp API

Search Overview - You can search location using geo bounding box, longitude and latitude, neighborhood, address or city and filter listings by “where”, using a list of support categories.   As well as pulling details for each places.

  • Database Size - Could not find
  • Store / Cache Data -No
  • Attribution - Yes with logo
  • Multi-Provider IDs - No
  • Meta Data - Yes
  • Rating / Review - Overall count with 3 review excerpts
  • Deals / Offers - Yes
  • Revenue Share - Yes with Commission Junction
  • Call Limits - 10,000 calls/day
  • Response Format - JSON
  • Authentication - OAuth
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://www.yelp.com/developers/documentation/v2/overview


YP

Search Overview - Providing a places search that can be queried by keyword and longitude/latitude, street address, city, postal code, Neighborhood, state, points of interest or by phone number with a radius.  Places details are also provided.

  • Database Size - Could not find
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - yes
  • Deals / Offers - Yes
  • Revenue Share - Yes
  • Call Limits -  50,000 requests per day.
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://developer.yp.com/

If you see anything missing or incorrect, let me know at @citygridapiteam.


February Hackathon Schedule

Here are the hackathons I'm tracking on for February. I'm adding new ones as I find them, make sure and checkout the hackathon events calendar for more information.

OPF Hackathon - A Practical Approach to Database Archiving 02/07/2012 Copenhagen
Denmark
Constantine Startup Weekend 02/09/2012 Constantine
Algeria
University of Miami Ohio Startup Weekend 02/2012 02/10/2012 Oxford
United States
Greater Lafayette Startup Weekend 2012 02/10/2012 Lafayette
United States
Startup Weekend Jackson 02/10/2012 Jackson
United States
Startup Weekend Twin Cities 3 02/10/2012 Twin Cities
United States
Valencia Startup Weekend 02/2012 02/10/2012 Valencia
Spain
Music Hack Day San Francisco 02/11/2012
Hacking at Music Hack Day San Francisco 2012 02/11/2012 San Francisco
United States
The Digital Barn 02/11/2012 Barnsley
United Kingdom
Hack To The Future 11.02.2012 (Kids) 02/11/2012 Preston
United Kingdom
qMedia HACK-A-THON 02/15/2012 London
United Kingdom
Startup Weekend Cluj 17-19.02.2012 02/17/2012 Cluj-Napoca
Romania
AT&T Mobile App Hackathon - Dallas 02/17/2012 Plano
United States
Detroit Startup Weekend 4 02/17/2012 Detroit
United States
FreshlyHacked 02/17/2012 Melbourne
Australia
Guadalajara Startup Weekend 02/12 02/17/2012 Lomas del Valle, 3a Sección Zapopan
Mexico
Open Hackday Tampere [Free] 02/18/2012 Tampere
Finland
Christchurch Startup Weekend 2/12 02/24/2012 Christchurch
New Zealand
Honolulu Startup Weekend February 24-26, 2012 02/24/2012 Honolulu
United States
BeMyApp San Francisco - The 2012 Mobile App Olympics 02/24/2012 San Francisco
United States


Government Opened Data via APIs in 2011

One of the most important fronts of API development is Government.  All of us API and data guys have all been screaming for city, county, state and federal government to open up their data via APIs for years now. 

In 2011 I would say many government officials listened, and opened up almost 100 government APIs, according to ProgrammableWeb:

In 2012 it will be critical for more government agencies to open up, but also I think its time for us developers to step up and start making sense of this data and how our government operates.

We are getting what we wanted, now how do we deliver on the promise of open government APIs?


2011 APIs as a Tag Cloud

I pulled a list of the 2023 APIs that were added to the ProgrammableWeb API directory in 2011.  I took the description column and used Wordle to generate a tag cloud for 2011.  I think tag clouds can provide a 100K view of where people are focusing their APIs.


Quick Walk Through the World of Location & Places APIs

Photo Credits

I took a walk through what I am calling the locations and places API landscape today. Most of these APIs I’m familiar with, but as the CityGrid API Evangelist, I’m getting an opportunity to immerse myself into this new local, social mobile world.

As I immerse myself in this semi-new world I want to share my findings with everyone else.  If you have any suggestions make sure and let me know in comments below.

First I started with CityGrid APIs, which provide several key location and places APIs:

  • The Places API - Provides functionality for information on local businesses, including search, detail, user content submission, and predictive text
  • The Offers API - Provides coupons and special offers from businesses based on geography and category
  • The Reviews API - Provides access to customer reviews for businesses selected by id or by geography or category

Then I wanted to see what Google was doing, and of course started with the Google Maps APIs:

  • Maps JavaScript API - The Google Maps Javascript API lets you embed Google Maps in your own web pages
  • Maps Image API - The Google Maps Image APIs make it easy to embed a static Google Maps image or Street View panorama into your web page, with no need for JavaScript

Along with Google Maps they offer a set of Geo Web Services that contain several location and places based APIs:

  • Directions API - The Google Directions API is a service that calculates directions between locations
  • Distance Matrix API - The Google Distance Matrix API is a service that provides travel distance and time for a matrix of origins and destinations.
  • Elevation API - The Google Elevation API provides you an interface to query locations on the earth for elevation data.
  • Geocoding API - Geocoding is the process of converting addresses into geographic coordinates
  • Places API - The Google Places API is a service that returns information about places, defined as establishments, geographic locations, or prominent points of interest

Already with CityGrid and Google I’m seeing that the type of location and places services, really start to get complicated and diverse. With Google Latitude I start separating the location from the place, with what are two location specific APIs:

  • Curent locations - Represents the user's most recent known location
  • Location history - Represents the list of all recorded user locations

After Google I have to look at another big player, Yahoo. Yahoo has several location based services:

  • Fire Eagle - Fire Eagle is a service designed to build and use location-aware applications and services
  • GeoPlanet - Yahoo! GeoPlanet is a resource for managing all geo-permanent named places on Earth
  • Local API - Provides a database of information including business address and phone, category, rating, distance, URL, and traffic alerts
  • Maps - Provides interactive maps with driving directions and traffic information
  • PlaceFinder - Converts street addresses or place names into geographic coordinates (and vice versa)
  • Placemaker - Identifies places mentioned in text, disambiguating them and returning unique identifiers

Naturally after taking a look at Yahoo I have to go see what Microsoft is up to in the space:

  • Bing Maps API- The API that power Bing Maps, an online mapping service that enables users to search, discover, explore, plan, and share information about specific locations
  • Bings Maps Location API - Use the Locations API to get location information (I love this description!)

After looking at what local and mobile offerings the big players Google, Yahoo and Microsoft had I started looking at less search and mapping based services to more carrier based location and place services. I started with Verizon, who has a single location API:

  • LBS Network API - The Verizon LBS API allows you to use the user's location to deliver specific services

Sprint brings three location APIs to the table:

  • Geofence - Provides virtual perimeter services
  • Location - Determines the location of a Sprint CDMA Device
  • Presence - Determines if a device is present on the Sprint CDMA network

AT&T has a LBS API:

  • Terminal Location - Set of Location-based Services (LBS)

Deutsche Telekom has one location API:

  • IP Location API - Locate Internet users with their IP addresses

Ericsson Labs provides a developer community around a full suite of APIS:

  • 3D Landscape API - 3D Landscape API for integration o realistic 3D MAPS
  • Mobile Location API - Allows the use a mobile phone user's current CELL-ID to obtain their geographical location
  • Network Probe API - Provides services measure certain characteristics of network IP connectivity, firewalls and Network Address Translators
  • Web Location API - Provides location data from a mobile phone using the positioning systems of mobile operators
  • Web Maps API - Provides dynamic maps for application integration

France Telecom also has a location API:

  • Location API - Allows applications to get geographic coordinates of a given Orange France mobile phone or a fleet

Makes sense for every carrier to also provide developers with a set of location services, as they don’t want to just be dumb pipes. They want to be an integrated player in their own customers handset usage.

Next I start looking to put the social in local, mobile, social. Where else to you start but Facebook, which has two location based objects as part of the Graph API:

  • Checkin - A checkin represents a single visit by a user to a location
  • Places - A search option before initiating a checkin, returning name and location information from Graph API

I thought I'd consider Twitter next.  They have Places and Geo methods, but it really doesn't seem like its going anywhere, and a really small portion of tweets have geo info recorded.  I will consider in the future if I see action around it.

In the category of location based social network I was investigating Foursquare and Gowalla, but with the recent Facebook acquisition of Gowalla I think I will only look at Foursquare. Foursquare offers access to four different APIs:

  • Core API - Users, Venues, Venue Groups, Checkins, Tips, Lists, Photos, Specials, Campaigns, Events
  • Real-time API - Notifies venue managers when users check in to their venues, and our user push API notifies developers when their users check in anywhere
  • Merchant Platform - The Merchant Platform allows developers to write applications that help registered venue owners manage their foursquare presence and specials
  • Venues Platform - The Venues Platform allows developers to search for places and access a wealth of information about them, including addresses, popularity, tips, and photos

After Foursquare you leave social, getting into the places data world, with popular player SimpleGeo. Similar to Gowalla I was going to overlook SimpleGeo, with their recent acquisition by Urban Airship, but I think SimpleGeo is still an important enough of a player, that we should still consider them in the game. SimpleGeo has four distinct web services for location and places:

  • SimpleGeo Storage - Storage of data in SimpleGeo system
  • SimpleGeo Features - Features in SimpleGeo represent real-world places such as businesses, regions, or US states
  • SimpleGeo Context - Provides relevant contextual information such as weather, demographics, or neighborhood data for a specific location
  • SimpleGeo Places - Businesses and points of interest

In the pure places data game I’d put Factual in the same category as SimpleGeo. Factual has seven location and places APIs:

  • Places Category API - Taxonomy to classify entities in the various Factual point-of-interest (POI) datasets
  • Places Crossref API - URLs for pages that mention a specific business or point of interest or vice versa
  • Places Crosswalk API - Maps third-party (Yelp, Foursquare, etc.) identifiers for businesses or points of interest to each other where each ID represents the same place
  • Places Global Database API - 55 million entities in 47 countries
  • Places Global Place Attributes API - The latest schema for the global places dataset
  • Places Resolve API - Makes partial records complete, matches one entity against another, and assists in de-duping and normalizing datasets
  • Places Restaurants API - Core places attributes in addition to 43 extended attributes on 800,000+ restaurants, bars, and casual eateries including datatypes such as cuisine, ratings, hours of operations, and price

Tied with SimpleGeo and Factual is InfoChimps. InfoChimps is a data marketplace player with some very strong location and places services:

  • Wikipedia Articles - Correlate Wikipedia articles with geographic locations
  • Business Places by Locationary - The Business Places by Locationary API delivers quality business information based on your geographically defined query
  • Foursquare Places - The Foursquare Places API delivers uniquely rich information about venues, worldwide.
  • Geonames Places - The Geonames Places API locates all places within a specified area. Places are any geographic points that can be named
  • NCDC Weather - The NCDC Weather API provides detailed weather data based on your geographically defined query. Weather data points for your query may include dew point, precipitation, snow depth, temperature, visibility, and wind speed details
  • American Community Survey (Topline) - The 2009 American Community Survey (ACS) Topline API provides basic demographic data based on your geographically defined query
  • American Community Survey (Drilldown)- The 2009 American Community Survey (ACS) Drilldown API provides detailed demographic data based on your geographically defined query
  • Core Geographic Regions - The Core Geographic Regions API delivers detailed geodata for any geographically defined query, worldwide
  • Zillow Neighborhoods - Zillow Neighborhoods retrieves geo data pertaining to neighborhoods within defined geometric parameters
  • Digital Element IP Intelligence Demographics - A geolocation API for all your demographics needs. Search by IP address to return data about a geographical area, including number of households, gender, age groups and language
  • Digital Element IP Intelligence Domains - A reverse IP lookup API with 5 fields of search results, all customized to your IP query. Search by IP address to return data about the domain, company, ISP, NAICS industry code and proxy type for an IP
  • Digital Element IP Intelligence Geolocation - A geolocation API with 20 fields of search results, all customized to your IP query. Search by IP address to return data about a geographical area, including country, region, city, internet connection speed
  • Geocoding API - The Geocoding API is a powerful and useful tool that provides location information for any given address in the United States. Geocoding is a process that assigns geographic data (ie, latitude and longitude) to an address
  • Latitude Longitude and Zip Code Conversions - This API returns approximated latitude/longitude centroids for a given zip code, along with the relative city, state, and county

Then moving out of pure data players Yelp has always been centered around reviews, and more recently, with version 2.0 of their API moved to be centered around the businesses. Yelp has two places APIs:

  • Search API - Searches for Businesses
  • Business API - Returns full details of businesses

Another player in the space is Fwix. Fwix has a different approach to places, trying to geotag the web. Fwix offers six places and location APIs:

  • Geotagger API - Returns places geotagged to a given web page
  • Content API - Returns geotagged content in or near a location
  • Categories API - Returns the list of canonical place categories
  • Location API - Returns geographic data for a latitude/longitude point
  • Places API - Return, Submit and Delete a list of businesses for a given location

After Fwix I found a couple of other mapping, location and places data services:

  • PushPin - The Pushpin Identify Service is a REST service that takes geographic coordinates (latitude and longitude) and resolves them to named locations on the earth
  • 43 Places - Allows users build 43 Places by adding places, asking questions, giving travel advice, uploading pictures of their favorite places and writing stories about the places they've been and want to go.
  • MaxMind GeoIP® City Database - Determine country, state/region, city, US postal code, US area code, metro code, latitude, and longitude information for IP addresses worldwide.
  • Compass - Allows access to a database of 16 million business establishments in the USA.

These providers either didn’t have clear market share or started deviating into other parallel universes of content and services to location and places, so I'm going to stop here.

These 17 places and location API providers are a lot to process.  I want to spend some time getting a handle on the types of services they offer, before I dive into the peripheral services as well as the players that have less market share. But in my style, I'll keep posting my findings as I pull them together.


New Amazon South America Region

Amazon Web Services just announced a new South American Region located in Sao Paulo, Brazil. South America-based businesses and global companies with customers in South America can run their applications and workloads in the new Sao Paulo Region, significantly reducing latency to end-users in South America and allows those needing their data to reside in Brazil for legal reasons, to easily do so.

The new Sao Paulo Region is launched with a complete set of Amazon Web Services, including: Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Store (Amazon EBS), Amazon SimpleDB, Amazon Relational Database Service (Amazon RDS), Amazon Simple Queue Service (Amazon SQS), Amazon Simple Notification Service (Amazon SNS), Amazon Virtual Private Cloud (Amazon VPC), Elastic Load Balancing, Amazon Elastic MapReduce, Auto Scaling, AWS CloudFormation, and Amazon CloudWatch.

That is seven regions that AWS is operating in now:
  • US West - Northern California
  • US West - Oregon
  • US East - Virginia
  • EU - Ireland
  • Asia Pacific - Singapore
  • Asia Pacific - Tokyo
  • South America - Sao Paulo
It is eight regions if you count their Government cloud they have in Oregon. Amazon sure has come a long ways to become a truly global cloud computing provider. All they need is an Africa and Middle Eastern presence and they'll have the world wired.


Sorry SalesForce.com, but Database.com is not Open

I’m working on reviewing various APIs as part of my database industry overview, where I’m trying to understand the different approaches to delivering database platforms using web APIs as the primary interface.

One of these database platforms is Database.com. I really like what SalesForce.com is doing with Database.com, but I take issue with their section declaring Database.com as an open platform. SalesForce.com claims because you can access Database.com from any language, any platform, and any device through standards-based APIs, it is "open".

I think many folks in the enterprise space truly believe this makes something open. All it means is, if you have the money and approved credentials, you can access. If I launch a database on Database.com it is open for "me" to access, and those I give access to. And yes, since it uses REST I am "open" to access via any language or platform I choose. But I'm sorry, that does not make it "open".

I recommend SalesForce.com take a look at Freebase. All the data contained in Freebase is licensed by a Creative Commons Attribution License, which means that it's free for you to browse, query, copy, and even use the data in your own systems or software, even for commercial use, all you have to do is attribute it back to Freebase.

Anyone has access to use and contribute to Freebase, as well as use any programming language or platform to access the data available on the platform. Access to data is only one aspect of "open", developers must also be freely open to use and contribute to it, as the community sees fit.

As much as I love and evangelize the power of the API, an API alone does not make systems or data open.


API Technology - OData

There is an ever growing amount of data available today and much of this data is now being collected and stored across a wide variety of system, locked into specific applications or formats, making it difficult to access, integrate and share.

The Open Data Protocol (OData) is a Web protocol for querying and updating data that provides a way to unlock data and free it from silos that exist in many applications.

OData frees data by applying and building upon existing web technologies like HTTP, Atom Publishing Protocol (AtomPub) and JSON to provide access to information from a variety of applications, services, and stores.

OData emerged from experiences implementing AtomPub clients and servers in a variety of products over the past several years. OData is currently used to expose and access data from relational databases, file systems, content management systems and custom websites.

OData works to be consistent with the way the World Wide Web works, adhering to HTTP standards just like RESTful APIs. This reuse of core Web principles allows OData to provide data integration and interoperability across a broad range of clients, servers, services, and tools.

OData is released under the Open Specification Promise and allows anyone to be involved with OData implementations.


RESTful Cloud Database with JSON

Jasondb is a cloud-based, RESTful JSON database designed to replace MySQL as the database in traditional web development Jasondb persists the data in the same JSON format used by many AJAX driven sites.

Jasondb provides indexing, metadata indexing, tagging and searching with JSON documents.

All of the features and functions of Jasondb are accessible via the RESTful API.

Jasondb appears to scale automatically with the help of Amazon Web Services (AWS).


Data APIs

A common reason for deploying an API is to share data with users outside of your organization.

You need to make information accessible to partners or general public, but in a way that's self-service so you can avoid emailing a spreadsheet every time someone asks for something. You do this with a data API. Data APIs can:
  • Provide an list of names and addresses for your upcoming event.
  • Deliver news to web sites and mobile phone in real-time.
  • Deliver products such as books from Amazon.com to other smaller, specialty web sites just focusing on cookbooks.
  • Delivers census data from the federal government about who lives in your state to your local unemployment office.
A data API is proving to be the easiest way to exchange information -- whether it's between businesses, between the government and the public, or between your Facebook profile and your mobile phone.

Web, desktop, and mobile applications are all being developed to depend on data from various private and public APIs. These data APIs are launched to satisfy our growing appetite for real-time information and updates on a daily basis.


Internet Service Provider (ISP) at Amazon Web Services (AWS)

I was just refining a wiki page of various building blocks I use at Amazon Web Services. I noticed it would make a great Internet Service Provider (ISP) package for someone who wanted to start an ISP, or even used as model for an existing ISP looking to migrate to cloud computing.

These are a few of the components I have my list:
  • Web Server on Amazon EC2
    • Linux / Windows
    • EBS Volumes for Storage
    • Machine Images
  • Amazon S3 Central File Storage + Jungle Disk
    • Server Backup
    • Central File Storage
    • Client Cloud Storage as a Service
    • FTP Access
  • Database
    • SQL Server 2008
    • MySQL
    • Amazon RDS
    • Other (Amazon SimpleDB, Cassandra, CouchDB, etc)
  • Email
    • POP Server
    • SMTP Server
  • DNS Server
  • FTP Server
    • Web File Access
    • Central File Storage Access
  • SVN Server
    • Version Repositories
    • Client Checkout
These are just a few of the building blocks I have on my list. There are many other possibilities and configurations. You slap on an ISP Cloud Management tool like Plesk from Parallels and you can manage your network without much headache.

You are going to see many Internet Service Providers (ISP) make the jump to the cloud because of cost and ease of deployment. There are just too many benefits to ignore the cloud.


Real-time, Aware Web Apps with XMPP and HTML

I am spending time each week learning something new about HTML5 and brainstorming ways it can be applied in my every day business world.

Here are two areas I am researching: I also just finished reading How to Create an XMPP-Driven Application in JavaScript. Using HTML 5 Web Sockets you can interface with any XMPP server and create a full-duplex communications channel within the web browser.

With all these improvements to HTML, HTML 5 seems like a logical choice for web and mobile application development.

Update: Also found this web socket extension for Apache Web Server.


OSCON: Database Explosion Part 2

After spending some time last weekend reviewing the exhibitors at the upcoming OSCON - Open Source Convention I noticed a pattern:

There is an explosion in database innovation right now I listed seven separate database platforms and tools that are exhibiting at the Open Source Convention. I neglected a very notable database platform appliance company:

Schooner Information Technology They provide turnkey MySQL, NoSQL, and memcached appliances to work with data intensive web and social applications. I'm reviewing their two products now:

Looks like they have the solution for anyone looking to deploy MySQL or NoSQL key-value stores in their data center. I could see this being used internally or offered up as a cloud database service.


Open Source Database Explosion

I am spending some time researching various open source and cloud computing service providers using the exhibitors list for OSCON 2010. Audrey needs some tech research done, before she attends the event. She wants to be prepared So I'm going through each exhibitor and learning and documenting as much as I can.

One thing I'm seeing is an explosion of open source database offerings, I'm counting seven systems (yes I didn't include Oracle):
  • BlackRay - BlackRay is an innovative, fully relational in-memory database system designed to offer performance features commonly associated with search engines. It offers fulltext search, function based indeces, JDBC/ODBC via Postgres drivers as well as object oriented APIs. BlackRay is available under the GPLv2.
  • DB Relay - DB Relay is an open source project built around the NGiNX web server platform, providing an HTTP/JSON interface to a variety of database servers. It enables database access without drivers and web application development without middleware. Designed for operational efficiency and ease of maintenance. Hosted at http://www.dbrelay.com/
  • HTSQL by Prometheus Research - HTSQL by Prometheus Research is the easiest way to connect SQL databases to the web, reducing development time and minimizing mistakes. HTSQL uses simple URL queries, works with open source databases, and is a painless way to create data-driven reports, dashboards, and web applications fast and at low cost.
  • Infobright, Inc - Infobright's high performance columnar database, based on MySQL, is the preferred choice for analytic applications and data marts, delivering fast query performance against large data volumes. Easy to implement, with unmatched operational simplicity, Infobright is being used by enterprises, SaaS and software companies to provide rapid access to critical business data. Download Infobright Community Edition and join the Community at infobright.org
  • MariaDB - MariaDB is a database server that offers drop-in replacement functionality for MySQL. MariaDB is built by some of the original authors of MySQL, with assistance from the broader community of Free and open source software developers. In addition to the core functionality of MySQL, MariaDB offers a rich set of feature enhancements including alternate storage engines, server optimizations, and patches.
  • MongoDB - 10gen sponsors the open source project MongoDB, and provides commercial support, consulting, and training for Mongo. MongoDB (from "humongous") is a scalable, high-performance, document-oriented database. MongoDB bridges the gap between key-value stores (which are fast and highly scalable) and traditional RDBMS systems (which provide rich queries and deep functionality).
  • PostgreSQL - PostgreSQL is the world's most advanced and rapidly evolving open source database, with more than 20 years of development by a community of hundreds of database developers from dozens of countries. Our long time support of an enterprise level feature set including transactions, SQL standards compliance, aggregation and subqueries makes PostgreSQL suitable for large enterprises and governments. Easy extensibility through stored procedures and custom data objects makes PostgreSQL exciting for developers. Try PostgreSQL for your next database application!
I'm seeing a big shift away from the database world I've known for the last 15 years. Microsoft SQL Server, Access, Oracle, and MySQL. I'm seeing a big shift with cloud database providers like Amazon SimpleDB and Google Fusion Tables. And I"m seeing an all out mutiny with the NoSQL movement. Seeing all this energy and change with databases after 20 years in the business is really exciting.


Encrypted Connections with Amazon Relational Database Service (RDS)

Amazon published today that Relational Database Service (RDS) now supports SSL encrypted connections. You can now generate an SSL certificate for each database instance.

Here are a few of the details about Relational Database Service (RDS) SSL connections:
  • SSL encrypts the data transferred "over the wire" between your DB Instance and your application. It does not protect data "at rest." If you want to do this, you'll need to encrypt and decrypt the data on your own.
  • SSL encryption and decryption is a compute-intensive task and as such it will increase the load on your DB Instance. You should monitor your database performance using the CloudWatch metrics in the AWS Management Console (pictured at right), and scale up to a more powerful instance type if necessary.
  • The SSL support is provided for encryption purposes and should not be relied upon to authenticate the DB Instance itself.
  • You can configure your database to accept only SSL connections by using the GRANT command with the REQUIRE SSL option. You can do this on a per-user basis so you could, for example, require SSL requests only from users connecting from a non-EC2 host.
I have just started using Amazon Relational Database Service (RDS) for a couple of blogs and a real-time data harvesting tools I am working on for fun, so probably won't get a chance to try out the SSL encrypted connections for a while. I'm sure it will help make people concerned about cloud security feel a little better.


HTML 5 - Web SQL Database


Crowdsourcing Database Administration

A while back Google released Fusion Tables allowing you to aggregate data from many data sources. Today they released an API for Google Fusion Tables.

Fusion Tables is a free service for sharing and visualizing data online. It allows you to upload data, share and mark up your data with collaborators, merge data from multiple tables, and create visualizations like charts and maps.

I have worked with many organizations assessing their internal database usage. It is extremely common for different groups and departments to use various local databases and spreadsheets to meet their data storage and sharing needs.

Recently with Google Docs you can work on excel spreadsheets online and collaborate with others. It has turned Google Spreadsheets more into a data stores.

Now with Google Fusion Tables you can aggregate spreadsheets from upload sources or Google Spreadsheets.

You can allow different individuals and groups to own and maintain small or large data sets that are aggregated for different business goals.

The data can then be published and visualized in real-time using Google Maps and Google Visualizations on your web site or other location.


If there is an API database related story you'd like me to know about, you can submit as Github issue for this research project and I will consider adding as part of my research.