The Shippable Blog

ReST API Best Practice: OAuth for Token Authentication and Authorization

 A big challenge with API based microservices architecture is handling authentication (authN) and authorization (authZ) . If you are like most companies today, you are probably using some sort of OAuth identity provider like OpenID, Google, GitHub, etc. This takes care of both identity and authentication, but authorization (AuthZ) is not addressed by this.

In our previous blog posts, we discussed two REST API best practices for making one Database call per API route and assembling complex objects that need to be displayed in the UI.  In response, one of our readers asked a great question: If the design pattern is to always make one DB call per API route and then handle joins in the UI to create complex objects, how do we manage authorization/permissions? With a finished API, you can abstract it across the lower level APIs.

This blog describes pros and cons of two options we considered for handling authZ and why we chose the approach we did. Our two possible approaches were:

- Create a user on the DB for every single user who interacted with our service and manage all permissions at the DB level

- Create a superuser DB account that has “data modification access” and no “data definition access,” and use that account to access data

We were initially hesitant to go with option 2 since it meant accessing all data with superuser credentials, which felt like we weren't enforcing permissions at the lowest level we could. 

Let's look at both options in greater detail.

Declarative Continuous Deployment Pipelines with Docker

Everyone agrees that continuous deployment helps accelerate innovation. However, Continuous Deployment (CD) today is synonymous with fragile homegrown solutions made of disjointed tools cobbled together with thousands of lines of imperative scripts. Avi Cavale walks you through the CD maturity model and demos an end to end continuous deployment with declarative pipelines for Docker applications.

This video was taken at the Docker Seattle Meetup, hosted by Tune.

Deploy your first Continuous Deployment Pipeline

As you know, we released our new implementation of continuous deployment pipelines last month. While our basic documentation is up to date, we believe that learning the new pipelines is best done with quick tutorials that demonstrate the power of CD and how easy it is to get started.

We have created a sample project and sample configuration to deploy the project to test environment,  create a release with semantic versioning, and deploy the project to production. The entire end to end scenario should take less than 30 mins to try out and while you won't learn every little trick, it will definitely make you comfortable with the configuration and how to set things up.

So read on and try it out!

How to deploy to Elastic Beanstalk (Part 2)

In the previous part, we went over the steps of source code deployment to AWS Elastic Beanstalk using a simple Node.js app. We deployed the source code natively at first, then compared with deploying it through Shippable. The latter approach showed actions in the work flow executed automatically for you, by Shippable's unified CI/CD platform.

I'll take a similar approach for this part where we'll go through a deployment of a Docker container of a Node.js app to AWS Elastic Beanstalk. To fully understand this tutorial, complete the previous source code deployment to AWS Elastic Beanstalk first.

Shippable Launches Industrialized Continuous Deployment Platform

SEATTLE, WA (Aug 25, 2016) Shippable has announced the next generation of its continuous deployment platform. The enhanced platform adds key features like release management, multi-cloud capabilities, a declarative pipeline language and a unified view across all application pipelines. These features help software-powered organizations further streamline the process of shipping software and accelerating innovation.

Today, most organizations find it challenging to innovate fast enough to satisfy consumers. DevOps is a set of principles that tries to solve this problem. However, the workflow required to get applications from source code to running in production is complicated and riddled withfragmented technology solutions. The only way to achieve rapid, iterative innovation is to cobble these fragments together in one continuous pipeline. Unfortunately, these custom, homegrown pipelines are rigid, inflexible and hard to maintain. The do-it-yourself approach is a distraction and takes valuable cycles away from product engineering.

Shippable’s integrated platform is built from the ground up to defragment and streamline the process of shipping applications, so that software-powered organizations can accelerate innovation.