
This is not an introduction to JSON Web Tokens. There are plenty of those on the internet already. I’m trying to outline and compile my thoughts and research about when to implement JWT and how to do it safely.
TLDR: It’s easy to shoot yourself in the foot with JWT, but it’s very widely used and has some clear benefits. Be sure to research before trying and use a solid library.
This is just a personal reference.
Generate the key for the user. This should be done on the local machine:
1 | ssh-keygen -t rsa -b 4096 -o -a 100 |
Create a new user on the remote machine. The -m
adds a deuser fault user home directory.
1 | useradd -m new-user |

This is mostly just a quick reference for me to use when I need to whip up a docker server. This is on Ubuntu 16.04.

Redis is a simple key value store and is highly optimized for fast reads and writes. I found myself in a situation where I wanted to offload some app task logging from our document store (mongoDB) to redis.
There are a few important things to consider when making this kind of change.

Multi tenant apps are apps where multiple users share the same database but their data is isolated from one another. This can basically describe almost any app with multiple users. For example users can often only see and change their own data.
However, personally, I define multi tenant apps as having a layer of data isolation above the level of the user. For example you could have a data model called an organization
and the the user can only see and interact with the data related to that organization.
This really isn’t a complicated problem, but I want to document this for later.
Its hard to find a good title for this. Usually you will never us a lambda function to upload to S3. For user submitted files, the right way to upload to S3 is generate a temporary signed upload URL and the user will submit directly to S3 without sending the file to the serverless function.

Many web apps rely on executable binaries to function. For example if you want to do any kind of image processing, usually, in addition to the actual libraries you are using usually you need an program like `imagemagick ` to make it actually work.
So, if you ever want to build a sophisticated web app with the serverless framework, you need to be able to upload and use executable binaries. And its best if you can upload them in such a way that the library knows how to find them without any extra configuration.

As a Rails developer turned Javascript Zealot, I sometimes miss the structure and opinions of the Ruby on Rails world. Its amazing how bare bones many javascript libraries are. They are so modular and self contained (good things) that, unless you take the time to add some structure and organization to your code, its easy for your project to feel chaotic an unorganized. So I’m always looking for and trying to find good patterns and structure to follow with my javascript projects.
The serverless framework is a good example of this. Its so minimal in its setup that it may be difficult to know where to start to give it some structure. So here I’ll share with you one possible way to structure a serverless API project.

The industry trend of decoupling backends and frontends has lots of advantages. You could argue that its just good software design. Plus it makes it much easier to have multiple front-end clients using the same backend. And since mobile apps dont use cookies, then it makes sense to convert the entire authentication system to some kind of token based solution.
But the next questions is how can you safely and convienently store and manage these tokens in your React+Redux app.
The library that everyone uses to manage environmental variables in node is dotenv. I don’t think I’ve ever had so much trouble with such a popular module.
What I want is to have my development environment run with the one set of environment variables and my tests run with a different set of environment variables.