Introduction to Beego

I tried GoLang for the first time a little less than two years ago and I didn’t get a very good first impression of it. One of the things that I disliked the most is that back then the community seemed to be overly focused in performance even when that meant creating unreadable code.

One of the things that bothered me the most was that the community was against creating web frameworks with opinions. Because of this lack of standardization, I had to relearn how things are done every time I looked at a new web project written in Go.

Today I decided to look again into the state of web frameworks and there seems to be a few promising options out there. I saw a good amount of people recommending Beego, so I decided to give it a try.

Read More

Dependency management with golang/dep

It has been more than a year since I wrote an article about dependency management with Glide. It seems like things have changed a little since then. An official package manager has been started by the community, which hopefully will make things easier for developers.

Install

For go applications more than with any other language (because of the necessity of GOPATH), I highly recommend using docker. This is a minimal Dockerfile I’m using that includes both Go and Dep:

Read More

Introduction to GitLab

In a previous post I explored using Bitbucket Pipelines to generate and publish Docker images for my projects. I was worried I would reach the 50 minutes limit pretty quickly, but even before I reached it I ran into other issues that made me look for other options. Namely, you can’t use docker-compose or the docker run command in Pipelines.

While looking for other options I found GitLab. I had heard a lot of things about GitLab in the past, especially about their very advanced deployment pipelines capability. One thing that I didn’t know, which makes me very happy, is that they have a free tier that includes 2,000 minutes of CI per month (Around 1 hour per day). This number should be more than enough for my personal project needs.

Read More

Introduction to Bitbucket Pipelines

I have a few projects that I host on Bitbucket (Mostly because I can have private repos for free). As I was working on some of these projects last week, I realized that there are a lot of manual steps I have to execute in order to verify that my project is in good health and to publish it or deploy it.

Today I’m going to explore using Bitbucket’s Pipelines to generate a Docker image out of one of my projects and publish it to Canister.

Read More

Getting Rails to run in an Alpine container

I’m trying to get a little Rails application ready for production, and just for fun I’m trying to make the image a little slimmer than it currently is (860 MB).

There is a official docker image of ruby with alpine (ruby:alpine), but because of the libraries that rails uses (with native bindings), it is a little more challenging than just referencing that image on the top of the Dockerfile.

Solving the issues

I added FROM ruby:2.4.1-alpine at the top of my Dockerfile and tried to create the image. The first problem I faced was with myql2. For the mysql2 gem to work on Alpine it is necessary to have a compiler (build-base) and MySQL development libraries (mariadb-dev). I added this to my Dockerfile:

Read More

The Rabin-Karp algorithm

I was doing a little studying on algorithms when I stumbled into what looked to me as a pretty simple question: “Find the first occurrence of a string inside another string”. This can be simply achieved with two nested loops and a worst case scenario performance of O(nm) or O(n^2) if m’s size is relative to n.

Here is an implementation that uses two loops:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
function findString(s1, s2) {
  // We only need to loop until s2 doesn't fit anymore
  var loopLimit = s1.length - s2.length;
  for (var i = 0; i <= loopLimit; i++) {
    for (var j = 0; j < s2.length; j++) {
      // As soon as we find a mismatch we abort the loop
      if (s1[i + j] !== s2[j]) {
        break;
      }
    }

    // If j reached the size of s2, it means all letters
    // in s2 matched. We have succeeded!
    if (j === s2.length) {
      return true;
    }
  }

  return false;
}

findString('aaaaaaaaaa', 'aaaab'); // false
findString('aaaaaaaaab', 'aaaab'); // true
findString('aaaaaaaaab', 'ab'); // true
Read More

Load testing a Rails app with Vegeta

I’m building a very simple app using Rails. While looking for guidance for preparing it for production, I found a lot of articles suggesting to put Nginx in front of it. After talking to some people they explained some reasons why this is suggested:

  • Ngingx can serve static assets – This appears to be the greatest and clearer advantage. You can configure Ngingx to directly serve static assets without having to hit Rails at all. This is very good because every request that comes to Rails will block all other request because Ruby is single threaded
  • Nginx can do caching for you – Nginx can cache some of the static assets, which would give them a performance boost
  • Nginx is multithreaded – Nginx can serve multiple static assets at the same time Rails is serving requests

These are definitely advantages (specially the first one), but having Nginx in front of my server also adds complexity to my deployment. To figure out if the added complexity worth it, I decided to run some load tests. Here I will explain how I did it and what were the results.

Read More

Update let’s encrypt certificate without restarting your server

I started using HTTPS in my blog a few months ago and today came the time to renew my certificate. I thought I had automated the process correctly but it turns out for my configuration I have to take some extra steps.

In my previous post I suggested using this command:

1
21 7,19 * * * /home/user/certbot-auto renew --quiet --no-self-upgrade

But it tries to spin a server in port 80, and I’m already using port 80 for my blog, so the server fails to start.

There is another approach that allows you to renew your certificate without having to free port 80. It works by writing a file to a folder in your webroot and having let’s encrypt server read that file. This sounds pretty straight forward but it was actually a little tricky for me, since I’m using docker.

My blog runs WordPress inside a docker container. Inside the docker container the webroot is /var/www/html and this folder contains all wordpress files. I can’t write directly to this folder because it is inside the docker container, so I had to use a volume. I also can’t mount the whole /var/www/html folder because there are already files in that location inside the container. To make it work I had to mount to /var/www/html/.well-known, which is the folder certbot-auto creates.

Read More

Securing your network with iptables

There comes a time on every system administrator’s life when they need to start being a little more conscious about security. That time has finally come for me.

I have a couple of servers in DigitalOcean where I run various sites and services. Some of these need to communicate with each other to do their job, for example, this blog runs in a server with Apache and PHP and communicates with another server that is running a MySQL database.

This is all good, but one of the most important rules of security is to only allow access to resources on a per-need basis. What this means is that from a security standpoint, nobody should be able to access a resource unless explicitly allowed. This rule applies to almost all scenarios that require some kind of access control and is a good idea to follow it whenever possible.

Read More

Simple strategy for MySQL backups

I now have a good amount of data in my blog that I would be very sad if I lost. As a precautionary measure I decided to build a little system that will backup my data regularly so I’m prepared in case of a disaster.

The strategy

The strategy is going to be very simple. I’m going to create a user in my database that has read permissions on the tables I want to backup. This user will run mysqldump from a different machine and will save the backups there. I will create a cron job that will do this once a day.

Read More