Mauro Morales

software developer

Author: Mauro

  • It’s Kairos Time

    I’m excited to announce that I joined Spectro Cloud. I’ll be part of the team building Kairos, the immutable Linux meta-distribution for edge Kubernetes.

    Ok, a lot to unpack there, and I’m still very new to it, so I have numerous questions myself, but for my own sake, I will unwrap it:

    • Immutable Linux: there are some parts of the file system in the OS that are read-only. This means that if you want to add a package, or make some configuration change, you need to build a new image of the OS with the given changes. This is good for two reasons, on the one hand, it reduces the attack surface and, on the other, it helps to roll back to a specific version of the OS.
    • Meta-distribution: you can pick the flavor of the base Linux distribution on which Kairos is built. For what I can tell, openSUSE, Ubuntu and Alpine are already available, but others could follow up.
    • Edge computing: systems nowadays are being centralized in datacenters. While this can be beneficial in some cases, it can also be unpractical for others. When you have a system, running as far away from the datacenter, then you’re running at the edge of the network. For example, a computer in a parking lot, taking pictures of car plates and calculating the amount they need to be charged.
    • Kubernetes: it’s a platform to deploy applications. It was started by Google and became quite popular. It’s now part of the CNCF.

    For as long as I can remember, I’ve been a Linux enthusiast, so I’m very much looking forward to this experience.

  • The Maintenance Price Tag

    When adding new features to a software product, we tend to plan them by the value they bring versus the cost of development. However, it’s critical to consider that they also come with a maintenance price tag attached to them. Ignoring this cost, can affect the user experience, cause a system crash and/or make it harder to do any changes in the codebase.

    In an ideal world. We’d pay for the cost of developing a feature, but once it’s out, we’d never have to spend more resources on it. And while there might be a few software products out there, that run with minimal maintenance, we should not assume this will be our lucky fate. Yet, product and development teams alike, seem to think this is the case.

    Because of this misconception of pay once, enjoy forever, we constantly fail to evaluate the costs of maintenance of a product before we even start coding it and only address them when something is broken. We are even willing to cut corners deliberately to ship sooner because we believe that first to market wins the race every single time. And everyone starts noticing the problems:

    • Customers, experience a slow and buggy product.
    • Developers, have a hard time understanding and changing code.
    • Product, has to cut down features or cancel their development because the velocity of the team is too bad.

    Instead of addressing the problems, more corners are cut, and eventually, it becomes a habit and the only way you do things. By the time you realize, you have dug your own grave and, soon enough, start talking about full-refactorings. Paying some tech-debt is hardly ever an option. Developers are blamed for building such monstrosity, but product and business were right there at the co-pilot seat too.

    To avoid this situation, product and developers must work together to make maintenance part of the development process. The product team should, at the very least, have healthy buffers during units of work, so developers can address maintenance, but ideally understand it well enough, so they can be the ones planning for it. And the development team shouldn’t work without thinking of the maintenance impact of their code, and own their solutions all the way to production.

    The most important thing to understand is that, doing maintenance proactively, is always cheaper than having to react. And while fully getting rid of emergencies is not possible, reducing the amount of them will give you more time for feature work and will keep the stress levels of the team under control.

  • 10 Years Working Abroad

    Today, I’m celebrating 10 years since I moved to Europe and became a migrant worker. Living abroad helped me grow as a programmer, professional and a human being. This is a short summary about the process, challenges, and key learnings from this experience.

    In 2011, I finished my first big web development project, and it left me with the feeling that there had to be better ways to build software. This is how I found out about the Ruby on Rails web framework and its community. They were talking about practices like automated testing, which for me was something new and appealing. There weren’t many local or remote Ruby jobs I could apply, so I decided to search abroad.

    I reviewed hundreds of job openings (even those that were not sponsoring visas) to learn about the most requested skills and studied them. I also did a couple of pet projects and one freelancing gig. After many months of preparation, I felt confident enough, and started applying.

    As a university dropout, I was afraid no company would sponsor me. Thankfully, I already had six years of experience, so I was able to apply for a skilled-worker visa. With a university degree, this would have been a tad simpler because I’d have been able to request a blue card. In practice, however, I haven’t really noticed the difference besides the length of the validity of the permits.


    While living in Europe, I’ve had one or two really challenging situations. Like the time a company decided to cancel my contract only 2 days after we signed it. But these kinds of issues put you on fight or flight mode, and you find a way to overcome them. Yes, it was stressful, but I managed to find something new in less than a month. In my experience, the things which are more likely to break you, are the daily challenges like:

    • Long distance relationships: My wife and I have lived a third of these years apart. It’s important to be aligned and have a vision because there will be times when you need a reminder to not lose faith.
    • Lacking a support circle: I saw a major difference between the time I lived in Switzerland alone, and the years when my wife has been there to go through this adventure together. We’ve also made meaningful friendships along the way. These people don’t only make it easier to be abroad, they become so close that you build some of your most significant memories together.
    • Cultural differences: Sometimes locals don’t understand you, and vice versa. I learned not to take any of these differences personal. It’s possible to immerse in a different culture, without having to die in the process or lose your own culture.
    • General knowledge: There’s a lot of general knowledge you lack when you’re in new territory. The housing market, taxes and medical system can be tricky to understand at best. Simple things like the business opening hours can be frustrating. Thankfully, there’s a lot of information online, so it’s just a matter of investigating and learning to survive a few embarrassing moments.
    • The feeling of not settling anywhere: The excitement of a new place eventually fades away, and sometimes you can feel anxious about not settling. I’ve seen an improvement since I stopped overthinking about the next stop in our journey and focused on enjoying the current stop.


    Even with all those challenges, the experience of working abroad is very rewarding. Living in Central Europe, you can travel to many countries nearby, learn or practice one or multiple languages, attend many cultural and technological events, and much more. And while Europe might not be the tech mecca that developers see in the US, it is a great place to learn about software development.

    In Switzerland, my boss taught me a lot about Lean principles and system administration. I also learned about configuration management and hardening Apple’s OS. I wasn’t using Ruby on Rails that much, which enabled me to focus on learning Ruby and OOP.

    At SUSE, I learned a big deal about Linux and tooling. This was my first experience with Extreme Programming, and I saw my technical and soft skills grow exponentially. I learned to think more like an engineer than a developer. I also expanded my horizons and played a bit with new technologies, like the Go programming language and Docker.

    I then had two shorter but valuable experiences. One at Babbel, where I learned about Serverless on AWS, and another one at CloudBees, where I deep dived on the topic of CI/CD while working on CodeShip and Jenkins-X.

    To finalize this 10 year journey, I’m now learning how to lead a dev team at Ring Twice. I’m also figuring out how to work together with management to do company-wide changes to improve the development process.

    What’s next?

    The last decade has been a fantastic experience. I’m more than satisfied with how things ended up and would recommend anyone interested in working abroad to give it a try. For those interested but not able to move, don’t despair. The world is a very different place, and it’s a lot easier to find good remote jobs. If you have questions on how to get started, don’t hesitate to get in touch (social links at the bottom of the page).

    Whether we continue on our expat journey or not, it will depend on the professional opportunities that open up. While this uncertainty can trigger my anxious side, I’ve also learned to accept that much of the success we’ve had, was was just us riding a wave, and not the result of compulsory planning. Amor fati.

    In terms of career, my goal is to keep growing as a software developer and to become a better leader and mentor. I’ll also make a conscious effort to create my first information product. Stay tuned!

  • Rails Routing Advanced Constraints for User Authentication Without Devise

    Many times we mount engines and restrict access to admin users via Devise. In this post, I’ll show you how to do the same when using a different authentication mechanism.

    Let’s take for example the Sidekiq engine. According to their wiki, all we need to do is surround the mount using the authenticate method.

    # config/routes.rb
    authenticate :user, ->(user) { user.admin? } do
      mount Sidekiq::Web => '/sidekiq'

    But since this method is a Devise helper method, how can we achieve the same results when we use a different authentication mechanism?

    Turns out it’s actually very simple. We can use a Rails’ advanced constraint.

    # config/routes.rb
    mount Sidekiq::Web, at: '/sidekiq', constraints: AdminConstraint

    Not too shabby! It looks even better than the Devise helper method IMO. But let’s dive into this constraint.

    For the sake of simplification, I will assume that our authentication mechanism consist of a JWT token which gets saved on a cookie and a service which takes care of verifying that token. This service will also return a user when successful or nil otherwise. Replace this behaviour for whatever mechanism you have instead.

    # app/constraints/admin_contraint.rb
    class AdminConstraint
      class << self
        def matches?(request)
          user =['authToken']).call
          user.present? && user.admin?

    Yes, it’s a bit more code, but not that much and it allows us to leave the routes file a bit cleaner and to have a single place where to define what access to admin means.

    Let’s finish the job by adding a test. I like RSpec, so I’ll write a request tests.

    I’ll also assume that you have a token generation service.

    # spec/constraints/admin_constraint_spec.rb
    require "rails_helper"
    # we won't want to rely on sidekiq for our test, so we'll create a dummy Engine
    module MyEngine
      class Engine < ::Rails::Engine
        isolate_namespace MyEngine
      class LinksController < ::ActionController::Base
        def index
          render plain: 'hit_engine_route'
    MyEngine::Engine.routes.draw do
      resources :links, :only => [:index]
    module MyEngine
      RSpec.describe "Links", :type => :request do
        include Engine.routes.url_helpers
        before do
          Rails.application.routes.draw do
            mount MyEngine::Engine => "/my_engine", constraints: AdminConstraint
          cookies['authToken'] = token
        after do
        let(:token) { }
        context 'with an admin token cookie' do
          let(:user) { create(:user, admin: true) }
          it "is found" do
            get links_url
            expect(response).to have_http_status(:ok)
            expect(response.body).to eq('hit_engine_route')
        context 'with a non-admin user' do
          let(:user) { create(:user, admin: false) }
          it "is not found" do
            expect {
              get links_url
            }.to raise_error(ActionController::RoutingError)

    Et voila! We’re sure that our constraint behaves as expected.


    All the code in this post was based on the documentation from the following projects:

  • Running MNT Reform OS on an NVME Disk

    Running the MNT Reform 2 from an SD card is not a bad solution. It’s similar to the way a Raspberry Pi is run. However, I wanted to free the SD card slot. In this post I describe the whole process from picking and buying an NVMe SSD, to installing and configuring it.

    But before I continue, I cannot take credit for this work, as it’s summarized in the Operating System on NVMe Without SD Card Post. I just wanted to give a little more detail into the steps I took, some of the mistakes I made, and add some information related to using an encrypted device.


    • 1 NVMe disk
    • 1 Phillips screw driver
    • 1 M2x4mm pan head screw (included in the DIY kit)


    I bought the one that MNT puts on the assembled version of the Reform 2, a 1Tb Transcend MTE220S because I didn’t want to risk it. I bought it from (I tried to look for it in local businesses in Belgium but I couldn’t find it in the ones I was suggested to check). The price was around 125 Euro with shipping included.

    There’s a community page on Confirmed Working NVMe Drives that will hopefully hold more options in the future but so far, the Transcend disk seems like a very good one.


    1. Disconnect the laptop from the power
    2. Discharge yourself by touching a metal surface or using a discharge bracelet
    3. Remove the acrylic bottom
    4. Remove the batteries
    5. Place the NVMe device in the M2 socket
    6. Secure it

    Do not close the laptop just yet. Turn it around, plug the power, turn it on and log in with your user. If the installation was successful, you should be able to see the device on the Disks application


    The next step is to create one or more partitions on the disk. I used Gnome Disks but it’s limited because you cannot do logical volumes, so you might want to install gparted or follow some tutorial for the CLI on how to achieve your specific partitioning setup.

    Note: If you are planning to use the whole disk without partitioning and only format the disk, using the Drive Options menu (3 dots at the top right corner). The script mounting your partition /sbin/reform-init might have issues because of the name of the device. At least that’s what I experienced the first time I did this process.

    My current setup is one encrypted partition with ext4 file system for root and one encrypted partition with ext4 file system for home (I will write about this in a next post). This means that I have to enter two passwords when booting. I’m planning to use a key in the future but if you don’t want to have to enter two passwords, read about logical volumes, or if you don’t want encryption at all then you don’t have to worry about this.

    1. Select the NVMe Disk
    2. Click on the + sign to create a new partition
    3. Select the size (needs to be at least the size of the SD card) and continue
    4. Give it a name e.g. “root”
    5. Select ext4 as your file system
    6. Select encryption with LUKS
    7. Press “Next”
    8. Add a pass phrase
    9. Press “Done”


    To copy all your data in the SD card to the NVMe disk, we first need to unlock the disk. The first argument is the path to the device, so it needs to map whatever partition number you did in the previous step. The second argument is the name you want to give, so choose whatever you prefer.

    # cryptsetup luksOpen /dev/nvme0n1p1 crypt

    The unencrypted partition will be accessible on /dev/mapper/crypt

    We can use that path to run the reform-migrate script

    # reform-migrate /dev/mapper/crypt

    You can of course use Disks to unlock (open lock button) and mount (play button) the device instead. You will need to use the following command to move all your data.

    # rsync -axHAWXS --numeric-ids --info=progress2 / /media/USER/NAME

    Make sure to update the last argument to be the path to where you mounted the device.


    Booting from the NVMe disk is a two step process. We first need to configure the laptop to boot from the eMMC drive, and configure it to decrypt and mount the NVMe drive and init from it.

    Read more about this topic on Section 10.2 and 10.3 from the Operators Handbook.

    To switch the booting mechanism from the SD card to the inner eMMC module where the MNT Rescue disk resides, we need to flip a dip switch that resides underneath the heat sink.

    1. Shutdown and disconnect from power
    2. Remove the heat sink (be careful not to put the heat sink bottom flat on top of a surface since there’s some paste in it)
    3. Flip the dip switch on the bottom right (or top left depending on your perspective)
    4. Place the heat sink back in place

    We can now plug the power again and start the machine. When prompted for a logging you need to use “root” without a password since this is a completely different system from the one configured on the SD card.

    Now we need to download a newer version of U-boot in the rescue disk.

    # wget

    U-boot is a mini OS used to boot Linux. For what I understand, this “newer” version is just the same version than is in the SD drive, so trying that instead of downloading a new one would also be an option.

    To flash the new U-boot we need to unlock the boot partition

    # echo 0 > /sys/class/block/mmcblk0boot0/force_ro

    And flash the binary

    # dd if=flash-rescue-reform-init.bin of=/dev/mmcblk0boot0 bs=1024 seek=33

    Now that we have this U-boot version in place, we can configure it to boot from the NVMe drive

    # reform-boot-config nvme

    This creates the file /reform-boot-medium with the word nvme in it. This is important because it’s used by reform-init.

    Note: One important thing to mention is that reform-init will only try to unlock and mount the encrypted partition under /dev/nvme0n1p1. With a different setup, one needs to go and modify this script to the right path. I stumbled across this problem on my first attempt but it was quite simple to debug and to help me understand better what’s going on under the hood.

    If everything went well you should be able to reboot the device and it will boot from the NVMe drive successfully. To finalize this process

    1. Shutdown the system and unplug it
    2. Put the batteries back in place (be careful with the polarity)
    3. Place the acrylic bottom
  • MNT Reform 2 DIY Kit Review

    The MNT Reform 2 laptop was made available on Crowd Supply in June 2020. This review is for the DIY kit version, and I’ll focus on the experience of supporting this project and its vendor through crowdsourcing, the process of putting the machine together, and my first impressions. I plan to share a second post with my thoughts on the experience of using the device as my computer for personal use.


    The MNT Reform 2 is an Open Hardware Laptop. It comes with the Open Source operating system Debian Linux pre-installed. The DIY kit is just a disassembled version with a set of instructions on how to put it together.

    Nowadays laptops, and most electronic devices, lose their warranty if you try to tinker or repair them yourself. This laptop is one of the few that invite you to open them and make them your own. If you don’t believe me, take a pick through the bottom, made of see-through acrylic.


    As soon as I saw the project on Crowd Supply, I got hooked and decided to support it. If I remember correctly, the project got fully funded reasonably quickly, and by the time the campaign finished, the number of backers tripled.

    The original shipping date was in December 2020, but I only received mine in April 2021. Four months of wait time can sound like a lot, but you need to consider that many producers and shipping companies had delays because of the COVID-19 pandemic. It would have been silly to expect that MNT wouldn’t be affected by this. On top of that, there are always delays in February because of the Chinese New Year. In the end, I think these delays ended up being positive because the MNT team used the time to make improvements to the keyboard and battery life. Lukas, MNT’s CEO, constantly shared about progress and any delays. It was pretty entertaining to follow up.


    In my opinion, most open projects don’t have a very appealing branding. MNT is the complete contrary. I’m glad they put the same passion on the packaging as they did on the product.


    The kit comes with a big printout that has on one side the instructions and on the other side pictures to give you a good idea of what is what. All you need is a cross screwdriver.

    While most steps were clear, there was one that I couldn’t figure out about the right way to plug the monitor. Fortunately, all I had to do is open the device again and invert the connector. In total, it took me between 2.5 and 3 hours to get the machine to boot. I swear I hadn’t had this much fun with a device in a very long time.



    The laptop is gorgeous. When the lid is closed, it has this old-school Thinkpad vibe. The aluminum enclosing is pleasant to the touch and hardly picks any fingerprints. The MNT Reform 2 is quite thick if you compare it to today’s standards, which has its benefits, as you’ll see.

    On one side, there’s an HDMI port and three USB type-A ports. On the other side, a port for SD cards, a headphone jack, a network port, and one for the charger. I’m pleased about this because there’s nothing more annoying (and ugly if you ask me) than all those dongles coming out of a beautiful laptop.

    If you flip the computer, you can pick into the electronics thanks to the acrylic bottom. At first, I wasn’t very excited about this feature. It’s not that I don’t like it, but I think that after a while, it will get scratched, and then it won’t look as good. If there had been an option to get it with an aluminum bottom, I would have probably gotten that one. But I’m glad it is this way because it feels like an invitation to open and tinker with the device.

    If you open the lid, the first thing you notice is the trackball. It’s also possible to buy it with a trackpad, but I thought this would be more fun. Plus, I can always replace it if I don’t like it. So far, it’s been quite fun to use, but it will take some time to get used to it.

    Next, you might notice the small display on top of the keyboard. It’s helpful to get additional feedback, like battery percentage or the system’s status. The best part is that I can turn it off whenever I don’t need to look into it and avoid wasting precious energy.

    And of course, here’s where we find the mechanical keyboard. The keycaps feel very natural to the touch. The switches have excellent travel and sound amazing. However, this could be problematic when working with others, just like with any other mechanical keyboard, and there doesn’t seem to be a way to put dampers on these switches. I’m now used to Cherry MX Silent Red switches, and these are louder. Typing is very comfortable, except for the keys I press with my thumbs. The keyboard sits only slightly above the level of the palms rest, and because of the long travel, I feel like I’m constantly pushing the palms rest with the side of my thumb. In my opinion, raising the keyboard a little bit or making the inclination between the palm rest and the keyboard a bit more prominent would help.

    The layout of the keys is quite good. I love having a dedicated row for the function keys and a split space bar with two alt keys in between, which makes it more natural to reach them. Instead of a caps lock key, you get a control key, which I already configure on every other laptop. Details like these make this laptop feel like if it is tailor-made. I don’t particularly appreciate having the up arrow where the shift key is typically, but that’s where the open-source part comes very handy. I plan to flash a new layout into the keyboard, hopefully, one with multiple layers.

    The last tiny issue I will mention is that the printout of the quotes key was inverted, but MNT is already aware of this, so they might have it fixed for future machines. None of the issues I just mentioned are deal-breakers. This keyboard is by far the best I’ve used on a laptop.

    On the top panel are the display and two speakers. I like the side and top bezels, but the bottom one is a bit prominent and could use, in my opinion, some design or an MNT logo. I decided to put the sticker with the serial number there. The sturdiness of the top panel feels solid. I can move, and it doesn’t wobble. The hinges feel pretty sturdy like they can last forever. The display quality is excellent, but the speakers are a bit too quiet.

    Last but not least, I must mention the parts that are not present with the machine. The MNT Reform 2 doesn’t have a webcam or a microphone (I remember reading somewhere that this was by design thinking about privacy first, but I couldn’t find this information on the Crowd Supply or MNT Reform websites). The DIY kit doesn’t come with an SSD or a WiFi card, but these can be bought online or at a local store. Lukas shared the exact models that come with the assembled version.


    The MNT Reform 2 initially boots into text mode, where you first have to follow a few steps to create your user account. The Operator’s Handbook explains every step in detail. Once you have an account, you can start the graphical interface. The three options that come pre-installed are Sway, Gnome 3, and Window Maker. But you can install any other that’s available for Debian.

    Sway is a tiling window manager, and you make heavy use of shortcuts to control it. It takes a little getting used to, but it feels suitable for this machine. It’s also the only one described in detail in the Operators Handbook. The device runs smoothly while using Sway. So far, my test consisted of watching a video with MPV, browsing the web, and editing text with Neovim simultaneously. The only case when the machine started struggling was when I tried to improve this text using Grammarly. Their JavaScript app doesn’t crash Firefox or Chromium, but it is painful. While I could blame the machine for not having enough power, I think the problem is we’ve gotten used to web-based technologies built without performance in mind. I find it ridiculous that you need a high-end laptop to run a web application. I tried to use my iPad (A-12 chip) for comparison, but Grammarly doesn’t even let you use the web application on Safari iOS, which proves my point.

    The Gnome 3 version that comes with the MNT Reform 2 has fewer components than the vanilla version. I guess that it helps reduce the load since full DEs are very power-hungry. I tried to do the same experiment as with Sway, but unfortunately, none of the videos played well on MPV, and in general, I did feel a bit of lag when using Gnome for some tasks.

    If you cannot live with a tiling window manager, I recommend you go with Window Maker. It doesn’t look very up-to-date, but after trying it for a while, I must say it performed very well. Like with Sway, there was no issue at all having my three designated applications running simultaneously.


    I’ve been using the MNT Reform 2 to write these notes down for the past few days. Hearing the sound of the mechanical keyboard is music to my ears. Not having a network connection allows me to concentrate on what I want to say instead of being annoyed by multiple notifications. From time to time, I plug an ethernet cable to search for something online and eventually publish these words on my blog. The whole experience reminded me about times when computing felt a lot more personal, and our lives didn’t need to be online 24/7.

    Not only has it been fun to build and use this machine, but I’m also very excited about the idea of being able to service and extend its life. I love the concept of having a device that evolves according to my needs. Above all, it feels good to own a device not because it’s the latest and greatest but because its ethos resonates with my own. Only time will tell if the MNT Reform 2 will live up to its promises, but I’m certainly rooting for it.

  • Ruby On Rails: Storing JSON Directly in PostgreSQL

    Whenever we save data from one of our Rails models, each attribute is mapped one to one to a field in the database. These fields are generally of a simple type, like a string or an integer. However, it’s also possible to save an entire data object in JSON format in a field. Let’s see an example of how to do this from a Ruby on Rails application.

    For this example, let’s assume that I have a Page model where I want to save some stats. To begin, we’re going to generate a new migration and add the stats field and define it as type JSON which by default will save an empty array

    def change
      add_column :pages, :stats, :json, default: []

    Once migrated, let’s have a deeper look at how our pages table looks like

    \d pages
    Table "public.pages"
     Column | Type | Default
     stats  | json | '[]'::json

    Now that is interesting, unlike the more common types which can be 0 or false, the default value of this field is literally the string cast to JSON. Let’s play a little bit with this and cast an array with values

    SELECT '[1, 2, 3]'::json
     [1, 2, 3]
    (1 row)

    Turns out PostgreSQL offers also a set of functions to handle JSON data. Let’s say for example that I wanted to get all pages that have no pre-calculated stats. This can be done using the json_array_length function

    SELECT *
      FROM people
     WHERE json_array_length(stats) = 0

    This is way more performant than, fetching the data, serializing it, and loading it to a Ruby array to then calculate its length.

    Ok, that’s all nice, what about the cases when I do need to load the data in a Ruby object and then save it back? You’ll be happy to know that you don’t need to do anything else, Rails will do all the heavy lifting of serializing and deserializing for you, and provide a getter and setter methods so you can interact with the attribute as you normally would

    page = Page.find(1)
    => Array
    page.stats = [1, 2, 3]
    => [1, 2, 3]
    => true

    Throughout this example, I used a very simple array, but you can of course use much more complex data objects like you normally would with JSON but be careful not to shoot yourself in the foot! Just because you can save a lot of data into a JSON field doesn’t mean that you should. Evaluate first if what you need is an additional model that relates to the model you’re working with.

    Want to know more? Checkout PostgreSQL documentation on the JSON datatype and the functions you can use

  • Using a Hackathon to Stress Test Your Development Process

    A hackathon’s value proposition is generally one of innovation. Companies see these events as an investment to come up with new products. However, I recently found out they are also a great way to teach us about existing flaws in our software development processes.

    I’ve participated in a handful of coding events before and have come to appreciate hackathons, as a good way to boost camaraderie and do some individual learning. That’s why I got very excited when a few months ago our CTO announced we were going to do ListMinut’s first hackathon. And so, for 3 days in mid-September, our product development team moved to the Belgian coast to design and code an MVP.

    Let me first give you a bit of context. ListMinut is growing, and while this is good, it’s also challenging. Just like most other businesses, at first, it was possible to add features fast and easy, but with time the development process became harder and slower and throwing more man-power doesn’t seem to balance the situation. This scenario is very similar to a city whose streets grew organically and were not designed with growth in mind. At some point, the excess in population causes the entire traffic system to collapse. Fortunately for us, software is way more malleable than a city.

    To mitigate these growing pains, there are two major changes we are introducing. On the operational side, we are setting agile processes in place to help us work smarter. And on the technical side, we are improving code quality and optimizing performance, while also evaluating architectural changes which can deliver long-term benefits. This is basically why I joined the team.

    The main motivation behind the hackathon was to address the latter. What I didn’t expect, was how good the hackathon would be, as a way to surface out any flaws in the way we work. You’ll see, as part of my work during my first four months, I’ve been pushing for (1) shorter-lived feature branches, (2) code reviews, (3) automated testing, and (4) improved code quality. The team has been taking all these changes very positively. I’ll even allow myself to say that they’ve even been somewhat enthusiastic about it, which not only has made my job much easier but is enabling us to rip off the benefits early on.

    During the hackathon, without any request from my side, the back-end team didn’t rush in to code like crazy but instead followed each of these 4 principles, which completely made my day. The fact that we did so, allows us to easily incorporate what we built during these 3 days without having to go through a big refactoring or suffering some high maintenance costs later on. This is already a considerable win, but I know we can still improve and the fast-paced rhythm of the hackathon was a great way to put our processes to the test.

    For us, it pointed out little communication problems which forced us to do some re-writes and some problems with WIP (work in progress) and bad design which gave us some ugly merge conflicts. These difficulties aren’t new to development teams, as a matter of fact, they are very common but sometimes challenging to see in the busyness of the day to day. So teams adapt and cope with them or worse yet, they start to think of them as myths. The good news is that our industry has been solving these issues for a while now, so we will follow some Lean/Agile advice while also improving our OOP design and iterate until this machine is finely tuned.

    I’m quite excited to be working as a part of a team that experiments and isn’t afraid of exposing its issues. I think this is the only way to learn and improve. If you’d also like to expose any issues with your development process, let me recommend that you do a hackathon and follow your existing development process, you might surprise yourself about what you find.

  • ActiveRecord Except

    August 19th was Whyday, and to commemorate it, I decided to write a gem called activerecord-except.


    activerecord-except is a Ruby gem that extends the functionality of ActiveRecord, allowing you to select all the fields from a table, except the ones that you specify. For example, if you have a table users.

    development_db=# \d users
          Table "public.users"
                Column             | 
     id                            | 
     username                      | 
     password                      | 
     email                         | 
     first_name                    | 
     last_name                     | 
     phone                         | 
     created_at                    | 
     updated_at                    | 

    And you want to get all the fields except for the password, you’d have to pass each of them in your select clause like so,

    Instead, using activerecord-except, can simplify your statement by saying only the field you don’t want, in this case, the password one



    Under the hood, the except clause makes use of the traditional selectclause. So our previous example will produce the following query

    SELECT "users"."id",
      FROM "users"

    This is because the SQL language doesn’t provide such functionality out of the box.

    I don’t know what is the reason for this. I can only speculate that it’s to be more explicit and not be caught by surprise if a field in a table gets added/deleted/changed. However, * is also wildly used. In Rails for example, it is what you get, when you don’t specify a select clause in your query.

    The way I managed to make it work is by adding a method toActiveRecord::Relation which asks the model for all its attributes and rejecting those that match with the ones passed as arguments.

         .reject { |attr| fields.include?(attr) }

    Note: As you can see, I’m using _default_attributes which starts with an underscore. This can mean that the method is not intended to be relied upon.

    Whether or not you might want to use in production, I leave up to you, where I really see the benefit of activerecord-except is for writing one-off scripts to extract data, because it makes them much easier to read.


    You can install it from rubygems or you can check the source code either on Sourcehut or GitHub.

  • Vortex Core Mechanical Keyboard Review

    I got myself a new keyboard for my birthday, the Vortex Core. I wanted a mechanical keyboard that I could take everywhere with me. Being a 40% keyboard, I expected it to over deliver on the portable side, what I didn’t expect, is that I’d enjoy using this tiny keyboard so much, even for extended periods of time.


    • Four layers, from which three of them are programmable without having to flash the devise
    • Cherry MX switches. I got mine with silent red ones
    • DSA Profile keycaps
    • RGB LEDs (also programmable)
    • ANSI layout (for the most part)
    • Aluminum case with 4 rubber feet
    • Micro USB connector

    The quality of the printing is great and I really appreciate having the side prints to be color coded depending on the function key. This is necessary to program the other layouts but even if it wasn’t, I wish more keyboard manufacturers would do it.


    Vortex Core compared to Das Keyboard 4 Ultimate

    Vortex Core compared to Macbook Pro 13″

    Vortex Core compared to Magic Trackpad


    I really like how this keyboard looks and feels from but there were two changes I made to make it perfect for me:

    1. Programmed layer 2 so I could access all numbers and symbols plus arrow keys via the Fn key or a combination of Fn+Shift. Edit this layout If you want to know more about how to program the Vortex Core, check out this blog post
    2. Switched the left Space Bar, with a Vim keycap I bought fromVimcaps. The Vim green color, fits perfectly with the beige and gray from the other keys.


    I configure my OS to switch Caps Lock for another Ctrl for easy access, but as you can notice from the layout, the physical Caps Lock is missing. At first I was considering to reprogram another key to be Ctrl because I find the position of Ctrl very inaccessible. However, I noticed that I can easily press the Ctrl key using my palms and I ended up liking this better. So much so, that I also adopted this while using my Ergodox EZ.


    The Vortex wasn’t really my first option for a 40%. I had my eyes on a Planck EZ because I’m very pleased with the quality of the Ergodox EZ by the same company. I ended up picking the Vortex because (a) I liked the retro look better, (b) it was about 60 EUR cheaper and (c) I could get it from a local shop here in Belgium.


    Overall, I’m very happy with this keyboard. I enjoy using it for my everyday writing, no matter if it’s code and prose. It’s a great addition to my keyboard fleet and I’ll keep using it on a daily basis. The only thing I’d change, is the cable that comes with it. Everything in this unit has been built with a very high standard and an average USB cable doesn’t do it justice but if you don’t mind this so much or you don’t mind spending extra on a nice cable then you won’t be disappointed. So, if you’re on the market for a well built, good looking, portable and programmable keyboard, you should consider the Vortex Core.

    Having said that, I wouldn’t recommend the Vortex Core to someone who’s looking to buy their first mechanical keyboard. Instead, try to go with something a bit bigger first so you can get an idea about what you like and don’t about mechanical keyboards before buying something as extreme as a 40%. A good option could be a Das Keyboard 4. It isn’t programmable but the quality is great and I really like the dedicated media keys.

    For those who already have experienced a mechanical keyboard and are considering the Vortex Core, remember that getting used to a new keyboard layout takes time. The great thing about this keyboard is that it’s programmable so you can make that process less annoying by changing the default layout to something you feel more comfortable with. I think it’s better to use something that feels natural so you find yourself coming back to your keyboard over and over again, than something which you might think is the ultimate layout. Little by little you can introduce minor modifications that you can adapt to easily. I’ve been re-programming my Ergodox EZ for the past 4 years and I’ll probably continue doing so in the years to come.