Function keys in Thinkpads

I usually work with an external ThinkPad keyboard, which matches the configuration of my laptop’s. Lately, though, I’ve been using my laptop’s keyboard more and more. At some point, Lenovo decided to design the 6th keyboard row in a slimmer way and switched the standard F1-F12 keys to function keys (volume, brightness, etc). This is very inconvenient if your work involves typing a lot, as editors tend to offer handy shortcuts with the F1-F12 keys.

This is how you change the default configuration: FN + ESC.

On JavaScript modules

Next week I'm going to participate in a session about package managers for different languages – we'll present pip (python), composer (PHP), and npm (JavaScript). In sharing ideas with my colleagues I realized how many of the struggles of modern JavaScript tooling and workflow is based upon the fact that the language didn't have a built-in module system until recently, so I've decided to write down my thoughts as preparation for the talk.

A brief history of modularity

For a more in-depth explanation, please, check this evolution of modularity in JavaScript – it has many details hard to find in other places.

JavaScript was created in 1995 without modules, and every <script>  shared the same global scope. We learned how to work with functions and namespacing to gain a minimum ability to divide our code into pieces, but as the importance of JavaScript grew, the pressure to have better alternatives was more intense as well.

In the server, Node.js became the de facto standard and its module choice, CommonJS, the format used in the incipient node package manager,npm . That was 2010, 15 years after the JavaScript release. Browser-side nobody could afford or was interested in a unified module system, so alternatives bloomed and the market became more fragmented. With time, the use of npm skyrocketed and so the importance of CommonJS, even for browsers.

How to make CommonJS available in the browser

It may be worth to pause and remember that at that point we still didn't have a module system for browsers. We had something that was simple enough to work with and hack. The core of the CommonJS API is quite simple, and has two pieces:

  • require: a function to import some code that's somewhere else.
  • module.exports: a variable to hold the code to be exported.

Let's say that I have an input.js file in CommonJS format:

var constants = require( './constants' );
console.log( constants.HELLO + ' ' + constants.WORLD );

And the corresponding constants.js contains:

module.exports = {
    HELLO: 'HELLO',
    WORLD: 'WORLD',
};

I can't add those files to the browser through the <script>  tag and expect them to work. That's invalid JavaScript as browsers understand it.

How do we make it valid JavaScript? Well, something we can do is to copy the modules to the same file, wrap them in a function (so their internal variables don't collide) and expose the necessary keywords through the function arguments:

// Scope the first module
function( require, module ) {
    var constants = require( './constants' );
    console.log( constants.HELLO + ' ' + constants.WORLD );
}

// Scope the second module
function( require, module ) {
    module.exports = {
        HELLO: 'HELLO',
        WORLD: 'WORLD',
    };
}

This can be included in the browsers! It won't fail, but it will also do nothing.

OK, next step, let's implement require : it is a function that takes a module identifier and returns its module.exports object. We can do that:

// Implement require
var modules = {};
var require = function( moduleId ){
    var tmpModule = {};
    modules[ moduleId ]( require, tmpModule );
    return tmpModule.exports;
}

// Scope and register the first module
var input = function( require, module ) {
    var constants = require( './constants' );
    console.log( constants.HELLO + ' ' + constants.WORLD );
}
modules[ './input.js' ] = input;

// Scope and register the second module
var constants = function( require, module ) {
    module.exports = {
        HELLO: 'HELLO',
        WORLD: 'WORLD',
    };
}
modules[ './constants' ] = constants;

It looks a bit better, but still does nothing and we ended up adding a lot of variables to the global scope.

Let's fix this by scoping the code within an IIFE (so it doesn't pollute the global scope) and execute the main module, the entry point of our program (./input.js in our example):

(function() {
  // Implement require
  var modules = {};
  var require = function( moduleId ) {
    var tmpModule = {};
    modules[ moduleId ]( require, tmpModule );
    return tmpModule.exports;
  };

  // Scope and register the first module
  var input = function( require, module ) {
    var constants = require( './constants' );
    console.log( constants.HELLO + ' ' + constants.WORLD );
  };
  modules[ './input' ] = input;

  // Scope and register the second module
  var constants = function( require, module ) {
    module.exports = {
      HELLO: 'HELLO',
      WORLD: 'WORLD',
    };
  };
  modules[ './constants' ] = constants;

  // Execute the main module
  var module = {};
  modules[ './input' ]( require, module );
})();

This is it! We've transformed our initial CommonJS modules into something that is executable in today browsers.

This exercise would need quite a bit of work to be production-ready, but the fundamental steps are there and it's not difficult to see that's easily automated by tools. This is mostly what Webpack does when it transpiles CommonJS to IIFE. Rollup seems to be a bit more elegant but its strategy is similar. If you're curious, check the runnable code they generate.

The transition to ESModules

The success of npm and CommonJS  taught the browser ecosystem a valuable lesson: a better workflow was possible. Some attempts were made to replicate the registry plus format formula but, eventually, npmCommonJS won .

Flash forward to 2015 and the ES6 standard introduces modules in the JavaScript language. Three years after that, browser adoption and node support are still not universal or complete, but everybody agrees that it will. The whole ecosystem seems to be on board and execution environments, toolslibraries, and authors are preparing for what that means.

If we believe that, npm will continue to be the central registry to distribute JavaScript for the foreseeable future, but CommonJS will no longer be the default module format. In this moment of transition, not everything is clear or necessarily better. The ashes of the module wars are still warm, but they won't be forever.

Agile according to Basecamp

Running in Circles is Basecamp’s view of agile product management. They acknowledge the value of working in cycles, but add three pieces: having the time to focus, being able to modify the original plan, and tackle the core unknowns of the feature first.

The first two are enablers that are provided to the makers by management. The last part is how the maker make the most of those powers. Together, they form a process that is nicely captured with the uphill / downhill metaphor. Uphill you are discovering the unknowns and making decisions about what goes in, downhill everything is clear and you are implementing it at warp factor 10:

Software architecture failing

Software architecture failing: tech writing is biased towards what the big ones do, which usually doesn’t fit most other contexts – but, who got fired for choosing IBM, right? Although I feel connected to this rant at an emotional level, I do think it’s necessary to elaborate more and make a positive contribution: help to create and spread that alternate history of software development. How do you do it? Hat tip: Fran.

Touch typing in Dvorak

On November 2016 I had a free month between jobs. Apart from some resting, reading, and general preparations for my new adventure, I still had quite a bit of free time to do new things or build good habits. It was while cleaning my office that I found a keyboard I had bought a couple of years back:

Its layout was a beautiful matrix -which is good for your fingers- and came with Dvorak by default. So it struck me: how about improving my typing during the coming weeks?

As a programmer, typing is an essential skill for me. I had been doing it for more than 15 years in a learn-by-doing way, and I plan to keep typing for years to come. I thought it would be fun to spend a couple of hours a day training in touch-typing and give Dvorak a second try. And so I did.

The experience

Before I switched, I recorded about 15 typing sessions at TypeRacer using the QWERTY layout, which logs typing speed (words per minute) and accuracy (% characters right over the total). I was at 67 wpm and about 95% accuracy at the time.

Progress was very humbling at the beginning; it felt like learning to walk again, and I swear that, sometimes, I could even hear my brain circuits being reconfigured! After a few weeks, though, I was under 40 wpm and, by the end of the month, I was under 50 wpm. I stopped quantifying myself by then: as I started working, I had a lot of typing to do anyway.

During the first months, the only moments I struggled and felt like perhaps the switch wasn’t a good idea after all was during real-time communication: chats, slack, etc. I don’t know what people thought of me, but my velocity at the time was typing-bounded – I was certainly a very slow touch-typist by my own standards.

But time passed and I improved.

Spáñish Dvorak and symbols

Throughout the process I changed my setup quite a bit: I started my journey using the Programmer Dvorak layout with a TypeMatrix keyboard. After a few months, I switched back to my good old ThinkPad keyboard because having to use a mouse again after years not using it was a pain. A few months later, I switched to the Dvorak international, because the Programmers Dvorak layout didn’t quite suit me. Then, I tweaked the common symbols I use for programming so they were better positioned. Besides, although the bulk of my typing is in English, I still need to write decent Spáñish, which basically means using tildes on vowels and ñ. TLDR: the Spanish Dvorak version made things more difficult, so I’ve just tweaked the Dvorak international to accommodate tildes and ñ as I see fit.

At this point, I believe I can patent my own layout:

All the changes I did to the symbol positions have affected my ability to build muscle memory for them – sometimes I still need to look at some specific symbol on the keyboard. However, the current version has been unchanged for months, so I only need a bit more time for them to stick.

The numbers

Given that I was a QWERTY user for 15 years, I thought I would give the new layout a year before comparing any statistics. The fair thing to do would be comparing after 15 years, but I’m a bit impatient for that. I went to TypeRacer again and noted down the results for about 20 races. These are the numbers of this totally unscientific experiment:

A few remarks:

  • In terms of speed, it seems that I’m mostly there. My median speed now is 65 wpm, 2 words per minute less than before. I had a higher peak (83 vs 79) in one of the current typing sessions, but I was under 60wpm in more sessions this time.
  • In terms of accuracy, I’ve improved a bit. My median accuracy has increased by 1,5 points, and I had only 2 sessions below 95% of accuracy this time.

Coda

Overall, I’m very happy with the switch to Dvorak. My accuracy has improved, meaning that I can maintain a longer typing rhythm. Not having to correct mistakes makes me a faster typist as well, and by learning to touch-type I also have grown more endurance.

This experiment was very humbling but fun. I believe it increased my brain plasticity by an order of magnitude, and I’m hoping to improve my numbers as years pass as well. However that turns out, though, I think of this as a gift to the elder me, a way to prevent typing pain in the future and promote a healthy use of the tools I heavily depend upon.