hello.

random post image

This last year has been an exciting time for Node. Mainstream adoption exploded, NPM blew past 100,000 modules, io.js launched to revamp Node’s development principles, and JavaScript continued on its path to world domination.

2014 was also a great year for The Node Way, which started 2014 as a Github repo and a few dinky blog posts labeled: “The Node.js Handbook.” By the end of the year, it had shed these humble beginnings and adopted a whole new look, a new website, and even a new name.

I want 2015 to be the most exciting year yet, for both Node.js & The Node Way. To kick it off I’ll be contributing new content to The Node Way weekly, starting with a brand new post today.

New Post: Designing Factories

“Previously, we introduced the Custom Type Module Pattern and how it can be used to create custom objects in your application. These custom types play a major role in JavaScript development, but creating them directly can quickly become a messy business. In this post I’ll introduce a new kind of module that makes working with custom types much easier: The Factory.”

Read it here and let me know what you think!

What Else?

I’ve got a ton more planned for the year, but I won’t spoil it all at once. Stay tuned for more updates, and keep an eye out for more Node.js best practices at thenodeway.io.

- FKS
random post image

Node.js has always been easy to grasp, but difficult to master. Design patterns and best practices exist for avoiding common pitfalls and building more stable applications, but these guiding principles have never been well documented. It has always been up to you, the individual developer, to uncover them on your own through painful trial and error.

The Node Way is a new project to discover and document the entire Node.js philosophy. It contains a collection of posts on understanding the platform fundamentals and leveraging that understanding to build great things.

This talk — presented at the awesome NodeConf Oakland 2014 — introduces the project, explains the motivation behind it, and gives an all-too-brief overview of some key philosophy takeaways. If you want to learn more, head over to thenodeway.io and learn the answer to the question, “What is The Node Way?”.

Slides

Video

random post image

Boston will always have a special place in my heart. I went to school there, and in a lot of ways grew up there (in that way Millennials say they’re still “growing up” into their late 20’s).

So when I heard that Box organizes a Boston recruiting trip every fall, I was desperate to get involved. The only problem? The trip requires interviewing experience, and I had none.

So I started interviewing… a lot. I interviewed college students. I interviewed senior candidates. I interviewed everyone, and slowly began to recognize what made a good interview vs. a bad one. I even began to recognize all the things that I had been doing wrong as an interviewee. What the hell?!? I was actually learning something, when all I originally wanted was a free flight to Boston.

This is a talk about technical interviewing, and everything I know about doing it well. The talk is geared towards college students, but almost all of it will be relevant to anyone interested in the process. It includes the expected tips and tricks, but with a main focus on what happens before and after the interview… and how it matters much, much more than you think.

Slides

Video

Unfortunately, I wasn’t able to get a video of the talk (I really wish I had, because the Q&A afterwards was awesome). But, if you are a college student or professor and want me to come give this talk to your class, reach out. Seriously, I’d love to visit your school and give it. No catch.

random post image

Update Dec. 18, 2014: An updated version to this article has been posted to The Node Way, a new series on Node.js design patterns and best practices needed to build beautiful applications. Check it out at thenodeway.io.

There are a million different ways to design a JavaScript module. Standard patterns like the singleton and custom type are widely adopted, and provide a dependable feature-set. Some other patterns, however, push the limits of what a module can (and should) actually be. The first group is often encouraged, while the second is denounced without further thought. This post will attempt to explore that second group.

Before jumping in, I want to explicitly point out that almost all of the concepts explained below should be avoided in production. These patterns have the potential to cause nightmares for you or your team down the road, with hidden bugs and unexpected side-effects. But they exist for a reason, and when used properly (read: very carefully) they can solve real problems that other, safer patterns can’t. But just… you know… with those terrible, dangerous side effects.

Monkey Patches

JavaScript is a dynamic language, which – when paired with its prototype-based nature – gives the developer free range to modify objects and classes across entire applications. So when you one day find yourself building a pig-latin generator and wishing that JavaScript strings handled this conversion themselves, you can do something like this:

1
2
String.prototype.pigLatin = function() { /* ... */ }
'Is this actually a good idea?'.pigLatin()    // 'Is-ay is-thay actually-ay an ood-gay idea-ay?'

Modifying already-existing methods can be a little trickier. You can simply overwrite them, but if you want to leverage the original function you’ll need to save it first. Using a more practical example than the one above, you may want to attach data to every every template that gets rendered in an Express application:

1
2
3
4
5
6
7
// Save the original render function to use later
res._render = res.render;
// Wrap the render function to process args before rendering
res.render = function(view, options, callback) {
  options.global = { /* ... */ };
  this._render(view, options, callback);
}

This practice is called monkey patching, and it is generally considered to be a terrible idea. Monkey patches pollute your application’s shared environment. They can collide with other patches, and be impossible to debug even when working properly. The pattern is a powerful hack, but luckily its adoption and use is generally limited.

But desperate times can call for desperate measures, and sometimes a monkey patch is necessary. If the situation allows it, building your patch as a separate module will help keep the hack quarantined and decoupled from the rest of your application. Organizing your monkey patches in one place can also make it easier to find when/if debugging is needed.

The first thing you’ll want to do is make as many assertions about the environment as possible. Assert that the method you’re adding/modifying hasn’t been added/modified yet. Check that its version is correct. Check that everything exists exactly as you expect. Check all of this first, and throw an error if any of it doesn’t look right. While this might sound over-the-top now, it could save you days of debugging later if you fail to catch some subtle collision.

You should also consider exporting your monkey patch as a singleton, with a single apply() method that executes the code. Applying the patch explicitly (instead of as a side effect of loading it) will make your module’s purpose clearer. It will also allow you to pass arguments to your monkey patch, which might be helpful or even necessary depending on your use case.

1
2
3
4
5
6
7
8
9
// some-monkey-patch/index.js
module.exports = {
  apply: function() {
    /* check environment/arguments & apply patch */
  }
}

// later...
require('some-monkey-patch').apply();

Polyfills

Polyfills are most commonly found on the client-side, where different browsers have different levels of feature support. Instead of forcing your application down to support the lowest-common denominator (looking at you, IE) you can use a polyfill to add new features to old browsers and standardize across platforms.

As a server-side developer, you might think that you’re safe from this problem. But with Node’s long v0.12 development cycle, even Node.js developers will find new features that aren’t fully available to them yet. For example, async-listeners were added in v0.11.9, but you’ll have to wait until v0.12.0 before you’ll see them in a stable build.

Or… you could consider using an async-listener polyfill.

1
2
// load polyfill if native support is unavailable
if (!process.addAsyncListener) require('async-listener');

The polyfill is still a monkey patch at heart, but it can be much safer to apply in practice. Instead of modifying anything and everything, polyfills are limited to implementing an already-defined feature. The presence of a spec makes polyfills easier to accept, but all the same warnings and guidelines for monkey patching still apply. Understand the code you’re adding, watch out for collisions (specs can always change), and make sure you assert as much as possible about your environment before applying the patch.

JSON Modules

JSON is the data format of choice for Node.js, and native JSON support makes it easy to load and then interact with static data files as if they were actually JavaScript modules. The original http-status-codes-json module, for example, was entirely represented by a static JSON file. And because of Node’s JSON support, the module became an interactive dictionary of HTTP status codes.

1
2
3
4
5
6
7
8
9
// http_status_codes.json
{
  "100": "Continue",
  "200": "OK",
  /* ... */

// later...
var httpStatusCodes = require('http-status-codes-json');
console.log(httpStatusCodes[res.statusCode]); // 'NOT FOUND'

This feature can be powerful, but don’t refactor your code just yet. Modules are loaded synchronously, which means nothing else can run while the file is loaded and parsed. And once parsed, the result is saved and persisted in your module cache for the rest of your applications lifetime. Unless you intend to actually interact with the object as a module, stick to fs.readFile() and/or JSON.parse(), and save yourself the performance hit and added complexity.

Compile-to-JS Modules

Node supports JSON right out of the box, but require() will throw an error if you try loading anything else. However, if you roll up your sleeves and start poking around, you’ll find that Node can be made to support any number of file types, as long as you provide the parsers.

Here’s how it works: Node holds a collection of “file extensions” internally, which are responsible for loading, parsing, and exporting a valid representation of a given file. The native JSON extension, for example, reads the file via fs.readFileSync(), parses the results via JSON.parse(), and then attaches the final object to module.exports. While these parsers are private to Node’s Module type, they are exposed to developers via the require() function.

CoffeeScript is probably the most popular compile-to-js language, but to properly use it with Node you’ll need to compile it down to JavaScript after every change. Using the technique described above, fans could instead build CoffeeScript support right into Node.js, handling this extra step automatically:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
module.exports = {
  apply: function() {
    // Load your new CoffeeScript extension into Node.js
    require.extensions['coffee'] = function coffeescriptLoader(module, filename) {
      // Read the contents from the '.coffee' file
      var fileContent = fs.readFileSync(filename, 'utf8');
      // Compile it into JavaScript so that V8 can understand it
      var jsContent = coffeescript.compile(fileContent);
      // Pass the contents to be compiled like a normal JavaScript module
      module._compile(jsContent, filename);
    };
  }
}

// Later...
require('require-coffee').apply();

Note: This feature was deprecated once everyone realized that processing your code into JS and JSON before run-time is almost always the better way to go. Parsing directly during runtime can make bugs harder to find, since you can’t see the actual JS/JSON that gets generated.

MP3… Modules?

CoffeeScript was built with JavaScript in mind, so requiring a CoffeeScript module makes a lot of sense. But since Node.js leaves the file representation up to the developer, you can really require any file type you want. In this last section, lets see how this would work with something completely different, like an MP3.

It would be too easy to just load and return the file contents as an MP3 module, so lets go one step further. In addition to getting the MP3 file contents, the file extension should also generate song metadata (such as title and artist) via the audio-metadata module.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
var audioMetaData = require('audio-metadata');

// A custom type to represent the mp3 file and its metadata
function MP3(file) {
  // Attach the file contents
  this.content = file;
  // Process and attach the audio id3 tags
  this.metadata = audioMetaData.id3v2(fileContent);
}

// Attach your new MP3 extension
require.extensions['mp3'] = function mp3Loader(module, filename) {
  // Read the contents from the '.mp3' file
  var fileContent = fs.readFileSync(filename);
  // Export a new MP3 instance to represent the module
  module.exports = new MP3(fileContents);
};

// Later...
var song = require('/media/i-believe-in-a-thing-called-love.mp3');
console.log(song.metadata.artist + ': ' + song.metadata.title); // 'The Darkness: I Believe in a Thing Called Love'

Depending on the use case, this extension could be built to add even more functionality like streaming, playing, and otherwise interacting with the song, all automatically supported at load time.

Conclusion

This post isn’t meant to endorse or approve of any of the above patterns, but it isn’t a blanket denouncement either. What makes these modules dangerous is the same thing that makes them so powerful: they don’t follow the normal rules. Polyfills can update your feature set without actually updating the framework, while File Extensions change the idea of what a Node.js module can actually be. Understanding how any of this is possible will help you make smarter decisions when it comes to module design, and allow you to spot potential problems before they happen.

And one day, when you find yourself in a jam, one of these patterns might just help you out.

random post image

Module.js is one of the most important modules in Node.js core, yet its existance remains almost completely undocumented.

In this talk, I expand on my previous blog post by walking through the module and explaining how loading, compiling, and caching all work. If you’re interested in exploring Node.js core but don’t know where to start — or if you’re just curious about its inner workings — then this is the talk for you.

Slides

Video