Writing a Node Library in Rust

Writing a Node Library in Rust

Metlo is an open source API security tool you can setup in < 15 minutes that inventories your endpoints, detects bad actors and blocks malicious traffic in real time. We do this by shipping an agent that integrates with any tech stack that analyzes and blocks API traffic as its coming in.

The issue is that we needed to make our agent support a ton of platforms (like Nginx, Node, Kubernetes, Go, AWS Traffic-Mirroring, etc...). So either we could rewrite our agent, or we could be smart about it.

In came rust, with its ability to create a C-compatible library that could then be consumed using various FFI libraries on the required language, or using low latency communication systems like GRPC or pipes.

Here we’ll explore how we did this for Node.js, using a rust-based library called Neon and the performance improvement Rust had for a few different types of tasks.

What is Neon?

As described by Neon itself,

“Neon is a library and toolchain for embedding Rust in your Node.js apps and libraries.”

So, essentially, Neon enables us to run Rust code along with, and inside our Node.js code.

Let’s look at a tiny example here:‌

Setting up the project

npm init neon hello_world -y

This sets up an npm package with some starter code for setting up a Rust library. Here is the built Cargo file:

Cargo.toml ‌
name = "hello_world"
version = "0.1.0"
license = "ISC"
edition = "2021"
exclude = ["index.node"]

crate-type = ["cdylib"] 


version = "0.10"
default-features = false
features = ["napi-6"] 

Nothing too interesting here, but take notice that the 'crate-type' is not the standard 'bin' or 'lib', but rather 'cdylib'. What this means is that Rust will create a C-compatible dynamic library. In addition, there’s a dependency already inserted for Neon, targeting the node-api v6, so this module will run fine on node v10.20.0 onwards. The full compatibility matrix is available here: https://nodejs.org/api/n-api.html#node-api-version-matrix

What about the starter code that got generated?

src/lib.rs ‌
use neon::prelude::*;

fn hello(mut cx: FunctionContext) -> JsResult<JsString> {   
    Ok(cx.string("hello node"))

fn main(mut cx: ModuleContext) -> NeonResult<()> { 
    cx.export_function("hello", hello)?; Ok(()) 

So we have a function main that has the '#[neon::main]' attribute. Note that the function that’s attributed as such doesn’t necessarily need to have the name 'main'. That has one parameter, 'cx' of type ModuleContext. This is essentially making an equivalent of a node module here. And then within that module, we can export functions as we would in a node project. Here, we do that with the function 'hello'.

The 'hello' function itself takes in a mutable param cx of type FunctionContext that exports a JsResult of type JsString. In that function, we return a Result containing a JsString. This result automatically gets cast to a JsResult. Also, note that the JsString is constructed within the context of the parameter 'cx', so the string constructed here is a valid js string, not necessarily a Rust style string.

Let’s build this project.

npm run build 

This command generates an artifact by the name of 'index.node'. This contains all of the exported Rust code and what we’ll be using within Node to communicate with.

Now that we have the artifact built, let’s add an index.js file, which we’ll use to test our code.

const hello = require("./index.node").hello;

And as we can see, it outputs as:

~/work/neon-hello-world$ node
Welcome to Node.js v18.12.1.
Type ".help" for more information.
> const hello = require("./index.node").hello
> console.log(hello())
hello node

A more complex example

So that was a rather toyish example. How about, we try to fetch some data from the AnimeChan API?

That’s a solid block of code. But what are we doing here? Let’s look at the major components

  • struct AnimeChanResponseObject
    • Is the struct representing the response object from the API
    • We have an implementation on this to export the struct in a JS-accessible format.
  • fn runtime
    • Started up the Tokio runtime (https://tokio.rs/) and store it as a static object, so it can be reused.
  • async fn fetch_api
    • Fetches the response from the AnimeChan API (as of writing, the official AnimeChan API is down, so we’re using a mirror http://animechan.melosh.space).
  • fn get_anime_quote
    • Utilizes the Tokio API to run the async code and send the response back over a Node.js promise.
    • We also pass two parameters to this,
      • name: A required parameter
      • Page: An optional parameter.
  • fn main
    • Exports the module containing our Rust code and corresponding bindings as Node-accessible code.

And the corresponding code in node:

And here are the results:‌

~/work/neon-complex$ node index.js
Diff (rust) : 977
Diff (axios): 1030 

You can see that its a bit faster, going from 1030ms down to 977ms. It might be worth it the safety of rust but the speed definitely isn't worth it!  This difference is much more visible on CPU-heavy tasks, where Rust's inherently lean nature and ability to leverage multiple threads shine even more Let's try a few more examples.


Let's create a similar package as we did above, and add this code for rust in the lib.rs file.

Also, add the regex crate

~/work/neon-regex$ cargo add regex

And use this for the javascript benchmark

The regexes were taken from rust-leipzig/regex-performance, and some code was adapted from mariomka/regex-benchmark.

Pattern Node Time Rust Time Improvement Ratio
Twain 1.098237 0.445926 2.462823428
(?i)Twain 7.273893 1.823619 3.988713103
[a-z]shing 6.11484 0.87526 6.986312639
Huck[a-zA-Z]+|Saw[a-zA-Z]+ 5.89834 1.204285 4.897794127
\b\w+nn\b 47.206398 88.812226 0.5315303999
[a-q][^u-z]{13}x 178.988168 1214.6896 0.1473530094
Tom|Sawyer|Huckleberry|Finn 15.087842 0.763358 19.76509318
(?i)Tom|Sawyer|Huckleberry|Finn 30.284651 1.856011 16.31706439
.{0,2}(Tom|Sawyer|Huckleberry|Finn) 38.097949 18.038199 2.112070556
.{2,4}(Tom|Sawyer|Huckleberry|Finn) 40.393531 18.038189 2.23933406
Tom.{10,25}river|river.{10,25}Tom 8.696396 1.523119 5.709597215
[a-zA-Z]+ing 75.122859 4.810666 15.6158958
\s[a-zA-Z]{0,12}ing\s 56.035154 20.528713 2.729598977
([A-Za-z]awyer|[A-Za-z]inn)\s 9.663153 17.703633 0.5458288138
["'][^"']{0,30}[?!.]["'] 10.881608 4.635093 2.347656886
\u221E|\u2713 5.79936 0.570572 10.16411601
\p{Sm} 126.242933 17.661804 7.147793793

That's a pretty interesting result card. The performance ranges from 0.14x to about 19.76x, with an average of 6.1x improvement. Also, 14 out of the 17 regexes show better performance on Rust compared to the Node regex engine.

An interesting thing to note is that this is largely a engine improvement, since the Node regex engine is itself a highly performant piece of code written largely in C++.

Pi digits

Let's do something which is a bit more implementation agnostic, and try to calculate upto the n digits on Pi.

These examples are adapted from benchmarks-game. The algorithm used is a variant of the sequential spigot algorithm.

We will be using the rug crate for working with arbitrary length integers.

~/work/neon-pi$ cargo add rug

Here's the Rust code:

And the wrapper for running the rust code in Node:

const x = require(".")
let start = new Date().getTime()
let end = new Date().getTime()
console.log(`Time(ms): ${end - start}`)

And here's the Node code

How do we do here ?

~/work/neon-pi$ node test-rust.js
Time(ms): 381
~/work/neon-pi$ node test-node.js
Time(ms): 6909

That's an 18x improvement! Not too bad for a tiny bit extra effort.

More docs on Neon can be found here, along with some more examples on Neon's Github repo.

How we use the bindings at Metlo

These bindings make it possible to have a core library with battle-tested code, while still having native performance. This enabled us to support multiple frameworks, more or less, out of the box with minimal configuration required.


import { initExpress as metlo } from "metlo";
const app = express();

        key: <YOUR_METLO_API_KEY>,
        host: "https://app.metlo.com:8081",   


import { initKoa as metlo } from "metlo";
const app = new Koa();
        key: <YOUR_METLO_API_KEY>,
        host: "https://app.metlo.com:8081",


import { initFastify as metlo } from "metlo";
const fastify = Fastify();
        key: <YOUR_METLO_API_KEY>,
        host: "https://app.metlo.com:8081",

At Metlo, we were able to support all three of these node frameworks, with less than a hundred lines of code dedicated to each since our core logic is now common to rust. And it's a pretty similar story for other languages, frameworks, and technologies. Using this common code we've built agents for Go, Python and Nginx as well! In addition to the code being shared, our rust agent is memory-safe, non-blocking, and adds latency on the order of hundreds of  microseconds at most.

You can see more about how you can set up Metlo for one of  these frameworks among many others on docs.metlo.com