How'd we get here?

In the 2019 not-so-official state of Javascript report, I heard the undertones of anguish and grumbling towards the javascript ecosystem, but whats new right? People have been complaining about Javascript since its three week incubation and subsequent birth. For good reason.

Example, executing random type additions:

> "" + {}
> "[object Object]"
> {} + ""
> 0

But…somehow, through all the pain and chaos, we have come together as a community and have made this language somewhat bearable to work with.

In 2012, Typescript was born and all of a sudden you could add Types!.to!.Javascript! as ForSortaRealNow.

In 2015 there were whispers of something called the webpack and react and all you had to was provide a small sacrificial offering, followed by installing the universe of npm modules to use them. Only then, would you be on your merry way.

In 2016, 2017, 2018, and 2019 if you updated a package and something broke, there was usually at least one GitHub issue that would say something nonchalantly like “Breaking change, no take backs”, and then go on to explain why you would be spending the next three days upgrading all your dependencies and refactoring your whole application. By the way, if this was you, I personally want to say “You’re a hero”.

Jokes aside, 2015-present has really been a journey of intense growing pain, but I personally believe we are on the brink of a new era of Javasc–cough Typescript. Yarn V2 just came out, and I am seriously more pumped about the ecosystem than ever. I would put this change on the same level of importance as the move to ECMAScript 6. Yep I said it. Let me explain why.

Yarn V2


Workspaces were a concept in V1 of Yarn, they should be thought of like a module. You .NET devs can think of them like a project within a solution. They are a good solution for a monorepo, but in V1, a lot was left up to the developer to get all their dev tools (Typescript, Webpack, and VS Code, etc) all on the same page as to what was going on with those workspaces. I remember it took a week or two for me to just get file resolution sorted out between all of that, and I know for fact, that setup was custom…and not quite right. 🙂

After you have gone through the migration to yarn berry, you can set up a more realistic and safe monorepo without worrying (too much) about how those workspaces are going to get resolved. Workspace resolution is treated as a class one citizen now. It even has its own protocol.

Assume the following directory structure

 - packages
     - web
         - src
             - data
             - components
     - api
     - shared

To reference another workspace from the web workspace, use the workspace protocol . Example:


"@repo/shared": "workspace:packages/shared"

example use:

import SomeSharedThing from "@repo/shared/SomeSharedThing"

Okay, but why was this a problem in V1 and why does it work better now?

Plug’n’Play, Plugins, and File/Module Resolution

The Problem: Resolvers.

Here is an excerpt from the yarn docs linked above that explains it well:

“Over the years that led to Plug’n’Play being designed and adopted as the main install strategy, various projects came up with their own implementation of the Node Resolution Algorithm - usually to circumvent shortcomings of the require.resolve API. Such projects can be Webpack (enhanced-resolve), Babel (resolve), Jest (jest-resolve), Metro (metro-resolver), …”

This means that if you wanted to combine and use different technologies that use different file/module resolving strategies, you would likely be dealing with configurations to make sure files are being resolved correctly.

But no more. No more tweaking aliases in configs. No more help from random third party resolvers, etc. No more.

The Solution: Plug,'n’Play(pnp.js)

Yarn V2 solves the file resolution issue through the use of the pnp.js file and their plugin strategy.

  • They introduced the concept of the pnp.js file (If you have ever used webpack stats, it is similar to that). Basically it’s a file that declares explicitly what dependencies/workspaces depend on and where those packages are all the way down the dependency tree. It then during runtime(think webpacking) can intercept calls to modules, and provide a virtual link to where yarn is keeping those modules now which by default are zipped within its cache folder. For developer tools like eslint, typescript and vs-code they wrote a pnpify module that basically fakes the existence of the node_modules folder for the same purpose. This is temporary until those tools can support pnp out of the box.
  • Yarn V2 is built primarily using a plugin strategy. Things like commands are even plugins(e.g yarn add). Along with commands, they have also built plugins to wrap how files get resolved from different mainstream file resolvers.

Together with the pnp.js file, and plugins, they can look to see what module resolution a package depends on (via the dependency tree) and then tell Node exactly where to find that module/package.

Zero Installs

Yarn berry has some great documentation around this topic. Zero installs is not a feature specific to yarn berry, but it is by far more possible because of it. Since, node_modules are no longer a thing, and instead the yarn cache is maintained through compressed .zip files, It is much more feasible to commit these to your repository. Zero installs is configured purely through .gitignore. You can choose to do so or not. Read more about it here.

I have decided to jump aboard, and have not regretted it one bit. It improves workflow all over the place, specifically around switching branches during the PR process and the build process itself.


Yarn 2 has the option to provide constraints. Constraints are rules about your dependencies. Yarn has defined a few of their own constraints, but you can also create your own.

The yarn maintainers are using a programming language called Prolog to create these constraints. Prolog is a fact based engine. Think of it as a database of facts that we can query against and determine if something is true or not.

From what I understand…upon running yarn constraints yarn berry generates a database of facts using the projects workspaces, and dependencyTypes(dependencies, devDependencies, etc) and in plain english those sound something like

“fact: the workspace someWorkspaceName depends on Lodash version 4.4.2 in devDependencies” -yarn docs.

Then it gobbles that list of DB facts, the predefined rules yarn has provided, and the source file the end user provides for custom constraints( by default). After that it loops over a list of query results provided by tauProlog’s(JS wrapper around prolog) query method and generates an array of enforced Dependencies.

Now constraints like Uncle Bobs dependency rule are much more possible and more importantly, enforceable.

This rule says that: “source code dependencies can only point inwards. Nothing in an inner circle can know anything at all about something in an outer circle. In particular, the name of something declared in an outer circle must not be mentioned by the code in the an inner circle. That includes, functions, classes. variables, or any other named software entity.” - Uncle Bob

Constraints are still in newborn status, but I plan on using them.

To wrap up. I have a lot of hope for Yarn v2. The lead maintainer @arcanis is someone I have come to respect, and is someone who not only sees the issues of the JS ecosystem, but has solved those problems. IMO, Yarn V2 is the future of the JS/TS ecosystem. At least until Deno takes over.