I know it’s a pain to test your front end code, but if you learn to set this up once, you’ll be able to replicate this for your future projects with a single click. Essentially, we’ll be testing our ES6 code with Mocha using Istanbul and Karma in this tutorial.

If you read code and not comments, head straight to my Github Repo I’ve set up to demonstrate this test environment.

For the rest of us, here’s what we’ll be using.

Update (23/6/16): The tutorial and repo have been updated with new devDependencies thanks to @monte-hayward.

What We Need to Install

  • Mocha: A test framework. It runs our unit tests and reports back when it fails.
  • Chai: An assertion library. I use it mainly for the syntactic sugar it provides for my tests (it makes my TDD code more expressive and readable)
  • Sinon: Enables test spies, stubs and mocks so you can simulate events and the behaviour of third parties in your tests.
  • Karma: A test runner. Run our client-side Javascript tests in a real browser from the command line. Useful if you want to check the DOM, for example.
  • IstanbulCode Coverage report generator. Tracks the percentage of your code covered by your Mocha tests.
  • Codecov.ioIntegrates your Istanbul code coverage into your workflow. Upload your reports to the cloud, visually include code coverage reports into Github pull requests, and award yourself with a spiffy badge, among other things.
  • BrowserifyLets us use ‘require’ in the browser like you can in Node. Together with:
  • Babelify: Uses the babel transpiler to turn our ES6/ES2015 code ES5 compatible. Write ES6 code in your front end and your tests without fear.
  • Optional: yarn (for package installing and precise repository versioning)
  • Optional: standard (for linting)

Whew, that seems like a daunting list of requirements.

But hold on. There’s more!

We can’t just npm install these and expect them to work together. For maximum package co-operation, you’re going to need specific plugins for Karma. I’ve compiled the list in my package.json. Recommended: use `yarn` to ensure you get the exact same working repositories. 

With this command, I presume that you want the tests to be executed within PhantomJS, Chrome, Firefox and Safari. It is easy to add other browsers like IE and Edge instead.

Now, place your code in a src folder.

Place your tests in test. Suggestion: I name my test files codefilename.spec.js as a coding convention.

Set up Karma

Generate Karma’s Configuration File

./node_modules/bin/karma init karma.conf.js 

karma.conf.js Configuration

These enable Mocha, Sinon and Chai for our tests in Karma. We also process our code with browserify before running our tests.

It also points Karma to our front end code in src folder and our tests in test folder. Change these if you’ve stored your stuff other-ways.

frameworks: ['mocha', 'sinon-chai', 'browserify'],
files: ['src/**/*.js', 'test/**/*.js'],
preprocessors: { 
      'src/**/*.js': ['browserify'],
      'test/**/*.spec.js': ['browserify']
      },
reporters: ['progress', 'mocha']

We’ve used globbing in our file paths. * is a wildcard character. That means…

‘*.js’ matches any file ending with .js
‘src/**’ matches any file inside the src folder in any directory within src. Example: any file inside ‘src/home’ or ‘src/data/components’ or src/this/file/is/buried/d/e/e/p

Browserify Configuration within karma.conf.js

This sets up our browserify preprocessor to transpile both our front end and test code with Babel. This preset transpiles our ES6/ES2015 code into a ES5 file.

browserify: {
  debug: true,
  transform: [
    'babelify',
  {
    presets: 'es2015'
  }
]

Choose Karma browsers

Presumption:

 browsers: ['Chrome', 'Firefox', 'Safari', 'PhantomJS']

PhantomJS is a virtual browser. You need it so Karma can work within the CI of your choice (we’re using TravisCI later) because TravisCI only provides PhantomJS in its virtual machine.

Package.json Configuration

"scripts": {
  "test": "./node_modules/karma/bin/karma start karma.conf.js"
}

Mocha Tests with helpers Chai and Sinon

Before either:

In *.spec.js

import { before, after, describe, it } from 'mocha'

Chai Expect, Assert, or Should

In *.spec.js

import { before, after, describe, it } from 'mocha'
import { expect, assert, should } from 'chai'

Sample test:

describe('karma test with Chai', function() {
  it('should expose the Chai assert method', function() {
    assert.ok('everything', 'everything is ok');
  })

  it('should expose the Chai expect method', function() {
    expect('foo').to.not.equal('bar');
  })

  it('should expose the Chai should property', function() {
    should.exist(123);
    (1).should.not.equal(2);
  })

  it('should work with ES6 fat arrow function', () => {
    (1).should.not.equal(2);
  })
})

Sinon

Sinon provides all sorts of useful help, like stubbing the return value of a function or faking XMLHttpRequests. Let’s use an example from sinonJS.org and Fake Time. This is the function we will test.

function throttle (callback) {
  var timer
  return function () {
    clearTimeout(timer)
    var args = [].slice.call(arguments)
    timer = setTimeout(function () {
      callback.apply(this, args)
    }, 100)
  }
}

This is the sample test from sinonJS that uses sinon spy.

In *.spec.js

import 'sinon'
describe('Testing the throttle function', () => {
  var clock

  before(() => { clock = sinon.useFakeTimers() })
  after(() => { clock.restore() })

  it('calls callback after 100ms', () => {
    const callback = sinon.spy()

    throttle(callback)()

    clock.tick(99)
    expect(callback.notCalled).to.be.true

    clock.tick(1)
    expect(callback.calledOnce).to.be.true

    expect(new Date().getTime()).to.equal(100)
  })
})

 

Run our tests with Karma

Finally!

npm test

Your test results should be in your terminal. Errors? Now is the time to debug them. Don’t move on till you’ve resolve your code issues.

Generate Code Coverage Reports with Istanbul

Set up Istanbul for Karma

Istanbul and Browserify Compatibility

Istanbul needs to run its checks only after Browserify has transpiled our code. Add browserify-istanbul after our babelify transform like this in karma.conf.js:

browserify: {
debug: true,
transform: [
[
  'babelify',
{
  presets: 'es2015'
}
], [
  'browserify-istanbul',
{
  instrumenterConfig: {
    embedSource: true
  }
}]
]

Yo: instrumenterConfig helps fix a bug with the sourcemaps as a result of the transpilation (see Github issue).

Specify formats of Report

coverageReporter: {
reporters: [
    {'type': 'text'},
    {'type': 'html', dir: 'coverage'},
    {'type': 'lcov'}
  ]
}

lcov is a format for Codecov.io.

Add Istanbul to Karma Reporters

reporters: ['progress', 'mocha', 'coverage']

Coverage is provided by karma-coverage, which uses Istanbul.

Test for Istanbul Reports

Time for npm test again!

Istanbul’s code coverage report should show after your Mocha tests run. Debug your issues now before proceeding!

Deploy to TravisCI and Codecov.io

Codecov Setup

In package.json:

"codecov": "cat coverage/*/lcov.info | ./node_modules/codecov.io/bin/codecov.io.js"

Proceed to codecov.io and link your github repo for this project.

Now it’s time to deploy to TravisCI.

TravisCI Setup

It’s easier to use only PhantomJS in your TravisCI setup.

Using process.env.Travis, it’s possible to detect TravisCI and change the browsers used.

var customBrowsers = ['Chrome', 'Safari', 'Firefox', 'PhantomJS']
if (process.env.TRAVIS) {
  customBrowsers = ['PhantomJS']
}

Create a .travis.yml file with these contents:

language: node_js
node_js:
- node #for latest node_js version
after_script:
- npm run codecov

The end is in sight! Proceed to TravisCI and enable your github repo.

Watch it run its tests.

Hope for the best.

If all goes well, your codecov.io should show your Istanbul results.

Congratulations, you have now unlocked a spectacular code coverage badge as an achievement for setting up the World’s Most Impressive Test Environment!

That is all.

Mocha + Chai + Sinon + Browserify + Babel(ify) + Istanbul + Karma + Codecov + TravisCI + Standard + Yarn = ❤

Advertisements