Web Development With NodeTools

The primary aim of NodeTools is to help package makers with back-end packages. NodeTools 2 will be based on the groundwork from the first version, and facilitate rapid web development, via more advanced tools such as:

  • Splendid, a static-website compiler for hybrid HTML + JSX pages that are written as static HTML, and only include dynamic components where they need to be. There's also great support for Bootstrap:

    bootstrap autocompletion support

  • Idio, a full-scale web-server with all essential middleware that was built with a compiler and has only 2 dependencies and provides a front-end middleware for serving JSX and node_modules.

We plan to release Irio which is the combination of the two of the above to become world's tidiest, most intuitive, professional web-framework. But while the work is in progress, let me just give a few hints of how to develop client-side JS code and compile it with Closure, based on some packages that I've made.

section break

Front-End Middleware

Traditionally, you would use complex build tools like WebPack to package your code and run a development server. Since all current browsers are mature enough to support modern JavaScript, including modules, destructuring, etc, I don't see the need to transpile your code when developing it. You can just serve it as it is, with only 2 exceptions:

  • When importing from node_modules, the imports need to be renamed from import { test } from 'example' to import { test } from '/node_modules/example/src/index.mjs'.
  • JSX is not understood by the browser, therefore needs to be modified being being served.

Those are not hard requirements. When a browser requests a file, instead of serving the file from the filesystem, we can modify it by implementing a server middleware, that would adapt the file to meet the two requirements from the above. This middleware is available via the @idio/frontend package, and is able to serve node_modules and transpile JSX with RegExp. There are some limitations to JSX until NodeTools 2 are properly released, e.g.,

  • There are no source maps for debugging.
  • There's no hot module reload.
  • JSX does not support block comments.
    render(<div>
      {/* hello world */}
    </div>)
  • No curly braces in components' attributes are allowed render(<div title="hello{world}"/>).
  • No > sign inside components is permitted.

It's fair enough if those limitations might deter you from using our method. On the other hand, you can literally get going with only 3 new folders in you node_modules (Idio web server) so if the simplicity of use outweighs these drawbacks, you can give it a go. The complexity of the project doesn't matter, and it's primarily the hot-module reload that might be the missing link, but we'll make it possible ASAP.

After your JS code is ready, you can compile it into a single file with Closure Compiler. If you used JSX, we'll transpile it with ÀLaMode, and put the intermediary code in the temp folder. The compiler will also add necessary polyfills for the target language (default is ES5) and update newer constructs such as async/await and { destructuring } so that it can run in older browsers.

section break

Setting Up

There's a more complex mnpjs/idio MNP config that is a template for a web-server. However, it's not the purpose of this article to show how to create a full-stack application. I'll leave it for another time. I just wanted to show how to develop and compile a front-end.

As I mentioned, you'll need the @idio/idio server (a Koa fork itself, but completely rewritten for Closure). Let's install it with yarn npm iadd @idio/idio. We'll then want to create a basic configuration to start it:
import idio from '@idio/idio'
import render from '@depack/render'
import api from './routes/api'

const PREACT = 'https://cdnjs.cloudflare.com/ajax/libs/' +
               'preact/8.5.3/preact.umd.js'

;(async () => {
  const PROD = process.env.CLOSURE
  const { app, url, router } = await idio({
    static: {
      root: 'docs',
      use: PROD,
    },
    frontend: {
      directory: 'frontend',
      use: !PROD,
    },
  })
  router.get('/', (ctx) => {
    const script = PROD ?
      <script src="index.js"/> :
      <script type="module" src="frontend/"/>
    ctx.body = render(<div>
      <h1>Hello World</h1>
      <div id="app" />
      <script src={PREACT}/>
      {script}
    </div>, {
      addDoctype: true,
    })
  })
  router.get('/api', api)
  app.use(router.routes())
  console.log(url)
})()

This is a JSX file, meaning that we can render JSX components statically, using @depack/render package. It is a fork of preact-render-to-html-string with a few improvements, like the ability to add the doctype automatically. Idio is configured from the first argument called MiddlewareConfig. It accepts settings for all supported middleware. We'll only want 2 middleware: front-end for development, and static, when it comes to testing the production bundle of our JS code.

idio method returns a URL and a router instance, that we use to set up the index path, and an api path. The index path will simply display a welcome message, and mount our component. Because we're using Preact, we'll want to make sure the library is available on the window scope, by installing it via a CDN. We could compile Preact into our library itself, however it would produce a lot of warnings as its annotations haven't been adapted to Closure in its source code. For development, we'll serve JavaScript as a module from frontend/index.jsx file, but after we've built the bundle, we'd want to test if it's working from the docs folder.

Our API route from routes/api is used for dynamic requests from the front-end application. We'll come back to it in a second.

section break

JSX Application

The application that we create is very simple. It's just going fetch some data from the server, and display it to the user. Since JSX is the most convenient way for web development, we'll use it for our app. Data bindings and ability to write HTML within JS give such massive advantage over plain JS that it's pretty much essential to be using it. Likely, we don't need Babel or anything else to transpile JSX anymore, since ÀLaMode can do it with a simple transpile based on regexes. Sure, it might fail occasionally for edge cases, but I haven't had any issues with it.

import { render, Component } from 'preact'
import fetch from 'unfetch'
import './style.css'

class App extends Component {
  constructor() {
    super()
    this.state = {
      data: null,
      /** @type {_nodetools.User} */
      user: null,
    }
  }
  render() {
    const { error, loading, user } = this.state
    if (loading) return (<div>Loading...</div>)
    if (error) return (<div className="Error">Error: {error}</div>)
    return (<div>
      Hello From Preact Component
      {user && ', '}
      {user && <span className="UserName">{user.name}</span>}
    </div>)
  }
  componentDidMount() {
    this.fetch()
  }
  async fetch() {
    this.setState({ loading: true })
    try {
      const res = await fetch('/api')
      const { error, user } =
        /** @type {!_nodetools.ServerResponse} */ (await res.json())
      if (error) return this.setState({ error })
      this.setState({ user })
    } catch(err) {
      let error = err
      if (err instanceof Event) {
        error = { message: 'Network error' }
      }
      this.setState({ error })
    } finally {
      this.setState({ loading: false })
    }
  }
}

render(<App />, window['app'])

/**
 * @suppress {nonStandardJsDocs}
 * @typedef {import('../types').ServerResponse} _nodetools.ServerResponse
 */
/**
 * @suppress {nonStandardJsDocs}
 * @typedef {import('../types').User} _nodetools.User
 */

We'll render the app on the window.app element (it's available on window since we gave the div the app id). I prefer to make use of unfetch package as fetch API is not supported by older browsers, and Closure doesn't provide a polyfill for it.

As soon as the main component mounts, it will attempt to fetch data from the server. It then can be in 3 states:
  • loading, when a request has been sent, but the response not received yet.
  • error, if there was a server error, or an error during the request. We need to check if the caught error is an event, because in the fetch library, it could be the case.
  • data, when data is received.

In our render method, we therefore extract the current state, and display the appropriate information. Because we used the setState method, the application will be re-rendered automatically whenever the state changes.

We also used an external type, called _nodetools.ServerResponse. We used casting since the assignments were made via destructuring, so we couldn't just annotate it above the function.
const { error, user } =
  /** @type {!_nodetools.ServerResponse} */ (await res.json())

This is when NodeTools with its typework come in handy again. We need to prepare types for A) our experience in VSCode, including auto-completions, and B) closure externs, so that the compiler doesn't mangle the properties of the received object. The types are defined in the types.xml file.

<types ns="_nodetools">
  <record name="ServerResponse">
    <prop type="!_nodetools.User" name="user" opt>
      The user object.
    </prop>
    <prop string name="error">
      The server-side error.
    </prop>
  </record>
  <record name="User">
    <prop string name="name">
      The name of the user.
    </prop>
    <prop string name="login">
      The unique login of the user.
    </prop>
  </record>
</types>

We define a generic ServerResponse type that contains either an error, or some data, according to the route. Doing it this way, allows to implement a fetch function in a single place, and then access the required property, such as user from within any component.

CSS Loader

Importing styles is also supported via frontend middleware. It will serve the CSS file as a JavaScript function, that adds the style to DOM dynamically. The import itself is not transpiled at all, but the server will assign the application/javascript mime type when service style.css. When Depack comes to generate temp code for the compiler, it will also create this .css.js file and pass its path to Closure.

section break

Universal Types

The types that we're using are applicable both for server-side an client-side, therefore we can reuse them by importing from both places. In code that's going to be compiled with Closure though, we need to add the @suppress annotation, otherwise the compiler will print a warning. Although the compiler won't discover those types from source code, we'll pass it to it via externs. For back-end, we use simple import. Below is our API route.

export default function API (ctx) {
  /** @type {_nodetools.User} */
  const user = {
    login: 'test',
    name: 'user',
  }
  // if error: ctx.body = { error }
  ctx.body = { user }
}

/**
 * @typedef {import('../types').User} _nodetools.User
 */

The fact that we import types, provides us with the desired experience: we can invoke auto-completions on the created object:

name auto-completion

For frontend, it is the same, as we annotate the state.user object with the user type. This not only prevents the compiler from mangling destructured property names of the received object, but also allows us to access the hints.

user type hint in front-end

TypeScript is said to be JavaScript that scales, that is great for large-scale app development, exactly because it provides convenient way to work with types for back-end and front-end. But I don't believe that using a new language and a proprietary typings infrastructure, is a solution. NodeTools is a stack for pure JS development, and we've achieved the same result with JSDoc annotations. Keeping types in a separate place and then compiling them, and importing from places we need, works just as well and allows to stay independent.

The types for VSCode are generated into the types/index.js file:
export {}

/* typal types/index.xml namespace */
/**
 * @typedef {_nodetools.ServerResponse} ServerResponse `@record`
 * @typedef {Object} _nodetools.ServerResponse `@record`
 * @prop {!_nodetools.User} [user] The user object.
 * @prop {string} error The server-side error.
 * @typedef {_nodetools.User} User `@record`
 * @typedef {Object} _nodetools.User `@record`
 * @prop {string} name The name of the user.
 * @prop {string} login The unique login of the user.
 */

We added export {} at the top to make it possible to import types from other files, which is otherwise impossible. The namespace argument in the typal marker is needed to preserve namespaces, which a convention in NodeTools since we need to match the externs' namespace. You can omit namespace altogether for simple applications but it's just how I do things.

/**
 * @fileoverview
 * @externs
 */

/* typal types/index.xml externs */
/** @const */
var _nodetools = {}
/**
 * @record
 */
_nodetools.ServerResponse
/**
 * The user object.
 * @type {(!_nodetools.User)|undefined}
 */
_nodetools.ServerResponse.prototype.user
/**
 * The server-side error.
 * @type {string}
 */
_nodetools.ServerResponse.prototype.error
/**
 * @record
 */
_nodetools.User
/**
 * The name of the user.
 * @type {string}
 */
_nodetools.User.prototype.name
/**
 * The unique login of the user.
 * @type {string}
 */
_nodetools.User.prototype.login

Any time we updates types.xml file, we'll run yarn npm run d command to recompile types and externs. By the time Node Tools 2 is released, we'll have a watch mode also, but for now we need to make use of scripts manually:

{
  "name": "nodetools-web",
  "scripts": {
    "start": "alanode index.jsx",
    "closure": "CLOSURE=1 yarn start",
    "web": "depack frontend -o docs -a -H -E -i --externs types/externs.js",
    "d": "yarn-s d1 externs",
    "d1": "typal types/index.js -u",
    "externs": "typal types/externs.js -e"
  },
  "license": "AGPL-3.0",
  "devDependencies": {
    "@externs/preact": "^1.2.9",
    "depack": "^1.1.1",
    "preact": "8.5.3",
    "unfetch": "^4.1.0",
    "yarn-s": "^1.2.3"
  }
}

Idio's frontend and server-side rendering only work with Preact 8 at the moment, without fragments support since that requires a small change to the parser. I've not had a reason to upgrade to Preact 10, but when there's time, I will do. The server is then started with yarn npm run start command.

section break

Accessing Server

We now can access our app from the browser:

preview of the app

The component rendered correctly, and displayed data received from the server. If we check network requests, we can see how the frontend middleware has transformed our index.jsx file:

devtools shows transpilation

The imports have been updated to point to the /node_modules/ path, rather than simple package-name notation. Moreover, the JSX syntax has been updated to invoke the h pragma function from Preact. Preact is the default choice for frontend, but if you wanted to use React, you'd have to specify the pragma property in the frontend config. There's no sourcemaps, but line numbers have been preserved so you can still step through in the debugger.

The style has been updated into a JS file that calls __$styleInject when imported:
function __$styleInject(css = '') {
  const head = document.head
  const style = document.createElement('style')
  style.type = 'text/css'
  if (style.styleSheet){
    style.styleSheet.cssText = css
  } else {
    style.appendChild(document.createTextNode(css))
  }
  head.appendChild(style)
}
const style = `.Error {
  color: red;
}
.UserName {
  color: darkcyan;
}`
__$styleInject(style)

section break

Compilation

After we made sure that the app is working correctly, I want to compile it with Closure. Depack supports JSX apps, and will transpile them prior to compilation into the depack-temp folder.

I'll call the yarn npm run web script from package.jsondepack frontend -o docs -a -H -E -i --externs types/externs.js. The arguments mean:
  • frontend, the path to the entry file, short for frontend/index.jsx;
  • -o docs, the output path (either full file, or just dir, in which case the basename of the input will be used);
  • -a, advanced compilation, one of the best features of the compiler that minifies the source.
  • -H, tells Depack to add Preact's pragma.
  • -E, instructs Depack to use preact from externs, treating it as a library that is already given in the scope. During the process, Depack will monkey-patch node_modules/preact and restore it at the end. This is needed since some 3rd party packages may reference preact and we don't transpile those.
  • -i, produces an IIFE function, to prevent polluting and conflicts with the global scope.
  • --externs types/externs.js*, those are the generated externs to notify the compiler of the types information.

The bundle and its source maps will be written to the docs folder.

compilation process

If there were some problems with types that Closure could pick up, they will be shown in the output. We saved our temp files in the depack-temp, so it won't be possible to jump to the error since the temp folder is deleted by the end of the process (we'll update Depack to point to the real src in the next version). For example, the warning we get here, is that the second argument to fetch is compulsory, which is not true and it's a problem with how unfetch was annotated rather than our code.

To test whether bundling was successful, we will start the server with yarn npm run closure command that sets the CLOSURE=1 environment variable, which is then read by the server so that the our app can be served from docs/index.js rather than frontend/index.jsx which was used only for development.

When we access the server again, we are served with the compiled JS bundle, that also includes our CSS as a string.

compiled js is served

That's all folks we just created a simple Preact application using a front-end middleware from Idio and compiled it using Closure Compiler. We also typed both front-end and back-end with JSDoc so that we could use pure JavaScript. In addition, types that we produced, could be embedded into documentation of our package using Documentary from NodeTools, however I'll skip talking about it here.

section break

Additional Points

There are a few things worth mentioning at the end.

External JSX

Depack doesn't transpile JSX in published packages, therefore they need to make sure their published code is transpiled into standard JavaScript with h pragma calls. This can be done with ÀLaMode. For example, the @depack/form package is a library to create Bootstrap forms that was written in JSX, e.g.,

import { Component } from 'preact'

export default class Form extends Component {
  // ... component code
  render(props) {
    const { children, formRef, onSubmit, onChange, ...prop } =
      /** @type {!_depackForm.FormProps} */ (props)
    return <form ref={formRef} onSubmit={onSubmit} {...prop}>
      {children}
    </form>
  }
}

It is then transpiled into the build folder, however imports stay as imports, and don't become require calls. If we were publishing back-end components, we'd also transpile imports but since Closure can understand them, we keep it.

import { h } from 'preact'
import { Component } from 'preact'

export default class Form extends Component {
  // ... component code
  render(props) {
    const { children, formRef, onSubmit, onChange, ...prop } =
      /** @type {!_depackForm.FormProps} */ (props)
    return    h('form',{...prop,'ref':formRef, 'onSubmit':onSubmit},
      children,
    )
  }
}

However, because we want Closure to read the transpiled code, our module field should really point to the build dir (module takes precedence over main in Depack):

{
  "main": "src/index.jsx",
  "module": "build/index.js"
}

And the main will point to the source code, which is not really necessary, but I think frontend at the moment still reads the main field, and will be able to transpile the source code (I now realise this is redundant, and main should probably become require-transpiled code for server-side rendering 👍).

Another question is how to test front-ends with NodeTools. As I said, I'm only working on the next version right now, but there is are a few strategies for testing, that is, server-side rendering of components, and also using headless chrome to render components in Chrome, and then get the snapshot of the DOM and send it back to the server for comparison against snapshots.

SSR Testing

For basic testing of components written in Preact, you can take advantage of the JSXContext that can transform an input string into a JSX VNode, that can then be rendered. Using this method, you can declare your components using simple HTML from masks, and compare rendered result to the expected output.

// test/mask/default.js

import makeTestSuite from '@zoroaster/mask'
import JSXContext from '@depack/context'
import Form, * as Form2 from '../../src'

export default makeTestSuite('test/result/default', {
  /**
   * @param {JSXContext} context
   */
  getResults({ getVNode, render }) {
    const vnode = getVNode(this.input, {
      Form, ...Form2,
    })
    const res = render(vnode, {
      pretty: true,
    })
    return `(${res})`
  }
})

The mask result in JSX format will then use a preamble to import components for syntax highlighting and auto-completions, and define inputs as JSX methods and outputs as HTML code. I've surrounded each block with () so that there's no syntax errors underlined with red in the file. Mask results are used only to define inputs/outputs, so despite the fact that the format of this file is JSX, it's only so to enable syntax highlighting.

// test/result/default/form-group.jsx

import { TextArea, Select, Form, FormGroup, Input } from '../../src'

// default FormGroup
(<FormGroup label="test" help="ok">
  <Input placeholder="test"/>
</FormGroup>)

/* expected */
(<div class="form-group">
  <label for="i70984">test</label>
  <input placeholder="test"
    class="form-control" type="text"
    aria-describedby="hi70984" id="i70984" />
  <small id="hi70984"
    class="form-text text-muted">ok
  </small>
</div>)
/**/

// details FormGroup
(<FormGroup label="test" help="ok" details>
  <Input placeholder="test"/>
</FormGroup>)

/* expected */
(<details class="form-group">
  <summary>
    <label for="i70984">test</label>
  </summary>
  <input placeholder="test"
    class="form-control" type="text"
    aria-describedby="hi70984" id="i70984" />
  <small id="hi70984"
    class="form-text text-muted">ok
  </small>
</details>)
/**/

Headless Chrome

The second strategy, is to spawn a headless browser as a background process, setup a server that will return some HTML with script tags that invoke JavaScript to render certain components on elements. Each test will open a new page, with slightly different setup routine, wait for JS to execute, and send the element's outer HTML back to the server. We'll use a RemoteChrome context to connect to the browser:

import CDP from 'chrome-remote-interface'
import { c, b } from 'erte'

export default class RemoteChrome {
  async _init() {
    let client
    client = await CDP({
      host: '127.0.0.1',
      port: '9222',
    })
    const { Network, Page, Runtime } = client
    await Network.enable()
    await Page.enable()
    this.client = client
    this._Page = Page
    this._Runtime = Runtime
    this.Network = Network
    console.log('[%s]: %s', c('RemoteChrome', 'red'), b('Page enabled', 'green'))
  }
  static get _timeout() {
    return 10000
  }
  /**
   * The enabled page, write types for that
   */
  get Page() {
    return this._Page
  }
  /**
   * The runtime.
   */
  get Runtime() {
    return this._Runtime
  }
  async _destroy() {
    if (this.client) {
      await this.client.close()
    }
  }
}

We'll also need the Idio context that will start a server to return HTML code.

import idio from '@idio/idio'
import frontend from '@idio/frontend'
import render from '@depack/render'
import jsx from '@a-la/jsx'

export default class IdioContext {
  /**
   * The context will start a server and serve the code given
   * @param {import('@idio/idio').MiddlewareConfig} config
   */
  async start(config = {}, port = null) {
    const { input, pre = '', ...conf } = config
    const inv = jsx(input, {
      quoteProps: 'dom',
    })
    const pree = jsx(pre, {
      quoteProps: 'dom',
    })
    const { app, url } = await idio({
      frontend: {
        async middlewareConstructor() {
          const f = await frontend({
            directory: ['src', 'test/context/idio', 'build'],
          })
          return f
        },
        use: true,
      },
      async serveJSX(ctx) {
        ctx.body = render(<html>
          <head>
            <meta charset="utf-8"/>
            <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"/>

            <link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossOrigin="anonymous"/>
            <title>Hello, world!</title>
          </head>
          <script type="module" src="test/context/idio/math-random.js" />
          <script type="module" src="test/context/idio/format.js" />
          <script type="module" dangerouslySetInnerHTML={{
            __html: `import { h, render, Component } from '/node_modules/preact/src/preact.js'
/**/ import Form, { FormGroup, Input, TextArea, Select } from '/${process.env.ALAMODE_ENV=='test-build' ? 'build' : 'src'}/'
${pree}
const inv = ${inv}
render(inv, document.body)
`,
          }}>
          </script>
        </html>, { addDoctype: 1, pretty: 1 })
      },
      ...conf,

    }, { port })
    this.app = app
    this.url = url
    return this.url
  }
  async _destroy() {
    if (this.app)
      await this.app.destroy()
  }
}

This context includes some utilities on the page, for example, for formatting, and calls Preact's render method on the invocation passed from tests. The mask setup is then the following:

import makeTestSuite from '@zoroaster/mask'
import { equal } from '@zoroaster/assert'
import IdioContext from '../context/Idio'
import RemoteChrome from '../context/RemoteChrome'

export const Chrome = makeTestSuite('test/result/chrome.jsx', {
  /**
   * @param {RemoteChrome} i
   * @param {IdioContext} i
   */
  async getResults({ Page, Runtime, client }, { start }) {
    const { action, pre } = this
    const url = await start({
      pre,
      input: this.input,
    })

    await Page.navigate({ url })
    await Page.loadEventFired()
    let actionValue
    if (action) {
      const s = await Runtime.evaluate({ expression: action,
        awaitPromise: true })
      handleError(s)
      if (s.result) actionValue = s.result.value
    }
    const res = await Runtime.evaluate({ expression: 'window.idio.format(document.querySelector(\'html body\'), 0).innerHTML.trim()' })
    // IF A TEST IS TIMING OUT, DEBUG MANUALLY:
    // console.log('', url)
    // await new Promise((r) => setTimeout(r, 1000000))
    const { result: { value } } = res
    const v = value.replace(/(<input[\s\S]*?)>/g, (m, i) => {
      return `${i} />`
    })
    return { actionValue, actual: `(${v})` }
  },
  mapActual({ actual }) {
    return actual
  },
  assertResults({ actionValue }, { actionValue: av }) {
    if (av) equal(`'${actionValue}'`, av)
  },
  context: IdioContext,
  persistentContext: [RemoteChrome],
})

The action and pre are properties of the mask result which wil be shown next. We start a server by passing these properties, and make the browser navigate to that prepared page, waiting until it's fully loaded. We then evaluate some JS from action if it was passed, and grab the body element from the DOM, while also formatting it using the format utility. We then return the result of the action (if it was given), and the HTML. The HTML is compared against the expected output using simple string comparison via mask result, while the action value is asserted on using the standard equal method from the assertion library in the mask itself.

Example mask result:
/* eslint-env browser */
import { Component } from 'preact'
import { TextArea, Select, Form, FormGroup, Input } from '../../src'

// Form
(<Form></Form>)

/* expected */
(<form></form>)
/**/

// Reset
(<Form><App /></Form>)

/* pre */
class App extends Component {
  constructor() {
    super()
    this.state = { value: 'hello-world' }
  }
  render() {
    const { value } = this.state
    return (<Input required type="text" value={value} onClick={() => {
      this.setState({ value: 'new value' })
    }}/>)
  }
}
async function clickInput() {
  const input = document.querySelector('input')
  input.click()
  await new Promise(r => setTimeout(r, 1))
  return input.value
}
window.clickInput = clickInput
/**/

/* action */
clickInput()
/**/
/* actionValue */
'new value'
/**/

/* expected */
(<form>
  <input required="" class="form-control" type="text" />
</form>)
/**/

The first test is pretty simple rendering of an empty form element. The second test creates a component, and makes the clickInput method available to the window scope. When the page is loaded, via the action we click the input, which changes the state of the Form component, triggering and update of the input value because of internal logic. Because this process is async, we had to wait 1ms in the clickInput method, which then grabs the new value of the input, and returns it for assertion in the mask. This way, we've tested interaction with the component from a real browser. The rendered code is also compared to the expected one.

There are a few more examples that you can use for reference:
  • Exif package that reads metadata of files from a headless browser (the testing API is available via @contexts/chrome now). It will also start the server, prepare test pages from the mask, navigate to them, execute window.test method and return the result.
  • exif2css package that puts an image on a page, and uses its metadata EXIF information to rotate into correct position with CSS. Then, a screenshot of the page is taken using the same @contexts/chrome context, and compared against the gold standard from fixtures using node-resemble-js package that highlights the differences. The threshold for passing tests is 99%, so that if the library became broken, the similarity measure would fall below that.

    image difference testing

I hope these examples can provide starting points for implementing your own testing routines. NodeTools 2 will definitely standardise browser testing.

section break

Comments

Loading comments...