About this

Other than the git content repository—at the moment hosted on sourcehut—the wiki system runs and is hosted by 9front software on a 9front machine.

This page intends to document how this is done.

The software

Ori’s git9 is used for pulling content updates from the repository.

kvik’s ugh! site generator renders the content, mainly the HTML pages.

ugh! depends on discount markdown processor by default.

cinap’s tcp80 which has been slightly modified serves the web clients.

kvik’s unionfs serves the 9p clients.

The setup

The rest is a glue provided through standard 9front mechanisms.

Starting off

   ; git/clone git:// /usr/wiki
   ; git/clone git:// /usr/wiki/data
   ; mkdir /usr/web


Checking for updates and triggering page regeneration is done by a custom script running once per minute through cron.

   0-59 * * * * local /usr/wiki/bin/update

Serving 9p

9p clients are served by unionfs, spawned by system network listener.


   # Runs as 'none'.
   exec /bin/unionfs -i /usr/wiki/public

Serving HTTP

Web clients are served by tcp80, spawned by system network listener.


   exec /bin/tcp80 -n /usr/wiki/cfg/webns

The namespace file /usr/wiki/cfg/ simply binds the public/ directory to the web root:

   bind /usr/wiki/public /usr/web




Not a priority, considering the currently small amount of content.

The benefit of organizing content in categories mostly seems to be the ability to generate hierarchical sitemaps.

A probable implementation is allowing arbitrary hierarchy in the source page directory, letting each directory denote a (sub)-category.