Other than the git content repository—at the moment hosted on sourcehut—the wiki system runs and is hosted by 9front software on a 9front machine.
This page intends to document how this is done.
Ori’s git9 is used for pulling content updates from the repository.
kvik’s ugh! site generator renders the content, mainly the HTML pages.
ugh! depends on discount markdown processor by default.
cinap’s tcp80 which has been slightly modified serves the web clients.
kvik’s unionfs serves the 9p clients.
The rest is a glue provided through standard 9front mechanisms.
; git/clone git://src.a-b.xyz/ugh /usr/wiki ; git/clone https://git.sr.ht/~kvik/wiki.9front.org /usr/wiki/data ; mkdir /usr/web
Checking for updates and triggering page regeneration is done by a custom script running once per minute through cron.
0-59 * * * * local /usr/wiki/bin/update
9p clients are served by unionfs, spawned by system network listener.
#!/bin/rc # Runs as 'none'. exec /bin/unionfs -i /usr/wiki/public
Web clients are served by tcp80, spawned by system network listener.
#!/bin/rc exec /bin/tcp80 -n /usr/wiki/cfg/webns
The namespace file
/usr/wiki/cfg/ simply binds the
to the web root:
bind /usr/wiki/public /usr/web
Not a priority, considering the currently small amount of content.
The benefit of organizing content in categories mostly seems to be the ability to generate hierarchical sitemaps.
A probable implementation is allowing arbitrary hierarchy in the source page directory, letting each directory denote a (sub)-category.