Repository

Looks good to me!

User Tools

Site Tools


kb:intranet:services:wiki:dokuwiki:serverfarm

Setting up a server farm

See these resources for more information:

The idea is to really setup a tiered-authentication system. Key ideas:

  • A dokuwiki server farm is responsible for all wiki and plugins upkeep, so as to eliminate double work during wiki maintenance.
  • Users have to authenticate across two authentication systems, for defence-in-depth:
    • Dokuwiki authentication for ACL rules
    • Webserver authentication for secondary user identity authentication, decoupled from Dokuwiki
      • This is extremely important, since a sizeable chunk of web vulnerabilities come from misconfigured ACLs
    • In theory, might be able to merge these by integrating with plugin:authsplit
  • User access to the wiki is transparent to the user
    • Avoids exposing obvious endpoints for attacks
    • This likely will involve having the webserver rewrite farm endpoints based on some authentication mechanism.

In the meantime, we can have two separate wikis, with less sensitive data pushed from the internal wiki. Changelog can be retroactively generated by running the following, as per Dokuwiki recommendation. Script to perform as follows:

find data/pages -type d -not -path "*course*" | xargs -i bash -c "cp -d --preserve=all --parents {}/* /srv/dokuwiki"
find data/meta -type d -not -path "*photos*" | xargs -i bash -c "cp -d --preserve=all --parents {}/* /srv/dokuwiki"
...
cd /srv/dokuwiki && find data/meta -name '*.changes' ! -name '_media.changes' -exec cat {} + | sort -u > data/meta/_dokuwiki.changes
cd /srv/dokuwiki && find data/media_meta -name '*.changes' -exec cat {} + | sort -u > data/meta/_media.changes

# Optional cleanup of changelog
cd /srv/dokuwiki/data/meta && awk -F '\t' -v ts=$(date +%s) '$1 > ts-2678400' _dokuwiki.changes > ...

# Consider cleaning the destination directories first to purge deleted directories
# To avoid conflicts with open file handles at point of purging, either stop the
# webserver, or use an atomic instruction, i.e. "renameat2", to perform path renames.

Link to C wrapper for the Linux syscall: renameat2 gist

data/
+-- pages/
|   +-- namespace/path/to/file.txt
+-- attic/
|   +-- namespace/path/to/file.txt.gz
+-- media/
|   +-- namespace/path/to/file.png
+-- media_attic/
    +-- namespace/path/to/file.png

Lockdown wiki

Copy wiki into the DMZ host with sensitive files removed (either using sample rsync command below, or using plain scp as above):

rsync -av --chown=...:www-data --exclude-from dokuwiki.exclude --delete dokuwiki/* USER@HOST:/dokuwiki/

Exclude the following files from subsequent copies:

conf/local.php       # Wiki configuration
conf/users.auth.php  # User accounts
conf/acl.auth.php    # ACL pages, optional

Make the following edits:

conf/local.php
$conf['title'] = '...';
$conf['useacl'] = 1;       /* needed, otherwise any user-retrieval functions will break */
$conf['disableactions'] = 'edit,login,media,...';  /* disable login feature */
$conf['updatecheck'] = 0;  /* to be pushed from development server */

and delete all non-admin users from conf/users.auth.php.

kb/intranet/services/wiki/dokuwiki/serverfarm.txt · Last modified: 7 months ago ( 8 May 2024) by justin