[[!meta copyright="Copyright © 2010 Free Software Foundation, Inc."]] [[!meta license="""[[!toggle id="license" text="GFDL 1.2+"]][[!toggleable id="license" text="Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the section entitled [[GNU Free Documentation License|/fdl]]."]]"""]] # Additional Packages apache2-mpm-worker build-essential git-core gitweb ikiwiki inetutils-inetd less libtext-csv-perl netcat nullmailer perlmagick screen texinfo Yet more: * libemail-send-perl (for my *sendmail vs. ikiwiki* patch) ## [[open_issues/syslog]] $ find /etc/rc*/ | grep syslog | sudo xargs rm # `~hurd-web/` $ mkdir hurd-web && GIT_DIR=hurd-web git init # `~tschwinge/` $ mkdir tmp/backup && chmod 0733 tmp/backup # `/var/www/robots.txt` This file used to contain: User-agent: * Disallow: /gitweb/ Disallow: /cgi-bin/ ... which I've now changed to: User-agent: * Disallow: / The goal is that robots rather index the official pages, , instead of the staging area on . # Restore Backup ## `/etc/apache2/mods-enabled/` `rewrite.load`, `userdir.conf`, `userdir.load`