summaryrefslogtreecommitdiff
path: root/public_hurd_boxen/installation/darnassus.mdwn
diff options
context:
space:
mode:
authorThomas Schwinge <thomas@codesourcery.com>2016-03-18 10:51:44 +0100
committerThomas Schwinge <thomas@codesourcery.com>2016-03-18 10:51:44 +0100
commit74cea5526f697635a3b7a702c733c27a6258eb8e (patch)
tree19e1a6deda4d1f70c2dd8fc6ac96ad2070e55392 /public_hurd_boxen/installation/darnassus.mdwn
parentf4ebd8729a7bd62cda48798cd19f47d2805c5f1d (diff)
parent895751fb07e382499b4afb8339a5bdd0ee9a2c2b (diff)
Merge commit '895751fb07e382499b4afb8339a5bdd0ee9a2c2b'
Diffstat (limited to 'public_hurd_boxen/installation/darnassus.mdwn')
-rw-r--r--public_hurd_boxen/installation/darnassus.mdwn19
1 files changed, 4 insertions, 15 deletions
diff --git a/public_hurd_boxen/installation/darnassus.mdwn b/public_hurd_boxen/installation/darnassus.mdwn
index 4a86f609..620baa0a 100644
--- a/public_hurd_boxen/installation/darnassus.mdwn
+++ b/public_hurd_boxen/installation/darnassus.mdwn
@@ -1,4 +1,4 @@
-[[!meta copyright="Copyright © 2013, 2014, 2015 Free Software Foundation,
+[[!meta copyright="Copyright © 2013, 2014, 2015, 2016 Free Software Foundation,
Inc."]]
[[!meta license="""[[!toggle id="license" text="GFDL 1.2+"]][[!toggleable
@@ -14,9 +14,10 @@ License|/fdl]]."]]"""]]
# Packages
- * apache2
+ * sthttpd (thttpd)
Installation done by Richard.
+ [[!message-id "20151122000109.GA8492@shattrath"]].
* ikiwiki
@@ -92,7 +93,7 @@ May want to clean up `~hurd-web/public_html.workspace/.ikiwiki/` before that.
As a user with appropriate permissions, then install the CGI file:
- $ sudo mv ~hurd-web/public_html.workspace.cgi /usr/lib/cgi-bin/hurd-web
+ § sudo mv ~hurd-web/public_html.workspace.cgi /var/www/html/hurd-web.cgi
# `~tschwinge/`
@@ -122,15 +123,6 @@ The goal is that robots rather index the official pages,
<http://darnassus.sceen.net/~hurd-web/>.
-# Restore Backup
-
-/!\ TODO.
-
-## `/etc/apache2/mods-enabled/`
-
-`rewrite.load`, `userdir.conf`, `userdir.load`
-
-
# IRC, freenode, #hurd, 2013-02-09
<tschwinge> We need an httpd (Apache used to work), and ikiwiki and some
@@ -160,9 +152,6 @@ The goal is that robots rather index the official pages,
<tschwinge> OK, I again understand the setup. Last been touched in the
2008/2009 timeframe. ;-)
- <braunr> so you finished your pat i suppose
- <braunr> i'll install apache
- <braunr> part*
<tschwinge> I'll add a hurd-web user.
<tschwinge> So... I actually have to locate a backup of the files from
flubber related to the wiki,