summaryrefslogtreecommitdiff
path: root/public_hurd_boxen/installation/darnassus.mdwn
blob: 6424df7112004b71f2651820db264256a56a80a1 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
[[!meta copyright="Copyright © 2013, 2014 Free Software Foundation, Inc."]]

[[!meta license="""[[!toggle id="license" text="GFDL 1.2+"]][[!toggleable
id="license" text="Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License, Version 1.2 or
any later version published by the Free Software Foundation; with no Invariant
Sections, no Front-Cover Texts, and no Back-Cover Texts.  A copy of the license
is included in the section entitled [[GNU Free Documentation
License|/fdl]]."]]"""]]

/!\ Incomplete, but hopefully helpful for future reference.


# Packages

  * apache2-mpm-prefork (used to be apache2-mpm-worker but changed because of some threading issues with CGI)

  * ikiwiki libcgi-session-perl libtext-csv-perl libcgi-formbuilder-perl
    libauthen-passphrase-perl libnet-openid-consumer-perl
    liblwpx-paranoidagent-perl libterm-readline-gnu-perl libgravatar-url-perl
    librpc-xml-perl libtext-wikiformat-perl libhighlight-perl perlmagick
    graphviz texinfo

      * libemail-send-perl (for my *sendmail vs. ikiwiki* patch)

      * libsearch-xapian-perl xapian-omega (for ikiwiki's search plugin)

      * libyaml-perl libyaml-syck-perl (for ikiwiki's YAML field plugins)

  * gitweb highlight

        sudo ln -s ~hurd-web/hurd-web.git /var/lib/git/

  * git-daemon-sysvinit

    Enable as per `/usr/share/doc/git-daemon-sysvinit/README.Debian`.  Also set
    `GIT_DAEMON_OPTIONS=--export-all`, and `GIT_DAEMON_DIRECTORY='/var/lib/git'`.


# `~hurd-web/`

    $ mkdir hurd-web.git && GIT_DIR=hurd-web.git git init


# `~tschwinge/`

/!\ TODO.

    $ mkdir tmp/backup && chmod 0733 tmp/backup


# `/var/www/robots.txt`

/!\ TODO.

This file used to contain:

    User-agent: *
    Disallow: /gitweb/
    Disallow: /cgi-bin/

... which I've now changed to:

    User-agent: *
    Disallow: /

The goal is that robots rather index the official pages,
<http://www.gnu.org/software/hurd/>, instead of the staging area on
<http://darnassus.sceen.net/~hurd-web/>.


# Restore Backup

/!\ TODO.

## `/etc/apache2/mods-enabled/`

`rewrite.load`, `userdir.conf`, `userdir.load`


# IRC, freenode, #hurd, 2013-02-09

    <tschwinge> We need an httpd (Apache used to work), and ikiwiki and some
      such stuff.
    <tschwinge> This has its own git repository.
    <tschwinge> This was on a separate virtual machine.
    <tschwinge> Then there was the Git repository on flubber used for people to
      push to.
    <tschwinge> Ho -- let me actually try to remember the setup.  Has been some
      years...
    <braunr> what machine currently hosts the wiki ?
    <tschwinge> Anyway, there is no requirement for the web server to be on a
      separate machine; your decision.
    <tschwinge> braunr: http://www.gnu.org/software/hurd/public_hurd_boxen.html
    <tschwinge> snubber
    <tschwinge> That was the web server.
    <braunr> isn't it gnu.org ?
    <tschwinge> And flubber had the repository for developers to push to.
    <tschwinge> No, gnu.org is updated manually (by me).
    <braunr> ah
    <tschwinge> It'S a snapshot of the wiki so to say.
    <braunr> ok so, is this wiki really meant to be modifiable from a browser ?
    <tschwinge>
      http://www.gnu.org/software/hurd/contributing/web_pages.html#index5h2
    <tschwinge> Yes.
    <braunr> i see
    <tschwinge> I should still be able to access the data from Barry's zenhost
      (including all the VMs it hosted), so I should be able to replicate that
      quite easily.
    <braunr> do you think it could be hosted on darnassus, or would you like a
      separate vm ?
    <tschwinge> The repository for people to push to and pull from (used to be
      on flubber) would be on darnassus now.
    <tschwinge> About the web server, hmm.
    <tschwinge> It's basically a security concern.
    <tschwinge> And it might get hammered by bots from time to time.
    <braunr> it won't need much resources i suppose
    <tschwinge> No.  The web server (snubber) was running with 242 MiB of RAM,
      and had uptimes of several weeks typically.
    <braunr> tschwinge: otherwise, could it use the web server running on the
      host ?
    <tschwinge> The host being darnassus?
    <braunr> no
    <braunr> shattrath, the linux system
    <tschwinge> Ah.
    <tschwinge> Sure.
    <tschwinge> There is no requirement this to be a Hurd system -- was just
      nice to show to people.
    <braunr> i think it is too
    <braunr> what's the problem with darnassus ?
    <braunr> yçou mentioned security
    <tschwinge> The web server being a public-facing component which might be
      broken into.
    <braunr> how ?
    <braunr> it's so much easier to just ask for an account .. :)
    <tschwinge> Web server bugs, CGI script bugs, etc.
    <tschwinge> Sure.
    <tschwinge> I just wanted to make you aware of it.  :-)
    <braunr> oh don't worry
    <braunr> ok so darnassus it is
    <pinotree> was it running apache? maybe, if other (lighter?) web servers
      are tested to work on hurd, they could be used
    <tschwinge> pinotree: Yes, and yes.
    <braunr> doesn't ikiwiki need php ?
    <tschwinge> Only requirement (I think) is abaility to run CGI scripts.
    <tschwinge> braunr: No.  It's written in perl.
    <braunr> ok
    <braunr> i still think i'll use apache
    <braunr> it's really not that heavy
    <braunr> lighter servers matter when the number of concurrent clients get
      very high
    <tschwinge> Then I'll figure out how exactly the setup was between flubber
      and snubber.
    <braunr> ok
    <braunr> it's good to finally get that going :)
    <tschwinge> braunr: Of course ;-) -- I had some parts of the process
      documented:
      http://www.gnu.org/software/hurd/public_hurd_boxen/installation/snubber.html
    <tschwinge> If both Git repositories are to be on the same machine
      (darnassus) we might not actually need inetutils-inetd and netcat.
    <tschwinge> Still trying to figure out what I had done there...  ;-)
    <tschwinge> OK, I again understand the setup.  Last been touched in the
      2008/2009 timeframe.  ;-)
    <braunr> :)
    <tschwinge> braunr: Please use the following ikiwiki packages: dpkg -i
      ~tschwinge/tmp/ikiwiki_3.20110608_all.deb
    <braunr> what makes this package special ?
    <tschwinge> Some patch that I added to get rendering of our news pages
      correct.
    <braunr> ok
    <tschwinge> I have not updated it ever since (and the patch was not yet in
      a suitable form for upstream).
    <tschwinge> Nothing major.
    <braunr> tschwinge: why is the ikiwiki package status hi ?
    <tschwinge> braunr: I set it to hold.
    <braunr> ah ok
    <braunr> so you finished your pat i suppose
    <braunr> i'll install apache
    <braunr> part*
    <tschwinge> I'll add a hurd-web user.
    <tschwinge> So...  I actually have to locate a backup of the files from
      flubber related to the wiki,
    * tschwinge goes searching his backup devices.
    <braunr> i added userdirs on darnassus' apache
    <tschwinge> braunr: I just noticed when I wanted to add it myself.  ;-)
    <tschwinge> braunr: Do you know about CGI scripts?
    <braunr> yes
    <tschwinge> braunr: snubber had these in /var/www/cgi-bin/; darnassus now
      in /usr/lib/cgi-bin/.
    <tschwinge> ikiwiki needs to install one CGI script.
    <braunr> ok
    <tschwinge> Does this go into /usr/lib/cgi-bin/ then?  Or into ~hurd-web/
      and a symlink somewhere?
    <braunr> ikiwiki should have installed it where it's appropriate
    <braunr> normally in /usr/lib/cgi-bin/
    <tschwinge> It's a CGI script that is generated per ikiwiki instance, so
      specific to hurd-web.
    <braunr> where does it install it by default ?
    <tschwinge> $PWD ;-)
    <braunr> ah
    <braunr> it seems a bit silly to me to generate cgi scripts :/
    <braunr> i don't care much actually, we won't have virtual servers
    <braunr> so anywhere is fine
    <tschwinge> What does the +SymLinksIfOwnerMatch Apache option mean?
    <braunr> apache will normally not follow symlink
    <braunr> unless the owner of the symlink is the same as the target's
    <braunr> (with this option)
    <tschwinge> That's enabled for CGI scripts.  So would it work to have a
      symlink /usr/lib/cgi-bin/hurd-web.cgi -> ~hurd-web/hurd-web.cgi?
    <braunr> the traditional way to access cgi scripts is to explicitely refer
      to them as http://server/cgi-bin/script
    <braunr> using *.cgi may allow too open access to cgis
    <braunr> (although normally, the userdir conf should disable them)
    <braunr> hm not sure it does
    <braunr> so put it in /usr/lib/cgi-bin/
    <tschwinge> So the hurd-web ikiwiki instance just needs to be configured
      accordingly with the URL where the CGI script will be found, and then it
      will render the pages accordingly.
    <tschwinge> OK.
    <braunr> and just named hurd-web


## IRC, freenode, #hurd, 2013-02-10

    <tschwinge> http://darnassus.sceen.net/~hurd-web/
    <tschwinge> Have at it!
    <tschwinge> braunr: ^
    <braunr> :)
    <braunr> great
    <tschwinge> And push to/pull from darnassus:~hurd-web/hurd-web.git for Git
      access.
    <tschwinge> Will update the web pages tomorrow, and all that.
    <tschwinge> braunr: And also install gitweb on darnassus, so one can view
      diffs of the wiki pages, etc.  OK?
    <braunr> tschwinge: there are still links towards bddebian
    <braunr> history for example
    <braunr> just fyi, we can look at this tomorrow
    <tschwinge> braunr: Yes, that'S what I need gitweb for.
    <tschwinge> braunr: gitweb installed, hurd-web URLs fixed
      (s%bddebian%darnassus), also some more ikiwiki-related Perl pacakges
      installed (openID login, for example).


## IRC, freenode, #hurd, 2014-02-28

    <gg0> braunr: "no_identity_server: Could not determine ID provider from
      URL." trying to login to wiki with google account
    <braunr> ?
    <braunr> gg0: for the first question: how do you get that error ?
    <gg0> "trying to login to wiki with google account"
    <braunr> why tell me ?
    <gg0> darnassus wiki
    <braunr> tschwinge is the wiki administrator
    <gg0> on darnassus too? oh didn't know
    <braunr> the wiki on darnassus is the real wiki
    <braunr> the gnu.org one is a mirror that merges additions from savannah
    <braunr> it's not exactly that, gnu.org is a static version
    <braunr> trying to edit pages there redirects to darnassus
    <braunr> gg0: don't you want an account on darnassus so that you can
      directly edit source files ?
    <gg0> braunr: i like Preview button, my review-myself process takes hours
      :)
    <braunr> it's a lot quicker to rebuild ikiwiki content and inspect static
      files locally you know ;p
    <gg0> yeah for instance because by "Save page" it doesn't actually save it
    <teythoon> hm, worked for me just the other day
    <gg0> Edit says page knows new changes but they are not online. shouldn't
      they get online right away after Save page?
    <braunr> on darnassus, yes
    <braunr> check http://darnassus.sceen.net/gitweb/hurd-web.git/
    <braunr> doesn't look like anything was committed
    <braunr> so i'd say there is a bug
    <gg0> plus openid login one
    <tschwinge> I'll have a look.  I already fixed a similar situation a few
      weeks ago.
    <tschwinge> No idea what's happening there.
    <tschwinge> gg0: I assume these are your changes:
    <tschwinge> --- a/hurd/running/qemu.mdwn
    <tschwinge> +++ b/hurd/running/qemu.mdwn
    <tschwinge> @@ -366,6 +366,23 @@ Once you have logged in as `root` run the
      `pfinet` translator with values that a
    <tschwinge>  That should do it! Do not forget to edit/update
      `/etc/resolv.conf` to get DNS working.
    <tschwinge> +# QEMU Multiboot
    <tschwinge> [...]
    <tschwinge> gg0: Correct?
    <gg0> tschwinge: they are
    <tschwinge> gg0: Am I right assuming that when you tried to do your commit
      in the web interface, you did not specify a commit message?
    <gg0> tschwinge: correct. i didn't.
    <tschwinge> OK, then I think I know (and fixed) what was going on there.
    <tschwinge> gg0: Can you please retry that now?  Also, try to log in again
      using OpenID.
    <gg0> google button keeps saying "no_identity_server: Could not determine
      ID provider from URL."
    <gg0> Save page works now. it also commits to git
    <cluck> gg0: i don't know what you're doing exactly but you might want to
      double check your dns is working as expected (some anecdotal evidence
      implies some dns servers around the word might be having issues)
    <braunr> since the dns servers is on the sceen.net host machine which i
      administer, and i have been very careful about its configuration, it
      seems unlikely
    <braunr> server*
    <braunr> i don't see anything special in the logs
    <braunr> something at the client side might be involved though
    <cluck> braunr: as i said, i don't know what he's doing
    <cluck> braunr: fwiw the cases that caught my attention are apparently
      isp's dns servers


### IRC, freenode, #hurd, 2014-03-01

    <tschwinge> braunr, gg0: OpenID logins should again be working (no idea
      when exactly it broke).  I needed to uninstall the
      liblwpx-paranoidagent-perl package.