Recent changes to this wiki:

justification for base tag; relative base might be OK in practice
diff --git a/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn b/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn
index d00368b..64b3e84 100644
--- a/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn
+++ b/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn
@@ -32,8 +32,10 @@ rather than solving one bug at the cost of exacerbating another.
 * URIs in RSS feeds must be absolute, because feed readers do not have
   any consistent semantics for the base of relative links
 
-* If we have a `<base href>` then the HTML spec says it must be
-  absolute
+* If we have a `<base href>` then HTML 4.01 says it must be
+  absolute, although HTML 5 does relax this by defining semantics
+  for a relative `<base href>` - it is interpreted relative to the
+  "fallback base URL" which is the URL of the page being viewed
   ([[bugs/trouble_with_base_in_search]],
   [[bugs/preview_base_url_should_be_absolute]])
 
@@ -62,6 +64,11 @@ rather than solving one bug at the cost of exacerbating another.
   [[forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https]],
   [[forum/Dot_CGI_pointing_to_localhost._What_happened__63__]])
 
+* For relative links in page-previews to work correctly without
+  having to have global state or thread state through every use of
+  `htmllink` etc., `cgitemplate` needs to make links in the page body
+  work as if we were on the page being previewed.
+
 # "Would be nice"
 
 * In general, the more relative the better

remove suggestion to wrap inline in <table>, that won't work well
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
index eb71994..3928c21 100644
--- a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
@@ -9,13 +9,8 @@ but there's a line in `inline.pm` that does:
 And the extra newlines break the table.  Can they be safely removed?
 
 > If you want an HTML table, I would suggest using an HTML table, which
-> should pass through Markdown without being interpreted further:
->
->     <table><tr>
->     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
->     </tr></table>
->
-> where tagtd.tmpl is of the form `<td>your markup here</td>`; or even just
+> should pass through Markdown without being interpreted further. To
+> avoid getting the `<div>` inside the `<table>` you can use:
 >
 >     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtable]]
 >

add links to upstream reports
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index 63f32a5..377fd10 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -279,6 +279,9 @@ server name for SNI:
 > test odysseys, but here's hoping your travails save others some
 > time and effort. --[[schmonz]]
 
+> Reported upstream as [LWPx-ParanoidAgent#14](https://github.com/csirtgadgets/LWPx-ParanoidAgent/issues/14)
+> _and_ [IO-Socket-SSL#16](https://github.com/noxxi/p5-io-socket-ssl/issues/16). -- Chap
+
 # Success!!
 
 And with that, ladies and gents, I got my first successful OpenID login!

bit of unapologetic fingerpointing
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index a0b251d..63f32a5 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -1,6 +1,6 @@
 **TL;DR**
 
-[[!toc levels=3]]
+[[!toc levels=4]]
 
 # An odyssey through lots of things that have to be right before OpenID works
 
@@ -91,6 +91,26 @@ like mine will blacklist it.
 >>> so now [ikiwiki.info](/) accepts my OpenID. I'm still not sure it wouldn't be
 >>> worthwhile to change the useragent default.... -- Chap
 
+#### culprit was an Atomicorp ModSecurity rule
+
+Further followup: my provider is using [ModSecurity](https://www.modsecurity.org/)
+with a ruleset commercially supplied by [Atomicorp](https://www.atomicorp.com/products/modsecurity.html),
+which seems to be where this rule came from. They've turned the rule off for _my account_.
+I followed up on my ticket with them, suggesting they at least think about turning it off
+more systemwide (without waiting for other customers to have bizarre problems that are
+hard to troubleshoot), or opening a conversation with Atomicorp about whether such a rule
+is really a good idea. Of course, while they were very responsive about turning it off
+_for me_, it's much iffier whether they'll take my advice any farther than that.
+
+So, this may crop up for anybody with a provider that uses Atomicorp ModSecurity rules.
+
+The ruleset produces a log message saying "turn this rule off if you use libwww-perl", which
+just goes to show whoever wrote that message wasn't thinking about what breaks what. It would
+have to be "turn this rule off if any of _your_ customers might ever need to use or depend on
+an app or service _hosted anywhere else_ that _could_ have been implemented using libwww-perl,
+over which you and your customer have no knowledge or control."
+
+Sigh. -- Chap
 
 ## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
 

add and use a "pkgsrc" shortcut (to pkgsrc.se)
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index 40c3e5c..a0b251d 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -145,7 +145,7 @@ module.
 >
 > Irrelevant to this ikiwiki instance, perhaps relevant to others:
 > I've added these patches to [pkgsrc](http://www.pkgsrc.org)'s
-> `www/p5-LWPx-ParanoidAgent` and they'll be included in the
+> [[!pkgsrc www/p5-LWPx-ParanoidAgent]] and they'll be included in the
 > soon-to-be-cut 2014Q3 branch. --[[schmonz]]
 
 ## Still naive_verify_failed_network, new improved reason
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
index ca529c2..72e4c7c 100644
--- a/doc/shortcuts.mdwn
+++ b/doc/shortcuts.mdwn
@@ -63,6 +63,7 @@ This page controls what shortcut links the wiki supports.
 * [[!shortcut name=mozillazinekb url="http://kb.mozillazine.org/%s"]]
 * [[!shortcut name=freebsdwiki url="http://wiki.freebsd.org/%s"]]
 * [[!shortcut name=hackage url="http://hackage.haskell.org/package/%s"]]
+* [[!shortcut name=pkgsrc url="http://pkgsrc.se/%S"]]
 
 To add a new shortcut, use the `shortcut`
 [[ikiwiki/directive]]. In the url, "%s" is replaced with the

I hope pkgsrc also no longer has these problems
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index a3637a2..40c3e5c 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -142,6 +142,11 @@ module.
 > To be clear, these are patches to [[!cpan LWPx::ParanoidAgent]].
 > Debian's `liblwpx-paranoidagent-perl (>= 1.10-3)` appears to
 > have those two patches. --[[smcv]]
+>
+> Irrelevant to this ikiwiki instance, perhaps relevant to others:
+> I've added these patches to [pkgsrc](http://www.pkgsrc.org)'s
+> `www/p5-LWPx-ParanoidAgent` and they'll be included in the
+> soon-to-be-cut 2014Q3 branch. --[[schmonz]]
 
 ## Still naive_verify_failed_network, new improved reason
 
@@ -219,6 +224,8 @@ yet.
 
 > Also in Debian's `liblwpx-paranoidagent-perl (>= 1.10-3)`, for the record.
 > --[[smcv]]
+>
+> And now in pkgsrc's `www/p5-LWPx-ParanoidAgent`, FWIW. --[[schmonz]]
 
 Only that still doesn't end the story, because that hand didn't know what
 [this hand](https://github.com/noxxi/p5-io-socket-ssl/commit/4f83a3cd85458bd2141f0a9f22f787174d51d587#diff-1)
@@ -247,6 +254,10 @@ server name for SNI:
 > (which is where ikiwiki.info's supporting packages come from).
 > Please report it upstream too, if the Debian maintainer doesn't
 > get there first. --[[smcv]]
+> 
+> Applied in pkgsrc. I haven't attempted to conduct before-and-after
+> test odysseys, but here's hoping your travails save others some
+> time and effort. --[[schmonz]]
 
 # Success!!
 

review
diff --git a/doc/todo/sortbylastcomment_plugin.mdwn b/doc/todo/sortbylastcomment_plugin.mdwn
index 84cf86e..b4110c0 100644
--- a/doc/todo/sortbylastcomment_plugin.mdwn
+++ b/doc/todo/sortbylastcomment_plugin.mdwn
@@ -11,3 +11,24 @@ You'll find it in this repository, in the 'sortbylastcomment' branch:
 > Reviewed, tested: looks good to me. We need it for the [Tails forum](https://tails.boum.org/forum/). --[[intrigeri]]
 
 >> Hi, is there a chance of seeing this plugin getting included in a release at any point soon? --sajolida
+
+>>> (Reviewing, better late than never...)
+>>>
+>>> It seems really non-obvious to me that the mtime of a page is
+>>> updated as a side-effect of sorting. I think it might also happen too
+>>> late for it to have the desired effect: mtimes should be updated before
+>>> the build phase starts, but sorting happens during the build phase.
+>>>
+>>> If we had a solution for [[!debbug 479371]] - copying
+>>> the mtime from child pages to a parent page - then it would
+>>> be enough to configure the forum threads to inherit the mtime
+>>> of their comments, and then sorting by mtime would do what
+>>> you wanted. The remaining problem would be to have a page pick up the
+>>> most recent mtime from a somewhat configurable set of pages. If the page
+>>> selection is done by pagespec, then by the time those can be matched
+>>> deterministically, it's also too late to be getting the desired
+>>> effect from changing mtimes... so perhaps this is a non-starter.
+>>>
+>>> Alternatively, perhaps just doing the sorting, and updating some
+>>> displayable last-update counter that is not the mtime, would be OK?
+>>> --[[smcv]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_7_a3ac2ad8a5e89efae1bbfdc4306678a7._comment b/doc/forum/PO_and_RTL_support/comment_7_a3ac2ad8a5e89efae1bbfdc4306678a7._comment
new file mode 100644
index 0000000..7ee60f5
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_7_a3ac2ad8a5e89efae1bbfdc4306678a7._comment
@@ -0,0 +1,29 @@
+[[!comment format=mdwn
+ username="smcv"
+ ip="81.100.115.242"
+ subject="comment 7"
+ date="2014-09-17T15:52:36Z"
+ content="""
+LTR with embedded RTL, or vice versa, sounds like a job for
+the [[tips/Right-to-left___40__RTL__41___page_text]] tip or
+something very similar.
+
+> Maybe the direction setting in the CSS also has other effects
+
+https://html.spec.whatwg.org/#the-dir-attribute suggests that the
+`dir` attribute is meant to be sufficient, but perhaps it's overridden
+by an explict `text-align: left`?
+
+> most of the time what you actually mean is to reverse the direction:
+> RTL becomes LTR and vice versa
+
+I don't think \"I know I am switching between English and Arabic,
+but I don't know which one I'm currently writing\" is a major use-case :-)
+
+> an option to have different wiki pages/section with different master
+> languages
+
+It sounds as though the po plugin is not really what you want, and
+you'd be better off with being able to write
+`\[[!meta lang=ar dir=rtl]]` or something.
+"""]]

diff --git a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
index f07dacd..46b6199 100644
--- a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
+++ b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
@@ -15,3 +15,5 @@ When using the GeoJSON output of the OSM plugin (osm_format: GeoJSON), the name
 
 >> No, although I believe the only code that parses this is line 112 of
 >> [underlays/osm/ikiwiki/osm.js](http://source.ikiwiki.branchable.com/?p=source.git;a=blob;f=underlays/osm/ikiwiki/osm.js;h=37e588f7b5bba4c1125052f82c358359a3459705;hb=HEAD#l112).
+
+>>> Ah, right, then this may make sense after all... --[[anarcat]]

correct
diff --git a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
index 3de7a37..82bebbb 100644
--- a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
+++ b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
@@ -130,3 +130,5 @@ Here's [[smcv]]'s review from [[todo/osm_plugin_GeoJSON_popup_patch]], annotated
 >>>   per-wiki configuration
 >>>
 >>> --s
+>>>
+>>>> That is correct. --[[anarcat]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_6_85012f6ce7050beeca8a70e1ac27eba2._comment b/doc/forum/PO_and_RTL_support/comment_6_85012f6ce7050beeca8a70e1ac27eba2._comment
new file mode 100644
index 0000000..3ae1db6
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_6_85012f6ce7050beeca8a70e1ac27eba2._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlcaGfdn9Kye1Gc8aGb67PDVQW4mKbQD7E"
+ nickname="Amitai"
+ subject="comment 6"
+ date="2014-09-17T14:24:38Z"
+ content="""
+smcv wrote:
+
+> As far as I know, none of the IkiWiki committers can read any RTL languages
+
+I read Hebrew well enough to detect chirality errors (e.g., L-Hebrew in an R-Hebrew universe). --[[schmonz]]
+"""]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_5_4f4e16afd6012796ef87a14aafe11d79._comment b/doc/forum/PO_and_RTL_support/comment_5_4f4e16afd6012796ef87a14aafe11d79._comment
new file mode 100644
index 0000000..91a2870
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_5_4f4e16afd6012796ef87a14aafe11d79._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="smcv"
+ ip="81.100.115.242"
+ subject="comment 5"
+ date="2014-09-17T11:35:07Z"
+ content="""
+`<div>` is not specifically preferred, any block-level element will do
+(e.g. `<p>`); but `<div>` is something you can wrap around any block,
+so it's good for a generic `\[[!template]]`.
+
+The difference between the use of a `dir` attribute and the use
+of a `class` attribute is that `dir` has a spec-defined semantic
+meaning in HTML4 and HTML5: search engines can look at
+`<div dir=\"rtl\">` and know that it is definitely right-to-left.
+
+`<div class=\"rtl\">` *might* mean right-to-left, but it could equally
+well mean (for instance) documentation about a run-time library,
+or something; classes have no built-in semantic meaning that generic
+user-agents like browsers and search engines can rely on.
+"""]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_4_906ed30ea85cf2910603d3ca94b7e46c._comment b/doc/forum/PO_and_RTL_support/comment_4_906ed30ea85cf2910603d3ca94b7e46c._comment
new file mode 100644
index 0000000..55bbeff
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_4_906ed30ea85cf2910603d3ca94b7e46c._comment
@@ -0,0 +1,49 @@
+[[!comment format=mdwn
+ username="fr33domlover"
+ ip="46.117.109.179"
+ subject="comment 4"
+ date="2014-09-17T11:22:57Z"
+ content="""
+> Could you test whether your tip works with \<div dir=\"rtl\"> or something, please?
+
+Sure, I will check that soon. I think it does, I just tried here in ikiwiki. Just curious, why is
+div preferred? IIRC I use \"class\" there after looking at some existing templates. But
+I'm not an expert, especially not in CSS. Would that be used as an HTML4 parallel of the dir attribute?
+
+As to that website with the patch, the problem is that the text is aligned to the left. When
+I type Hebrew in an LTR page, it already shows more or less correctly - English words are
+shown in correct letter order thanks to the bidi algorithm. The issue seems to be aligning
+to the right - that is what my tip does. Maybe the direction setting in the CSS also has other
+effects - I just know it works :-)
+
+I'll happily help with the tests. I also have a test page on my wiki which uses many ikiwiki
+features, to demonstrate how they all look in RTL. Test case ideas:
+
+- Page in RTL (e.g. Arabic) with an LTR paragraph (e.g. English)
+- Page in RTL with LTR paragraph in the same language (e.g. fancy way to write a poem)
+- Page in LTR (e.g. English) with an RTL paragraph (e.g. Hebrew)
+- Page in LTR with RTL paragraph in the same language (poem again)
+- Translated page - master language is LTR, slave is RTL
+- Translated page - master language is RTL, slave is LTR
+- Master LTR page has RTL paragraph, all slaves have it RTL too regardless of their global direction
+- Master RTL page has LTR paragraph, all slaves have it LTR too regardless of their global direction
+
+An example for the last 2 tests is an English master page about linguistics which has a paragraph in some
+RTL language that is being studied, and all slave pages must keep that paragraph intact - both the
+text itself and its RTL direction. But the rest of the page can be translated and correctly made RTL when
+translated to RTL languages.
+
+This gives me another idea - most of the time what you actually mean is to reverse the direction: RTL
+becomes LTR and vice versa. When writing some fancy poem, that's what you probably want. But in the
+previous example, the direction should not be reversed - so there should maybe be two kinds of direction
+modifiers:
+
+1. Dynamic (the default) - You write e.g. a master page in LTR and some RTL paragraphs. an RTL translation
+   automatically reverses directions, RTL <=> LTR.
+2. Fixed - this is like my tip, e.g. An RTL paragraph in an LTR page has a fixed direction set, which is kept even in
+    translations for RTL languages - the page in general is reversed, but that paragraph is not.
+
+Another very useful thing (at least to me) would be an option to have different wiki pages/section with
+different master languages. I have sections in English and sections in Hebrew, which makes the PO
+plugin a problem to use, unless I keep one of these sections untranslated.
+"""]]

start designing by listing constraints/requirements
diff --git a/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn b/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn
new file mode 100644
index 0000000..d00368b
--- /dev/null
+++ b/doc/todo/design_for_cross-linking_between_content_and_CGI.mdwn
@@ -0,0 +1,104 @@
+We're accumulating a significant number of bugs related to cross-linking
+between the content and the CGI not being as relative as we would like.
+This is an attempt to design a solution for them all in a unified way,
+rather than solving one bug at the cost of exacerbating another.
+--[[smcv]]
+
+# Terminology
+
+* Absolute: starts with a scheme, like
+  `http://example.com/ikiwiki.cgi`, `https://www.example.org/`
+
+* Protocol-relative: starts with `//` like `//example.com/ikiwiki.cgi`
+
+* Host-relative: starts with `/` like `/ikiwiki.cgi`
+
+* Relative: starts with neither `/` nor a scheme, like `../ikiwiki.cgi`
+
+# What we need
+
+* Static content must be able to link to other static content
+
+* Static content must be able to link to the CGI
+
+* CGI-generated content must be able to link to arbitrary
+  static content (it is sufficient for it to be able to link
+  to the "root" of the `destdir`)
+
+* CGI-generated content must be able to link to the CGI
+
+# Constraints
+
+* URIs in RSS feeds must be absolute, because feed readers do not have
+  any consistent semantics for the base of relative links
+
+* If we have a `<base href>` then the HTML spec says it must be
+  absolute
+  ([[bugs/trouble_with_base_in_search]],
+  [[bugs/preview_base_url_should_be_absolute]])
+
+* It is currently possible for the static content and the CGI
+  to be on different domains, e.g. `www.example.com`
+  vs. `cgi.example.com`; this should be preserved
+
+* It is currently possible to serve static content "mostly over
+  HTTP" (i.e. advertise a http URI to readers, and use a http
+  URI in RSS feeds etc.) but use HTTPS for the CGI
+
+* If the static content is served over HTTPS, it must refer
+  to other static content and the CGI via HTTPS (to avoid
+  mixed content, which is a vulnerability); this may be
+  either absolute, protocol-relative, host-relative or relative
+
+* If the CGI is served over HTTPS, it must refer to static
+  content and the CGI via HTTPS; again, this may be either
+  either absolute, protocol-relative, host-relative or relative
+  ([[todo/Protocol_relative_urls_for_stylesheet_linking]])
+
+* Because reverse proxies and `w3mmode` exist, it must be
+  possible to configure ikiwiki to not believe the `HTTPS`, etc.,
+  CGI variables, and force a particular scheme or host
+  ([[bugs/W3MMode_still_uses_http:__47____47__localhost__63__]],
+  [[forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https]],
+  [[forum/Dot_CGI_pointing_to_localhost._What_happened__63__]])
+
+# "Would be nice"
+
+* In general, the more relative the better
+
+* [[schmonz]] wants to direct all CGI pageviews to https
+  even if the visitor comes from http (but this can be done
+  at the webserver level by making http://example.com/ikiwiki.cgi
+  a redirect to https://example.com/ikiwiki.cgi, so is not
+  necessarily mandatory)
+
+* [[smcv]] has some sites that have non-CA-cartel-approved
+  certificates, with a limited number of editors who can be taught
+  to add SSL policy exceptions and log in via https;
+  anonymous/read-only actions like `do=goto` should
+  not go via HTTPS, since random readers would get scary SSL
+  warnings
+  ([[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]],
+  [[forum/CGI_script_and_HTTPS]])
+
+* It would be nice if the CGI did not need to use a `<base>` so that
+  we could use host-relative URI references (`/sandbox/`) or scheme-relative
+  URI references (`//static.example.com/sandbox/`)
+  (see [[bugs/trouble_with_base_in_search]])
+
+As a consequence of the "no mixed content" constraint, I think we can
+make some assumptions:
+
+* if the `cgiurl` is http but the CGI discovers at runtime that it has
+  been reached via https, we can assume that the https equivalent,
+  or a host- or protocol-relative URI reference to itself, would work;
+
+* if the `url` is http but the CGI discovers at runtime that it has been
+  reached via https, we can assume that the https equivalent of the `url`
+  would work
+
+In other words, best-practice would be to list your `url` and `cgiurl`
+in the setup file as http if you intend that they will most commonly
+be accessed via http (e.g. the "my cert is not CA-cartel approved"
+use-case), or as https if you intend to force accesses into
+being via https (the "my wiki is secret" use-case).

probably fixed
diff --git a/doc/bugs/cannot_decode_wide_characters_error_with_utf-8_encoding.mdwn b/doc/bugs/cannot_decode_wide_characters_error_with_utf-8_encoding.mdwn
index 2b02f3b..c665fff 100644
--- a/doc/bugs/cannot_decode_wide_characters_error_with_utf-8_encoding.mdwn
+++ b/doc/bugs/cannot_decode_wide_characters_error_with_utf-8_encoding.mdwn
@@ -5,3 +5,6 @@ Pressing on either `Save Page` or `Preview` button results in
 Editing the wiki page with non-Latin characters using webinterface also fails with the same error.  
 Additionally, embedding graphviz graphs non-Latin, leads to he same error.
 Observed in ikiwiki versions 3.20130904 and version 3.20140102
+
+> This is probably [[fixed|done]] in 3.20140916. Please provide more
+> information if not. --[[smcv]]

review
diff --git a/doc/todo/osm_plugin_icon_patch.mdwn b/doc/todo/osm_plugin_icon_patch.mdwn
index ccb7810..f7d7a2a 100644
--- a/doc/todo/osm_plugin_icon_patch.mdwn
+++ b/doc/todo/osm_plugin_icon_patch.mdwn
@@ -4,3 +4,30 @@
 Currently, the documented icon parameter to the waypoint directive is not used. This patch fixes that, and fixes some related problems in the KML output. 
 
 > That patch looks pretty awesome, thanks for your work on it. I don't have time to test it now, but if it works, I am all for its inclusion. --[[anarcat]]
+
+>     +    my $tag = $params{'tag'};
+>
+> Please check indentation: you're mixing spaces and hard tabs, apparently
+> with the assumption that a tab is worth 4 spaces.
+>
+>     -	my $icon = $config{'osm_default_icon'} || "ikiwiki/images/osm.png"; # sanitized: we trust $config
+>     +	my $icon = $params{'icon'}; # sanitized: we trust $config
+>
+> So there's a comment there that explains why the value of `$icon` can
+> be trusted, but it is no longer true, because it no longer comes from
+> `$config`. This does not fill me with confidence. Maybe it's OK to use
+> a wiki-editor-supplied icon, maybe not. If it is OK, please justify why,
+> and in any case, please do not leave old comments if they are no longer
+> true.
+>
+> In this case I suspect editors may be able to specify an icon whose URL is
+> `javascript:alert("cross-site scripting!")` (or something more malicious)
+> and have it written into the KML as-is. The osm plugin has had cross-site
+> scripting vulnerabilities before, I don't want to add another.
+>
+>     +            	externalGraphic: "${icon}"
+>
+> I don't think Perl variable interpolation is going to work in Javascript?
+> I suspect this should have been inserting something into the GeoJSON instead?
+>
+> --[[smcv]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_3_5ab0391517ab4b84666ed8b1360e4ad5._comment b/doc/forum/PO_and_RTL_support/comment_3_5ab0391517ab4b84666ed8b1360e4ad5._comment
new file mode 100644
index 0000000..18d6800
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_3_5ab0391517ab4b84666ed8b1360e4ad5._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="smcv"
+ ip="81.100.115.242"
+ subject="comment 3"
+ date="2014-09-17T08:29:48Z"
+ content="""
+> I saw a recent patch which claims to solve the problem by exposing the language code and direction to the templates
+
+It looks as though you mean [[mhameed]]'s change from
+[[todo/expose_html_language_and_direction]], which exposed them to the
+templates, but did not modify the default `page.tmpl` to make use
+of them. Perhaps you or mhameed could provide a `page.tmpl` patch?
+"""]]

Added a comment
diff --git a/doc/forum/PO_and_RTL_support/comment_2_d302d47b3b3c2d75fa8de353d09cb825._comment b/doc/forum/PO_and_RTL_support/comment_2_d302d47b3b3c2d75fa8de353d09cb825._comment
new file mode 100644
index 0000000..b6ebceb
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_2_d302d47b3b3c2d75fa8de353d09cb825._comment
@@ -0,0 +1,31 @@
+[[!comment format=mdwn
+ username="smcv"
+ ip="81.100.115.242"
+ subject="comment 2"
+ date="2014-09-17T08:19:38Z"
+ content="""
+If I'm interpreting that Arabic website correctly, it *is* RTL, but
+left-justified (which is a somewhat confusing CSS glitch, but hopefully
+not a barrier to understanding by people who can read Arabic). English
+words embedded in the Arabic are LTR, but my understanding of the bidi
+algorithm is that that's meant to happen.
+
+For instance, in the English version, the last paragraph before the inline says:
+
+> Please feel free to subscribe to the rss or atom feeds to be informed on when new addons or a new version of an addon is made available. The following community supported addons are available:
+
+and in the Arabic version, the last paragraph looks like this in my browser
+(where `*****` represents Arabic that I don't know how to read):
+
+    : ***** (... lots more ....) ***** atom feeds * rss **** ****
+
+So that looks right for RTL: the colon is at the end (left), and the
+mentions of rss feeds and atom feeds are at the beginning (right).
+When I \"view source\", it's the other way round.
+
+Also, the page source says:
+
+    <html xmlns=\"http://www.w3.org/1999/xhtml\" lang=\"ar\" xml:lang=\"ar\" dir=\"rtl\">
+
+which looks right?
+"""]]

Added a comment: next steps
diff --git a/doc/forum/PO_and_RTL_support/comment_1_5506d5878cfc7ad9a34f85c49d523ec3._comment b/doc/forum/PO_and_RTL_support/comment_1_5506d5878cfc7ad9a34f85c49d523ec3._comment
new file mode 100644
index 0000000..9c71889
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support/comment_1_5506d5878cfc7ad9a34f85c49d523ec3._comment
@@ -0,0 +1,44 @@
+[[!comment format=mdwn
+ username="smcv"
+ ip="81.100.115.242"
+ subject="next steps"
+ date="2014-09-17T08:09:50Z"
+ content="""
+HTML5 says:
+
+> Authors are strongly encouraged to use the dir attribute to indicate text direction rather than using CSS, since that way their documents will continue to render correctly even in the absence of CSS (e.g. as interpreted by search engines).
+
+Could you test whether your tip works with `<div dir=\"rtl\">` or something,
+please? If it does, please change the tip, if not, we'll have to look at
+whether the [[plugins/htmlscrubber]] is getting in the way.
+
+After that, I think the next step towards good RTL support would be to
+put together some test-cases for things that are meant to work, in the
+form of:
+
+* self-contained source code and setup file for a very simple wiki
+* the pages in that wiki making it clear what their intended text
+  direction is (e.g. \"this paragraph should be right to left\")
+
+As far as I know, none of the IkiWiki committers can read any RTL
+languages, so if you use Arabic or Hebrew or whatever in those
+test-cases, we'll need a screenshot/image of what it's meant to look 
+like. Using Latin text marked as RTL (so it should come out backwards
+if everything is working correctly) might be easier.
+
+The obvious cases that I can think of are:
+
+* the wiki is \"mostly\" in a RTL language
+* the master language is LTR but the [[plugins/po]] plugin
+  provides a translation into a RTL language
+
+and possibly
+
+* the master language is RTL but the [[plugins/po]] plugin
+  provides a translation into a LTR language
+
+It might be necessary to add support for a per-wiki, per-page or
+(for po) per-translation-language direction override that would set
+the `<html dir>` attribute, but we should find test-cases first, then we
+can work out solutions.
+"""]]

Add comment regarding GeoJSON output
diff --git a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
index 46129f3..f07dacd 100644
--- a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
+++ b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
@@ -12,3 +12,6 @@ When using the GeoJSON output of the OSM plugin (osm_format: GeoJSON), the name
 > --[[smcv]] [[!tag reviewed]]
 
 >> This is especially confusing because this is actually about JSON, not KML. Disregarding that, here's the [geojson homepage](http://geojson.org/) which has a link to the spec. The spec doesn't seem to specify `description`, `desc` or `name` anywhere. --[[anarcat]]
+
+>> No, although I believe the only code that parses this is line 112 of
+>> [underlays/osm/ikiwiki/osm.js](http://source.ikiwiki.branchable.com/?p=source.git;a=blob;f=underlays/osm/ikiwiki/osm.js;h=37e588f7b5bba4c1125052f82c358359a3459705;hb=HEAD#l112).

respond
diff --git a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
index 1b2d40a..3de7a37 100644
--- a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
+++ b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
@@ -25,12 +25,25 @@ Here's [[smcv]]'s review from [[todo/osm_plugin_GeoJSON_popup_patch]], annotated
 
 >> Or `@layers = ( 'OSM' );`. --[[anarcat]]
 
+>>> Yeah, and then `layers => [@layers]` or `layers => \@layers`
+>>> to turn it into a reference when building `%options`. --s
+
 >     +		@layers = [ split(/,/, $params{layers}) ];
 >
 > Is comma-separated the best fit here? Would whitespace, or whitespace and/or
 > commas, work better?
 
->> Why don't we simply keep it an array as it already is? I fail to see the reason behind that change. This is the config I use right now on http://reseaulibre.ca/:
+>> Why don't we simply keep it an array as it already is? I fail to see the reason behind that change.
+>>
+>>> This seems to be at least partially a feature request for \[[!osm]]:
+>>> "allow individual \[[!osm]] maps to override `$config{osm_layers}`.
+>>> Items in `%config` can be a reference to an array, so that's fine.
+>>> However, parameters to a [[ikiwiki/directive]] cannot be an array,
+>>> so for the directive, we need a syntax for taking a scalar parameter
+>>> and splitting it into an array - comma-separated, whitespace-separated,
+>>> whatever. --s
+>>
+>> This is the config I use right now on http://reseaulibre.ca/:
 >> 
 >> ~~~~
 >> osm_layers:
@@ -107,3 +120,13 @@ Here's [[smcv]]'s review from [[todo/osm_plugin_GeoJSON_popup_patch]], annotated
 >> That is an accurate statement.
 >>
 >> This is old code, so my memory may be cold, but i think that the "layers" parameters used to be a hash, not an array, until two years ago (commit 636e04a). The javascript code certainly expects an array right now. --[[anarcat]]
+
+>>> OK, then I think this might be a mixture of a bug and a feature request:
+>>>
+>>> * bug: the configuration suggested by the example (or the default when
+>>>   unconfigured, or something) produces "TypeError: mapProjection is null"
+>>>
+>>> * feature request: per-\[[!osm]] configuration to complement the
+>>>   per-wiki configuration
+>>>
+>>> --s

Added a comment
diff --git a/doc/forum/Right-to-left_support/comment_2_d6fc07900fbf9e70ee20609a37264913._comment b/doc/forum/Right-to-left_support/comment_2_d6fc07900fbf9e70ee20609a37264913._comment
new file mode 100644
index 0000000..ba642cb
--- /dev/null
+++ b/doc/forum/Right-to-left_support/comment_2_d6fc07900fbf9e70ee20609a37264913._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="fr33domlover"
+ ip="46.117.109.179"
+ subject="comment 2"
+ date="2014-09-17T06:57:41Z"
+ content="""
+I couldn't figure out how to make a comment on the commandline so I made this:
+
+[[forum/PO_and_RTL_support]]
+
+The Arabic pages on your wiki seem to have the Arabic in LTR, instead of the intended
+RTL. The reason may be that the PO plugin does not generate each slave page from scratch,
+but rather uses the original page, which causes slave pages to have language 'en' and direction
+LTR. I didn't verify this yet. If you do check this, please share results here :)
+
+What I got to work so far is RTL chunks inside LTR pages. It doesn't replace the PO plugin but
+it can be used to make PO+RTL work:
+
+[Right-to-left (RTL) page text](http://ikiwiki.info/tips/Right-to-left___40__RTL__41___page_text)
+"""]]

Write forum page about RTL support of PO plugin
diff --git a/doc/forum/PO_and_RTL_support.mdwn b/doc/forum/PO_and_RTL_support.mdwn
new file mode 100644
index 0000000..849cd71
--- /dev/null
+++ b/doc/forum/PO_and_RTL_support.mdwn
@@ -0,0 +1,33 @@
+A while ago I added RTL text support to my wiki:
+
+<http://ikiwiki.info/tips/Right-to-left___40__RTL__41___page_text>
+
+But this support does not work with PO files. When I write a page in
+English, I need the Hebrew/Arabic translation to have additional text
+(in my case, using the template directive) which causes the direction of the
+text to be RTL.
+
+I saw a recent patch which claims to solve the problem by exposing the
+language code and direction to the templates (which would help a lot), but
+when I go to the original website from which it came, it looks like the Arabic
+text is still aligned LTR just like English:
+
+<http://addons.nvda-project.org/index.ar.html>
+
+Another issue is that I use Debian stable, and I'm not sure it's safe to
+use some unstable ikiwiki (currently I use the version from backports) -
+advice welcome :-)
+
+It's still important to have the ability to change direction inside the page,
+but the default direction specified either in CSS on in the page.tmpl file
+should be dynamic. I didn't check how the PO plugin works, but it may be
+necessary to update there, because if all it does is copy the HTML page and
+switch strings with translations, it must be modified to also edit the
+LTR/RTL directives so that different translations of the same page can have
+different directions.
+
+I hope I'll have some time to look into it myself, I'm just a bit behind now
+with non-recent ikiwiki version (maybe it's time for me to try sid or from
+source).
+
+--[[fr33domlover]]

move the comments in the right place, add my comments
diff --git a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
index c81ed6a..1b2d40a 100644
--- a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
+++ b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
@@ -12,3 +12,98 @@ I have produced a patch for this issue, but beware, while it appears to fix the
 >> over on the todo page for that branch. Feel free to move my
 >> review comments for it here if you want to split the discussion. --[[smcv]]
 >> [[!tag reviewed]]
+
+Here's [[smcv]]'s review from [[todo/osm_plugin_GeoJSON_popup_patch]], annotated with my comments. --[[anarcat]]
+
+> It would be good if the commit added documentation for the new feature,
+> probably in `doc/ikiwiki/directive/osm.mdwn`.
+>
+>     +	my @layers = [ 'OSM' ];
+>
+> You mean `$layers`. `[]` is a scalar value (a reference to an array);
+> `@something` is an array.
+
+>> Or `@layers = ( 'OSM' );`. --[[anarcat]]
+
+>     +		@layers = [ split(/,/, $params{layers}) ];
+>
+> Is comma-separated the best fit here? Would whitespace, or whitespace and/or
+> commas, work better?
+
+>> Why don't we simply keep it an array as it already is? I fail to see the reason behind that change. This is the config I use right now on http://reseaulibre.ca/:
+>> 
+>> ~~~~
+>> osm_layers:
+>> - http://a.tile.stamen.com/toner/${z}/${x}/${y}.png
+>> - OSM
+>> - GoogleHybrid
+>> ~~~~
+>> 
+>> It works fine. At the very least, we should *default* to the configuration set in the the .setup file, so this chunk of the patch should go:
+>> 
+>> ~~~~
+>> -        $options{'layers'} = $config{osm_layers};
+>> ~~~~
+>> 
+>> Maybe the best would be to use `$config{osm_layers};` as a default? --[[anarcat]]
+
+> It's difficult to compare without knowing what the values would look like.
+> What would be valid values? The documentation for `$config{osm_layers}`
+> says "in a syntax acceptable for OpenLayers.Layer.OSM.url parameter" so
+> perhaps:
+>
+>     # expected by current branch
+>     \[[!osm layers="OSM,WTF,OMG"]]
+>     \[[!osm layers="http://example.com/${z}/${x}/${y}.png,http://example.org/tiles/${z}/${x}/${y}.png"]]
+>     # current branch would misbehave with this syntax but it could be
+>     made to work
+>     \[[!osm layers="OSM, WTF, OMG"]]
+>     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png,
+>       http://example.org/tiles/${z}/${x}/${y}.png"""]]
+>     # I would personally suggest whitespace as separator (split(' ', ...))
+>     \[[!osm layers="OSM WTF OMG"]]
+>     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png
+>       http://example.org/tiles/${z}/${x}/${y}.png"""]]
+>
+> If you specify more than one layer, is it like "get tiles from OpenCycleMap
+> server A or B or C as a round-robin", or "draw OpenCycleMap and then overlay
+> county boundaries and then overlay locations of good pubs", or what?
+
+>> Multiple layers support means that the user is shown the first layer by default, but can also choose to flip to another layer. See again http://reseaulibre.ca/ for an example. --[[anarcat]]
+
+>     +		layers => @layers,
+>
+> If @layers didn't have exactly one item, this would mess up argument-parsing;
+> but it has exactly one item (a reference to an array), so it works.
+> Again, if you replace @layers with $layers throughout, that would be better.
+>
+>     -        $options{'layers'} = $config{osm_layers};
+>
+> Shouldn't the default if no `$params{layers}` are given be this, rather
+> than a hard-coded `['OSM']`?
+
+>> Agreed. --[[anarcat]]
+
+> `getsetup()` says `osm_layers` is `safe => 0`, which approximately means
+> "don't put this in the web UI, changing it could lead to a security flaw
+> or an unusable website". Is that wrong? If it is indeed unsafe, then
+> I would expect changing the same thing via \[[!osm]] parameters to be
+> unsafe too.
+
+>> I put that at `safe=>0` as a security precaution, because I didn't
+>> exactly know what that setting did.
+>> 
+>> It is unclear to me whether this could lead to a security flaw. The
+>> osm_layers parameter, in particular, simply decides which tiles get
+>> loaded in OpenLayers, but it is unclear to me if this is safe to change
+>> or not. --[[anarcat]]
+
+> I notice that `example => { 'OSM', 'GoogleSatellite' }` is wrong:
+> it should (probably) be `example => [ 'OSM', 'GoogleSatellite' ]`
+> (a list of two example values, not a map with key 'OSM' corresponding
+> to value 'GoogleSatellite'. That might be why you're having trouble
+> with this.
+
+>> That is an accurate statement.
+>>
+>> This is old code, so my memory may be cold, but i think that the "layers" parameters used to be a hash, not an array, until two years ago (commit 636e04a). The javascript code certainly expects an array right now. --[[anarcat]]

another review
diff --git a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
index 117aefc..46129f3 100644
--- a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
+++ b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
@@ -3,71 +3,12 @@
 
 When using the GeoJSON output of the OSM plugin (osm_format: GeoJSON), the name and description in the popups are missing, this patch fixes the issue.
 
-> "Pass the layers given in the OSM directive through"
->
-> It would be good if the commit added documentation for the new feature,
-> probably in `doc/ikiwiki/directive/osm.mdwn`.
->
->     +	my @layers = [ 'OSM' ];
->
-> You mean `$layers`. `[]` is a scalar value (a reference to an array);
-> `@something` is an array.
->
->     +		@layers = [ split(/,/, $params{layers}) ];
->
-> Is comma-separated the best fit here? Would whitespace, or whitespace and/or
-> commas, work better?
->
-> It's difficult to compare without knowing what the values would look like.
-> What would be valid values? The documentation for `$config{osm_layers}`
-> says "in a syntax acceptable for OpenLayers.Layer.OSM.url parameter" so
-> perhaps:
->
->     # expected by current branch
->     \[[!osm layers="OSM,WTF,OMG"]]
->     \[[!osm layers="http://example.com/${z}/${x}/${y}.png,http://example.org/tiles/${z}/${x}/${y}.png"]]
->     # current branch would misbehave with this syntax but it could be
->     made to work
->     \[[!osm layers="OSM, WTF, OMG"]]
->     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png,
->       http://example.org/tiles/${z}/${x}/${y}.png"""]]
->     # I would personally suggest whitespace as separator (split(' ', ...))
->     \[[!osm layers="OSM WTF OMG"]]
->     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png
->       http://example.org/tiles/${z}/${x}/${y}.png"""]]
->
-> If you specify more than one layer, is it like "get tiles from OpenCycleMap
-> server A or B or C as a round-robin", or "draw OpenCycleMap and then overlay
-> county boundaries and then overlay locations of good pubs", or what?
->
->     +		layers => @layers,
->
-> If @layers didn't have exactly one item, this would mess up argument-parsing;
-> but it has exactly one item (a reference to an array), so it works.
-> Again, if you replace @layers with $layers throughout, that would be better.
->
->     -        $options{'layers'} = $config{osm_layers};
->
-> Shouldn't the default if no `$params{layers}` are given be this, rather
-> than a hard-coded `['OSM']`?
->
-> `getsetup()` says `osm_layers` is `safe => 0`, which approximately means
-> "don't put this in the web UI, changing it could lead to a security flaw
-> or an unusable website". Is that wrong? If it is indeed unsafe, then
-> I would expect changing the same thing via \[[!osm]] parameters to be
-> unsafe too.
->
-> I notice that `example => { 'OSM', 'GoogleSatellite' }` is wrong:
-> it should (probably) be `example => [ 'OSM', 'GoogleSatellite' ]`
-> (a list of two example values, not a map with key 'OSM' corresponding
-> to value 'GoogleSatellite'. That might be why you're having trouble
-> with this.
->
 > "Fix the title and description of map popups"
 >
 >    +			# Rename desc to description (this matches the kml output)
 >
 > Is there a spec for this anywhere, or a parser with which it needs to be
 > compatible?
->
 > --[[smcv]] [[!tag reviewed]]
+
+>> This is especially confusing because this is actually about JSON, not KML. Disregarding that, here's the [geojson homepage](http://geojson.org/) which has a link to the spec. The spec doesn't seem to specify `description`, `desc` or `name` anywhere. --[[anarcat]]

fix path to my repo
diff --git a/doc/git.mdwn b/doc/git.mdwn
index 5ec0dda..ed02247 100644
--- a/doc/git.mdwn
+++ b/doc/git.mdwn
@@ -73,7 +73,7 @@ think about merging them. This is recommended. :-)
 * [[pelle]] `git://github.com/hemmop/ikiwiki.git`
 * [[chrismgray]] `git://github.com/chrismgray/ikiwiki.git`
 * [[ttw]] `git://github.com/ttw/ikiwiki.git`
-* [[anarcat]] `git://src.anarcat.ath.cx/ikiwiki`
+* [[anarcat]] `git://src.anarc.at/ikiwiki`
 * anderbubble `git://civilfritz.net/ikiwiki.git`
 * frioux `git://github.com/frioux/ikiwiki`
 * llipavsky `git://github.com/llipavsky/ikiwiki`

reviewed elsewhere
diff --git a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
index 42e2edb..c81ed6a 100644
--- a/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
+++ b/doc/bugs/osm_plugin_error_TypeError:_mapProjection_is_null.mdwn
@@ -6,3 +6,9 @@ Using the osm plugin with a simple \[[!osm]] directive does not seem to work, a
 I have produced a patch for this issue, but beware, while it appears to fix the problem for me, I have little understanding of perl and the existing code base.
 
 > It looks sound, but I have yet to test it. --[[anarcat]]
+
+>> I reviewed a version of this (possibly rebased or modified or something)
+>> that was in the [[todo/osm_plugin_GeoJSON_popup_patch]] branch,
+>> over on the todo page for that branch. Feel free to move my
+>> review comments for it here if you want to split the discussion. --[[smcv]]
+>> [[!tag reviewed]]

review
diff --git a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
index 8b0996f..117aefc 100644
--- a/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
+++ b/doc/todo/osm_plugin_GeoJSON_popup_patch.mdwn
@@ -3,4 +3,71 @@
 
 When using the GeoJSON output of the OSM plugin (osm_format: GeoJSON), the name and description in the popups are missing, this patch fixes the issue.
 
-
+> "Pass the layers given in the OSM directive through"
+>
+> It would be good if the commit added documentation for the new feature,
+> probably in `doc/ikiwiki/directive/osm.mdwn`.
+>
+>     +	my @layers = [ 'OSM' ];
+>
+> You mean `$layers`. `[]` is a scalar value (a reference to an array);
+> `@something` is an array.
+>
+>     +		@layers = [ split(/,/, $params{layers}) ];
+>
+> Is comma-separated the best fit here? Would whitespace, or whitespace and/or
+> commas, work better?
+>
+> It's difficult to compare without knowing what the values would look like.
+> What would be valid values? The documentation for `$config{osm_layers}`
+> says "in a syntax acceptable for OpenLayers.Layer.OSM.url parameter" so
+> perhaps:
+>
+>     # expected by current branch
+>     \[[!osm layers="OSM,WTF,OMG"]]
+>     \[[!osm layers="http://example.com/${z}/${x}/${y}.png,http://example.org/tiles/${z}/${x}/${y}.png"]]
+>     # current branch would misbehave with this syntax but it could be
+>     made to work
+>     \[[!osm layers="OSM, WTF, OMG"]]
+>     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png,
+>       http://example.org/tiles/${z}/${x}/${y}.png"""]]
+>     # I would personally suggest whitespace as separator (split(' ', ...))
+>     \[[!osm layers="OSM WTF OMG"]]
+>     \[[!osm layers="""http://example.com/${z}/${x}/${y}.png
+>       http://example.org/tiles/${z}/${x}/${y}.png"""]]
+>
+> If you specify more than one layer, is it like "get tiles from OpenCycleMap
+> server A or B or C as a round-robin", or "draw OpenCycleMap and then overlay
+> county boundaries and then overlay locations of good pubs", or what?
+>
+>     +		layers => @layers,
+>
+> If @layers didn't have exactly one item, this would mess up argument-parsing;
+> but it has exactly one item (a reference to an array), so it works.
+> Again, if you replace @layers with $layers throughout, that would be better.
+>
+>     -        $options{'layers'} = $config{osm_layers};
+>
+> Shouldn't the default if no `$params{layers}` are given be this, rather
+> than a hard-coded `['OSM']`?
+>
+> `getsetup()` says `osm_layers` is `safe => 0`, which approximately means
+> "don't put this in the web UI, changing it could lead to a security flaw
+> or an unusable website". Is that wrong? If it is indeed unsafe, then
+> I would expect changing the same thing via \[[!osm]] parameters to be
+> unsafe too.
+>
+> I notice that `example => { 'OSM', 'GoogleSatellite' }` is wrong:
+> it should (probably) be `example => [ 'OSM', 'GoogleSatellite' ]`
+> (a list of two example values, not a map with key 'OSM' corresponding
+> to value 'GoogleSatellite'. That might be why you're having trouble
+> with this.
+>
+> "Fix the title and description of map popups"
+>
+>    +			# Rename desc to description (this matches the kml output)
+>
+> Is there a spec for this anywhere, or a parser with which it needs to be
+> compatible?
+>
+> --[[smcv]] [[!tag reviewed]]

fixed
diff --git a/doc/todo/document_dependency_influences_in_code.mdwn b/doc/todo/document_dependency_influences_in_code.mdwn
index 4dfbb14..8b899cd 100644
--- a/doc/todo/document_dependency_influences_in_code.mdwn
+++ b/doc/todo/document_dependency_influences_in_code.mdwn
@@ -26,3 +26,5 @@ written this myself because I'm somewhat stuck on the subtlety of what
 >> what make `pagespec_match_list` more efficient than repeated
 >> `pagespec_match_list`." to give an idea of why it is there in the first
 >> place. --[[chrysn]]
+
+>>> [[done]] in 3.20140916 --s

fix link to the mtl mesh wiki
diff --git a/doc/plugins/osm.mdwn b/doc/plugins/osm.mdwn
index a2455a4..cfeab59 100644
--- a/doc/plugins/osm.mdwn
+++ b/doc/plugins/osm.mdwn
@@ -37,8 +37,7 @@ The plugin was originally written by
 [[the techno-viking|http://techno-viking.com/posts/ikiwiki-maps/]] and fixed up
 by [[anarcat]]. 
 
-See [[the Mtl-mesh
-wiki|http://mesh.openisp.ca/nodes/anarcat]] for a sample of what this
+See [[the Reseaulibre.ca wiki|http://reseaulibre.ca/]] for a sample of what this
 plugin can do
 
 See also [[plugins/contrib/googlemaps]].

followup after asking my provider to fix useragent blocking
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index 0de6fab..a3637a2 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -87,6 +87,11 @@ like mine will blacklist it.
 >> can't have anything but relatively luckier and unluckier choices, maybe
 >> `libwww/perl` is an especially unlucky one?
 
+>>> Yippee! _My_ provider found their offending `mod_security` rule and took it out,
+>>> so now [ikiwiki.info](/) accepts my OpenID. I'm still not sure it wouldn't be
+>>> worthwhile to change the useragent default.... -- Chap
+
+
 ## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
 
 Again, this could have various causes. It was helpful to bump the debug level

remove webconverger from list of git remotes
This appears to be a website run with ikiwiki, not a set of branches
to fix bugs / add features in the ikiwiki code, so having it appear
in `gitk --all` is just noise.
diff --git a/doc/git.mdwn b/doc/git.mdwn
index 55cc9c1..5ec0dda 100644
--- a/doc/git.mdwn
+++ b/doc/git.mdwn
@@ -36,7 +36,6 @@ think about merging them. This is recommended. :-)
 * [[intrigeri]] `git://gaffer.ptitcanardnoir.org/ikiwiki.git`
 * [[gmcmanus]] `git://github.com/gmcmanus/ikiwiki.git`
 * [[jelmer]] `git://git.samba.org/jelmer/ikiwiki.git`
-* [[hendry]] `git://webconverger.org/git/ikiwiki`
 * [[jon]] `git://github.com/jmtd/ikiwiki.git`
 * [[ikipostal|DavidBremner]] `git://pivot.cs.unb.ca/git/ikipostal.git`
 * [[ikimailbox|DavidBremner]] `git://pivot.cs.unb.ca/git/ikimailbox.git`

close bug
diff --git a/doc/bugs/img_test_failing_under_sbuild.mdwn b/doc/bugs/img_test_failing_under_sbuild.mdwn
index 253a166..bae6c27 100644
--- a/doc/bugs/img_test_failing_under_sbuild.mdwn
+++ b/doc/bugs/img_test_failing_under_sbuild.mdwn
@@ -23,3 +23,5 @@ I haven't been able to diagnose what else is wrong there yet.
 
 If anyone needs to release ikiwiki in a hurry, please delete that test
 and we can put it back later. --[[smcv]]
+
+> [[fixed in 3.20140916|done]] --[[smcv]]

news entry for 3.20140916
diff --git a/doc/news/version_3.20140125.mdwn b/doc/news/version_3.20140125.mdwn
deleted file mode 100644
index 3ef6ab3..0000000
--- a/doc/news/version_3.20140125.mdwn
+++ /dev/null
@@ -1,5 +0,0 @@
-ikiwiki 3.20140125 released with [[!toggle text="these changes"]]
-[[!toggleable text="""
-   * inline: Allow overriding the title of the feed. Closes: #[735123](http://bugs.debian.org/735123)
-     Thanks, Christophe Rhodes
-   * osm: Escape name parameter. Closes: #[731797](http://bugs.debian.org/731797)"""]]
\ No newline at end of file
diff --git a/doc/news/version_3.20140916.mdwn b/doc/news/version_3.20140916.mdwn
new file mode 100644
index 0000000..a2c23e0
--- /dev/null
+++ b/doc/news/version_3.20140916.mdwn
@@ -0,0 +1,33 @@
+ikiwiki 3.20140916 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+   * Don't double-decode CGI submissions with Encode.pm &gt;= 2.53,
+     fixing "Error: Cannot decode string with wide characters".
+     Thanks, [[Antoine Beaupré|anarcat]]
+   * Avoid making trails depend on everything in the wiki by giving them
+     a better way to sort the pages
+   * Don't let users post comments that won't be displayed
+   * Fix encoding of Unicode strings in Python plugins.
+     Thanks, [[chrysn]]
+   * Improve performance and correctness of the \[[!if]] directive
+   * Let \[[!inline rootpage=foo postform=no]] disable the posting form
+   * Switch default \[[!man]] shortcut to manpages.debian.org. Closes: #[700322](http://bugs.debian.org/700322)
+   * Add UUID and TIME variables to edittemplate. Closes: #[752827](http://bugs.debian.org/752827)
+     Thanks, Jonathon Anderson
+   * Display pages in linkmaps as their pagetitle (no underscore escapes).
+     Thanks, [[chrysn]]
+   * Fix aspect ratio when scaling small images, and add support for
+     converting SVG and PDF graphics to PNG.
+     Thanks, [[chrysn]]
+     - suggest ghostscript (required for PDF-to-PNG thumbnailing)
+       and libmagickcore-extra (required for SVG-to-PNG thumbnailing)
+     - build-depend on ghostscript so the test for scalable images can be run
+   * In the CGI wrapper, incorporate $config{ENV} into the environment
+     before executing Perl code, so that PERL5LIB can point to a
+     non-system-wide installation of IkiWiki.
+     Thanks, Lafayette Chamber Singers Webmaster
+   * filecheck: accept MIME types not containing ';'
+   * autoindex: index files in underlays if the resulting pages aren't
+     going to be committed. Closes: #[611068](http://bugs.debian.org/611068)
+   * Add \[[!templatebody]] directive so template pages don't have to be
+     simultaneously a valid template and valid HTML
+   * Add [[smcv]] to Uploaders and release to Debian"""]]

bug report
diff --git a/doc/bugs/img_test_failing_under_sbuild.mdwn b/doc/bugs/img_test_failing_under_sbuild.mdwn
new file mode 100644
index 0000000..253a166
--- /dev/null
+++ b/doc/bugs/img_test_failing_under_sbuild.mdwn
@@ -0,0 +1,25 @@
+The new regression test from [[plugins/img]] fails when I try to build
+ikiwiki in sbuild for a release (with commits from my `img-test` branch
+also included, in an attempt to fix this):
+
+    Use of uninitialized value in numeric eq (==) at IkiWiki/Plugin/img.pm line 93.
+    Use of uninitialized value in numeric lt (<) at IkiWiki/Plugin/img.pm line 110.
+    Use of uninitialized value in numeric eq (==) at IkiWiki/Plugin/img.pm line 93.
+    Use of uninitialized value in numeric lt (<) at IkiWiki/Plugin/img.pm line 110.
+    
+    #   Failed test at t/img.t line 78.
+    #          got: 'no image'
+    #     expected: '12x12'
+    
+    #   Failed test at t/img.t line 79.
+    #          got: 'no image'
+    #     expected: '16x2'
+    # Looks like you failed 2 tests of 18.
+    t/img.t ...................... 
+    Dubious, test returned 2 (wstat 512, 0x200)
+    Failed 2/18 subtests 
+
+I haven't been able to diagnose what else is wrong there yet.
+
+If anyone needs to release ikiwiki in a hurry, please delete that test
+and we can put it back later. --[[smcv]]

email verification is a separate issue, can we please fix the bug here?
diff --git a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
index 011966b..48b046b 100644
--- a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
+++ b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
@@ -103,3 +103,9 @@ Any other ideas? --[[anarcat]]
 >>>>> OpenID (it's not in the userdb, right?) and suggesting a stop on the preferences page, where if the provider
 >>>>> did supply an e-mail address, it could be already filled in as default (maybe still unverified if we don't want
 >>>>> to assume the provider did that). -- Chap
+
+>>>>>> So yay, I want a poney too, aka i agree that email verification would be nice.
+>>>>>>
+>>>>>> But the problem is that is a separate feature request, which should be filed as a
+>>>>>> separate [[wishlist]] item. What I am describing above is an actual *bug* that should be fixed regardless of
+>>>>>> the color you want that poney to be. :p -- [[anarcat]]

add gables and turrets to bikeshed
diff --git a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
index c4542c8..011966b 100644
--- a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
+++ b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
@@ -95,3 +95,11 @@ Any other ideas? --[[anarcat]]
 >>>> this doesn't seem to be a very big security issue that would merit implementing a new verification mechanism, especially since we don't verify email addresses on accounts right now. what we could do however is allow password authentication on openid accounts, and allow those users to actually change settings like their email addresses. however, I don't think this should be blocking that functionality right now. --[[anarcat]]
 >>>>
 >>>> besides, the patch I am proposing doesn't make the vulnerability worse at all, it exists right now without the patch. my patch only allows users that **don't** have an email set (likely because their openid provider is more discreet) to set one... --[[anarcat]]
+
+>>>>> Maybe this is too much paint for one bikeshed, but I guess the email-verification idea seems worthwhile to me
+>>>>> and not terribly hard to implement (though I'm not stepping forward at the moment) ... store it with a flag
+>>>>> saying whether it's verified, send a magic cookie to it, let the user supply the cookie to toggle the flag.
+>>>>> I could also see leaving the email field hidden for OpenID login, but perhaps detecting the first use of a new
+>>>>> OpenID (it's not in the userdb, right?) and suggesting a stop on the preferences page, where if the provider
+>>>>> did supply an e-mail address, it could be already filled in as default (maybe still unverified if we don't want
+>>>>> to assume the provider did that). -- Chap

Revert "Use templatebody for the templates in the basewiki and docwiki"
This reverts commit 236c46a3f7e5e62296484dc47b4882f7f4327a06.
We can't apply this bit until the ikiwiki on ikiwiki.info
(i.e. Branchable) supports [[!templatebody]].
diff --git a/doc/templates/gitbranch.mdwn b/doc/templates/gitbranch.mdwn
index 4ea73c9..853da92 100644
--- a/doc/templates/gitbranch.mdwn
+++ b/doc/templates/gitbranch.mdwn
@@ -1,11 +1,9 @@
-[[!templatebody <<ENDBODY
 <div class="infobox">
 Available in a [[!taglink /git]] repository [[!taglink branch|/branches]].<br />
 Branch: <TMPL_IF browse><a href="<TMPL_VAR browse>"></TMPL_IF><TMPL_VAR branch><TMPL_IF browse></a></TMPL_IF><br />
 <TMPL_IF author>Author: <TMPL_VAR author><br /></TMPL_IF>
 </div>
-ENDBODY]]
-
+<TMPL_UNLESS branch>
 This template is used to create an infobox for a git branch. It uses
 these parameters:
 
@@ -15,3 +13,4 @@ these parameters:
   (e.g. github/master)</li>
 <li>author - the author of the branch</li>
 </ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/links.mdwn b/doc/templates/links.mdwn
index 3239a59..4bd1a85 100644
--- a/doc/templates/links.mdwn
+++ b/doc/templates/links.mdwn
@@ -1,4 +1,3 @@
-[[!templatebody <<ENDBODY
 <div class="infobox">
 [[ikiwiki_logo|logo/ikiwiki.png]]  
 <ul>
@@ -15,6 +14,3 @@
 <img src="https://api.flattr.com/button/flattr-badge-large.png"
 alt="Flattr this" title="Flattr this" /></a>
 </div>
-ENDBODY]]
-
-This template contains the navigation links used on the front page.
diff --git a/doc/templates/note.mdwn b/doc/templates/note.mdwn
index 8de7374..9ef5ad9 100644
--- a/doc/templates/note.mdwn
+++ b/doc/templates/note.mdwn
@@ -1,12 +1,11 @@
-[[!templatebody <<ENDBODY
 <div class="notebox">
 <TMPL_VAR text>
 </div>
-ENDBODY]]
-
+<TMPL_UNLESS text>
 Use this template to insert a note into a page. The note will be styled to
 float to the right of other text on the page. This template has one
 parameter:
 <ul>
 <li>`text` - the text to display in the note
 </ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/plugin.mdwn b/doc/templates/plugin.mdwn
index d36dd5f..322c494 100644
--- a/doc/templates/plugin.mdwn
+++ b/doc/templates/plugin.mdwn
@@ -1,4 +1,3 @@
-[[!templatebody <<ENDBODY
 <span class="infobox">
 Plugin: <TMPL_VAR name><br />
 Author: <TMPL_VAR author><br />
@@ -9,8 +8,7 @@ Currently enabled: [[!if test="enabled(<TMPL_VAR name>)" then="yes" else="no"]]<
 </span>
 [[!if test="sourcepage(plugins/contrib/*)" then="""[[!meta title="<TMPL_VAR name> (third party plugin)"]]"""]]
 <TMPL_IF core>[[!tag plugins/type/core]]</TMPL_IF>
-ENDBODY]]
-
+<TMPL_UNLESS name>
 This template is used to create an infobox for an ikiwiki plugin. It uses
 these parameters:
 <ul>
@@ -18,3 +16,4 @@ these parameters:
 <li>author - the author of the plugin
 <li>core - set to a true value if the plugin is enabled by default
 </ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/popup.mdwn b/doc/templates/popup.mdwn
index b721a95..92455eb 100644
--- a/doc/templates/popup.mdwn
+++ b/doc/templates/popup.mdwn
@@ -1,3 +1,4 @@
+<TMPL_UNLESS mouseover>
 Use this template to create a popup window that is displayed when the mouse
 is over part of the page. This template has two parameters:
 <ul>
@@ -9,9 +10,7 @@ large for good usability.
 </ul>
 Note that browsers that do not support the CSS will display the popup
 inline in the page, inside square brackets.
-
-[[templatebody <<ENDBODY
+</TMPL_UNLESS>
 <span class="popup"><TMPL_VAR mouseover>
 <span class="paren">[</span><span class="balloon"><TMPL_VAR popup></span><span class="paren">]</span>
 </span>
-ENDBODY]]

changelog, close bugs
diff --git a/debian/changelog b/debian/changelog
index 4356a7a..ff1e73b 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -25,6 +25,8 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
   * filecheck: accept MIME types not containing ';'
   * autoindex: index files in underlays if the resulting pages aren't
     going to be committed. Closes: #611068
+  * Add [[!templatebody]] directive so template pages don't have to be
+    simultaneously a valid template and valid HTML
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/pages_under_templates_are_invalid.mdwn b/doc/bugs/pages_under_templates_are_invalid.mdwn
index c031543..20d711f 100644
--- a/doc/bugs/pages_under_templates_are_invalid.mdwn
+++ b/doc/bugs/pages_under_templates_are_invalid.mdwn
@@ -17,3 +17,5 @@ Maybe just encode all &lt; and &gt; when compling pages within the templates fol
 
 >> My `templatebody` branch on [[template creation error]] fixes this.
 >> --[[smcv]]
+
+>>> [[Merged|done]] --[[smcv]]
diff --git a/doc/bugs/template_creation_error.mdwn b/doc/bugs/template_creation_error.mdwn
index d1fb788..33a863e 100644
--- a/doc/bugs/template_creation_error.mdwn
+++ b/doc/bugs/template_creation_error.mdwn
@@ -268,3 +268,5 @@ same logic as IkiWiki itself. I don't think that's serious. --[[smcv]]
 >>>> advocate), that should likewise warn if `add_link` actually adds a link in
 >>>> the render phase.  such a warning would have helped spotting the
 >>>> link-related [[template evaluation oddities]] earlier. --[[chrysn]]
+
+>>>>> [[Merged|done]] --[[smcv]]

Merge branch 'ready/templatebody'
changelog, close bug
diff --git a/debian/changelog b/debian/changelog
index 07b4339..4356a7a 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -23,6 +23,8 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
     non-system-wide installation of IkiWiki.
     Thanks, Lafayette Chamber Singers Webmaster
   * filecheck: accept MIME types not containing ';'
+  * autoindex: index files in underlays if the resulting pages aren't
+    going to be committed. Closes: #611068
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
index 0673aa6..a52b31c 100644
--- a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
+++ b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
@@ -72,3 +72,5 @@ Shouldn't `ikiwiki-tag-test/raw/.ikiwiki/transient/tag.mdwn` and `ikiwiki-tag-te
 >>>> and *something* when there is ambiguity is ok for now; especially, it's
 >>>> not up to the autoindex branch to come up with a solution to the general
 >>>> problem. --[[chrysn]]
+
+>>>>> [[Merged|done]] --[[smcv]]

poll vote (Accept both)
diff --git a/doc/news/openid.mdwn b/doc/news/openid.mdwn
index c158ec3..03fca55 100644
--- a/doc/news/openid.mdwn
+++ b/doc/news/openid.mdwn
@@ -10,4 +10,4 @@ log back in, try out the OpenID signup process if you don't already have an
 OpenID, and see how OpenID works for you. And let me know your feelings about
 making such a switch. --[[Joey]]
 
-[[!poll 76 "Accept only OpenID for logins" 21 "Accept only password logins" 49 "Accept both"]]
+[[!poll 76 "Accept only OpenID for logins" 21 "Accept only password logins" 50 "Accept both"]]

this patch doesn't make the situation worse, actually
diff --git a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
index 91aeda4..c4542c8 100644
--- a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
+++ b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
@@ -93,3 +93,5 @@ Any other ideas? --[[anarcat]]
 >>>> hmm... true, that is a problem, especially for hostile wikis. but then any hostile site could send you such garbage - they would be spammers then. otherwise, you could ask the site manager to disable that account...
 >>>>
 >>>> this doesn't seem to be a very big security issue that would merit implementing a new verification mechanism, especially since we don't verify email addresses on accounts right now. what we could do however is allow password authentication on openid accounts, and allow those users to actually change settings like their email addresses. however, I don't think this should be blocking that functionality right now. --[[anarcat]]
+>>>>
+>>>> besides, the patch I am proposing doesn't make the vulnerability worse at all, it exists right now without the patch. my patch only allows users that **don't** have an email set (likely because their openid provider is more discreet) to set one... --[[anarcat]]

first answer
diff --git a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
index dd50166..91aeda4 100644
--- a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
+++ b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
@@ -89,3 +89,7 @@ Any other ideas? --[[anarcat]]
 >>> willing to send notifications to a verified address?
 >>>
 >>> --[[smcv]]
+>>>
+>>>> hmm... true, that is a problem, especially for hostile wikis. but then any hostile site could send you such garbage - they would be spammers then. otherwise, you could ask the site manager to disable that account...
+>>>>
+>>>> this doesn't seem to be a very big security issue that would merit implementing a new verification mechanism, especially since we don't verify email addresses on accounts right now. what we could do however is allow password authentication on openid accounts, and allow those users to actually change settings like their email addresses. however, I don't think this should be blocking that functionality right now. --[[anarcat]]

changelog, close bug
diff --git a/debian/changelog b/debian/changelog
index b50b515..07b4339 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -22,6 +22,7 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
     before executing Perl code, so that PERL5LIB can point to a
     non-system-wide installation of IkiWiki.
     Thanks, Lafayette Chamber Singers Webmaster
+  * filecheck: accept MIME types not containing ';'
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
index 627b2c8..e179f09 100644
--- a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
+++ b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
@@ -89,3 +89,7 @@ Weird... --[[anarcat]]
 > > > > > > I've turned the version I suggested above into a proper branch.
 > > > > > > Review by someone who can commit to ikiwiki.git would be appreciated.
 > > > > > > --[[smcv]]
+
+> > > > > > > Turns out "someone who can commit" includes me.
+> > > > > > > [[Merged|done]] this version, we can revert or alter it if
+> > > > > > > Joey remembers a reason to require `;` --[[smcv]]

fine-tuning of halfheartedness
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index 20e7a90..0de6fab 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -78,7 +78,7 @@ like mine will blacklist it.
 >> One reason they still have my business is that their customer service has
 >> been notably good; I always get a response from a human on the first try,
 >> and on the first or second try from a human who understands what I'm saying
->> and is able to fix it. I've dealt with organizations not like that....
+>> and is able to fix it. With a few exceptions over the years. I've dealt with organizations not like that....
 >>
 >> But I included the note here because I'm sure if _they're_ doing it, there's
 >> probably some nonzero number of other hosting providers where it's also

responses in halfhearted defense of provider in questions
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index c80d645..20e7a90 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -74,6 +74,19 @@ like mine will blacklist it.
 > but malicious script authors will have no such qualms, so I would
 > argue that your provider's strategy is already doomed... --[[smcv]]
 
+>> I agree, and I'll ask them to fix it (and probably refer them to this page).
+>> One reason they still have my business is that their customer service has
+>> been notably good; I always get a response from a human on the first try,
+>> and on the first or second try from a human who understands what I'm saying
+>> and is able to fix it. I've dealt with organizations not like that....
+>>
+>> But I included the note here because I'm sure if _they're_ doing it, there's
+>> probably some nonzero number of other hosting providers where it's also
+>> happening, so a person setting up OpenID and being baffled by this failure
+>> needs to know to check for it. Also, while the world of user-agent strings
+>> can't have anything but relatively luckier and unluckier choices, maybe
+>> `libwww/perl` is an especially unlucky one?
+
 ## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
 
 Again, this could have various causes. It was helpful to bump the debug level
@@ -165,6 +178,12 @@ Then a recent `Net::SSLeay` perl module needs to be built and linked against it.
 > but equally it might be as bad as it seems at first glance.
 > "Let the buyer beware", I think... --[[smcv]]
 
+>> As far as I can tell, this particular provider _is_ on Red Hat (EL 5).
+>> I can't conclusively tell because I'm in what appears to be a CloudLinux container when I'm in,
+>> and certain parts of the environment (like `rpm`) I can't see. But everything
+>> I _can_ see is like several RHEL5 boxen I know and love.
+
+
 ### Local OpenSSL installation will need certs to trust
 
 Bear in mind that the OpenSSL distribution doesn't come with a collection

respond
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
index c59f734..c80d645 100644
--- a/doc/plugins/openid/troubleshooting.mdwn
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -56,6 +56,24 @@ unlikely-to-be-blacklisted value is; if there is one, it's probably the
 next one all the rude bots will be using anyway, and some goofy provider
 like mine will blacklist it.
 
+> If your shared hosting provider is going to randomly break functionality,
+> I would suggest "voting with your wallet" and taking your business to
+> one that does not.
+>
+> In principle we could set the default UA (if `$config{useragent}` is
+> unspecified) to `IkiWiki/3.20140915`, or `IkiWiki/3.20140915 libwww-perl/6.03`
+> (which would be the "most correct" option AIUI), or some such.
+> That might work, or might get randomly blacklisted too, depending on the
+> whims of shared hosting providers. If you can't trust your provider to
+> behave helpfully then there isn't much we can do about it.
+>
+> Blocking requests according to UA seems fundamentally flawed, since
+> I'm fairly sure no hosting provider can afford to blacklist UAs that
+> claim to be, for instance, Firefox or Chrome. I wouldn't want
+> to patch IkiWiki to claim to be an interactive browser by default,
+> but malicious script authors will have no such qualms, so I would
+> argue that your provider's strategy is already doomed... --[[smcv]]
+
 ## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
 
 Again, this could have various causes. It was helpful to bump the debug level
@@ -103,6 +121,10 @@ Unfortunately, there isn't a release in CPAN yet that includes those two
 commits, but they are only a few lines to edit into your own locally-installed
 module.
 
+> To be clear, these are patches to [[!cpan LWPx::ParanoidAgent]].
+> Debian's `liblwpx-paranoidagent-perl (>= 1.10-3)` appears to
+> have those two patches. --[[smcv]]
+
 ## Still naive_verify_failed_network, new improved reason
 
     500 Can't connect to indieauth.com:443 (SSL connect attempt failed
@@ -136,6 +158,13 @@ not be used by `IO::Socket::SSL` unless it is
 
 Then a recent `Net::SSLeay` perl module needs to be built and linked against it.
 
+> I would tend to be somewhat concerned about the update status and security
+> of a shared hosting platform that is still on an OpenSSL major version from
+> pre-2010 - it might be fine, because it might be RHEL or some similarly
+> change-averse distribution backporting security fixes to ye olde branch,
+> but equally it might be as bad as it seems at first glance.
+> "Let the buyer beware", I think... --[[smcv]]
+
 ### Local OpenSSL installation will need certs to trust
 
 Bear in mind that the OpenSSL distribution doesn't come with a collection
@@ -164,6 +193,9 @@ That was fixed in `LWPx::ParanoidAgent` with
 which needs to be backported by hand if it hasn't made it into a CPAN release
 yet.
 
+> Also in Debian's `liblwpx-paranoidagent-perl (>= 1.10-3)`, for the record.
+> --[[smcv]]
+
 Only that still doesn't end the story, because that hand didn't know what
 [this hand](https://github.com/noxxi/p5-io-socket-ssl/commit/4f83a3cd85458bd2141f0a9f22f787174d51d587#diff-1)
 was doing. What good is passing the name in
@@ -187,6 +219,11 @@ server name for SNI:
 
 ... not submitted upstream yet, so needs to be applied by hand.
 
+> I've [reported this to Debian](https://bugs.debian.org/761635)
+> (which is where ikiwiki.info's supporting packages come from).
+> Please report it upstream too, if the Debian maintainer doesn't
+> get there first. --[[smcv]]
+
 # Success!!
 
 And with that, ladies and gents, I got my first successful OpenID login!

write changelog, close bug
diff --git a/debian/changelog b/debian/changelog
index 611e518..b50b515 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -18,6 +18,10 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
   * Fix aspect ratio when scaling small images, and add support for
     converting SVG and PDF graphics to PNG.
     Thanks, chrysn
+  * In the CGI wrapper, incorporate $config{ENV} into the environment
+    before executing Perl code, so that PERL5LIB can point to a
+    non-system-wide installation of IkiWiki.
+    Thanks, Lafayette Chamber Singers Webmaster
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index 59ca754..140b487 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -55,6 +55,8 @@ As I am not sure that remembering `PERL5LIB` is a good idea, I think that a pret
 
 Happy to make the escaping change, thanks for the sharp eye.
 
+> [[Merged|done]] with that change. --[[smcv]]
+
 My thinking on `delete` is once it's handled, it's handled. The C code
 is going to put this straight into the real environment and then do
 a simple `exec` ... is there any way this hasn't been handled?

made 'this change' link a comparison of the branch rather than a specific commit.
diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index b94b79c..59ca754 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -28,7 +28,7 @@ As I am not sure that remembering `PERL5LIB` is a good idea, I think that a pret
 -- Bruno
 
 **Update:** I had not seen this bug earlier, but I ran into the same issue and made a more general solution. You can already add stuff to `%config{ENV}` in the setup file, but it was being processed too late for `PERL5LIB` to do any good.
-[This change](https://github.com/jcflack/ikiwiki/commit/bc4721da0441a30822225c51b250be4cc5f8af24) moves the `%config{ENV}` handling earlier in the wrapper, so anything specified there is placed back in the actual environment before Perl gets control. Problem solved!
+[This change](https://github.com/jcflack/ikiwiki/compare/early-env) moves the `%config{ENV}` handling earlier in the wrapper, so anything specified there is placed back in the actual environment before Perl gets control. Problem solved!
 
 -- Chap
 

point taken
diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index a60fe13..b94b79c 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -63,3 +63,7 @@ It just takes up space twice in the generated wrapper otherwise.
 Admittedly it's not much space, but seems to be even less point ... ?
 
 -- Chap
+
+> That makes sense, as long as nothing else is going to read
+> `$config{ENV}` for purposes other than copying it into the actual
+> environment. --[[smcv]]

diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index 9da2515..a60fe13 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -52,3 +52,14 @@ As I am not sure that remembering `PERL5LIB` is a good idea, I think that a pret
 > in having it in the storable too?
 >
 > --[[smcv]]
+
+Happy to make the escaping change, thanks for the sharp eye.
+
+My thinking on `delete` is once it's handled, it's handled. The C code
+is going to put this straight into the real environment and then do
+a simple `exec` ... is there any way this hasn't been handled?
+
+It just takes up space twice in the generated wrapper otherwise.
+Admittedly it's not much space, but seems to be even less point ... ?
+
+-- Chap

diff --git a/doc/news/openid/discussion.mdwn b/doc/news/openid/discussion.mdwn
index d8c83f0..e1a7ef0 100644
--- a/doc/news/openid/discussion.mdwn
+++ b/doc/news/openid/discussion.mdwn
@@ -121,3 +121,8 @@ I'm worried, at least until the issue is cleared.
 
 This poll is now 8 years old. Do we have enough data to make a decision?
 Can we consider adding `open=no` to the poll? -- [[Jon]]
+
+----
+
+I vote against disabling password logins until my OpenID will work on [ikiwiki.info](/)!
+See [[/plugins/openid/troubleshooting]]. -- Chap

rename plugins/plugins/openid/troubleshooting.mdwn to plugins/openid/troubleshooting.mdwn
diff --git a/doc/plugins/openid/troubleshooting.mdwn b/doc/plugins/openid/troubleshooting.mdwn
new file mode 100644
index 0000000..c59f734
--- /dev/null
+++ b/doc/plugins/openid/troubleshooting.mdwn
@@ -0,0 +1,197 @@
+**TL;DR**
+
+[[!toc levels=3]]
+
+# An odyssey through lots of things that have to be right before OpenID works
+
+Having just (at last) made an ikiwiki installation accept my
+OpenID, I have learned many of the things that may have to be checked
+when getting the [[plugins/openid]] plugin to work. (These are probably
+the reasons why [ikiwiki.info](/) itself won't accept my OpenID!)
+
+Just to describe my OpenID setup a bit (and why it makes a good stress-test
+for the OpenID plugin :).
+
+I'm using my personal home page URL as my OpenID. My page lives at
+a shared-hosting service I have hired. It contains links that delegate
+my OpenID processing to [indieauth.com](https://indieauth.com).
+
+IndieAuth, in turn, uses
+[rel-me authentication](http://microformats.org/wiki/RelMeAuth) to find
+an [OAuth](http://microformats.org/wiki/OAuth) provider that can authenticate
+me. (At present, I am using [github](http://github.com) for that, which
+is an OAuth provider but not an OpenID provider, so the gatewaying provided
+by IndieAuth solves that problem.) As far as ikiwiki is concerned,
+IndieAuth is my OpenID provider; the details beyond that are transparent.
+
+So, what were the various issues I had to sort out before my first successful
+login with the [[plugins/openid]] plugin?
+
+## no_identity_server: Could not determine ID provider from URL.
+
+This is the message [ikiwiki.info](/) shows as soon as I enter my home URL
+as an OpenID. It is also the first one I got on my own ikiwiki installation.
+
+### various possible causes ...
+
+There could be lots of causes. Maybe:
+
+* the offered OpenID is an `https:` URL and there is an issue in checking
+    the certificate, so the page can't be retrieved?
+* the page can be retrieved, but it isn't well-formed HTML and the library
+    can't parse it for the needed OpenID links?
+* ...?
+
+### make a luckier setting of useragent ?!
+
+In my case, it was none of the above. It turns out my shared-hosting provider
+has a rule that refuses requests with `User-Agent: libwww-perl/6.03` (!!).
+This is the sort of problem that's really hard to anticipate or plan around.
+I could fix it (_for this case!_) by changing `useragent:` in `ikiwiki.setup`
+to a different string that my goofy provider lets through.
+
+__Recommendation:__ set `useragent:` in `ikiwiki.setup` to some
+unlikely-to-be-blacklisted value. I can't guess what the best
+unlikely-to-be-blacklisted value is; if there is one, it's probably the
+next one all the rude bots will be using anyway, and some goofy provider
+like mine will blacklist it.
+
+## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
+
+Again, this could have various causes. It was helpful to bump the debug level
+and get some logging, to see:
+
+    500 Can't connect to indieauth.com:443 (Net::SSL from Crypt-SSLeay can't
+    verify hostnames; either install IO::Socket::SSL or turn off verification
+    by setting the PERL_LWP_SSL_VERIFY_HOSTNAME environment variable to 0)
+
+I don't belong to the camp that solves every verification problem by turning
+verification off, so this meant finding out how to get verification to be done.
+It turns out there are two different Perl modules that can be used for SSL:
+
+* `IO::Socket::SSL` (verifies hostnames)
+* `Net::SSL` (_does not_ verify hostnames)
+
+Both were installed on my hosted server. How was Perl deciding which one
+to use?
+
+### set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` appropriately
+
+It turns out
+[there's an environment variable](https://rt.cpan.org/Public/Bug/Display.html?id=71599).
+So just set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` to `IO::Socket::SSL` and the
+right module gets used, right?
+
+[Wrong](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/fed6f7d7df8619df0754e8883cfad2ac15703a38#diff-2).
+That change was made to `ParanoidAgent.pm` back in November 2013 because of an
+unrelated [bug](https://github.com/csirtgadgets/LWPx-ParanoidAgent/issues/4)
+in `IO::Socket::SSL`. Essentially, _hmm, something goes wrong in
+`IO::Socket::SSL` when reading certain large documents, so we'll fix it by
+forcing the use of `Net::SSL` instead (the one that never verifies hostnames!),
+no matter what the admin has set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` to!_
+
+### undo change that broke `PERL_NET_HTTPS_SSL_SOCKET_CLASS`
+
+Plenty of [comments](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=738493)
+quickly appeared about how good an idea that wasn't, and it was corrected in
+June 2014 with [one commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/a92ed8f45834a6167ff62d3e7330bb066b307a35)
+to fix the original reading-long-documents issue in `IO::Socket::SSL` and
+[another commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/815c691ad5554a219769a90ca5f4001ae22a4019)
+that reverts the forcing of `Net::SSL` no matter how the environment is set.
+
+Unfortunately, there isn't a release in CPAN yet that includes those two
+commits, but they are only a few lines to edit into your own locally-installed
+module.
+
+## Still naive_verify_failed_network, new improved reason
+
+    500 Can't connect to indieauth.com:443 (SSL connect attempt failed
+    with unknown error error:14090086:SSL
+    routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed)
+
+Yay, at least it's trying to verify! Now why can't it verify IndieAuth's
+certificate?
+
+[Here's why](https://tools.ietf.org/html/rfc6066#section-3). As it turns out,
+[indieauth.com](https://indieauth.com/) is itself a virtual host on a shared
+server. If you naively try
+
+    openssl s_client -connect indieauth.com:443
+
+you get back a certificate for [indieweb.org](https://indieweb.org/)
+instead, so the hostname won't verify. If you explicitly indicate what server
+name you're connecting to:
+
+    openssl s_client -connect indieauth.com:443 -servername indieauth.com
+
+then, magically, the correct certificate comes back.
+
+### ensure `OpenSSL`, `Net::SSLeay`, `IO::Socket::SSL` new enough for SNI
+
+If your `openssl` doesn't recognize the `-servername` option, it is too old
+to do SNI, and a newer version needs to be built and installed. In fact,
+even though SNI support was reportedly backported into OpenSSL 0.9.8f, it will
+not be used by `IO::Socket::SSL` unless it is
+[1.0 or higher](http://search.cpan.org/~sullr/IO-Socket-SSL-1.998/lib/IO/Socket/SSL.pod#SNI_Support).
+
+Then a recent `Net::SSLeay` perl module needs to be built and linked against it.
+
+### Local OpenSSL installation will need certs to trust
+
+Bear in mind that the OpenSSL distribution doesn't come with a collection
+of trusted issuer certs. If a newer version is built and installed locally
+(say, on a shared server where the system locations can't be written), it will
+need to be given a directory of trusted issuer certs, say by linking to the
+system-provided ones. However, a change to the certificate hash algorithm used
+for the symlinks in that directory was [reportedly](http://www.cilogon.org/openssl1)
+made with OpenSSL 1.0.0. So if the system-provided trusted certificate directory
+was set up for an earlier OpenSSL version, all the certificates in it will be
+fine but the hash symlinks will be wrong. That can be fixed by linking only the
+named certificate files from the system directory into the newly-installed one,
+and then running the new version of `c_rehash` there.
+
+## Still certificate verify failed
+
+Using [SNI](https://tools.ietf.org/html/rfc6066#section-3)-supporting versions
+of `IO::Socket::SSL`, `Net::SSLeay`, and `OpenSSL` doesn't do any good if an
+upper layer hasn't passed down the name of the host being connected to so the
+SSL layer can SNI for it.
+
+### ensure that `LWPx::ParanoidAgent` passes server name to SSL layer for SNI
+
+That was fixed in `LWPx::ParanoidAgent` with
+[this commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/df6df19ccdeeb717c709cccb011af35d3713f546),
+which needs to be backported by hand if it hasn't made it into a CPAN release
+yet.
+
+Only that still doesn't end the story, because that hand didn't know what
+[this hand](https://github.com/noxxi/p5-io-socket-ssl/commit/4f83a3cd85458bd2141f0a9f22f787174d51d587#diff-1)
+was doing. What good is passing the name in
+`PeerHost` if the SSL code looks in `PeerAddr` first ... and then, if that
+doesn't match a regex for a hostname, decides you didn't supply one at all,
+without even looking at `PeerHost`?
+
+Happily, is is possible to assign a key that _explicitly_ supplies the
+server name for SNI:
+
+    --- LWPx/Protocol/http_paranoid.pm    2014-09-08 03:33:00.000000000 -0400
+    +++ LWPx/Protocol/http_paranoid.pm    2014-09-08 03:33:27.000000000 -0400
+    @@ -73,6 +73,7 @@
+            close($el);
+             $sock = $self->socket_class->new(PeerAddr => $addr,
+                                              PeerHost => $host,
+    +                                         SSL_hostname => $host,
+                                              PeerPort => $port,
+                                              Proto    => 'tcp',
+                                              Timeout  => $conn_timeout,
+
+... not submitted upstream yet, so needs to be applied by hand.
+
+# Success!!
+
+And with that, ladies and gents, I got my first successful OpenID login!
+I'm pretty sure that if the same fixes can be applied to
+[ikiwiki.info](/) itself, a wider range of OpenID logins (like mine, for

(Diff truncated)
diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn
index d56d1a3..82c23fc 100644
--- a/doc/plugins/openid.mdwn
+++ b/doc/plugins/openid.mdwn
@@ -30,3 +30,8 @@ certain setups.
   to be used when doing openid authentication. The `openid_cgiurl` must
   point to an ikiwiki [[CGI]], and it will need to match the `openid_realm`
   to work.
+
+## troubleshooting
+
+See [[plugins/openid/troubleshooting]] for a number of issues that may
+need to be addressed when setting up ikiwiki to accept OpenID logins reliably.

diff --git a/doc/plugins/plugins/openid/troubleshooting.mdwn b/doc/plugins/plugins/openid/troubleshooting.mdwn
new file mode 100644
index 0000000..c59f734
--- /dev/null
+++ b/doc/plugins/plugins/openid/troubleshooting.mdwn
@@ -0,0 +1,197 @@
+**TL;DR**
+
+[[!toc levels=3]]
+
+# An odyssey through lots of things that have to be right before OpenID works
+
+Having just (at last) made an ikiwiki installation accept my
+OpenID, I have learned many of the things that may have to be checked
+when getting the [[plugins/openid]] plugin to work. (These are probably
+the reasons why [ikiwiki.info](/) itself won't accept my OpenID!)
+
+Just to describe my OpenID setup a bit (and why it makes a good stress-test
+for the OpenID plugin :).
+
+I'm using my personal home page URL as my OpenID. My page lives at
+a shared-hosting service I have hired. It contains links that delegate
+my OpenID processing to [indieauth.com](https://indieauth.com).
+
+IndieAuth, in turn, uses
+[rel-me authentication](http://microformats.org/wiki/RelMeAuth) to find
+an [OAuth](http://microformats.org/wiki/OAuth) provider that can authenticate
+me. (At present, I am using [github](http://github.com) for that, which
+is an OAuth provider but not an OpenID provider, so the gatewaying provided
+by IndieAuth solves that problem.) As far as ikiwiki is concerned,
+IndieAuth is my OpenID provider; the details beyond that are transparent.
+
+So, what were the various issues I had to sort out before my first successful
+login with the [[plugins/openid]] plugin?
+
+## no_identity_server: Could not determine ID provider from URL.
+
+This is the message [ikiwiki.info](/) shows as soon as I enter my home URL
+as an OpenID. It is also the first one I got on my own ikiwiki installation.
+
+### various possible causes ...
+
+There could be lots of causes. Maybe:
+
+* the offered OpenID is an `https:` URL and there is an issue in checking
+    the certificate, so the page can't be retrieved?
+* the page can be retrieved, but it isn't well-formed HTML and the library
+    can't parse it for the needed OpenID links?
+* ...?
+
+### make a luckier setting of useragent ?!
+
+In my case, it was none of the above. It turns out my shared-hosting provider
+has a rule that refuses requests with `User-Agent: libwww-perl/6.03` (!!).
+This is the sort of problem that's really hard to anticipate or plan around.
+I could fix it (_for this case!_) by changing `useragent:` in `ikiwiki.setup`
+to a different string that my goofy provider lets through.
+
+__Recommendation:__ set `useragent:` in `ikiwiki.setup` to some
+unlikely-to-be-blacklisted value. I can't guess what the best
+unlikely-to-be-blacklisted value is; if there is one, it's probably the
+next one all the rude bots will be using anyway, and some goofy provider
+like mine will blacklist it.
+
+## Error: OpenID failure: naive_verify_failed_network: Could not contact ID provider to verify response.
+
+Again, this could have various causes. It was helpful to bump the debug level
+and get some logging, to see:
+
+    500 Can't connect to indieauth.com:443 (Net::SSL from Crypt-SSLeay can't
+    verify hostnames; either install IO::Socket::SSL or turn off verification
+    by setting the PERL_LWP_SSL_VERIFY_HOSTNAME environment variable to 0)
+
+I don't belong to the camp that solves every verification problem by turning
+verification off, so this meant finding out how to get verification to be done.
+It turns out there are two different Perl modules that can be used for SSL:
+
+* `IO::Socket::SSL` (verifies hostnames)
+* `Net::SSL` (_does not_ verify hostnames)
+
+Both were installed on my hosted server. How was Perl deciding which one
+to use?
+
+### set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` appropriately
+
+It turns out
+[there's an environment variable](https://rt.cpan.org/Public/Bug/Display.html?id=71599).
+So just set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` to `IO::Socket::SSL` and the
+right module gets used, right?
+
+[Wrong](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/fed6f7d7df8619df0754e8883cfad2ac15703a38#diff-2).
+That change was made to `ParanoidAgent.pm` back in November 2013 because of an
+unrelated [bug](https://github.com/csirtgadgets/LWPx-ParanoidAgent/issues/4)
+in `IO::Socket::SSL`. Essentially, _hmm, something goes wrong in
+`IO::Socket::SSL` when reading certain large documents, so we'll fix it by
+forcing the use of `Net::SSL` instead (the one that never verifies hostnames!),
+no matter what the admin has set `PERL_NET_HTTPS_SSL_SOCKET_CLASS` to!_
+
+### undo change that broke `PERL_NET_HTTPS_SSL_SOCKET_CLASS`
+
+Plenty of [comments](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=738493)
+quickly appeared about how good an idea that wasn't, and it was corrected in
+June 2014 with [one commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/a92ed8f45834a6167ff62d3e7330bb066b307a35)
+to fix the original reading-long-documents issue in `IO::Socket::SSL` and
+[another commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/815c691ad5554a219769a90ca5f4001ae22a4019)
+that reverts the forcing of `Net::SSL` no matter how the environment is set.
+
+Unfortunately, there isn't a release in CPAN yet that includes those two
+commits, but they are only a few lines to edit into your own locally-installed
+module.
+
+## Still naive_verify_failed_network, new improved reason
+
+    500 Can't connect to indieauth.com:443 (SSL connect attempt failed
+    with unknown error error:14090086:SSL
+    routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed)
+
+Yay, at least it's trying to verify! Now why can't it verify IndieAuth's
+certificate?
+
+[Here's why](https://tools.ietf.org/html/rfc6066#section-3). As it turns out,
+[indieauth.com](https://indieauth.com/) is itself a virtual host on a shared
+server. If you naively try
+
+    openssl s_client -connect indieauth.com:443
+
+you get back a certificate for [indieweb.org](https://indieweb.org/)
+instead, so the hostname won't verify. If you explicitly indicate what server
+name you're connecting to:
+
+    openssl s_client -connect indieauth.com:443 -servername indieauth.com
+
+then, magically, the correct certificate comes back.
+
+### ensure `OpenSSL`, `Net::SSLeay`, `IO::Socket::SSL` new enough for SNI
+
+If your `openssl` doesn't recognize the `-servername` option, it is too old
+to do SNI, and a newer version needs to be built and installed. In fact,
+even though SNI support was reportedly backported into OpenSSL 0.9.8f, it will
+not be used by `IO::Socket::SSL` unless it is
+[1.0 or higher](http://search.cpan.org/~sullr/IO-Socket-SSL-1.998/lib/IO/Socket/SSL.pod#SNI_Support).
+
+Then a recent `Net::SSLeay` perl module needs to be built and linked against it.
+
+### Local OpenSSL installation will need certs to trust
+
+Bear in mind that the OpenSSL distribution doesn't come with a collection
+of trusted issuer certs. If a newer version is built and installed locally
+(say, on a shared server where the system locations can't be written), it will
+need to be given a directory of trusted issuer certs, say by linking to the
+system-provided ones. However, a change to the certificate hash algorithm used
+for the symlinks in that directory was [reportedly](http://www.cilogon.org/openssl1)
+made with OpenSSL 1.0.0. So if the system-provided trusted certificate directory
+was set up for an earlier OpenSSL version, all the certificates in it will be
+fine but the hash symlinks will be wrong. That can be fixed by linking only the
+named certificate files from the system directory into the newly-installed one,
+and then running the new version of `c_rehash` there.
+
+## Still certificate verify failed
+
+Using [SNI](https://tools.ietf.org/html/rfc6066#section-3)-supporting versions
+of `IO::Socket::SSL`, `Net::SSLeay`, and `OpenSSL` doesn't do any good if an
+upper layer hasn't passed down the name of the host being connected to so the
+SSL layer can SNI for it.
+
+### ensure that `LWPx::ParanoidAgent` passes server name to SSL layer for SNI
+
+That was fixed in `LWPx::ParanoidAgent` with
+[this commit](https://github.com/csirtgadgets/LWPx-ParanoidAgent/commit/df6df19ccdeeb717c709cccb011af35d3713f546),
+which needs to be backported by hand if it hasn't made it into a CPAN release
+yet.
+
+Only that still doesn't end the story, because that hand didn't know what
+[this hand](https://github.com/noxxi/p5-io-socket-ssl/commit/4f83a3cd85458bd2141f0a9f22f787174d51d587#diff-1)
+was doing. What good is passing the name in
+`PeerHost` if the SSL code looks in `PeerAddr` first ... and then, if that
+doesn't match a regex for a hostname, decides you didn't supply one at all,
+without even looking at `PeerHost`?
+
+Happily, is is possible to assign a key that _explicitly_ supplies the
+server name for SNI:
+
+    --- LWPx/Protocol/http_paranoid.pm    2014-09-08 03:33:00.000000000 -0400
+    +++ LWPx/Protocol/http_paranoid.pm    2014-09-08 03:33:27.000000000 -0400
+    @@ -73,6 +73,7 @@
+            close($el);
+             $sock = $self->socket_class->new(PeerAddr => $addr,
+                                              PeerHost => $host,
+    +                                         SSL_hostname => $host,
+                                              PeerPort => $port,
+                                              Proto    => 'tcp',
+                                              Timeout  => $conn_timeout,
+
+... not submitted upstream yet, so needs to be applied by hand.
+
+# Success!!
+
+And with that, ladies and gents, I got my first successful OpenID login!
+I'm pretty sure that if the same fixes can be applied to
+[ikiwiki.info](/) itself, a wider range of OpenID logins (like mine, for

(Diff truncated)
review
diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index 7655d40..9da2515 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -31,3 +31,24 @@ As I am not sure that remembering `PERL5LIB` is a good idea, I think that a pret
 [This change](https://github.com/jcflack/ikiwiki/commit/bc4721da0441a30822225c51b250be4cc5f8af24) moves the `%config{ENV}` handling earlier in the wrapper, so anything specified there is placed back in the actual environment before Perl gets control. Problem solved!
 
 -- Chap
+
+> Thanks, this looks like a nicer solution than the above. Some review:
+>
+>     + $val =~ s/([\\"])/\\$1/g;
+>
+> This is *probably* OK, because the configuration is unlikely to include
+> non-ASCII, but I'd prefer something that covers all possibilities,
+> like this:
+>
+>     my $tmp = $val;
+>     utf8::encode($tmp) if utf8::is_utf8($tmp);
+>     $tmp =~ s/([^A-Za-z0-9])/sprintf "\\x%02x", $1/ge;
+>
+> and then passing $tmp to addenv.
+>
+>     + delete $config{ENV};
+>
+> I don't think this is particularly necessary: there doesn't seem any harm
+> in having it in the storable too?
+>
+> --[[smcv]]

make gitremotes work
diff --git a/doc/git.mdwn b/doc/git.mdwn
index 6dbfedb..55cc9c1 100644
--- a/doc/git.mdwn
+++ b/doc/git.mdwn
@@ -81,7 +81,7 @@ think about merging them. This is recommended. :-)
 * [[cbaines]] `git://git.cbaines.net/ikiwiki`
 * [[mhameed]] `git://github.com/mhameed/ikiwiki.git`
 * [[spalax]] `git://github.com/paternal/ikiwiki.git` ([[browse|https://github.com/paternal/ikiwiki]])
-* [[users/jcflack]] `git://github.com/jcflack/ikiwiki.git`
+* [[jcflack]] `git://github.com/jcflack/ikiwiki.git`
 
 ## branches
 

diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
index 81a5abf..7655d40 100644
--- a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -26,3 +26,8 @@ This brutal patch implement your solution as a temporary fix.
 As I am not sure that remembering `PERL5LIB` is a good idea, I think that a prettier solution will be to add a config variable (let's say `cgi_wrapper_perllib`) which, if fixed, contains the `PERL5LIB` value to include in the wrapper, or another (let's say `cgi_wrapper_remember_libdir`), which, if fixed, remember the current `PERL5LIB`.
 
 -- Bruno
+
+**Update:** I had not seen this bug earlier, but I ran into the same issue and made a more general solution. You can already add stuff to `%config{ENV}` in the setup file, but it was being processed too late for `PERL5LIB` to do any good.
+[This change](https://github.com/jcflack/ikiwiki/commit/bc4721da0441a30822225c51b250be4cc5f8af24) moves the `%config{ENV}` handling earlier in the wrapper, so anything specified there is placed back in the actual environment before Perl gets control. Problem solved!
+
+-- Chap

diff --git a/doc/git.mdwn b/doc/git.mdwn
index e71fa57..6dbfedb 100644
--- a/doc/git.mdwn
+++ b/doc/git.mdwn
@@ -81,6 +81,7 @@ think about merging them. This is recommended. :-)
 * [[cbaines]] `git://git.cbaines.net/ikiwiki`
 * [[mhameed]] `git://github.com/mhameed/ikiwiki.git`
 * [[spalax]] `git://github.com/paternal/ikiwiki.git` ([[browse|https://github.com/paternal/ikiwiki]])
+* [[users/jcflack]] `git://github.com/jcflack/ikiwiki.git`
 
 ## branches
 

propose a branch which uses non-numeric `show` for this purpose
diff --git a/doc/todo/Option_linktext_for_pagestats_directive.mdwn b/doc/todo/Option_linktext_for_pagestats_directive.mdwn
index 8bbb7c2..ab5eb22 100644
--- a/doc/todo/Option_linktext_for_pagestats_directive.mdwn
+++ b/doc/todo/Option_linktext_for_pagestats_directive.mdwn
@@ -194,3 +194,10 @@ Regards,
 > like `limit` (by analogy with SQL) or `max` as the canonical name for the
 > "number of things to match" parameter, at which point a non-numeric
 > `show` could mean this thing. --[[smcv]]
+
+>> [[!template id=gitbranch branch=smcv/pagestats-show
+author="[[Louis|spalax]], [[smcv]]"
+browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/pagestats-show]]
+>> Here's a branch. It depends on my `ready/limit` branch
+>> from [[todo/pick a new canonical name for equivalent of SQL limit]].
+>> --[[smcv]]

extend rst test to cover a fixed bug
diff --git a/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn
index 99e46aa..57e0cf6 100644
--- a/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn
+++ b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn
@@ -27,3 +27,5 @@ throwing code..):
 
 > On second thought, this was a bug in ikiwiki, it should be transmitting
 > that as a string. Fixed in external.pm --[[Joey]] 
+
+>> [[done]] a while ago, then. I've added a regression test now. --[[smcv]]
diff --git a/t/rst.t b/t/rst.t
index 4e0c4b7..a72c468 100755
--- a/t/rst.t
+++ b/t/rst.t
@@ -8,7 +8,7 @@ BEGIN {
 	}
 }
 
-use Test::More tests => 2;
+use Test::More tests => 3;
 
 BEGIN { use_ok("IkiWiki"); }
 
@@ -19,4 +19,8 @@ $config{add_plugins}=[qw(rst)];
 IkiWiki::loadplugins();
 IkiWiki::checkconfig();
 
-ok(IkiWiki::htmlize("foo", "foo", "rst", "foo\n") =~ m{\s*<p>foo</p>\s*});
+like(IkiWiki::htmlize("foo", "foo", "rst", "foo\n"), qr{\s*<p>foo</p>\s*});
+# regression test for [[bugs/rst fails on file containing only a number]]
+my $html = IkiWiki::htmlize("foo", "foo", "rst", "11");
+$html =~ s/<[^>]*>//g;
+like($html, qr{\s*11\s*});

rename bugs/redirect.mdwn to todo/redirect.mdwn
diff --git a/doc/bugs/redirect.mdwn b/doc/bugs/redirect.mdwn
deleted file mode 100644
index 87f6a67..0000000
--- a/doc/bugs/redirect.mdwn
+++ /dev/null
@@ -1,53 +0,0 @@
-I suppose this isn't technically a bug, but whetever.
-
-I want symbolic links to be rendered as HTTP redirects. For example,
-if we do this,
-
-    touch foo.mkdwn
-    ln -s foo.mkdwn bar.mkdwn
-    git push baz.branchable.com
-
-then the following command should print 302
-
-    curl -o /dev/null -s -w "%{http_code}" http://baz.thomaslevine.com/bar/
-
-> An interesting idea, but it conflicts somewhat with wanting symlinks to be
-> treated as the referenced file when it's safe to do so, which would be
-> great for [[todo/git-annex support]], and also good to avoid duplication
-> for files in system-wide underlays.
->
-> Also, I don't think this is possible without help from the web server
-> configuration: for instance, under Apache, I believe the only way to get
-> an HTTP 302 redirect is via Apache-specific `.htaccess` files or
-> system-level Apache configuration.
->
-> In current ikiwiki, you can get a broadly similar effect by either
-> using \[[!meta redir=foo]] (which does a HTML `<meta>` redirect)
-> or reconfiguring the web server. --[[smcv]]
-
->> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
->> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.
-
->>> One of the major things that separates ikiwiki from other wiki software
->>> is that ikiwiki is a wiki compiler: ordinary page-views are purely
->>> static HTML, and the CGI only gets involved when you do something
->>> that really has to be dynamic (like an edit).
->>>
->>> However, there is no server-independent static content that ikiwiki
->>> could write out to the destdir that would result in that redirect.
->>>
->>> If you're OK with requiring the [[plugins/404]] plugin (and a
->>> web server where it works, which I think still means Apache) then
->>> it would be possible to write a plugin that detected symlinks,
->>> stored them in the `%wikistate`, and used them to make the
->>> [[plugins/404]] plugin (or its own hook similar to the one
->>> in that plugin) do a 302 redirect instead of a 404.
->>> Similarly, a plugin that assumed a suitable Apache
->>> configuration with fairly broad `AllowOverrides`,
->>> and wrote out `.htaccess` files, would be a feasible thing
->>> for someone to write.
->>>
->>> I don't think this is a bug; I think it's a request for a
->>> feature that not everyone will want. The solution to those
->>> is for someone who wants the feature to
->>> [[write a plugin|plugins/write]]. --[[smcv]]
diff --git a/doc/todo/redirect.mdwn b/doc/todo/redirect.mdwn
new file mode 100644
index 0000000..87f6a67
--- /dev/null
+++ b/doc/todo/redirect.mdwn
@@ -0,0 +1,53 @@
+I suppose this isn't technically a bug, but whetever.
+
+I want symbolic links to be rendered as HTTP redirects. For example,
+if we do this,
+
+    touch foo.mkdwn
+    ln -s foo.mkdwn bar.mkdwn
+    git push baz.branchable.com
+
+then the following command should print 302
+
+    curl -o /dev/null -s -w "%{http_code}" http://baz.thomaslevine.com/bar/
+
+> An interesting idea, but it conflicts somewhat with wanting symlinks to be
+> treated as the referenced file when it's safe to do so, which would be
+> great for [[todo/git-annex support]], and also good to avoid duplication
+> for files in system-wide underlays.
+>
+> Also, I don't think this is possible without help from the web server
+> configuration: for instance, under Apache, I believe the only way to get
+> an HTTP 302 redirect is via Apache-specific `.htaccess` files or
+> system-level Apache configuration.
+>
+> In current ikiwiki, you can get a broadly similar effect by either
+> using \[[!meta redir=foo]] (which does a HTML `<meta>` redirect)
+> or reconfiguring the web server. --[[smcv]]
+
+>> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
+>> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.
+
+>>> One of the major things that separates ikiwiki from other wiki software
+>>> is that ikiwiki is a wiki compiler: ordinary page-views are purely
+>>> static HTML, and the CGI only gets involved when you do something
+>>> that really has to be dynamic (like an edit).
+>>>
+>>> However, there is no server-independent static content that ikiwiki
+>>> could write out to the destdir that would result in that redirect.
+>>>
+>>> If you're OK with requiring the [[plugins/404]] plugin (and a
+>>> web server where it works, which I think still means Apache) then
+>>> it would be possible to write a plugin that detected symlinks,
+>>> stored them in the `%wikistate`, and used them to make the
+>>> [[plugins/404]] plugin (or its own hook similar to the one
+>>> in that plugin) do a 302 redirect instead of a 404.
+>>> Similarly, a plugin that assumed a suitable Apache
+>>> configuration with fairly broad `AllowOverrides`,
+>>> and wrote out `.htaccess` files, would be a feasible thing
+>>> for someone to write.
+>>>
+>>> I don't think this is a bug; I think it's a request for a
+>>> feature that not everyone will want. The solution to those
+>>> is for someone who wants the feature to
+>>> [[write a plugin|plugins/write]]. --[[smcv]]

close bug
diff --git a/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn b/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn
index ca0738a..001d990 100644
--- a/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn
+++ b/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn
@@ -33,3 +33,5 @@ without decode('utf8') is working
 > [[this related bug|bugs/pythonproxy-utf8_again]]. [[!tag patch]] --smcv
 
 tested and fixed with patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a) and patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c)
+
+> [[done]], pending release --[[smcv]]

branch for comment, not merging just yet
diff --git a/doc/todo/pick_a_new_canonical_name_for_equivalent_of_SQL_limit.mdwn b/doc/todo/pick_a_new_canonical_name_for_equivalent_of_SQL_limit.mdwn
index daa520d..4e70f81 100644
--- a/doc/todo/pick_a_new_canonical_name_for_equivalent_of_SQL_limit.mdwn
+++ b/doc/todo/pick_a_new_canonical_name_for_equivalent_of_SQL_limit.mdwn
@@ -35,4 +35,10 @@ Which of those do Joey/other contributors prefer?
 Or if keeping `show=10` is preferred, what should be the conventional name
 for functionality like `\[[!map show=title]]`?
 
-I personally like the idea of `\[[!inline limit=10]]`. --[[smcv]]
+> [[!template id=gitbranch branch=smcv/ready/limit
+author="[[Simon McVittie|smcv]]"
+browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/limit]]
+> [[!tag patch users/smcv/ready]]
+
+I personally prefer `\[[!inline limit=10]]` so I have put that in a branch.
+Agreement/objections/better ideas welcome. --[[smcv]]

fix link
diff --git a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
index abe5d56..bae331f 100644
--- a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
+++ b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
@@ -134,7 +134,7 @@ The [[ikiwiki/directive/listdirectives]]` directive doesn't register a link betw
 >>>>>>> would all become bidirectional; and as I noted previously, if pagespecs
 >>>>>>> can match by linkedness (which we want) and plugins can generate lists
 >>>>>>> of links according to pagespecs (which we also want), then links in the
->>>>>>> compiled output can certainly get into [[!wp Russell's paradox]]-like
+>>>>>>> compiled output can certainly get into [[!wikipedia Russell's paradox]]-like
 >>>>>>> situations, such as the page that links to every page to which it
 >>>>>>> does not link.
 >>>>>>>

more thoughts on this
diff --git a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
index ad52d78..abe5d56 100644
--- a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
+++ b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
@@ -112,3 +112,34 @@ The [[ikiwiki/directive/listdirectives]]` directive doesn't register a link betw
 >>>>> it doesn't inline. That's never going to end well :-) --[[smcv]]
 >>>>>> We have to differentiate between what users of ikiwiki consider first class links and what internally is happening. For the user any link contributing to the structured access tree is first class. The code on the other hand has to differentiate between the static links, then generated links, then orphan links. Three "passes", even your proposed solution could be seen as adding another pass since the orphan plugin has to run after all the plugins generating (first class user) links.   -- [[holger]]
 
+>>>>>>> I think the difference between your point of view, and what ikiwiki
+>>>>>>> currently implements / what its design is geared towards, is this:
+>>>>>>> ikiwiki says A links to B if the *source code* of A contains an
+>>>>>>> explicit link to B. You say A links to B if the *compiled HTML*
+>>>>>>> of A contains a link to B.
+>>>>>>>
+>>>>>>> Would you agree with that characterization?
+>>>>>>>
+>>>>>>> I suspect that "link in the source code" may be the more useful concept
+>>>>>>> when using links for backlinks (I think the original implementation is
+>>>>>>> <http://c2.com/cgi/wiki?BackLink>) and as pseudo-tags
+>>>>>>> (<http://c2.com/cgi/wiki?WikiCategories>). The fact that this is what
+>>>>>>> `link()` and `backlink()` mean could be better-documented: it's
+>>>>>>> entirely possible that the author of their documentation (Joey?)
+>>>>>>> thought it was obvious that that's what they mean, because they
+>>>>>>> were coming from a compiler/source-code mindset.
+>>>>>>>
+>>>>>>> Also, backlinks become rather all-engulfing if their presence in
+>>>>>>> the compiled output counts as a link, since after a render pass, they
+>>>>>>> would all become bidirectional; and as I noted previously, if pagespecs
+>>>>>>> can match by linkedness (which we want) and plugins can generate lists
+>>>>>>> of links according to pagespecs (which we also want), then links in the
+>>>>>>> compiled output can certainly get into [[!wp Russell's paradox]]-like
+>>>>>>> situations, such as the page that links to every page to which it
+>>>>>>> does not link.
+>>>>>>>
+>>>>>>> For the special case of deciding what is orphaned, sure, it's the
+>>>>>>> compiled HTML that is the more relevant thing;
+>>>>>>> that's why I talked about "reachability" rather than "links".
+>>>>>>>
+>>>>>>> --[[smcv]]

more bug-closing
diff --git a/debian/changelog b/debian/changelog
index e2b5ac5..611e518 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -11,6 +11,13 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
   * Improve performance and correctness of the [[!if]] directive
   * Let [[!inline rootpage=foo postform=no]] disable the posting form
   * Switch default [[!man]] shortcut to manpages.debian.org. Closes: #700322
+  * Add UUID and TIME variables to edittemplate. Closes: #752827
+    Thanks, Jonathon Anderson
+  * Display pages in linkmaps as their pagetitle (no underscore escapes).
+    Thanks, chrysn
+  * Fix aspect ratio when scaling small images, and add support for
+    converting SVG and PDF graphics to PNG.
+    Thanks, chrysn
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn b/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
index 6425c1e..9ce091e 100644
--- a/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
+++ b/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
@@ -47,3 +47,5 @@ If you use the rescaling feature of the directive [[ikiwiki/directive/img/]] wit
 >>> in my copy of the branch.
 >>>
 >>> --[[smcv]]
+
+>>>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/linkmap_displays_underscore_escapes.mdwn b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
index 14164d0..1608035 100644
--- a/doc/bugs/linkmap_displays_underscore_escapes.mdwn
+++ b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
@@ -33,3 +33,5 @@ the patch is stored in [[the patch.pl]] as created by git-format-patch, and can
 be pulled from the abovementioned branch.
 
 > update 2014-06-29: branch still merges cleanly and works. --[[chrysn]]
+
+>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/svg_and_pdf_conversion_fails.mdwn b/doc/bugs/svg_and_pdf_conversion_fails.mdwn
index ac18fe8..9910959 100644
--- a/doc/bugs/svg_and_pdf_conversion_fails.mdwn
+++ b/doc/bugs/svg_and_pdf_conversion_fails.mdwn
@@ -56,3 +56,5 @@ should be safe for inclusion.
 >>> which works, so my biggest fear about the all-to-png change is unwarranted.
 >>> i'll have a look at that some time, but i think as things are, this is
 >>> ready now, please review again. --[[chrysn]]
+
+>>>> [[merged|done]] --[[smcv]]
diff --git a/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn b/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn
index e5a8d0a..6d702fe 100644
--- a/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn
+++ b/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn
@@ -83,3 +83,5 @@ Changes to the structure of `$pagestate{$registering_page}{edittemplate}{$pagesp
 >>>>> almost 5 years ago. Branch replaced by `smcv/ready/edittemplate2`
 >>>>> which drops `formatted_time` and `html_time`, and adds a suggestion
 >>>>> to use `\[[!date]]`. --[[smcv]]
+
+>>>>>> [[merged|done]] --[[smcv]]

more changelog and bug-closing
diff --git a/debian/changelog b/debian/changelog
index 59a9e6b..e2b5ac5 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -3,6 +3,14 @@ ikiwiki (3.20140912) UNRELEASED; urgency=medium
   * Don't double-decode CGI submissions with Encode.pm >= 2.53,
     fixing "Error: Cannot decode string with wide characters".
     Thanks, Antoine Beaupré
+  * Avoid making trails depend on everything in the wiki by giving them
+    a better way to sort the pages
+  * Don't let users post comments that won't be displayed
+  * Fix encoding of Unicode strings in Python plugins.
+    Thanks, chrysn
+  * Improve performance and correctness of the [[!if]] directive
+  * Let [[!inline rootpage=foo postform=no]] disable the posting form
+  * Switch default [[!man]] shortcut to manpages.debian.org. Closes: #700322
 
  -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
 
diff --git a/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn b/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
index 7e75486..7b97b40 100644
--- a/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
+++ b/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
@@ -21,3 +21,4 @@ not the actual inlining of pages, but it's a start.
 --[[smcv]]
 
 >> this looks simple, straightforward and good to me --[[chrysn]]
+>>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn b/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
index c7d0ffb..22733e6 100644
--- a/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
+++ b/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
@@ -63,3 +63,5 @@ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/
 > `bestlink` is still the single most expensive function in this refresh
 > at ~ 9.5s, with `match_glob` at ~ 5.2s as the runner-up.
 > --[[smcv]]
+
+>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn b/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn
index bb6cd17..83d662c 100644
--- a/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn
+++ b/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn
@@ -30,3 +30,5 @@ to
     comments_pagespec && !comments_closed_pagespec && check_canedit
 
 --[[smcv]]
+
+> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/pythonproxy-utf8_again.mdwn b/doc/bugs/pythonproxy-utf8_again.mdwn
index cc6d11d..f068782 100644
--- a/doc/bugs/pythonproxy-utf8_again.mdwn
+++ b/doc/bugs/pythonproxy-utf8_again.mdwn
@@ -66,3 +66,5 @@ patch.
 >>>> Joey, I think this is [[ready for merge|users/smcv/ready]] even if it
 >>>> doesn't fix chrysn's bug - it does fix Python 3 support
 >>>> in general. --[[smcv]]
+
+>>>>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/trails_depend_on_everything.mdwn b/doc/bugs/trails_depend_on_everything.mdwn
index babb1e3..8e9edcf 100644
--- a/doc/bugs/trails_depend_on_everything.mdwn
+++ b/doc/bugs/trails_depend_on_everything.mdwn
@@ -12,3 +12,5 @@ list of pages.
 They should just sort the pages instead; they'll already have all the
 dependencies they need. My branch adds `IkiWiki::sort_pages` but does not
 make it plugin API just yet. --[[smcv]]
+
+> [[merged|done]] --[[smcv]]
diff --git a/doc/todo/upload__95__figure.mdwn b/doc/todo/upload__95__figure.mdwn
index d8dd659..a63e183 100644
--- a/doc/todo/upload__95__figure.mdwn
+++ b/doc/todo/upload__95__figure.mdwn
@@ -18,3 +18,5 @@ Unfortunately, Github shows [[raw code|https://github.com/paternal/ikiwiki/blob/
 >
 > This particular SVG [[looks good to me|users/smcv/ready]] and I've
 > mirrored it in my own git repo. --[[smcv]]
+
+>> [[merged|done]] --[[smcv]]

Merge remote-tracking branch 'spalax/paternal/upload-svg'
changelog/close bugs
diff --git a/debian/changelog b/debian/changelog
index 022fc7d..59a9e6b 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,3 +1,11 @@
+ikiwiki (3.20140912) UNRELEASED; urgency=medium
+
+  * Don't double-decode CGI submissions with Encode.pm >= 2.53,
+    fixing "Error: Cannot decode string with wide characters".
+    Thanks, Antoine Beaupré
+
+ -- Simon McVittie <smcv@debian.org>  Fri, 12 Sep 2014 21:23:58 +0100
+
 ikiwiki (3.20140831) unstable; urgency=medium
 
   * Make --no-gettime work in initial build. Closes: #755075
diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
index 74d8e46..657b86b 100644
--- a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -122,3 +122,5 @@ so this would explain the error on cancel, but doesn't explain the weird encodin
 
 > [[Looks good to me|users/smcv/ready]] although I'm not sure how valuable
 > the `$] < 5.02 || ` test is - I'd be tempted to just call `is_utf8`. --[[smcv]]
+
+>> [[merged|done]] --[[smcv]]
diff --git a/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron.mdwn b/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron.mdwn
index 8f92259..c5a91be 100644
--- a/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron.mdwn
+++ b/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron.mdwn
@@ -18,3 +18,6 @@ Can anyone shed any light on this problem and guide me what I need to do to fix
 Regards,
 
 -- [Shlomi Fish](http://www.shlomifish.org/)
+
+> [[Merged anarcat's fix for
+this|bugs/garbled non-ascii characters in body in web interface]] --[[smcv]]
diff --git a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
index 2ca8620..5a55fcc 100644
--- a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
+++ b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
@@ -92,5 +92,9 @@ I hope it's a bug, not a feature and you fix it soon :) --[[Paweł|ptecza]]
 >>> [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]] fixes this.
 >>> --[[smcv]]
 
+>>>> Merged that patch. Not marking this page as done, because the todo
+>>>> about using a standard encoding still stands (although I'm not at
+>>>> all sure there's an encoding that would be better). --[[smcv]]
+
 [[wishlist]]
 [1]: https://packages.debian.org/search?suite=all&section=all&arch=any&searchon=names&keywords=libencode-imaputf7-perl

Don't URL-encode links to Debian wiki pages
diff --git a/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
index c068c4a..f83f960 100644
--- a/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
+++ b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
@@ -1,3 +1,5 @@
 E.g. [[!debwiki Derivatives/Guidelines]].
 
 Maybe we should use `%S` instead of `%s` in the shortcut definition?
+
+> seems reasonable, [[done]] --[[smcv]]
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
index 753bdb9..ca529c2 100644
--- a/doc/shortcuts.mdwn
+++ b/doc/shortcuts.mdwn
@@ -27,7 +27,7 @@ This page controls what shortcut links the wiki supports.
 * [[!shortcut name=debrt url="https://rt.debian.org/Ticket/Display.html?id=%s"]]
 * [[!shortcut name=debss url="http://snapshot.debian.org/package/%s/"]]
   * Usage: `\[[!debss package]]` or `\[[!debss package/version]]`.  See <http://snapshot.debian.org/> for details.
-* [[!shortcut name=debwiki url="https://wiki.debian.org/%s"]]
+* [[!shortcut name=debwiki url="https://wiki.debian.org/%S"]]
 * [[!shortcut name=fdobug url="https://bugs.freedesktop.org/show_bug.cgi?id=%s" desc="freedesktop.org bug #%s"]]
 * [[!shortcut name=fdolist url="http://lists.freedesktop.org/mailman/listinfo/%s" desc="%s@lists.freedesktop.org"]]
 * [[!shortcut name=gnomebug url="https://bugzilla.gnome.org/show_bug.cgi?id=%s" desc="GNOME bug #%s"]]

yes, duplicate
diff --git a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
index cd4533f..2ca8620 100644
--- a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
+++ b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
@@ -88,11 +88,8 @@ I hope it's a bug, not a feature and you fix it soon :) --[[Paweł|ptecza]]
 >> Solving this sort of bug usually requires having a clear picture of
 >> which "strings" are bytestrings, and which "strings" are Unicode. --[[smcv]]
 
->>> As mhameed noted on IRC, this might be the same issue as
->>> [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]] and/or
->>> [[forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron]].
->>> Please try [[anarcat]]'s patch which you can find at
->>> <http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/ready/anarcat/safe_unicode>.
+>>> mhameed confirmed on IRC that anarcat's [[patch]] from
+>>> [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]] fixes this.
 >>> --[[smcv]]
 
 [[wishlist]]

possible dup
diff --git a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
index 19b9b59..cd4533f 100644
--- a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
+++ b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
@@ -88,5 +88,12 @@ I hope it's a bug, not a feature and you fix it soon :) --[[Paweł|ptecza]]
 >> Solving this sort of bug usually requires having a clear picture of
 >> which "strings" are bytestrings, and which "strings" are Unicode. --[[smcv]]
 
+>>> As mhameed noted on IRC, this might be the same issue as
+>>> [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]] and/or
+>>> [[forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron]].
+>>> Please try [[anarcat]]'s patch which you can find at
+>>> <http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/ready/anarcat/safe_unicode>.
+>>> --[[smcv]]
+
 [[wishlist]]
 [1]: https://packages.debian.org/search?suite=all&section=all&arch=any&searchon=names&keywords=libencode-imaputf7-perl

clarify
diff --git a/doc/todo/do_not_make_links_backwards.mdwn b/doc/todo/do_not_make_links_backwards.mdwn
index 4dd85ee..50720fe 100644
--- a/doc/todo/do_not_make_links_backwards.mdwn
+++ b/doc/todo/do_not_make_links_backwards.mdwn
@@ -35,15 +35,25 @@ Discussion
 > > > "text first" vs. "link first", so, say that.
 > > >
 > > > As far as I understand it, RTL languages like Arabic typically write
-> > > text files "in logical order" (first letter is first in the bytestream)
-> > > and only apply RTL rendering on display, and IkiWiki will parse files
+> > > text files "in logical order" (i.e. reading/writing order - first
+> > > letter is first in the bytestream) and only apply RTL rendering on
+> > > display. IkiWiki is UTF-8-only, and Unicode specifies that all
+> > > Unicode text should be in logical order. The opposite of logical
+> > > order is is "display order", which is how you would have to mangle
+> > > the file for it to appear correctly on a naive terminal that expects
+> > > LTR; that can only work correctly for hard-wrapped text, I think.
+> > >
+> > > IkiWiki will parse files
 > > > in logical order too; so if a link's text and destination are both
-> > > written in Arabic, in your proposed order (text before link), an
+> > > written in Arabic, in text-before-link order in the source code, an
 > > > Arabic reader starting from the right would still see the text
-> > > before the link. So I don't think it would make sense to suggest that
+> > > before the link. Similarly, in your proposed link-before-text
+> > > order, an Arabic reader would still see the link before the text
+> > > (which in their case means further to the right). So I don't think
+> > > it would make sense to suggest that
 > > > one order was more appropriate for RTL languages than the other: if
-> > > it's "right" (for whatever opinion of "right") in English, then it's
-> > > "right" in Arabic too.
+> > > it's "more correct" (for whatever opinion of "correct") in English, then
+> > > it's "more correct" in Arabic too.
 > > >
 > > > (If the destination is written in Latin then it gets
 > > > more complicated, because the destination will be rendered LTR within an

I still don't think 'rtl' is a suitable name for this switch
diff --git a/doc/todo/do_not_make_links_backwards.mdwn b/doc/todo/do_not_make_links_backwards.mdwn
index 4059d8e..4dd85ee 100644
--- a/doc/todo/do_not_make_links_backwards.mdwn
+++ b/doc/todo/do_not_make_links_backwards.mdwn
@@ -30,6 +30,25 @@ Discussion
 > > 
 > > Originally, I named that parameter `backwards_links`, but then it wouldn't make sense in the long term, and isn't exactly neutral: it assume the current way is backwards! Your suggestion is interesting however, but I don't think the rtl/ltr nomenclature is problematic, with proper documentation of course... --[[anarcat]]
 
+> > > I still don't think `rtl`/`ltr` is the right terminology here. I think
+> > > the "API" should say what you mean: the distinction being made is
+> > > "text first" vs. "link first", so, say that.
+> > >
+> > > As far as I understand it, RTL languages like Arabic typically write
+> > > text files "in logical order" (first letter is first in the bytestream)
+> > > and only apply RTL rendering on display, and IkiWiki will parse files
+> > > in logical order too; so if a link's text and destination are both
+> > > written in Arabic, in your proposed order (text before link), an
+> > > Arabic reader starting from the right would still see the text
+> > > before the link. So I don't think it would make sense to suggest that
+> > > one order was more appropriate for RTL languages than the other: if
+> > > it's "right" (for whatever opinion of "right") in English, then it's
+> > > "right" in Arabic too.
+> > >
+> > > (If the destination is written in Latin then it gets
+> > > more complicated, because the destination will be rendered LTR within an
+> > > otherwise RTL document. I think the order still works though.) --[[smcv]]
+
 There's a caveat: we can't have a per-wiki backwards_links option, because of the underlay, common to all wikis, which needs to be converted. So the option doesn't make much sense. Not sure how to deal with this... Maybe this needs to be at the package level? --[[anarcat]]
 
 > I've thought about adding a direction-neutral `\[[!link]]` directive -
@@ -80,6 +99,7 @@ I think we can approach this rationnally:
 
  1. left to right (text then link) can be considered more natural, and should therefore be supported
  2. it is supported in markdown using regular markdown links. in the proposed branch, the underlay wikilinks are converted to use regular markdown links
+    > Joey explicitly rejected this for a valid reason (it breaks inlining). See above. --[[smcv]]
  3. ikiwiki links break other markup plugins, like mediawiki and creole, as those work right to left.
  4. those are recognized "standards" used by other popular sites, like Wikipedia, or any wiki supporting the Creole markup, which is [most wikis](http://www.wikicreole.org/wiki/Engines)
 

Added a comment: Same Trick in Apache
diff --git a/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment
new file mode 100644
index 0000000..1546c67
--- /dev/null
+++ b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment
@@ -0,0 +1,61 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk6z7Jsfi_XWfzFJNZIjYUcjgrthg4aPUU"
+ nickname="Alejandro"
+ subject="Same Trick in Apache"
+ date="2014-09-10T18:58:24Z"
+ content="""
+I got it working with Apache 2.4 and Virtual Hosts on both HTTP 1.1 and HTTPS (SNI). The procedure is somewhat analogous to the nginx procedure above. So here is my set-up in the hopes will help other avoid this pain.
+
+## Set-up
+
+    CLIENT <---- HTTPS ----> REVERSE PROXY <---- HTTP ----> IKIWIKI
+
+
+## The HTTP to HTTPS Redirect
+
+To assure that all your HTTP requests are being redirected to HTTPS I chose to use mod_rewrite because simple Redirect does not pass query parameters. You will want an HTTP VHost that will redirect with something like the one below (notice the subtle ? before query string). **Note: This will NOT re-write ikiwiki's http:// URLs (base tag, etc.)**. For that I use a content filter like you will see below. This HTTP to HTTPS redirect is required though for both security and for the /foo/?updated URI form in this set-up.
+
+<pre>
+
+&lt;VirtualHost *:80&gt;
+    ServerName imass.name
+    RewriteEngine On
+    RewriteCond %{HTTPS} off
+    RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}?%{QUERY_STRING}
+    ErrorLog /var/log/imass.name-error.log
+    LogLevel warn
+    CustomLog /var/log/imass.name-access.log combined
+&lt;/VirtualHost&gt;
+
+</pre>
+
+## The SSL Virtual Host
+
+This part is a bit more tricky. First I am using SNI as I don't care for non-SNI user agents. Second, you need to use a filter that replaces all http:// to https:// before the response is set. Note that this alone won't deal with ?update so you will need the HTTP to HTTPS set-up above anyway. Third, I use HTTP Auth so I don't know if this will work with your particular Auth set-up (although it should IMHO), YMMV:
+
+<pre>
+
+&lt;VirtualHost *:443&gt;
+    ServerName imass.name
+    ProxyHTMLEnable On
+    ProxyHTMLExtended On
+    SSLEngine on
+    SSLCertificateFile XXX
+    SSLCertificateKeyFile XXX
+    SSLCertificateChainFile XXX
+    SSLOptions +StdEnvVars
+    ProxyPreserveHost On
+    ProxyHTMLURLMap http:// https://
+    ProxyPass / http://192.168.101.101/
+    ProxyPassReverse / http://192.168.101.101/
+    LogLevel warn
+    ErrorLog /var/log/imass.name-ssl-error.log
+    TransferLog \"/var/log/imass.name-ssl-access.log\"
+    CustomLog \"/var/log/imass.name-ssl-request.log\" \"%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \\"%r\\" %b\"
+&lt;/VirtualHost&gt;
+
+</pre>
+
+
+
+"""]]

switch man shortcut to manpages.debian.org (Closes: #700322)
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
index b4f6d8e..753bdb9 100644
--- a/doc/shortcuts.mdwn
+++ b/doc/shortcuts.mdwn
@@ -55,7 +55,7 @@ This page controls what shortcut links the wiki supports.
 * [[!shortcut name=whois url="http://reports.internic.net/cgi/whois?whois_nic=%s&type=domain"]]
 * [[!shortcut name=cve url="https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s"]]
 * [[!shortcut name=flickr url="https://secure.flickr.com/photos/%s"]]
-* [[!shortcut name=man url="http://linux.die.net/man/%s"]]
+* [[!shortcut name=man url="http://manpages.debian.org/%s"]]
 * [[!shortcut name=ohloh url="https://www.ohloh.net/p/%s"]]
 * [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]]
 * [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]]

clarify further
diff --git a/doc/todo/calendar_autocreate.mdwn b/doc/todo/calendar_autocreate.mdwn
index 8e6a1a0..2a7350b 100644
--- a/doc/todo/calendar_autocreate.mdwn
+++ b/doc/todo/calendar_autocreate.mdwn
@@ -218,8 +218,8 @@ sub gencalendaryear {
 >
 > However, that whole `if` block can be omitted, and you can just use
 > `$changed{$params{year}}{$params{month}} = 1;`, because Perl will automatically
-> create `$changed{$params{year}}` as a reference to an empty hash, in order to
-> put the pair `$params{month} => 1` in it (the term to look
+> create `$changed{$params{year}}` as a reference to an empty hash if necessary,
+> in order to put the pair `$params{month} => 1` in it (the term to look
 > up if you're curious is "autovivification").
 >
 > --[[smcv]]

clarif
diff --git a/doc/todo/calendar_autocreate.mdwn b/doc/todo/calendar_autocreate.mdwn
index e25c45c..8e6a1a0 100644
--- a/doc/todo/calendar_autocreate.mdwn
+++ b/doc/todo/calendar_autocreate.mdwn
@@ -212,12 +212,14 @@ sub gencalendaryear {
 >     +  }
 >     +  $changed{$params{year}}{$params{month}} = 1;
 >
-> $changed{$params{year}} is a scalar but `()` is a list. I think you want `{}`
+> `$changed{$params{year}}` is a scalar (you can tell because it starts with the
+> `$` sigil) but `()` is a list. I think you want `{}`
 > (a scalar that is a reference to an empty anonymous hash).
 >
 > However, that whole `if` block can be omitted, and you can just use
 > `$changed{$params{year}}{$params{month}} = 1;`, because Perl will automatically
-> create $changed{$params{year}} as a reference to a hash (the term to look
+> create `$changed{$params{year}}` as a reference to an empty hash, in order to
+> put the pair `$params{month} => 1` in it (the term to look
 > up if you're curious is "autovivification").
 >
 > --[[smcv]]

clarify
diff --git a/doc/todo/calendar_autocreate.mdwn b/doc/todo/calendar_autocreate.mdwn
index 46cfea3..e25c45c 100644
--- a/doc/todo/calendar_autocreate.mdwn
+++ b/doc/todo/calendar_autocreate.mdwn
@@ -189,7 +189,7 @@ sub gencalendaryear {
 >
 >     +    0 0 * * * ikiwiki ~/ikiwiki.setup --refresh
 >
-> I think that should be `ikiwiki --setup ~/ikiwiki.setup`.
+> I think that should be `ikiwiki --setup ~/ikiwiki.setup --refresh`
 >
 > The indentation of some of the new code in `IkiWiki/Plugin/calendar.pm`
 > is weird. Please use one hard tab (U+0009) per indent step: you seem

re-review
diff --git a/doc/todo/calendar_autocreate.mdwn b/doc/todo/calendar_autocreate.mdwn
index 02659d0..46cfea3 100644
--- a/doc/todo/calendar_autocreate.mdwn
+++ b/doc/todo/calendar_autocreate.mdwn
@@ -181,3 +181,43 @@ sub gencalendaryear {
 ---
 
 [[smcv]], can you please go on reviewing this?
+
+> I don't think I'm really the reviewer you want, since I don't have commit
+> access (as you might be able to tell from the number of pending branches
+> I have)... but nobody with commit access seems to be available to do
+> reviews at the moment, so I'm probably the best you're going to get.
+>
+>     +    0 0 * * * ikiwiki ~/ikiwiki.setup --refresh
+>
+> I think that should be `ikiwiki --setup ~/ikiwiki.setup`.
+>
+> The indentation of some of the new code in `IkiWiki/Plugin/calendar.pm`
+> is weird. Please use one hard tab (U+0009) per indent step: you seem
+> to have used a mixture of one hard tab per indent or two spaces
+> per indent, which looks bizarre for anyone whose tab size is not
+> 2 spaces.
+>
+>     +	return unless $config{calendar_autocreate};
+>
+> This is checked in `gencalendaryear` but not in `gencalendarmonth`.
+> Shouldn't `gencalendarmonth` do it too? Alternatively, do the check
+> in `scan`, which calls `gencalendarmonth` directly.
+>
+>     +		my $year  = $date[5] + 1900;
+>
+> You calculate this, but you don't seem to do anything with it?
+>
+>     +  if (not exists $changed{$params{year}}) {
+>     +    $changed{$params{year}} = ();
+>     +  }
+>     +  $changed{$params{year}}{$params{month}} = 1;
+>
+> $changed{$params{year}} is a scalar but `()` is a list. I think you want `{}`
+> (a scalar that is a reference to an empty anonymous hash).
+>
+> However, that whole `if` block can be omitted, and you can just use
+> `$changed{$params{year}}{$params{month}} = 1;`, because Perl will automatically
+> create $changed{$params{year}} as a reference to a hash (the term to look
+> up if you're curious is "autovivification").
+>
+> --[[smcv]]

add alternative (IMO better) branch
diff --git a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
index f4bc340..627b2c8 100644
--- a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
+++ b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
@@ -56,16 +56,22 @@ Weird... --[[anarcat]]
 > > 
 > > --[[anarcat]]
 
+> > > [[!template  id=gitbranch branch=ready/more-magic author="[[smcv]]" browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/ready/more-magic]]
 > > > If the regex match isn't necessary and it's just about deleting the
-> > > parameters, I think I'd prefer something like
+> > > parameters, I think I'd prefer
 > > >
 > > >     if (! defined $mimetype) {
 > > >         ...
 > > >     }
 > > >     $mimetype =~ s/;.*//;
 > > >
-> > > but I'd be hesitant to do that without knowing why Joey implemented it
-> > > the way it is. If it's about catching a result from file(1) that
+> > > as done in my `ready/more-magic` branch.
+> > >
+> > > I'm a little hesitant to do that without knowing why Joey implemented it
+> > > the way it is, but as far as I can tell it's just an oversight.
+> > >
+> > > Or, if the result of the s/// is checked for a reason, and it's
+> > > about catching a result from file(1) that
 > > > is not, in fact, a MIME type at all (empty string or error message
 > > > or something), maybe something more like this?
 > > >
@@ -76,3 +82,10 @@ Weird... --[[anarcat]]
 > > > > I don't mind either way, but i feel this should be fixed for the next release, as I need to reapply this patch at every upgrade now. -- [[anarcat]]
 
 > > > > > This is still a problem in 3.20140831. -- [[anarcat]]
+
+> > > > > > I still don't think appending a semicolon is the right answer:
+> > > > > > at best it's equivalent to what I suggested, and at worst it's
+> > > > > > disabling a check that does have some reason behind it.
+> > > > > > I've turned the version I suggested above into a proper branch.
+> > > > > > Review by someone who can commit to ikiwiki.git would be appreciated.
+> > > > > > --[[smcv]]

branch looks good
diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
index 2f90f3e..74d8e46 100644
--- a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -119,3 +119,6 @@ so this would explain the error on cancel, but doesn't explain the weird encodin
 ... and that leads me to this crazy patch which fixes all the above issue, by avoiding double-decoding... go figure that shit out...
 
 [[!template  id=gitbranch branch=anarcat/dev/safe_unicode author="[[anarcat]]"]] 
+
+> [[Looks good to me|users/smcv/ready]] although I'm not sure how valuable
+> the `$] < 5.02 || ` test is - I'd be tempted to just call `is_utf8`. --[[smcv]]

patch
diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
index c92f407..2f90f3e 100644
--- a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -115,3 +115,7 @@ called at /usr/bin/ikiwiki line 231
 ~~~~
 
 so this would explain the error on cancel, but doesn't explain the weird encoding i get when editing the page... <sigh>...
+
+... and that leads me to this crazy patch which fixes all the above issue, by avoiding double-decoding... go figure that shit out...
+
+[[!template  id=gitbranch branch=anarcat/dev/safe_unicode author="[[anarcat]]"]] 

Added a comment
diff --git a/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron/comment_1_abf7ec7c378ab0908685d72d159e9fd2._comment b/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron/comment_1_abf7ec7c378ab0908685d72d159e9fd2._comment
new file mode 100644
index 0000000..8b066b3
--- /dev/null
+++ b/doc/forum/__34__Error:_cannot_decode_string_with_wide_characters__34___on_Mageia_Linux_x86-64_Cauldron/comment_1_abf7ec7c378ab0908685d72d159e9fd2._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://id.koumbit.net/anarcat"
+ ip="72.0.72.144"
+ subject="comment 1"
+ date="2014-09-10T03:00:22Z"
+ content="""
+i had a similar issue, reported in [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]]. 
+"""]]

more info, dupe?
diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
index 1c6ffc4..c92f407 100644
--- a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -99,3 +99,19 @@ http://paste.debian.net/plain/119944
 This is a major bug which should probably be fixed before jessie, yet i can't seem to find a severity statement in reportbug that would justify blocking the release based on this - unless we consider non-english speakers as "most" users (i don't know the demographics well enough). It certainly makes ikiwiki completely unusable for my users that operate on the web interface in french... --[[anarcat]]
 
 Note that on this one page, i can't even get the textarea to display and i immediately get `Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215`: http://anarc.at/ikiwiki.cgi?do=edit&page=hardware%2Fserver%2Fmarcos.
+
+Also note that this is the same as [[forum/"Error: cannot decode string with wide characters" on Mageia Linux x86-64 Cauldron]], I believe. The backtrace I get here is:
+
+~~~~
+Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215. Encode::decode_utf8("**Menu**\x{d}\x{a}\x{d}\x{a} * [[\x{fffd} propos|index]]\x{d}\x{a} * [[Logiciels|software]]"...)
+called at /usr/share/perl5/IkiWiki/CGI.pm line 117 IkiWiki::decode_form_utf8(CGI::FormBuilder=HASH(0x2ad63b8))
+called at /usr/share/perl5/IkiWiki/Plugin/editpage.pm line 90 IkiWiki::cgi_editpage(CGI=HASH(0xd514f8), CGI::Session=HASH(0x27797e0))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::__ANON__(CODE(0xfaa460))
+called at /usr/share/perl5/IkiWiki.pm line 2101 IkiWiki::run_hooks("sessioncgi", CODE(0x2520138))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::cgi()
+called at /usr/bin/ikiwiki line 192 eval {...}
+called at /usr/bin/ikiwiki line 192 IkiWiki::main()
+called at /usr/bin/ikiwiki line 231
+~~~~
+
+so this would explain the error on cancel, but doesn't explain the weird encoding i get when editing the page... <sigh>...

diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
index e80c52b..1c6ffc4 100644
--- a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -70,6 +70,19 @@ Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/pe
 >     some_bytes.decode('utf-8').decode('utf-8')
 >
 > --[[smcv]]
+> > 
+> > I couldn't figure out where to set that Carp thing - it doesn't work simply by setting it in /usr/bin/ikiwiki - so i am not sure how to use this. However, with some debugging code in Encode.pm, i was able to find a case of double-encoding - in the left menu, for example, which is the source of the Encode.pm crash.
+> > 
+> > It seems that some unicode semantics changed in Perl 5.20, or more precisely, in Encode.pm 2.53, according to [this](https://code.activestate.com/lists/perl-unicode/3314/). 5.20 does have significant Unicode changes, but I am not sure they are related (see [perldelta](https://metacpan.org/pod/distribution/perl/pod/perldelta.pod)). Doing more archeology, it seems that Encode.pm is indeed where the problem started, all the way back in [commit 8005a82](https://github.com/dankogai/p5-encode/commit/8005a82d8aa83024d72b14e66d9eb97d82029eeb#diff-f3330aa405ffb7e3fec2395c1fc953ac) (august 2013), taken from [pull request #11](https://github.com/dankogai/p5-encode/pull/11) which expressively forbids double-decoding, in effect failing like python does in the above example you gave (Perl used to silently succeed instead, a rather big change if you ask me).
+> > 
+> > So stepping back, it seems that this would be a bug in Ikiwiki. It could be in any of those places:
+> > 
+> > ~~~~
+> > anarcat@marcos:ikiwiki$ grep -r decode_utf8 IkiWiki* | wc -l
+> > 31
+> > ~~~~
+> > 
+> > Now the fun part is to determine which one should be turned off... or should we duplicate the logic that was removed in decode_utf8, or make a safe_decode_utf8 for ourselves? --[[anarcat]]
 
 The apache logs yield:
 
@@ -84,3 +97,5 @@ I had put ikiwiki on hold during the last upgrade, so it was upgraded separately
 http://paste.debian.net/plain/119944
 
 This is a major bug which should probably be fixed before jessie, yet i can't seem to find a severity statement in reportbug that would justify blocking the release based on this - unless we consider non-english speakers as "most" users (i don't know the demographics well enough). It certainly makes ikiwiki completely unusable for my users that operate on the web interface in french... --[[anarcat]]
+
+Note that on this one page, i can't even get the textarea to display and i immediately get `Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215`: http://anarc.at/ikiwiki.cgi?do=edit&page=hardware%2Fserver%2Fmarcos.

fixed upstream!
diff --git a/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn b/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn
index 5c667b5..073c10d 100644
--- a/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn
+++ b/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn
@@ -30,6 +30,8 @@ On some ikiwikis that I run, I get the following error on OpenID logins:
 > > > > by "upstream", i did mean `liblwpx-paranoidagent-perl`. so yeah, maybe this should be punted back into that package's court again. :( --[[anarcat]]
 > > > > 
 > > > > done, by bumping the severity of [[!debbug 744404]] to release-criticial. --[[anarcat]]
+> > > > 
+> > > > > ooh cool, the bug was fixed already with an upload, so this should probably be considered [[done]] at this point, even without the patch below! great! -- [[anarcat]]
 
 [[!template  id=gitbranch branch=anarcat/dev/ssl_ca_path author="[[anarcat]]"]] 
 

another possibility
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
index 47027c3..eb71994 100644
--- a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
@@ -15,10 +15,24 @@ And the extra newlines break the table.  Can they be safely removed?
 >     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
 >     </tr></table>
 >
-> where tagtd.tmpl is of the form `<td>your markup here</td>`.
+> where tagtd.tmpl is of the form `<td>your markup here</td>`; or even just
+>
+>     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtable]]
+>
+> where tagtable.tmpl looks like
+>
+>     <TMPL_IF FIRST>
+>     <table><tr>
+>     </TMPL_IF>
+>
+>     <td>your tag here</td>
+>
+>     <TMPL_IF LAST>
+>     </tr></table>
+>     </TMPL_IF>
 >
 > I don't think you're deriving much benefit from Markdown's table syntax
-> here, if you have to mix it with HTML::Template and ikiwiki directives,
+> if you have to mix it with HTML::Template and ikiwiki directives,
 > and be pathologically careful with whitespace. "Right tool for the job"
 > and all that :-)
 >

turn this into mdwn
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
index b40d5d0..47027c3 100644
--- a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
@@ -1,13 +1,10 @@
 I'm trying to put a list of tags in a table, so I carefully make a newline-free taglist.tmpl and then do:
 
-<pre>
-| [ [!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist] ] |
-</pre>
+    | \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist]] |
 
-but there's a line in <pre>inline.pm</pre> that does:
-<pre>
-        return "&lt;div class=\"inline\" id=\"$#inline\"&gt;&lt;/div&gt;\n\n";
-</pre>
+but there's a line in `inline.pm` that does:
+
+    return "&lt;div class=\"inline\" id=\"$#inline\"&gt;&lt;/div&gt;\n\n";
 
 And the extra newlines break the table.  Can they be safely removed?
 
@@ -23,4 +20,10 @@ And the extra newlines break the table.  Can they be safely removed?
 > I don't think you're deriving much benefit from Markdown's table syntax
 > here, if you have to mix it with HTML::Template and ikiwiki directives,
 > and be pathologically careful with whitespace. "Right tool for the job"
-> and all that :-) --[[smcv]]
+> and all that :-)
+>
+> When I edited this page I was amused to find that you used HTML,
+> not Markdown, as its format. It seems oddly appropriate to my answer, but
+> I've converted it to Markdown and adjusted the formatting, for easier
+> commenting.
+> --[[smcv]]

rename bugs/Inlining_adds_newlines_which_can_break_markdown.html to bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
deleted file mode 100644
index b40d5d0..0000000
--- a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
+++ /dev/null
@@ -1,26 +0,0 @@
-I'm trying to put a list of tags in a table, so I carefully make a newline-free taglist.tmpl and then do:
-
-<pre>
-| [ [!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist] ] |
-</pre>
-
-but there's a line in <pre>inline.pm</pre> that does:
-<pre>
-        return "&lt;div class=\"inline\" id=\"$#inline\"&gt;&lt;/div&gt;\n\n";
-</pre>
-
-And the extra newlines break the table.  Can they be safely removed?
-
-> If you want an HTML table, I would suggest using an HTML table, which
-> should pass through Markdown without being interpreted further:
->
->     <table><tr>
->     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
->     </tr></table>
->
-> where tagtd.tmpl is of the form `<td>your markup here</td>`.
->
-> I don't think you're deriving much benefit from Markdown's table syntax
-> here, if you have to mix it with HTML::Template and ikiwiki directives,
-> and be pathologically careful with whitespace. "Right tool for the job"
-> and all that :-) --[[smcv]]
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
new file mode 100644
index 0000000..b40d5d0
--- /dev/null
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.mdwn
@@ -0,0 +1,26 @@
+I'm trying to put a list of tags in a table, so I carefully make a newline-free taglist.tmpl and then do:
+
+<pre>
+| [ [!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist] ] |
+</pre>
+
+but there's a line in <pre>inline.pm</pre> that does:
+<pre>
+        return "&lt;div class=\"inline\" id=\"$#inline\"&gt;&lt;/div&gt;\n\n";
+</pre>
+
+And the extra newlines break the table.  Can they be safely removed?
+
+> If you want an HTML table, I would suggest using an HTML table, which
+> should pass through Markdown without being interpreted further:
+>
+>     <table><tr>
+>     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
+>     </tr></table>
+>
+> where tagtd.tmpl is of the form `<td>your markup here</td>`.
+>
+> I don't think you're deriving much benefit from Markdown's table syntax
+> here, if you have to mix it with HTML::Template and ikiwiki directives,
+> and be pathologically careful with whitespace. "Right tool for the job"
+> and all that :-) --[[smcv]]

I would recommend using HTML here
diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
index 3a7741f..b40d5d0 100644
--- a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
@@ -11,3 +11,16 @@ but there's a line in <pre>inline.pm</pre> that does:
 
 And the extra newlines break the table.  Can they be safely removed?
 
+> If you want an HTML table, I would suggest using an HTML table, which
+> should pass through Markdown without being interpreted further:
+>
+>     <table><tr>
+>     \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
+>     </tr></table>
+>
+> where tagtd.tmpl is of the form `<td>your markup here</td>`.
+>
+> I don't think you're deriving much benefit from Markdown's table syntax
+> here, if you have to mix it with HTML::Template and ikiwiki directives,
+> and be pathologically careful with whitespace. "Right tool for the job"
+> and all that :-) --[[smcv]]

diff --git a/doc/bugs/redirect.mdwn b/doc/bugs/redirect.mdwn
index 40d1086..87f6a67 100644
--- a/doc/bugs/redirect.mdwn
+++ b/doc/bugs/redirect.mdwn
@@ -27,3 +27,27 @@ then the following command should print 302
 
 >> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
 >> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.
+
+>>> One of the major things that separates ikiwiki from other wiki software
+>>> is that ikiwiki is a wiki compiler: ordinary page-views are purely
+>>> static HTML, and the CGI only gets involved when you do something
+>>> that really has to be dynamic (like an edit).
+>>>
+>>> However, there is no server-independent static content that ikiwiki
+>>> could write out to the destdir that would result in that redirect.
+>>>
+>>> If you're OK with requiring the [[plugins/404]] plugin (and a
+>>> web server where it works, which I think still means Apache) then
+>>> it would be possible to write a plugin that detected symlinks,
+>>> stored them in the `%wikistate`, and used them to make the
+>>> [[plugins/404]] plugin (or its own hook similar to the one
+>>> in that plugin) do a 302 redirect instead of a 404.
+>>> Similarly, a plugin that assumed a suitable Apache
+>>> configuration with fairly broad `AllowOverrides`,
+>>> and wrote out `.htaccess` files, would be a feasible thing
+>>> for someone to write.
+>>>
+>>> I don't think this is a bug; I think it's a request for a
+>>> feature that not everyone will want. The solution to those
+>>> is for someone who wants the feature to
+>>> [[write a plugin|plugins/write]]. --[[smcv]]

diff --git a/doc/bugs/redirect.mdwn b/doc/bugs/redirect.mdwn
index 6296c3d..40d1086 100644
--- a/doc/bugs/redirect.mdwn
+++ b/doc/bugs/redirect.mdwn
@@ -24,3 +24,6 @@ then the following command should print 302
 > In current ikiwiki, you can get a broadly similar effect by either
 > using \[[!meta redir=foo]] (which does a HTML `<meta>` redirect)
 > or reconfiguring the web server. --[[smcv]]
+
+>> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
+>> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.

diff --git a/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
new file mode 100644
index 0000000..3a7741f
--- /dev/null
+++ b/doc/bugs/Inlining_adds_newlines_which_can_break_markdown.html
@@ -0,0 +1,13 @@
+I'm trying to put a list of tags in a table, so I carefully make a newline-free taglist.tmpl and then do:
+
+<pre>
+| [ [!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist] ] |
+</pre>
+
+but there's a line in <pre>inline.pm</pre> that does:
+<pre>
+        return "&lt;div class=\"inline\" id=\"$#inline\"&gt;&lt;/div&gt;\n\n";
+</pre>
+
+And the extra newlines break the table.  Can they be safely removed?
+

Bugs++
diff --git a/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
new file mode 100644
index 0000000..c068c4a
--- /dev/null
+++ b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
@@ -0,0 +1,3 @@
+E.g. [[!debwiki Derivatives/Guidelines]].
+
+Maybe we should use `%S` instead of `%s` in the shortcut definition?