Maybe this is obvious, but the config variable lives in the IkiWiki module, and one probably wants to call defaultconfig for most applications.
%IkiWiki::config=IkiWiki::defaultconfig();
IkiWiki::Setup::load($config_file);
print  join(",",keys %IkiWiki::config);
[[DavidBremner]] I'm a little concerned about one aspect of the `%wikistate` variable that was just introduced. I think global state for each plugin is a fine idea, but I worry about making it persist across rebuilds. (And by rebuild, I assume we're talking about the `--rebuild` option.) My reasoning is that a 'rebuild' should be similar to checking out a new copy of the wiki and building. Another way of saying this is that all permanent state should be in the RCS. It is great that there is temporary state stored in other places - I think of it as indexing and caching. I'm worried that with the persistence, plugin writers will start putting data there that isn't backed by the RCS and that will break IkiWiki's great abilities as a distributed wiki. [[Will]] > Well, if you look at state that already persists across rebuilds, we have > pagectime, which can be extracted from RCS only very slowly in many > cases. There's also the separate state stored by the aggregate plugin, > which is indeed independant of the RCS, and can in some cases not be > replecated by rebuilding a different checkout (if the data is gone from > the feeds). Then there's the session cookie database, and the user > database, which started out with a lot of local state, has been > whittled down by removing admin prefs and subscriptions, but still has > important state including password hashes. > > So while I take your point about the potential for abuse, > there's certianly legitimate reasons to need to store data across > rebuilds. And plugins have always been able to drop their own files in > wikistatedir as aggregate does and have it persist, so the abuse > potential has always been there, the barrier has been lowered only > slightly. > > OTOH, if something can be added to the documentation that encourages > good behavior, that'd be a good thing ... --[[Joey]] --- Since there's no mailing list, I'll post my request for help here :-) I would like to use ikiwiki to build a static site which needs some transformations to be made on binary assets. A simple example is to translate a .odp presentation to .pdf using (e.g.) unoconv. I'd probably make a plugin with a config which maps extensions to shell commands. But what's the right place to hook in to do this? I can see that binary assets are normally hardlinked or copied verbatim. The logic from `sub render` in `IkiWiki/Render.pm` is: * If the private hash $rendered{$file} is already set, skip * If the extension is known to pagetype(), i.e. it has been registered for the htmlize hook, send content through the full cycle of `genpage(htmlize(linkify(preprocess(filter(readfile)))))` * ...except for extensions which start with underscore, in which case the processing is aborted before the write * Any file whose extension is unknown to pagetype() is either hardlinked or copied directly to the target directory Options I can see are: * Register .odp as a htmlize extension, use the scan hook(), inside there write out the file and alter the page name so that it has an underscore (xxx.odp -> xxx._odp) * Use the scan() hook, write out the file, directly manipulate the private %rendered hash to stop `sub render` handling it * use needsbuild to build the page as a side effect and at the same time remove it from the list of pages to be built * other way?? It's not clear to me which of these is the right way to go, taking into account all the existing logic for rebuilding pages on demand. (For example: if I git add and push a new .odp to the repository, I want the .pdf to be generated automatically in the output site through the post-commit hook) [[BrianCandler]] --- I would find this page clearer split up into sub-pages. Does anyone agree/disagree? -- [[users/Jon]]