X-Git-Url: https://sipb.mit.edu/gitweb.cgi/ikiwiki.git/blobdiff_plain/3ebb012e3fbb456a322ff546b9ec222050d5cf49..d7e749572c9e6c731d9e90f95bfedb200d98ddca:/doc/plugins/aggregate.mdwn diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn index 21a8105d0..75123d923 100644 --- a/doc/plugins/aggregate.mdwn +++ b/doc/plugins/aggregate.mdwn @@ -1,28 +1,17 @@ -[[template id=plugin name=aggregate author="[[Joey]]"]] -[[tag type/useful]] +[[!template id=plugin name=aggregate author="[[Joey]]"]] +[[!tag type/special-purpose]] -This plugin allows content from other feeds to be aggregated into the wiki. -Aggregate a feed as follows: +This plugin allows content from other feeds to be aggregated into the +wiki. To specify feeds to aggregate, use the +[[ikiwiki/directive/aggregate]] [[ikiwiki/directive]]. - \[[aggregate name="example blog" dir="example" - feedurl="http://example.com/index.rss" - url="http://example.com/" updateinterval="15"]] +## requirements -That example aggregates posts from the specified RSS feed, updating no -more frequently than once every 15 minutes, and puts a page per post under -the example/ directory in the wiki. +The [[meta]] and [[tag]] plugins are also recommended to be used with this +one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since +feeds can easily contain html problems, some of which these plugins can fix. -You can then use ikiwiki's [[ikiwiki/blog]] support to create a blog of one or -more aggregated feeds. For example: - - \[[inline pages="internal(example/*)"]] - -## setup - -Make sure that you have the [[html]] plugin enabled, as the created pages are -in html format. The [[meta]] and [[tag]] plugins are also recommended. The -[[htmltidy]] plugin is suggested, since feeds can easily contain html -problems, some of which tidy can fix. +## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the --aggregate parameter, to make it check for new posts. Here's an example @@ -30,6 +19,11 @@ crontab entry: */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh +The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp +when the next aggregation run could occur. (The file may be empty, if no +aggregation is required.) This can be integrated into more complex cron +jobs or systems to trigger aggregation only when needed. + Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You should only need this if for some reason you cannot use cron, and instead want to use a service such as [WebCron](http://webcron.org). To enable @@ -38,63 +32,26 @@ visit is `http://whatever/ikiwiki.cgi?do=aggregate_webtrigger`. Anyone can visit the url to trigger an aggregation run, but it will only check each feed if its `updateinterval` has passed. -## usage - -Here are descriptions of all the supported parameters to the `aggregate` -directive: - -* `name` - A name for the feed. Each feed must have a unique name. - Required. -* `url` - The url to the web page for the feed that's being aggregated. - Required. -* `dir` - The directory in the wiki where pages should be saved. Optional, - if not specified, the directory is based on the name of the feed. -* `feedurl` - The url to the feed. Optional, if it's not specified ikiwiki - will look for feeds on the `url`. RSS and atom feeds are supported. -* `updateinterval` - How often to check for new posts, in minutes. Default - is 15 minutes. -* `expireage` - Expire old items from this feed if they are older than - a specified number of days. Default is to never expire on age. -* `expirecount` - Expire old items from this feed if there are more than - the specified number total. Oldest items will be expired first. Default - is to never expire on count. -* `tag` - A tag to tag each post from the feed with. A good tag to use is - the name of the feed. Can be repeated multiple times. The [[tag]] plugin - must be enabled for this to work. -* `template` - Template to use for creating the aggregated pages. Defaults to - aggregatepost. - -Note that even if you are using subversion or another revision control -system, pages created by aggregation will *not* be checked into revision -control. - -## internal pages +## aggregated pages This plugin creates a page for each aggregated item. -Currently, by default, these pages have the ".html" extension, and are -first-class wiki pages -- which allows them to be inlined into blogs -and even edited. - -That turns out to not be ideal for aggregated content, because publishing -files for each of those pages is a waste of disk space and CPU, and you probably -don't want to allow them to be edited. So, there is an alternate method -that can be used, turned on by the `aggregateinternal` option in the setup -file. - -If `aggregateinternal` is enabled, aggregated pages are stored in the source -directory with a "._aggregate" extension. These pages cannot be edited by -web users, and do not generate first-class wiki pages. They can only be -inlined into a blog. - -If you are already using aggregate and want to enable `aggregateinternal`, -you should follow this process: - -1. Update all [[PageSpecs|ikiwiki/PageSpec]] that refer to the aggregated - pages -- such as those in inlines. Put "internal()" around globs - in those PageSpecs. For example, if the PageSpec was "foo/*", it should - be changed to "internal(foo/*)". This has to be done because internal - pages are not matched by regular globs. -2. Use [[ikiwiki-transition]] to move all existing aggregated `.html` - files. The command to run is `ikiwiki-transition aggregateinternal $srcdir` -3. Turn on `aggregateinternal` in the setup file and rebuild the wiki. +If the `aggregateinternal` option is enabled in the setup file (which is +the default), aggregated pages are stored in the source directory with a +"._aggregated" extension. These pages cannot be edited by web users, and +do not generate first-class wiki pages. They can still be inlined into a +blog, but you have to use `internal` in [[PageSpecs|IkiWiki/PageSpec]], +like `internal(blog/*)`. + +If `aggregateinternal` is disabled, you will need to enable the [[html]] +plugin as well as aggregate itself, since feed entries will be stored as +HTML, and as first-class wiki pages -- each one generates +a separate HTML page in the output, and they can even be edited. This +option is provided only for backwards compatability. + +## cookies + +The `cookiejar` option can be used to configure how [[!cpan LWP::UserAgent]] +handles cookies. The default is to read them from a file +`~/.ikiwiki/cookies`, which can be populated using standard perl cookie +tools like [[!cpan HTTP::Cookies]].