PWP wiki processor

Installation

TextRules . SimpleTextRules . Installation . Configuration
UserManual . FAQ . »find more online

 -Quick install
 -System requirements
 -Access rights and security considerations
 -Data handling
 -Directory structure

+Don't panic.+ For a quick glance at PWP you have to read just the first section. You are ready to run in a few minutes.

Quick install

  1. Just unzip everything into the root directory (or any subdirectory) of a PHP enabled web server. You do not have to adapt path names, etc. Just start through! If PWP resides in the root directory, you may use: / or /wiki to start PWP. /static will call the static directory index file, /help contains this manual.
  2. Unix OS, MAC OSX: Adjust the directory rights if needed. PWP should be able to write within the directory /wiki and its subdirs, and of course in /static.
  3. /wiki/conf/Config.inc and other config files could to be edited later for changing HTML keywords, InterWiki link targets, etc.

System requirements

PHP 4.2 is required - that's all.

register_globals=off is supported; all variables are fetched from $_REQUEST, etc. PWP uses the built in Perl regular expressions preg_xxx of PHP 4.

The application itself requires less than 0.33 MB, you will need sufficient web space for your Wiki pages and especially the uploaded files.

Access rights and security considerations

PWP basically supports two scenarios described in the following paragraphs. The first one is to run a completely open Wiki where everyone is allowed to do everything. The second one is to grant everyone access to the static pages but restrict access to the Wiki itself for a closed group of editors.

What PWP does not support is an ownership based rights management, e.g. only the 'owner' of a page can erase it. Such a feature is beyond the concept of PWP. If you rely on ownership based rights, you should consider using a 'real' content management system (CMS). Some can be found starting from the Links page.

Open Wiki

An open Wiki supports the original idea: Collaboration and information exchange in a group or community. Everyone can join and amend data. It is recommended to configure the MaxAge of files in the history and in the trash to be at least 7 days. This means a file has to be in the trash or in the history for 7 days before it can become erased physically. It prevents an open Wiki from a delete attack: You can undo most of the changes even after a few days.

Another danger for open Wikis are search engine spiders, also known as search robots. Such a robot follows every link on a page stupidly - even the 'erase' links. You have two choices for handling this issue:-

Here is a sample 'robots.txt' file

User-Agent *
Disallow /my_path/wiki/

The path under disallow must match the part of the URL which is following the server name, i.e. the first part of an absolute link. There is no explicit 'Allow' directive, any robot will just follow all links starting from an URL or web page which was registered in the search engine.

You should consider to disallow HTML input, which is actually the default setting in your config, or to restrict HTML input to a few safe tags. Visitors might include JavsScript calls on the DHTML attributes like 'onclick' or 'onfocus'.

The last recommendation is to disable the web configuration of variables. An open Wiki should be able to work without variables.

Closed group of editors

PWP itself does not implement any user rights management. The reason is simple: Your webserver already does - I didn't want to re-implement such a feature for the nth time. There is the danger of coding new bugs and, further, PWP would force the user to memorize one more password. Using the web server's access management an user can work with several applications having one centrally managed password.

The disadvantage of this concept is that there are no action based access restrictions. Every editor is allowed to execute every action: unerase pages, create static pages, etc. You have to form a trustworthy group of editors.

The principle for a closed group of editors is simple. You have to restrict the access to the /wiki/ directory and all its subdirectories. You may decide also to restrict the access to the /static/ directory as well.

The access restriction to directories also solves the "search robots problem" described above under 'Open Wiki'.

Apache manages access rights per directory using a '.htaccess' file. Maybe your web provider already installed a web based tool for managing '.htaccess' files. Otherwise you can follow this short guide:

Go to the {webdir}/wiki and edit the existing .htaccess file. Protect the directory and all subdirectories by defining "require valid-user":

 AuthUserFile {file_system_path_to_.htpasswd}
 AuthGroupFile {file_system_path_to_.htgroups}
 AuthName {MyAuthRealm}
 AuthType Basic

 require valid-user

At {file_system_path_to_.htpasswd} must be a file '.htaccess' which contains one line of text for each user. It consists of the user name and an encrypted password. This file is generated by a tool. For WindowsOS just google for 'htpasswd.exe'.

Another recommendation for (local) Windows environments is the Sambar web server »http://www.sambar.com. There is a free and a professional version. The first one has a nice web based user management function, the latter one also supports the user managament based on an NT domain controller.

Data handling

Creating a backup of your Wiki pages

All your Wiki pages reside in the directory /wiki/data. Every Wiki page is in its own file. The file name matches the Wiki page name. The uploaded files are stored in the directory /wiki/upload. It would be sufficient to backup these two directories.

The older revisions of your pages are in the directory /wiki/history. Every file name corresponds to the name of a Wiki page or an uploaded file. All files in this directory have an unique numeric file name extension. Decide for yourself if it is worth keeping older revisions in your backup. It is advisable on an open Wiki, otherwise you might realise too late that someone edited an important page and saved it empty.

In the directory /wiki/trash are the deleted files; both Wiki pages and uploaded files. Make a backup if you are running an open Wiki.

In any case: ignore the cache in /wiki/cache.

Getting a copy of the static pages

Simply generate static web pages using the menu item 'Extra'. You will need to copy these directories for a backup:-

Static pages do not contain Wiki mark-up. They consist of 100% static HTML. The static pages can be copied on CD-ROM or every other medium. All path entries inside the static pages are relative (Unless YOU did link images using an absolute path, see FAQ).

File time and FTP backup

PWP features like RecentChanges, etc. are based on the file attributes created and last modified. An FTP based file transfer exchanges only the file contents itself - not the file attributes. If you backup your Wiki data using FTP, all time based file attributes will we set to the current time, i.e. all files have been created and modified recently.

A workaround might be to store your data files in a zip archive and transport the zip file via FTP.

Directory structure

Do you want to proceed to Download?

   (Powered by PWP Version 1-4-3)