If a page is opened for editing, a lock entry gets created. In the case the edit action is not completed by saving the page (even unchanged), the lock will remain. It has to be removed manually by following the provided link on the next edit action. This behavior is implied by the lock concept and not a bug. But it may come up as a surprise.
There is a bug in PWP which may cause static pages to contain dynamic links instead of links to HTML pages. The workaround is to clear the cache before crating static pages. This bug will be resolved in higher releases.
Depending on the PHP version and/or the OS, newly created Wiki pages appear empty after saving them. A reload makes the page appear completely. This error doesn't seem to happen with PHP 4.2 on Win2K or SUSE 7.1. Cause is a missing call to the function 'clearstatcache()'. PHP is lazy in reading file infos from a directory; the infos remain cached, even if the files change their size. 'clearstatcache()' would force a reload of all cached information. All user input get's correctly stored in a data file, there is no data loss.
Under OSX and other Unix based OS there may be "hidden" system files visible in the data, upload and trash directory. Their file names are starting with a dot. As long as you do not touch these files, everything is OK. Further PWP releases will pay attention to this OS specific behaviour and hide all files starting with a dot.
PWP uses the PHP function "htmlentities()" which seems to cause problems with multibyte languages. As workaround you should configure your PWP installation to allow HTML tags and restrict the tag set to some unimportant tag only.
$this->mProp['AllowedTags'] = '<br>';
Spiders, crawlers, search robots or whatever is out there indexing web sites may delete your Wiki pages into the trash bin and mix up your revision history - all 'accidentally'. The searchers simply follow every link on a page - and there is one 'erase' link on every page. The Wiki page index contains a whole bunch of 'erase' links! This scenario affects open Wikis, only. Installations protected by .htaccess or a similar mechanisms are on the safe side.
Solution: Guide the search engines to your static pages and deny access to the dynamic part by providing a robots.txt in your web server root dir. For more information about this file, just search the web...
In higher releases there will be JavaScript links preventing the search engines from erasing pages.
(Powered by PWP Version 1.4.0)