Administrative scripts#
blockpageschecker script#
A bot to remove stale protection templates from unprotected pages
Very often sysops block the pages for a set time but then they forget to remove the warning! This script is useful if you want to remove those useless warning left in these pages.
These command line parameters can be used to specify which pages to work on:
This script supports use of pagegenerators
arguments.
Furthermore, the following command line parameters are supported:
- -protectedpages
Check all the blocked pages; useful when you have not categories or when you have problems with them. (add the namespace after “:” where you want to check - default checks all protected pages.)
- -moveprotected
Same as -protectedpages, for moveprotected pages
This script is a ConfigParserBot
. The
following options can be set within a settings file which is scripts.ini
by default:
- -always
Doesn’t ask every time whether the bot should make the change. Do it always.
- -show
When the bot can’t delete the template from the page (wrong regex or something like that) it will ask you if it should show the page on your browser.
Attention
Pages included may give false positives!
- -move
The bot will check if the page is blocked also for the move option, not only for edit
Examples:
python pwb.py blockpageschecker -always
python pwb.py blockpageschecker -cat:Geography -always
python pwb.py blockpageschecker -show -protectedpages:4
delete script#
This script can be used to delete and undelete pages en masse
Of course, you will need an admin account on the relevant wiki.
These command line parameters can be used to specify which pages to work on:
This script supports use of pagegenerators
arguments.
Furthermore, the following command line parameters are supported:
- -always
Don’t prompt to delete pages, just do it.
- -summary:XYZ
Set the summary message text for the edit to XYZ.
- -undelete
Actually undelete pages instead of deleting. Obviously makes sense only with -page and -file.
- -isorphan
Alert if there are pages that link to page to be deleted (check ‘What links here’). By default it is active and only the summary per namespace is be given. If given as
-isorphan:n
, n pages per namespace will be shown. If given as-isorphan:0
, only the summary per namespace will be shown. If given as-isorphan:n
, with n < 0, the option is disabled. This option is disregarded if-always
is set.
- -orphansonly:
Specified namespaces. Separate multiple namespace numbers or names with commas. Examples:
-orphansonly:0,2,4 -orphansonly:Help,MediaWiki
Note that Main ns can be indicated either with a 0 or a ‘,’:
-orphansonly:0,1 -orphansonly:,Talk
Usage:
python pwb.py delete [-category categoryName]
Examples
Delete everything in the category “To delete” without prompting:
python pwb.py delete -cat:”To delete” -always
patrol script#
The bot is meant to mark the edits based on info obtained by whitelist
This bot obtains a list of recent changes and newpages and marks the edits as patrolled based on a whitelist.
Whitelist Format#
The whitelist is formatted as a number of list entries. Any links outside of lists are ignored and can be used for documentation. In a list the first link must be to the username which should be white listed and any other link following is adding that page to the white list of that username. If the user edited a page on their white list it gets patrolled. It will also patrol pages which start with the mentioned link (e.g. [[foo]] will also patrol [[foobar]]).
To avoid redlinks it’s possible to use Special:PrefixIndex as a prefix so that it will list all pages which will be patrolled. The page after the slash will be used then.
On Wikisource, it’ll also check if the page is on the author namespace in which case it’ll also patrol pages which are linked from that page.
An example can be found at https://en.wikisource.org/wiki/User:Wikisource-bot/patrol_whitelist
Commandline parameters:
- -namespace
Filter the page generator to only yield pages in specified namespaces
- -ask
If True, confirm each patrol action
- -whitelist
page title for whitelist (optional)
- -autopatroluserns
Takes user consent to automatically patrol
- -versionchecktime
Check versionchecktime lapse in sec
- -repeat
Repeat run after 60 seconds
- -newpages
Run on unpatrolled new pages (default for Wikipedia Projects)
- -recentchanges
Run on complete unpatrolled recentchanges (default for any project except Wikipedia Projects)
- -usercontribs
Filter generators above to the given user
protect script#
This script can be used to protect and unprotect pages en masse
Of course, you will need an admin account on the relevant wiki. These command line parameters can be used to specify which pages to work on:
This script supports use of pagegenerators
arguments.
Furthermore, the following command line parameters are supported:
- -unprotect
Acts like “default:all”
- -default:
Sets the default protection level (default ‘sysop’). If no level is defined it doesn’t change unspecified levels.
-[type]:[level] Set [type] protection level to [level]
Usual values for [level] are: sysop, autoconfirmed, all; further levels may be provided by some wikis.
For all protection types (edit, move, etc.) it chooses the default protection level. This is “sysop” or “all” if -unprotect was selected. If multiple parameters -unprotect or -default are used, only the last occurrence is applied.
This script is a ConfigParserBot
.
The following options can be set within a settings file which is scripts.ini
by default:
-always Don't prompt to protect pages, just do it.
- -summary:
Supply a custom edit summary. Tries to generate summary from the page selector. If no summary is supplied or couldn’t determine one from the selector it’ll ask for one.
- -expiry:
Supply a custom protection expiry, which defaults to indefinite. Any string understandable by MediaWiki, including relative and absolute, is acceptable. See: API:Protect#Parameters
Usage:
python pwb.py protect <OPTIONS>
Examples
Protect everything in the category ‘To protect’ prompting:
python pwb.py protect -cat:”To protect”
Unprotect all pages listed in text file ‘unprotect.txt’ without prompting:
python pwb.py protect -file:unprotect.txt -unprotect -always
revertbot script#
This script can be used for reverting certain edits
The following command line parameters are supported:
- -username
Edits of which user need to be reverted. Default is bot’s username (
site.username()
).- -rollback
Rollback edits instead of reverting them.
Note
No diff would be shown in this mode.
- -limit:num
(int) Use the last num contributions to be checked for revert. Default is 500.
Users who want to customize the behaviour should subclass the BaseRevertBot
and override its callback
method. Here is a sample:
class myRevertBot(BaseRevertBot):
'''Example revert bot.'''
def callback(self, item) -> bool:
'''Sample callback function for 'private' revert bot.
:param item: an item from user contributions
:type item: dict
'''
if 'top' in item:
page = pywikibot.Page(self.site, item['title'])
text = page.get(get_redirect=True)
pattern = re.compile(r'\[\[.+?:.+?\..+?\]\]')
return bool(pattern.search(text))
return False
speedy_delete script#
Help sysops to quickly check and/or delete pages listed for speedy deletion
This bot trawls through candidates for speedy deletion in a fast and semi-automated fashion. It displays the contents of each page one at a time and provides a prompt for the user to skip or delete the page. Of course, this will require a sysop account.
Future upcoming options include the ability to untag a page as not being eligible for speedy deletion, as well as the option to commute its sentence to Proposed Deletion (see [[en:WP:PROD]] for more details). Also, if the article text is long, to prevent terminal spamming, it might be a good idea to truncate it just to the first so many bytes.
Warning
This tool shows the contents of the top revision only. It is possible that a vandal has replaced a perfectly good article with nonsense, which has subsequently been tagged by someone who didn’t realize it was previously a good article. The onus is on you to avoid making these mistakes.
Note
This script currently only works for the Wikipedia project.