tests.site_detect_tests module#

Test for site detection.

class tests.site_detect_tests.MediaWikiSiteTestCase(*args, **kwargs)[source]#

Bases: DisableSiteMixin, SiteDetectionTestCase

Test detection of MediaWiki sites.

abstract_class = False#
failing_sites = [('http://wikisophia.org/index.php?title=$1', '/index.php?title=$1 reports 404, however a wiki exists there, but the API is also hidden'), ('http://wiki.animutationportal.com/index.php/$1', 'SSL certificate verification fails'), ('http://wiki.opensprints.org/index.php?title=$1', 'offline'), ('http://musicbrainz.org/doc/$1', 'Possible false positive caused by the existence of a page called http://musicbrainz.org/doc/api.php.')]#
net = True#
no_sites = ('http://www.imdb.com/name/nm$1/', 'http://www.ecyrd.com/JSPWiki/Wiki.jsp?page=$1', 'http://www.tvtropes.org/pmwiki/pmwiki.php/Main/$1', 'http://c2.com/cgi/wiki?$1', 'https://phabricator.wikimedia.org/$1', 'http://www.merriam-webster.com/cgi-bin/dictionary?book=Dictionary&va=$1', 'http://arxiv.org/abs/$1', 'https://wikimediafoundation.org/$1')#
non_standard_version_sites = ('https://wiki.gentoo.org/wiki/$1', 'https://www.arabeyes.org/$1')#
old_version_sites = ('http://tfwiki.net/wiki/$1', 'http://www.hrwiki.org/index.php/$1', 'http://www.thelemapedia.org/index.php/$1', 'http://www.werelate.org/wiki/$1', 'http://www.otterstedt.de/wiki/index.php/$1', 'https://en.wikifur.com/wiki/$1')#
standard_version_sites = ('http://www.ck-wissen.de/ckwiki/index.php?title=$1', 'http://en.citizendium.org/wiki/$1', 'http://www.wikichristian.org/index.php?title=$1', 'http://kb.mozillazine.org/$1')#
test_failing_sites()[source]#

Test detection of failing MediaWiki sites.

test_no_sites()[source]#

Test detection of non-MediaWiki sites.

test_non_standard_version_sites()[source]#

Test detection of non standard MediaWiki sites.

test_old_version_sites()[source]#

Test detection of old MediaWiki sites.

test_proofreadwiki()[source]#

Test detection of proofwiki.org site.

test_standard_version_sites()[source]#

Test detection of standard MediaWiki sites.

class tests.site_detect_tests.PrivateWikiTestCase(*args, **kwargs)[source]#

Bases: DisableSiteMixin, PatchingTestCase

Test generate_family_file works for private wikis.

APIPATH = '/w/api.php'#
LANG = 'ike-cans'#
NETLOC = 'privatewiki.example.com'#
SCHEME = 'https'#
SCRIPTPATH = '/w'#
Site(code=None, fam=None, user=None, *args, **kwargs)[source]#

Patched version of pywikibot.Site.

USERNAME = 'Private Wiki User'#
VERSION = '1.33.0'#
WEBPATH = '/wiki/'#
abstract_class = False#
fetch(url, *args, **kwargs)[source]#

Patched version of pywikibot.site_detect.fetch.

input(question, *args, **kwargs)[source]#

Patched version of pywikibot.input.

test_T235768_failure()[source]#

Test generate_family_file works for private wikis.

generate_family_file.FamilyFileGenerator.run() does:

w = self.Wiki(self.base_url) self.wikis[w.lang] = w

where self.Wiki is pywikibot.site_detect.MWSite.__init__. That calls MWSite._parse_post_117() which sets lang, but that call’s wrapped to log exceptions and then continue past them. In T235768, the code that handles private wikis raises an exception that’s consumed in that way. The value returned to FamilyFileGenerator.run() does not have lang set, causing generate_family_file to bomb.

class tests.site_detect_tests.SiteDetectionTestCase(*args, **kwargs)[source]#

Bases: TestCase

Testcase for MediaWiki detection and site object creation.

abstract_class = True#
assertNoSite(url)[source]#

Assert a url is not a MediaWiki site.

Parameters:

url (str) – Url of tested site

Raises:

AssertionError – Site under url is MediaWiki powered

assertSite(url)[source]#

Assert a MediaWiki site can be loaded from the url.

Parameters:

url (str) – Url of tested site

Raises:

AssertionError – Site under url is not MediaWiki powered

net = True#