MediaWiki REL1_31
|
Job to purge the cache for all pages that link to or use another page or file. More...
Public Member Functions | |
__construct (Title $title, array $params) | |
getDeduplicationInfo () | |
Subclasses may need to override this to make duplication detection work. | |
run () | |
Run the job. | |
workItemCount () | |
Public Member Functions inherited from Job | |
__construct ( $command, $title, $params=false) | |
allowRetries () | |
getLastError () | |
getParams () | |
getQueuedTimestamp () | |
getReadyTimestamp () | |
getReleaseTimestamp () | |
getRequestId () | |
getRootJobParams () | |
getTitle () | |
getType () | |
hasExecutionFlag ( $flag) | |
hasRootJobParams () | |
ignoreDuplicates () | |
Whether the queue should reject insertion of this job if a duplicate exists. | |
insert () | |
Insert a single job into the queue. | |
isRootJob () | |
teardown ( $status) | |
Do any final cleanup after run(), deferred updates, and all DB commits happen. | |
toString () | |
Static Public Member Functions | |
static | newForBacklinks (Title $title, $table, $params=[]) |
Static Public Member Functions inherited from Job | |
static | batchInsert ( $jobs) |
Batch-insert a group of jobs into the queue. | |
static | factory ( $command, Title $title, $params=[]) |
Create the appropriate object to handle a specific job. | |
static | newRootJobParams ( $key) |
Get "root job" parameters for a task. | |
Protected Member Functions | |
invalidateTitles (array $pages) | |
Protected Member Functions inherited from Job | |
addTeardownCallback ( $callback) | |
setLastError ( $error) | |
Additional Inherited Members | |
Public Attributes inherited from Job | |
string | $command |
array | $metadata = [] |
Additional queue metadata. | |
array | $params |
Array of job parameters. | |
Protected Attributes inherited from Job | |
string | $error |
Text for error that occurred last. | |
int | $executionFlags = 0 |
Bitfield of JOB_* class constants. | |
bool | $removeDuplicates |
Expensive jobs may set this to true. | |
callable[] | $teardownCallbacks = [] |
Title | $title |
Job to purge the cache for all pages that link to or use another page or file.
This job comes in a few variants:
Definition at line 38 of file HTMLCacheUpdateJob.php.
HTMLCacheUpdateJob::__construct | ( | Title | $title, |
array | $params ) |
Definition at line 39 of file HTMLCacheUpdateJob.php.
References Job\$params.
HTMLCacheUpdateJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.
The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Reimplemented from Job.
Definition at line 179 of file HTMLCacheUpdateJob.php.
|
protected |
array | $pages | Map of (page ID => (namespace, DB key)) entries |
Definition at line 112 of file HTMLCacheUpdateJob.php.
References $batch, $urls, $wgPageLanguageUseDB, $wgUpdateRowsPerQuery, $wgUseFileCache, HTMLFileCache\clearFileCache(), DB_MASTER, TitleArray\newFromResult(), CdnCacheUpdate\purge(), wfGetDB(), and wfTimestampNow().
Referenced by run().
|
static |
Title | $title | Title to purge backlink pages from |
string | $table | Backlink table name |
array | $params | Additional job parameters |
Definition at line 59 of file HTMLCacheUpdateJob.php.
References Job\$params, Job\$title, Job\newRootJobParams(), and true.
Referenced by HTMLCacheUpdate\doUpdate().
HTMLCacheUpdateJob::run | ( | ) |
Run the job.
Reimplemented from Job.
Definition at line 71 of file HTMLCacheUpdateJob.php.
References $t, Job\$title, $wgUpdateRowsPerJob, $wgUpdateRowsPerQuery, Job\getRootJobParams(), invalidateTitles(), BacklinkJobUtils\partitionBacklinkJob(), and JobQueueGroup\singleton().
HTMLCacheUpdateJob::workItemCount | ( | ) |
Reimplemented from Job.
Definition at line 193 of file HTMLCacheUpdateJob.php.