MediaWiki REL1_39
|
Job to purge the HTML/file cache for all pages that link to or use another page or file. More...
Public Member Functions | ||||
__construct (Title $title, array $params) | ||||
getDeduplicationInfo () | ||||
Subclasses may need to override this to make duplication detection work. | ||||
run () | ||||
Run the job. | ||||
workItemCount () | ||||
Public Member Functions inherited from Job | ||||
__construct ( $command, $params=null) | ||||
allowRetries () | ||||
| ||||
getLastError () | ||||
| ||||
getMetadata ( $field=null) | ||||
getParams () | ||||
| ||||
getQueuedTimestamp () | ||||
getReadyTimestamp () | ||||
| ||||
getReleaseTimestamp () | ||||
getRequestId () | ||||
| ||||
getRootJobParams () | ||||
getTitle () | ||||
getType () | ||||
| ||||
hasExecutionFlag ( $flag) | ||||
| ||||
hasRootJobParams () | ||||
ignoreDuplicates () | ||||
Whether the queue should reject insertion of this job if a duplicate exists. | ||||
isRootJob () | ||||
setMetadata ( $field, $value) | ||||
teardown ( $status) | ||||
toString () | ||||
| ||||
Public Member Functions inherited from RunnableJob | ||||
tearDown ( $status) | ||||
Do any final cleanup after run(), deferred updates, and all DB commits happen. | ||||
Static Public Member Functions | |
static | newForBacklinks (PageReference $page, $table, $params=[]) |
Static Public Member Functions inherited from Job | |
static | factory ( $command, $params=[]) |
Create the appropriate object to handle a specific job. | |
static | newRootJobParams ( $key) |
Get "root job" parameters for a task. | |
Protected Member Functions | |
invalidateTitles (array $pages) | |
Protected Member Functions inherited from Job | |
addTeardownCallback ( $callback) | |
setLastError ( $error) | |
Additional Inherited Members | |
Public Attributes inherited from Job | |
string | $command |
array | $metadata = [] |
Additional queue metadata. | |
array | $params |
Array of job parameters. | |
Protected Attributes inherited from Job | |
string | $error |
Text for error that occurred last. | |
int | $executionFlags = 0 |
Bitfield of JOB_* class constants. | |
bool | $removeDuplicates = false |
Expensive jobs may set this to true. | |
callable[] | $teardownCallbacks = [] |
Title | $title |
Job to purge the HTML/file cache for all pages that link to or use another page or file.
This job comes in a few variants:
Definition at line 37 of file HTMLCacheUpdateJob.php.
HTMLCacheUpdateJob::__construct | ( | Title | $title, |
array | $params ) |
Definition at line 41 of file HTMLCacheUpdateJob.php.
References $title.
HTMLCacheUpdateJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.
The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Reimplemented from Job.
Definition at line 181 of file HTMLCacheUpdateJob.php.
|
protected |
array | $pages | Map of (page ID => (namespace, DB key)) entries |
Definition at line 119 of file HTMLCacheUpdateJob.php.
References DB_PRIMARY, TitleArray\newFromResult(), and wfTimestampOrNull().
Referenced by run().
|
static |
PageReference | $page | Page to purge backlink pages from |
string | $table | Backlink table name |
array | $params | Additional job parameters |
Definition at line 62 of file HTMLCacheUpdateJob.php.
References Job\$params, $title, Job\newRootJobParams(), and true.
HTMLCacheUpdateJob::run | ( | ) |
Run the job.
Implements RunnableJob.
Definition at line 76 of file HTMLCacheUpdateJob.php.
References $t, $title, Job\getRootJobParams(), invalidateTitles(), and BacklinkJobUtils\partitionBacklinkJob().
HTMLCacheUpdateJob::workItemCount | ( | ) |
Reimplemented from Job.
Definition at line 195 of file HTMLCacheUpdateJob.php.