|
MediaWiki master
|
Job to purge the HTML/file cache for all pages that link to or use another page or file. More...
Inherits MediaWiki\JobQueue\Job.

Public Member Functions | ||||
| __construct (Title $title, array $params) | ||||
| getDeduplicationInfo () | ||||
Subclasses may need to override this to make duplication detection work.The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
| ||||
| run () | ||||
Run the job.If this method returns false or completes exceptionally, the job runner will retry executing this job unless the number of retries has exceeded its configured retry limit. Retries are allowed by default, unless allowRetries() is overridden to disable retries.See the architecture doc for more information.
| ||||
| workItemCount () | ||||
| ||||
Public Member Functions inherited from MediaWiki\JobQueue\Job | ||||
| __construct ( $command, $params=null) | ||||
| allowRetries () | ||||
Whether to retry execution of this job if run() returned false or threw an exception.
| ||||
| getLastError () | ||||
| ||||
| getMetadata ( $field=null) | ||||
| getParams () | ||||
| ||||
| getQueuedTimestamp () | ||||
| getReadyTimestamp () | ||||
| ||||
| getReleaseTimestamp () | ||||
| getRequestId () | ||||
| ||||
| getRootJobParams () | ||||
| getTitle () | ||||
| getType () | ||||
| ||||
| hasExecutionFlag ( $flag) | ||||
| ||||
| hasRootJobParams () | ||||
| ignoreDuplicates () | ||||
| Whether the queue should reject insertion of this job if a duplicate exists. | ||||
| isRootJob () | ||||
| setMetadata ( $field, $value) | ||||
| teardown ( $status) | ||||
| toString () | ||||
| ||||
Public Member Functions inherited from MediaWiki\JobQueue\RunnableJob | ||||
| tearDown ( $status) | ||||
| Do any final cleanup after run(), deferred updates, and all DB commits happen. | ||||
Static Public Member Functions | |
| static | newForBacklinks (PageReference $page, $table, $params=[]) |
Static Public Member Functions inherited from MediaWiki\JobQueue\Job | |
| static | factory ( $command, $params=[]) |
| Create the appropriate object to handle a specific job. | |
| static | newRootJobParams ( $key) |
| Get "root job" parameters for a task. | |
Protected Member Functions | |
| invalidateTitles (array $pages) | |
Protected Member Functions inherited from MediaWiki\JobQueue\Job | |
| addTeardownCallback ( $callback) | |
| setLastError ( $error) | |
Additional Inherited Members | |
Public Attributes inherited from MediaWiki\JobQueue\Job | |
| string | $command |
| array | $metadata = [] |
| Additional queue metadata. | |
| array | $params |
| Array of job parameters. | |
Protected Attributes inherited from MediaWiki\JobQueue\Job | |
| string | $error |
| Text for error that occurred last. | |
| int | $executionFlags = 0 |
| Bitfield of JOB_* class constants. | |
| bool | $removeDuplicates = false |
| Expensive jobs may set this to true. | |
| callable[] | $teardownCallbacks = [] |
| Title | $title |
Job to purge the HTML/file cache for all pages that link to or use another page or file.
This job comes in a few variants:
Definition at line 29 of file HTMLCacheUpdateJob.php.
| MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob::__construct | ( | Title | $title, |
| array | $params ) |
Definition at line 33 of file HTMLCacheUpdateJob.php.
References MediaWiki\JobQueue\Job\$params, and MediaWiki\JobQueue\Job\$title.
| MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Reimplemented from MediaWiki\JobQueue\Job.
Definition at line 173 of file HTMLCacheUpdateJob.php.
|
protected |
| array | $pages | Map of (page ID => (namespace, DB key)) entries |
Definition at line 111 of file HTMLCacheUpdateJob.php.
References MediaWiki\MediaWikiServices\getInstance(), MediaWiki\MainConfigNames\PageLanguageUseDB, MediaWiki\MainConfigNames\UpdateRowsPerQuery, wfTimestamp(), and wfTimestampOrNull().
Referenced by MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob\run().
|
static |
| PageReference | $page | Page to purge backlink pages from |
| string | $table | Backlink table name |
| array | $params | Additional job parameters |
Definition at line 54 of file HTMLCacheUpdateJob.php.
References MediaWiki\JobQueue\Job\$params, MediaWiki\JobQueue\Job\$title, and MediaWiki\JobQueue\Job\newRootJobParams().
| MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob::run | ( | ) |
Run the job.If this method returns false or completes exceptionally, the job runner will retry executing this job unless the number of retries has exceeded its configured retry limit. Retries are allowed by default, unless allowRetries() is overridden to disable retries.See the architecture doc for more information.
false to instruct the job runner to retry a failed job. Otherwise return true to indicate that a job completed (i.e. succeeded, or failed in a way that's deterministic or redundant).Implements MediaWiki\JobQueue\RunnableJob.
Definition at line 68 of file HTMLCacheUpdateJob.php.
References MediaWiki\JobQueue\Job\$title, MediaWiki\MediaWikiServices\getInstance(), MediaWiki\JobQueue\Job\getRootJobParams(), MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob\invalidateTitles(), MediaWiki\MainConfigNames\UpdateRowsPerJob, and MediaWiki\MainConfigNames\UpdateRowsPerQuery.
| MediaWiki\JobQueue\Jobs\HTMLCacheUpdateJob::workItemCount | ( | ) |
Reimplemented from MediaWiki\JobQueue\Job.
Definition at line 188 of file HTMLCacheUpdateJob.php.