MediaWiki master
|
Job to update link tables for rerendered wiki pages. More...
Inherits Job.
Public Member Functions | ||||
__construct (PageIdentity $page, array $params) | ||||
getDeduplicationInfo () | ||||
Subclasses may need to override this to make duplication detection work. | ||||
run () | ||||
Run the job. | ||||
workItemCount () | ||||
![]() | ||||
__construct ( $command, $params=null) | ||||
allowRetries () | ||||
Whether to retry execution of this job if run() returned false or threw an exception.
| ||||
getLastError () | ||||
| ||||
getMetadata ( $field=null) | ||||
getParams () | ||||
| ||||
getQueuedTimestamp () | ||||
getReadyTimestamp () | ||||
| ||||
getReleaseTimestamp () | ||||
getRequestId () | ||||
| ||||
getRootJobParams () | ||||
getTitle () | ||||
getType () | ||||
| ||||
hasExecutionFlag ( $flag) | ||||
| ||||
hasRootJobParams () | ||||
ignoreDuplicates () | ||||
Whether the queue should reject insertion of this job if a duplicate exists. | ||||
isRootJob () | ||||
setMetadata ( $field, $value) | ||||
teardown ( $status) | ||||
toString () | ||||
| ||||
![]() | ||||
tearDown ( $status) | ||||
Do any final cleanup after run(), deferred updates, and all DB commits happen. | ||||
Static Public Member Functions | |
static | newDynamic (PageIdentity $page, array $params) |
static | newPrioritized (PageIdentity $page, array $params) |
![]() | |
static | factory ( $command, $params=[]) |
Create the appropriate object to handle a specific job. | |
static | newRootJobParams ( $key) |
Get "root job" parameters for a task. | |
Protected Member Functions | |
runForTitle (PageIdentity $pageIdentity) | |
![]() | |
addTeardownCallback ( $callback) | |
setLastError ( $error) | |
Additional Inherited Members | |
![]() | |
string | $command |
array | $metadata = [] |
Additional queue metadata. | |
array | $params |
Array of job parameters. | |
![]() | |
string | $error |
Text for error that occurred last. | |
int | $executionFlags = 0 |
Bitfield of JOB_* class constants. | |
bool | $removeDuplicates = false |
Expensive jobs may set this to true. | |
callable[] | $teardownCallbacks = [] |
Title | $title |
Job to update link tables for rerendered wiki pages.
This job comes in a few variants:
Job parameters for all jobs:
Metrics:
refreshlinks_superseded_updates_total
: The number of times the job was cancelled because the target page had already been refreshed by a different edit or job. The job is considered to have succeeded in this case.refreshlinks_warnings_total
: The number of times the job failed due to a recoverable issue. Possible reason
label values include:lag_wait_failed
: The job timed out while waiting for replication.refreshlinks_failures_total
: The number of times the job failed. The reason
label may be:page_not_found
: The target page did not exist.rev_not_current
: The target revision was no longer the latest revision for the target page.rev_not_found
: The target revision was not found.lock_failure
: The job failed to acquire an exclusive lock to refresh the target page.refreshlinks_parsercache_operations_total
: The number of times the job attempted to fetch parser output from the parser cache. Possible status
label values include:cache_hit
: The parser output was found in the cache.cache_miss
: The parser output was not found in the cache.Definition at line 110 of file RefreshLinksJob.php.
RefreshLinksJob::__construct | ( | PageIdentity | $page, |
array | $params ) |
Definition at line 116 of file RefreshLinksJob.php.
References $params, and MediaWiki\Page\PageIdentity\canExist().
RefreshLinksJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.
The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Reimplemented from Job.
Definition at line 615 of file RefreshLinksJob.php.
|
static |
PageIdentity | $page | |
array | $params |
Definition at line 157 of file RefreshLinksJob.php.
|
static |
PageIdentity | $page | |
array | $params |
Definition at line 145 of file RefreshLinksJob.php.
Referenced by MediaWiki\Deferred\LinksUpdate\LinksUpdate\queueRecursiveJobs().
RefreshLinksJob::run | ( | ) |
Run the job.
If this method returns false
or completes exceptionally, the job runner will retry executing this job unless the number of retries has exceeded its configured retry limit. Retries are allowed by default, unless allowRetries() is overridden to disable retries.
See the architecture doc for more information.
false
to instruct the job runner to retry a failed job. Otherwise return true
to indicate that a job completed (i.e. succeeded, or failed in a way that's deterministic or redundant). Implements RunnableJob.
Definition at line 164 of file RefreshLinksJob.php.
References Job\$title, MediaWiki\Title\Title\canExist(), Job\getRootJobParams(), BacklinkJobUtils\partitionBacklinkJob(), runForTitle(), and setLastError().
|
protected |
PageIdentity | $pageIdentity |
Definition at line 227 of file RefreshLinksJob.php.
References setLastError().
Referenced by run().
RefreshLinksJob::workItemCount | ( | ) |
Reimplemented from Job.
Definition at line 631 of file RefreshLinksJob.php.