MediaWiki master
RefreshLinksJob Class Reference

Job to update link tables for rerendered wiki pages. More...

Inherits Job.

Collaboration diagram for RefreshLinksJob:

Public Member Functions

 __construct (PageIdentity $page, array $params)
 
 getDeduplicationInfo ()
 Subclasses may need to override this to make duplication detection work.
 
 run ()
 Run the job.
 
 workItemCount ()
 
- Public Member Functions inherited from Job
 __construct ( $command, $params=null)
 
 allowRetries ()
 
Returns
bool Whether this job can be retried on failure by job runners
Since
1.21

 
 getLastError ()
 
Returns
string

 
 getMetadata ( $field=null)
 
 getParams ()
 
Returns
array Parameters that specify sources, targets, and options for execution

 
 getQueuedTimestamp ()
 
 getReadyTimestamp ()
 
Returns
int|null UNIX timestamp of when the job was runnable, or null
Since
1.26

 
 getReleaseTimestamp ()
 
 getRequestId ()
 
Returns
string|null Id of the request that created this job. Follows jobs recursively, allowing to track the id of the request that started a job when jobs insert jobs which insert other jobs.
Since
1.27

 
 getRootJobParams ()
 
 getTitle ()
 
 getType ()
 
Returns
string Job type that defines what sort of changes this job makes

 
 hasExecutionFlag ( $flag)
 
Parameters
int$flagJOB_* class constant
Returns
bool
Since
1.31

 
 hasRootJobParams ()
 
 ignoreDuplicates ()
 Whether the queue should reject insertion of this job if a duplicate exists.
 
 isRootJob ()
 
 setMetadata ( $field, $value)
 
 teardown ( $status)
 
 toString ()
 
Returns
string Debugging string describing the job

 
- Public Member Functions inherited from RunnableJob
 tearDown ( $status)
 Do any final cleanup after run(), deferred updates, and all DB commits happen.
 

Static Public Member Functions

static newDynamic (PageIdentity $page, array $params)
 
static newPrioritized (PageIdentity $page, array $params)
 
- Static Public Member Functions inherited from Job
static factory ( $command, $params=[])
 Create the appropriate object to handle a specific job.
 
static newRootJobParams ( $key)
 Get "root job" parameters for a task.
 

Protected Member Functions

 runForTitle (PageIdentity $pageIdentity)
 
- Protected Member Functions inherited from Job
 addTeardownCallback ( $callback)
 
 setLastError ( $error)
 

Additional Inherited Members

- Public Attributes inherited from Job
string $command
 
array $metadata = []
 Additional queue metadata.
 
array $params
 Array of job parameters.
 
- Protected Attributes inherited from Job
string $error
 Text for error that occurred last.
 
int $executionFlags = 0
 Bitfield of JOB_* class constants.
 
bool $removeDuplicates = false
 Expensive jobs may set this to true.
 
callable[] $teardownCallbacks = []
 
Title $title
 

Detailed Description

Job to update link tables for rerendered wiki pages.

This job comes in a few variants:

  • a) Recursive jobs to update links for backlink pages for a given title. Scheduled by {
    See also
    LinksUpdate::queueRecursiveJobsForTable()}; used to refresh pages which link/transclude a given title. These jobs have (recursive:true,table:
    set. They just look up which pages link to the job title and schedule them as a set of non-recursive RefreshLinksJob jobs (and possible one new recursive job as a way of continuation).
  • b) Jobs to update links for a set of pages (the job title is ignored). These jobs have (pages:(<page ID>:(<namespace>,<title>),...) set.
  • c) Jobs to update links for a single page (the job title). These jobs need no extra fields set.

Job parameters for all jobs:

  • recursive (bool): When false, updates the current page. When true, updates the pages which link/transclude the current page.
  • triggeringRevisionId (int): The revision of the edit which caused the link refresh. For manually triggered updates, the last revision of the page (at the time of scheduling).
  • triggeringUser (array): The user who triggered the refresh, in the form of a [ 'userId' => int, 'userName' => string ] array. This is not necessarily the user who created the revision.
  • triggeredRecursive (bool): Set on all jobs which were partitioned from another, recursive job. For debugging.
  • Standard deduplication params (see {
    See also
    JobQueue::deduplicateRootJob()}). For recursive jobs:
  • table (string): Which table to use (imagelinks or templatelinks) when searching for affected pages.
  • range (array): Used for recursive jobs when some pages have already been partitioned into separate jobs. Contains the list of ranges that still need to be partitioned. See {
    See also
    BacklinkJobUtils::partitionBacklinkJob()}.
  • division: Number of times the job was partitioned already (for debugging). For non-recursive jobs:
  • pages (array): Associative array of [ <page ID> => [ <namespace>, <dbkey> ] ]. Might be omitted, then the job title will be used.
  • isOpportunistic (bool): Set for opportunistic single-page updates. These are "free" updates that are queued when most of the work needed to be performed anyway for non-linkrefresh-related reasons, and can be more easily discarded if they don't seem useful. See {
    See also
    WikiPage::triggerOpportunisticLinksUpdate()}.
  • useRecursiveLinksUpdate (bool): When true, triggers recursive jobs for each page.

Metrics:

  • refreshlinks_warning.<warning>: A recoverable issue. The job will continue as normal.
  • refreshlinks_outcome.<reason>: If the job ends with an unusual outcome, it will increment this exactly once. The reason starts with bad_, a failure is logged and the job may be retried later. The reason starts with good_, the job was cancelled and considered a success, i.e. it was superseded.
See also
RefreshSecondaryDataUpdate
WikiPage::doSecondaryDataUpdates()

Definition at line 94 of file RefreshLinksJob.php.

Constructor & Destructor Documentation

◆ __construct()

RefreshLinksJob::__construct ( PageIdentity  $page,
array  $params 
)

Definition at line 100 of file RefreshLinksJob.php.

References MediaWiki\Page\PageIdentity\canExist().

Member Function Documentation

◆ getDeduplicationInfo()

RefreshLinksJob::getDeduplicationInfo ( )

Subclasses may need to override this to make duplication detection work.

The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.

Stability: stable
to override
Returns
array Map of key/values
Since
1.21

Reimplemented from Job.

Definition at line 510 of file RefreshLinksJob.php.

◆ newDynamic()

static RefreshLinksJob::newDynamic ( PageIdentity  $page,
array  $params 
)
static
Parameters
PageIdentity$page
array$params
Returns
RefreshLinksJob

Definition at line 141 of file RefreshLinksJob.php.

References $job.

◆ newPrioritized()

static RefreshLinksJob::newPrioritized ( PageIdentity  $page,
array  $params 
)
static
Parameters
PageIdentity$page
array$params
Returns
RefreshLinksJob

Definition at line 129 of file RefreshLinksJob.php.

References $job.

Referenced by MediaWiki\Deferred\LinksUpdate\LinksUpdate\queueRecursiveJobs().

◆ run()

RefreshLinksJob::run ( )

◆ runForTitle()

RefreshLinksJob::runForTitle ( PageIdentity  $pageIdentity)
protected
Parameters
PageIdentity$pageIdentity
Returns
bool

Definition at line 208 of file RefreshLinksJob.php.

References Job\setLastError().

Referenced by run().

◆ workItemCount()

RefreshLinksJob::workItemCount ( )
Stability: stable
to override
Returns
int

Reimplemented from Job.

Definition at line 526 of file RefreshLinksJob.php.


The documentation for this class was generated from the following file: