MediaWiki REL1_29
|
Job to update link tables for pages. More...
Public Member Functions | |
__construct (Title $title, array $params) | |
getDeduplicationInfo () | |
Subclasses may need to override this to make duplication detection work. | |
run () | |
Run the job. | |
workItemCount () | |
![]() | |
__construct ( $command, $title, $params=false) | |
allowRetries () | |
getLastError () | |
getParams () | |
getQueuedTimestamp () | |
getReadyTimestamp () | |
getReleaseTimestamp () | |
getRequestId () | |
getRootJobParams () | |
getTitle () | |
getType () | |
hasRootJobParams () | |
ignoreDuplicates () | |
Whether the queue should reject insertion of this job if a duplicate exists. | |
insert () | |
Insert a single job into the queue. | |
isRootJob () | |
teardown ( $status) | |
Do any final cleanup after run(), deferred updates, and all DB commits happen. | |
toString () | |
Static Public Member Functions | |
static | newDynamic (Title $title, array $params) |
static | newPrioritized (Title $title, array $params) |
![]() | |
static | batchInsert ( $jobs) |
Batch-insert a group of jobs into the queue. | |
static | factory ( $command, Title $title, $params=[]) |
Create the appropriate object to handle a specific job. | |
static | newRootJobParams ( $key) |
Get "root job" parameters for a task. | |
Protected Member Functions | |
runForTitle (Title $title) | |
![]() | |
addTeardownCallback ( $callback) | |
setLastError ( $error) | |
Additional Inherited Members | |
![]() | |
string | $command |
array | $metadata = [] |
Additional queue metadata. | |
array | $params |
Array of job parameters. | |
![]() | |
string | $error |
Text for error that occurred last. | |
bool | $removeDuplicates |
Expensive jobs may set this to true. | |
callable[] | $teardownCallbacks = [] |
Title | $title |
Job to update link tables for pages.
This job comes in a few variants:
Definition at line 39 of file RefreshLinksJob.php.
Definition at line 47 of file RefreshLinksJob.php.
References Job\$params, and Job\$title.
RefreshLinksJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.
The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Reimplemented from Job.
Definition at line 285 of file RefreshLinksJob.php.
Title | $title | |
array | $params |
Definition at line 75 of file RefreshLinksJob.php.
References $job, Job\$params, and Job\$title.
Referenced by WikiPage\triggerOpportunisticLinksUpdate().
Title | $title | |
array | $params |
Definition at line 63 of file RefreshLinksJob.php.
References $job, Job\$params, and Job\$title.
Referenced by LinksUpdate\queueRecursiveJobs(), and WikiPage\triggerOpportunisticLinksUpdate().
RefreshLinksJob::run | ( | ) |
Run the job.
Reimplemented from Job.
Definition at line 82 of file RefreshLinksJob.php.
References $e, $lbFactory, $wgUpdateRowsPerJob, as, Job\getRootJobParams(), global, list, BacklinkJobUtils\partitionBacklinkJob(), runForTitle(), JobQueueGroup\singleton(), title, and wfWikiID().
|
protected |
Title | $title |
Definition at line 132 of file RefreshLinksJob.php.
References $content, $lbFactory, $page, $parserOutput, $services, Job\$title, $user, LinksUpdate\acquirePageLock(), as, DB_MASTER, WikiPage\factory(), Title\getLatestRevID(), InfoAction\invalidateCache(), Revision\newFromId(), Revision\newFromTitle(), Revision\RAW, Job\setLastError(), ParserCache\singleton(), and wfTimestamp().
Referenced by run().
RefreshLinksJob::workItemCount | ( | ) |
Reimplemented from Job.
Definition at line 299 of file RefreshLinksJob.php.