MediaWiki  1.23.8
HTMLCacheUpdateJob.php
Go to the documentation of this file.
1 <?php
35 class HTMLCacheUpdateJob extends Job {
36  function __construct( $title, $params = '' ) {
37  parent::__construct( 'htmlCacheUpdate', $title, $params );
38  // Base backlink purge jobs can be de-duplicated
39  $this->removeDuplicates = ( !isset( $params['range'] ) && !isset( $params['pages'] ) );
40  }
41 
42  function run() {
43  global $wgUpdateRowsPerJob, $wgUpdateRowsPerQuery;
44 
45  static $expected = array( 'recursive', 'pages' ); // new jobs have one of these
46 
47  $oldRangeJob = false;
48  if ( !array_intersect( array_keys( $this->params ), $expected ) ) {
49  // B/C for older job params formats that lack these fields:
50  // a) base jobs with just ("table") and b) range jobs with ("table","start","end")
51  if ( isset( $this->params['start'] ) && isset( $this->params['end'] ) ) {
52  $oldRangeJob = true;
53  } else {
54  $this->params['recursive'] = true; // base job
55  }
56  }
57 
58  // Job to purge all (or a range of) backlink pages for a page
59  if ( !empty( $this->params['recursive'] ) ) {
60  // Convert this into no more than $wgUpdateRowsPerJob HTMLCacheUpdateJob per-title
61  // jobs and possibly a recursive HTMLCacheUpdateJob job for the rest of the backlinks
63  $this,
64  $wgUpdateRowsPerJob,
65  $wgUpdateRowsPerQuery, // jobs-per-title
66  // Carry over information for de-duplication
67  array( 'params' => $this->getRootJobParams() )
68  );
69  JobQueueGroup::singleton()->push( $jobs );
70  // Job to purge pages for for a set of titles
71  } elseif ( isset( $this->params['pages'] ) ) {
72  $this->invalidateTitles( $this->params['pages'] );
73  // B/C for job to purge a range of backlink pages for a given page
74  } elseif ( $oldRangeJob ) {
75  $titleArray = $this->title->getBacklinkCache()->getLinks(
76  $this->params['table'], $this->params['start'], $this->params['end'] );
77 
78  $pages = array(); // same format BacklinkJobUtils uses
79  foreach ( $titleArray as $tl ) {
80  $pages[$tl->getArticleId()] = array( $tl->getNamespace(), $tl->getDbKey() );
81  }
82 
83  $jobs = array();
84  foreach ( array_chunk( $pages, $wgUpdateRowsPerJob ) as $pageChunk ) {
85  $jobs[] = new HTMLCacheUpdateJob( $this->title,
86  array(
87  'table' => $this->params['table'],
88  'pages' => $pageChunk
89  ) + $this->getRootJobParams() // carry over information for de-duplication
90  );
91  }
92  JobQueueGroup::singleton()->push( $jobs );
93  }
94 
95  return true;
96  }
97 
101  protected function invalidateTitles( array $pages ) {
102  global $wgUpdateRowsPerQuery, $wgUseFileCache, $wgUseSquid;
103 
104  // Get all page IDs in this query into an array
105  $pageIds = array_keys( $pages );
106  if ( !$pageIds ) {
107  return;
108  }
109 
110  $dbw = wfGetDB( DB_MASTER );
111 
112  // The page_touched field will need to be bumped for these pages.
113  // Only bump it to the present time if no "rootJobTimestamp" was known.
114  // If it is known, it can be used instead, which avoids invalidating output
115  // that was in fact generated *after* the relevant dependency change time
116  // (e.g. template edit). This is particularily useful since refreshLinks jobs
117  // save back parser output and usually run along side htmlCacheUpdate jobs;
118  // their saved output would be invalidated by using the current timestamp.
119  if ( isset( $this->params['rootJobTimestamp'] ) ) {
120  $touchTimestamp = $this->params['rootJobTimestamp'];
121  } else {
122  $touchTimestamp = wfTimestampNow();
123  }
124 
125  // Update page_touched (skipping pages already touched since the root job).
126  // Check $wgUpdateRowsPerQuery for sanity; batch jobs are sized by that already.
127  foreach ( array_chunk( $pageIds, $wgUpdateRowsPerQuery ) as $batch ) {
128  $dbw->update( 'page',
129  array( 'page_touched' => $dbw->timestamp( $touchTimestamp ) ),
130  array( 'page_id' => $batch,
131  // don't invalidated pages that were already invalidated
132  "page_touched < " . $dbw->addQuotes( $dbw->timestamp( $touchTimestamp ) )
133  ),
134  __METHOD__
135  );
136  }
137  // Get the list of affected pages (races only mean something else did the purge)
138  $titleArray = TitleArray::newFromResult( $dbw->select(
139  'page',
140  array( 'page_namespace', 'page_title' ),
141  array( 'page_id' => $pageIds, 'page_touched' => $dbw->timestamp( $touchTimestamp ) ),
142  __METHOD__
143  ) );
144 
145  // Update squid
146  if ( $wgUseSquid ) {
147  $u = SquidUpdate::newFromTitles( $titleArray );
148  $u->doUpdate();
149  }
150 
151  // Update file cache
152  if ( $wgUseFileCache ) {
153  foreach ( $titleArray as $title ) {
155  }
156  }
157  }
158 
159  public function workItemCount() {
160  return isset( $this->params['pages'] ) ? count( $this->params['pages'] ) : 1;
161  }
162 }
Job\getRootJobParams
getRootJobParams()
Definition: Job.php:251
DB_MASTER
const DB_MASTER
Definition: Defines.php:56
php
skin txt MediaWiki includes four core it has been set as the default in MediaWiki since the replacing Monobook it had been been the default skin since before being replaced by Vector largely rewritten in while keeping its appearance Several legacy skins were removed in the as the burden of supporting them became too heavy to bear Those in etc for skin dependent CSS etc for skin dependent JavaScript These can also be customised on a per user by etc This feature has led to a wide variety of user styles becoming that gallery is a good place to ending in php
Definition: skin.txt:62
TitleArray\newFromResult
static newFromResult( $res)
Definition: TitleArray.php:38
HTMLFileCache\clearFileCache
static clearFileCache(Title $title)
Clear the file caches for a page for all actions.
Definition: HTMLFileCache.php:206
wfGetDB
& wfGetDB( $db, $groups=array(), $wiki=false)
Get a Database object.
Definition: GlobalFunctions.php:3659
Job\$title
Title $title
Definition: Job.php:38
HTMLCacheUpdateJob\__construct
__construct( $title, $params='')
Definition: HTMLCacheUpdateJob.php:36
BacklinkJobUtils\partitionBacklinkJob
static partitionBacklinkJob(Job $job, $bSize, $cSize, $opts=array())
Break down $job into approximately ($bSize/$cSize) leaf jobs and a single partition job that covers t...
Definition: BacklinkJobUtils.php:67
SquidUpdate\newFromTitles
static newFromTitles( $titles, $urlArr=array())
Create a SquidUpdate from an array of Title objects, or a TitleArray object.
Definition: SquidUpdate.php:95
title
to move a page</td >< td > &*You are moving the page across *A non empty talk page already exists under the new or *You uncheck the box below In those you will have to move or merge the page manually if desired</td >< td > be sure to &You are responsible for making sure that links continue to point where they are supposed to go Note that the page will &a page at the new title
Definition: All_system_messages.txt:2703
Job
Class to both describe a background job and handle jobs.
Definition: Job.php:31
HTMLCacheUpdateJob\workItemCount
workItemCount()
Definition: HTMLCacheUpdateJob.php:159
HTMLCacheUpdateJob
Job to purge the cache for all pages that link to or use another page or file.
Definition: HTMLCacheUpdateJob.php:35
array
the array() calling protocol came about after MediaWiki 1.4rc1.
List of Api Query prop modules.
global
when a variable name is used in a it is silently declared as a new masking the global
Definition: design.txt:93
wfTimestampNow
wfTimestampNow()
Convenience function; returns MediaWiki timestamp for the present time.
Definition: GlobalFunctions.php:2514
HTMLCacheUpdateJob\invalidateTitles
invalidateTitles(array $pages)
Definition: HTMLCacheUpdateJob.php:101
Job\$params
array bool $params
Array of job parameters or false if none *.
Definition: Job.php:34
JobQueueGroup\singleton
static singleton( $wiki=false)
Definition: JobQueueGroup.php:61
as
This document is intended to provide useful advice for parties seeking to redistribute MediaWiki to end users It s targeted particularly at maintainers for Linux since it s been observed that distribution packages of MediaWiki often break We ve consistently had to recommend that users seeking support use official tarballs instead of their distribution s and this often solves whatever problem the user is having It would be nice if this could such as
Definition: distributors.txt:9
$batch
$batch
Definition: linkcache.txt:23
HTMLCacheUpdateJob\run
run()
Run the job.
Definition: HTMLCacheUpdateJob.php:42