MediaWiki  1.28.1
HTMLCacheUpdateJob.php
Go to the documentation of this file.
1 <?php
36 class HTMLCacheUpdateJob extends Job {
38  parent::__construct( 'htmlCacheUpdate', $title, $params );
39  // Base backlink purge jobs can be de-duplicated
40  $this->removeDuplicates = ( !isset( $params['range'] ) && !isset( $params['pages'] ) );
41  }
42 
48  public static function newForBacklinks( Title $title, $table ) {
49  return new self(
50  $title,
51  [
52  'table' => $table,
53  'recursive' => true
54  ] + Job::newRootJobParams( // "overall" refresh links job info
55  "htmlCacheUpdate:{$table}:{$title->getPrefixedText()}"
56  )
57  );
58  }
59 
60  function run() {
61  global $wgUpdateRowsPerJob, $wgUpdateRowsPerQuery;
62 
63  if ( isset( $this->params['table'] ) && !isset( $this->params['pages'] ) ) {
64  $this->params['recursive'] = true; // b/c; base job
65  }
66 
67  // Job to purge all (or a range of) backlink pages for a page
68  if ( !empty( $this->params['recursive'] ) ) {
69  // Convert this into no more than $wgUpdateRowsPerJob HTMLCacheUpdateJob per-title
70  // jobs and possibly a recursive HTMLCacheUpdateJob job for the rest of the backlinks
72  $this,
73  $wgUpdateRowsPerJob,
74  $wgUpdateRowsPerQuery, // jobs-per-title
75  // Carry over information for de-duplication
76  [ 'params' => $this->getRootJobParams() ]
77  );
78  JobQueueGroup::singleton()->push( $jobs );
79  // Job to purge pages for a set of titles
80  } elseif ( isset( $this->params['pages'] ) ) {
81  $this->invalidateTitles( $this->params['pages'] );
82  // Job to update a single title
83  } else {
84  $t = $this->title;
85  $this->invalidateTitles( [
86  $t->getArticleID() => [ $t->getNamespace(), $t->getDBkey() ]
87  ] );
88  }
89 
90  return true;
91  }
92 
96  protected function invalidateTitles( array $pages ) {
97  global $wgUpdateRowsPerQuery, $wgUseFileCache;
98 
99  // Get all page IDs in this query into an array
100  $pageIds = array_keys( $pages );
101  if ( !$pageIds ) {
102  return;
103  }
104 
105  // Bump page_touched to the current timestamp. This used to use the root job timestamp
106  // (e.g. template/file edit time), which was a bit more efficient when template edits are
107  // rare and don't effect the same pages much. However, this way allows for better
108  // de-duplication, which is much more useful for wikis with high edit rates. Note that
109  // RefreshLinksJob, which is enqueued alongside HTMLCacheUpdateJob, saves the parser output
110  // since it has to parse anyway. We assume that vast majority of the cache jobs finish
111  // before the link jobs, so using the current timestamp instead of the root timestamp is
112  // not expected to invalidate these cache entries too often.
113  $touchTimestamp = wfTimestampNow();
114 
115  $dbw = wfGetDB( DB_MASTER );
116  $factory = wfGetLBFactory();
117  $ticket = $factory->getEmptyTransactionTicket( __METHOD__ );
118  // Update page_touched (skipping pages already touched since the root job).
119  // Check $wgUpdateRowsPerQuery for sanity; batch jobs are sized by that already.
120  foreach ( array_chunk( $pageIds, $wgUpdateRowsPerQuery ) as $batch ) {
121  $factory->commitAndWaitForReplication( __METHOD__, $ticket );
122 
123  $dbw->update( 'page',
124  [ 'page_touched' => $dbw->timestamp( $touchTimestamp ) ],
125  [ 'page_id' => $batch,
126  // don't invalidated pages that were already invalidated
127  "page_touched < " . $dbw->addQuotes( $dbw->timestamp( $touchTimestamp ) )
128  ],
129  __METHOD__
130  );
131  }
132  // Get the list of affected pages (races only mean something else did the purge)
133  $titleArray = TitleArray::newFromResult( $dbw->select(
134  'page',
135  [ 'page_namespace', 'page_title' ],
136  [ 'page_id' => $pageIds, 'page_touched' => $dbw->timestamp( $touchTimestamp ) ],
137  __METHOD__
138  ) );
139 
140  // Update CDN
141  $u = CdnCacheUpdate::newFromTitles( $titleArray );
142  $u->doUpdate();
143 
144  // Update file cache
145  if ( $wgUseFileCache ) {
146  foreach ( $titleArray as $title ) {
148  }
149  }
150  }
151 
152  public function workItemCount() {
153  return isset( $this->params['pages'] ) ? count( $this->params['pages'] ) : 1;
154  }
155 }
wfGetDB($db, $groups=[], $wiki=false)
Get a Database object.
the array() calling protocol came about after MediaWiki 1.4rc1.
Class to both describe a background job and handle jobs.
Definition: Job.php:31
static newRootJobParams($key)
Get "root job" parameters for a task.
Definition: Job.php:261
when a variable name is used in a it is silently declared as a new local masking the global
Definition: design.txt:93
const DB_MASTER
Definition: defines.php:23
$batch
Definition: linkcache.txt:23
null means default in associative array with keys and values unescaped Should be merged with default with a value of false meaning to suppress the attribute in associative array with keys and values unescaped noclasses just before the function returns a value If you return true
Definition: hooks.txt:1936
$wgUseFileCache
This will cache static pages for non-logged-in users to reduce database traffic on public sites...
static clearFileCache(Title $title)
Clear the file caches for a page for all actions.
getRootJobParams()
Definition: Job.php:274
wfTimestampNow()
Convenience function; returns MediaWiki timestamp for the present time.
Job to purge the cache for all pages that link to or use another page or file.
static newFromResult($res)
Definition: TitleArray.php:38
This document is intended to provide useful advice for parties seeking to redistribute MediaWiki to end users It s targeted particularly at maintainers for Linux since it s been observed that distribution packages of MediaWiki often break We ve consistently had to recommend that users seeking support use official tarballs instead of their distribution s and this often solves whatever problem the user is having It would be nice if this could such as
Definition: distributors.txt:9
static singleton($wiki=false)
injection txt This is an overview of how MediaWiki makes use of dependency injection The design described here grew from the discussion of RFC T384 The term dependency this means that anything an object needs to operate should be injected from the the object itself should only know narrow no concrete implementation of the logic it relies on The requirement to inject everything typically results in an architecture that based on two main types of and essentially stateless service objects that use other service objects to operate on the value objects As of the beginning MediaWiki is only starting to use the DI approach Much of the code still relies on global state or direct resulting in a highly cyclical dependency which acts as the top level factory for services in MediaWiki which can be used to gain access to default instances of various services MediaWikiServices however also allows new services to be defined and default services to be redefined Services are defined or redefined by providing a callback the instantiator that will return a new instance of the service When it will create an instance of MediaWikiServices and populate it with the services defined in the files listed by thereby bootstrapping the DI framework Per $wgServiceWiringFiles lists includes ServiceWiring php
Definition: injection.txt:35
wfGetLBFactory()
Get the load balancer factory object.
invalidateTitles(array $pages)
__construct(Title $title, array $params)
array $params
Array of job parameters.
Definition: Job.php:36
static partitionBacklinkJob(Job $job, $bSize, $cSize, $opts=[])
Break down $job into approximately ($bSize/$cSize) leaf jobs and a single partition job that covers t...
Title $title
Definition: Job.php:42
static newForBacklinks(Title $title, $table)
static newFromTitles($titles, $urlArr=[])
Create an update object from an array of Title objects, or a TitleArray object.