MediaWiki  REL1_31
Go to the documentation of this file.
1 <?php
87  public static function partitionBacklinkJob( Job $job, $bSize, $cSize, $opts = [] ) {
88  $class = get_class( $job );
89  $title = $job->getTitle();
90  $params = $job->getParams();
92  if ( isset( $params['pages'] ) || empty( $params['recursive'] ) ) {
93  $ranges = []; // sanity; this is a leaf node
94  $realBSize = 0;
95  wfWarn( __METHOD__ . " called on {$job->getType()} leaf job (explosive recursion)." );
96  } elseif ( isset( $params['range'] ) ) {
97  // This is a range job to trigger the insertion of partitioned/title jobs...
98  $ranges = $params['range']['subranges'];
99  $realBSize = $params['range']['batchSize'];
100  } else {
101  // This is a base job to trigger the insertion of partitioned jobs...
102  $ranges = $title->getBacklinkCache()->partition( $params['table'], $bSize );
103  $realBSize = $bSize;
104  }
106  $extraParams = isset( $opts['params'] ) ? $opts['params'] : [];
108  $jobs = [];
109  // Combine the first range (of size $bSize) backlinks into leaf jobs
110  if ( isset( $ranges[0] ) ) {
111  list( $start, $end ) = $ranges[0];
112  $iter = $title->getBacklinkCache()->getLinks( $params['table'], $start, $end );
113  $titles = iterator_to_array( $iter );
115  foreach ( array_chunk( $titles, $cSize ) as $titleBatch ) {
116  $pages = [];
117  foreach ( $titleBatch as $tl ) {
118  $pages[$tl->getArticleID()] = [ $tl->getNamespace(), $tl->getDBkey() ];
119  }
120  $jobs[] = new $class(
121  $title, // maintain parent job title
122  [ 'pages' => $pages ] + $extraParams
123  );
124  }
125  }
126  // Take all of the remaining ranges and build a partition job from it
127  if ( isset( $ranges[1] ) ) {
128  $jobs[] = new $class(
129  $title, // maintain parent job title
130  [
131  'recursive' => true,
132  'table' => $params['table'],
133  'range' => [
134  'start' => $ranges[1][0],
135  'end' => $ranges[count( $ranges ) - 1][1],
136  'batchSize' => $realBSize,
137  'subranges' => array_slice( $ranges, 1 )
138  ],
139  // Track how many times the base job divided for debugging
140  'division' => isset( $params['division'] )
141  ? ( $params['division'] + 1 )
142  : 1
143  ] + $extraParams
144  );
145  }
147  return $jobs;
148  }
149 }
Definition: styleTest.css.php:40
injection txt This is an overview of how MediaWiki makes use of dependency injection The design described here grew from the discussion of RFC T384 The term dependency this means that anything an object needs to operate should be injected from the the object itself should only know narrow no concrete implementation of the logic it relies on The requirement to inject everything typically results in an architecture that based on two main types of and essentially stateless service objects that use other service objects to operate on the value objects As of the beginning MediaWiki is only starting to use the DI approach Much of the code still relies on global state or direct resulting in a highly cyclical dependency which acts as the top level factory for services in MediaWiki which can be used to gain access to default instances of various services MediaWikiServices however also allows new services to be defined and default services to be redefined Services are defined or redefined by providing a callback the instantiator that will return a new instance of the service When it will create an instance of MediaWikiServices and populate it with the services defined in the files listed by thereby bootstrapping the DI framework Per $wgServiceWiringFiles lists includes ServiceWiring php
Definition: injection.txt:37
static partitionBacklinkJob(Job $job, $bSize, $cSize, $opts=[])
Break down $job into approximately ($bSize/$cSize) leaf jobs and a single partition job that covers t...
Definition: BacklinkJobUtils.php:87
Class to both describe a background job and handle jobs.
Definition: Job.php:31
linkcache txt The LinkCache class maintains a list of article titles and the information about whether or not the article exists in the database This is used to mark up links when displaying a page If the same link appears more than once on any page then it only has to be looked up once In most cases link lookups are done in batches with the LinkBatch class or the equivalent in so the link cache is mostly useful for short snippets of parsed and for links in the navigation areas of the skin The link cache was formerly used to track links used in a document for the purposes of updating the link tables This application is now deprecated To create a you can use the following $titles
Definition: linkcache.txt:17
namespace and then decline to actually register it file or subcat img or subcat $title
Definition: hooks.txt:964
deferred txt A few of the database updates required by various functions here can be deferred until after the result page is displayed to the user For updating the view updating the linked to tables after a etc PHP does not yet have any way to tell the server to actually return and disconnect while still running these but it might have such a feature in the future We handle these by creating a deferred update object and putting those objects on a global list
Definition: deferred.txt:11
Class with Backlink related Job helper methods.
Definition: BacklinkJobUtils.php:51
if(count( $args)< 1) $job
Definition: recompressTracked.php:47
This document is intended to provide useful advice for parties seeking to redistribute MediaWiki to end users It s targeted particularly at maintainers for Linux since it s been observed that distribution packages of MediaWiki often break We ve consistently had to recommend that users seeking support use official tarballs instead of their distribution s and this often solves whatever problem the user is having It would be nice if this could such as
Definition: distributors.txt:22
wfWarn( $msg, $callerOffset=1, $level=E_USER_NOTICE)
Send a warning either to the debug log or in a PHP error depending on $wgDevelopmentWarnings.
Definition: GlobalFunctions.php:1137