MediaWiki  1.30.0
MediaWikiPageNameNormalizer.php
Go to the documentation of this file.
1 <?php
2 
3 namespace MediaWiki\Site;
4 
6 use Http;
7 use UtfNormal\Validator;
8 
36 
40  private $http;
41 
45  public function __construct( Http $http = null ) {
46  if ( !$http ) {
47  $http = new Http();
48  }
49 
50  $this->http = $http;
51  }
52 
71  public function normalizePageName( $pageName, $apiUrl ) {
72  // Check if we have strings as arguments.
73  if ( !is_string( $pageName ) ) {
74  throw new \MWException( '$pageName must be a string' );
75  }
76 
77  // Go on call the external site
78 
79  // Make sure the string is normalized into NFC (due to T42017)
80  // but do nothing to the whitespaces, that should work appropriately.
81  // @see https://phabricator.wikimedia.org/T42017
82  $pageName = Validator::cleanUp( $pageName );
83 
84  // Build the args for the specific call
85  $args = [
86  'action' => 'query',
87  'prop' => 'info',
88  'redirects' => true,
89  'converttitles' => true,
90  'format' => 'json',
91  'titles' => $pageName,
92  // @todo options for maxlag and maxage
93  // Note that maxlag will lead to a long delay before a reply is made,
94  // but that maxage can avoid the extreme delay. On the other hand
95  // maxage could be nice to use anyhow as it stops unnecessary requests.
96  // Also consider smaxage if maxage is used.
97  ];
98 
99  $url = wfAppendQuery( $apiUrl, $args );
100 
101  // Go on call the external site
102  // @todo we need a good way to specify a timeout here.
103  $ret = $this->http->get( $url, [], __METHOD__ );
104 
105  if ( $ret === false ) {
106  wfDebugLog( "MediaWikiSite", "call to external site failed: $url" );
107  return false;
108  }
109 
110  $data = FormatJson::decode( $ret, true );
111 
112  if ( !is_array( $data ) ) {
113  wfDebugLog( "MediaWikiSite", "call to <$url> returned bad json: " . $ret );
114  return false;
115  }
116 
117  $page = static::extractPageRecord( $data, $pageName );
118 
119  if ( isset( $page['missing'] ) ) {
120  wfDebugLog( "MediaWikiSite", "call to <$url> returned a marker for a missing page title! "
121  . $ret );
122  return false;
123  }
124 
125  if ( isset( $page['invalid'] ) ) {
126  wfDebugLog( "MediaWikiSite", "call to <$url> returned a marker for an invalid page title! "
127  . $ret );
128  return false;
129  }
130 
131  if ( !isset( $page['title'] ) ) {
132  wfDebugLog( "MediaWikiSite", "call to <$url> did not return a page title! " . $ret );
133  return false;
134  }
135 
136  return $page['title'];
137  }
138 
147  private static function extractPageRecord( $externalData, $pageTitle ) {
148  // If there is a special case with only one returned page
149  // we can cheat, and only return
150  // the single page in the "pages" substructure.
151  if ( isset( $externalData['query']['pages'] ) ) {
152  $pages = array_values( $externalData['query']['pages'] );
153  if ( count( $pages ) === 1 ) {
154  return $pages[0];
155  }
156  }
157  // This is only used during internal testing, as it is assumed
158  // a more optimal (and lossfree) storage.
159  // Make initial checks and return if prerequisites are not meet.
160  if ( !is_array( $externalData ) || !isset( $externalData['query'] ) ) {
161  return false;
162  }
163  // Loop over the tree different named structures, that otherwise are similar
164  $structs = [
165  'normalized' => 'from',
166  'converted' => 'from',
167  'redirects' => 'from',
168  'pages' => 'title'
169  ];
170  foreach ( $structs as $listId => $fieldId ) {
171  // Check if the substructure exist at all.
172  if ( !isset( $externalData['query'][$listId] ) ) {
173  continue;
174  }
175  // Filter the substructure down to what we actually are using.
176  $collectedHits = array_filter(
177  array_values( $externalData['query'][$listId] ),
178  function ( $a ) use ( $fieldId, $pageTitle ) {
179  return $a[$fieldId] === $pageTitle;
180  }
181  );
182  // If still looping over normalization, conversion or redirects,
183  // then we need to keep the new page title for later rounds.
184  if ( $fieldId === 'from' && is_array( $collectedHits ) ) {
185  switch ( count( $collectedHits ) ) {
186  case 0:
187  break;
188  case 1:
189  $pageTitle = $collectedHits[0]['to'];
190  break;
191  default:
192  return false;
193  }
194  } elseif ( $fieldId === 'title' && is_array( $collectedHits ) ) {
195  // If on the pages structure we should prepare for returning.
196 
197  switch ( count( $collectedHits ) ) {
198  case 0:
199  return false;
200  case 1:
201  return array_shift( $collectedHits );
202  default:
203  return false;
204  }
205  }
206  }
207  // should never be here
208  return false;
209  }
210 
211 }
captcha-old.count
count
Definition: captcha-old.py:249
use
as see the revision history and available at free of to any person obtaining a copy of this software and associated documentation to deal in the Software without including without limitation the rights to use
Definition: MIT-LICENSE.txt:10
MediaWiki\Site\MediaWikiPageNameNormalizer\__construct
__construct(Http $http=null)
Definition: MediaWikiPageNameNormalizer.php:45
MediaWiki\Site\MediaWikiPageNameNormalizer\extractPageRecord
static extractPageRecord( $externalData, $pageTitle)
Get normalization record for a given page title from an API response.
Definition: MediaWikiPageNameNormalizer.php:147
MediaWiki\Site\MediaWikiPageNameNormalizer\$http
Http $http
Definition: MediaWikiPageNameNormalizer.php:40
http
Apache License January http
Definition: APACHE-LICENSE-2.0.txt:3
wfDebugLog
wfDebugLog( $logGroup, $text, $dest='all', array $context=[])
Send a line to a supplementary debug log file, if configured, or main debug log if not.
Definition: GlobalFunctions.php:1140
php
injection txt This is an overview of how MediaWiki makes use of dependency injection The design described here grew from the discussion of RFC T384 The term dependency this means that anything an object needs to operate should be injected from the the object itself should only know narrow no concrete implementation of the logic it relies on The requirement to inject everything typically results in an architecture that based on two main types of and essentially stateless service objects that use other service objects to operate on the value objects As of the beginning MediaWiki is only starting to use the DI approach Much of the code still relies on global state or direct resulting in a highly cyclical dependency which acts as the top level factory for services in MediaWiki which can be used to gain access to default instances of various services MediaWikiServices however also allows new services to be defined and default services to be redefined Services are defined or redefined by providing a callback the instantiator that will return a new instance of the service When it will create an instance of MediaWikiServices and populate it with the services defined in the files listed by thereby bootstrapping the DI framework Per $wgServiceWiringFiles lists includes ServiceWiring php
Definition: injection.txt:35
wfAppendQuery
wfAppendQuery( $url, $query)
Append a query string to an existing URL, which may or may not already have query string parameters a...
Definition: GlobalFunctions.php:534
FormatJson\decode
static decode( $value, $assoc=false)
Decodes a JSON string.
Definition: FormatJson.php:187
FormatJson
JSON formatter wrapper class.
Definition: FormatJson.php:26
MediaWiki\Site\MediaWikiPageNameNormalizer\normalizePageName
normalizePageName( $pageName, $apiUrl)
Returns the normalized form of the given page title, using the normalization rules of the given site.
Definition: MediaWikiPageNameNormalizer.php:71
MediaWiki\Site\MediaWikiPageNameNormalizer
Service for normalizing a page name using a MediaWiki api.
Definition: MediaWikiPageNameNormalizer.php:35
$ret
null means default in associative array with keys and values unescaped Should be merged with default with a value of false meaning to suppress the attribute in associative array with keys and values unescaped noclasses & $ret
Definition: hooks.txt:1965
$args
if( $line===false) $args
Definition: cdb.php:63
as
This document is intended to provide useful advice for parties seeking to redistribute MediaWiki to end users It s targeted particularly at maintainers for Linux since it s been observed that distribution packages of MediaWiki often break We ve consistently had to recommend that users seeking support use official tarballs instead of their distribution s and this often solves whatever problem the user is having It would be nice if this could such as
Definition: distributors.txt:9
Http
Various HTTP related functions.
Definition: Http.php:27
MediaWiki\Site
Definition: MediaWikiPageNameNormalizer.php:3