Wikidata:Contact the development team/Archive/2020/08

From Wikidata
Jump to navigation Jump to search
This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Call to undefined method GeoData\CoordinatesOutput::hasPrimary()

Hello all, I have this issue:

php maintenance/importEntities.php --all-properties [2020-08-03 11:28:54]: Importing Batch: P10, P1000, P1001, P1002, P1003, P1004, P1005, P1006, P1007, P101 [2020-08-03 11:28:55]: Creating P10 [a005ade11535540a59260e84] [no req] Error from line 103 of /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/GeoDataDataUpdater.php: Call to undefined method GeoData\CoordinatesOutput::hasPrimary() Backtrace: #0 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/CompositeStatementDataUpdater.php(32): Wikibase\Repo\ParserOutput\GeoDataDataUpdater->updateParserOutput() #1 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/PropertyParserOutputUpdater.php(31): Wikibase\Repo\ParserOutput\CompositeStatementDataUpdater->updateParserOutput() #2 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/PropertyParserOutputUpdater.php(22): Wikibase\Repo\ParserOutput\PropertyParserOutputUpdater->updateParserOutputForProperty() #3 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/EntityParserOutputDataUpdaterCollection.php(44): Wikibase\Repo\ParserOutput\PropertyParserOutputUpdater->updateParserOutput() #4 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/FullEntityParserOutputGenerator.php(138): Wikibase\Repo\ParserOutput\EntityParserOutputDataUpdaterCollection->updateParserOutput() #5 /var/www/html/extensions/Wikibase/repo/includes/ParserOutput/StatsdTimeRecordingEntityParserOutputGenerator.php(48): Wikibase\Repo\ParserOutput\FullEntityParserOutputGenerator->getParserOutput() #6 /var/www/html/extensions/Wikibase/repo/includes/Content/EntityContent.php(238): Wikibase\Repo\ParserOutput\StatsdTimeRecordingEntityParserOutputGenerator->getParserOutput() #7 /var/www/html/extensions/Wikibase/repo/includes/Content/PropertyContent.php(138): Wikibase\Repo\Content\EntityContent->getParserOutputFromEntityView() #8 /var/www/html/extensions/Wikibase/repo/includes/Content/EntityContent.php(177): Wikibase\Repo\Content\PropertyContent->getParserOutputFromEntityView() #9 /var/www/html/includes/Revision/RenderedRevision.php(263): Wikibase\Repo\Content\EntityContent->getParserOutput() #10 /var/www/html/includes/Revision/RenderedRevision.php(235): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached() #11 /var/www/html/includes/Revision/RevisionRenderer.php(215): MediaWiki\Revision\RenderedRevision->getSlotParserOutput() #12 /var/www/html/includes/Revision/RevisionRenderer.php(152): MediaWiki\Revision\RevisionRenderer->combineSlotOutput() #13 [internal function]: MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}() #14 /var/www/html/includes/Revision/RenderedRevision.php(197): call_user_func() #15 /var/www/html/includes/Storage/DerivedPageDataUpdater.php(1309): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput() #16 /var/www/html/includes/Storage/PageUpdater.php(749): MediaWiki\Storage\DerivedPageDataUpdater->getCanonicalParserOutput() #17 /var/www/html/extensions/Wikibase/repo/includes/Store/Sql/WikiPageEntityStore.php(374): MediaWiki\Storage\PageUpdater->saveRevision() #18 /var/www/html/extensions/Wikibase/repo/includes/Store/Sql/WikiPageEntityStore.php(234): Wikibase\Repo\Store\Sql\WikiPageEntityStore->saveEntityContent() #19 /var/www/html/extensions/Wikibase/lib/includes/Store/TypeDispatchingEntityStore.php(85): Wikibase\Repo\Store\Sql\WikiPageEntityStore->saveEntity() #20 /var/www/html/extensions/WikibaseImport/src/EntityImporter.php(145): Wikibase\Lib\Store\TypeDispatchingEntityStore->saveEntity() #21 /var/www/html/extensions/WikibaseImport/src/EntityImporter.php(115): Wikibase\Import\EntityImporter->createEntity() #22 /var/www/html/extensions/WikibaseImport/src/EntityImporter.php(74): Wikibase\Import\EntityImporter->importBatch() #23 /var/www/html/extensions/WikibaseImport/maintenance/importEntities.php(78): Wikibase\Import\EntityImporter->importEntities() #24 /var/www/html/maintenance/doMaintenance.php(105): Wikibase\Import\Maintenance\ImportEntities->execute() #25 /var/www/html/extensions/WikibaseImport/maintenance/importEntities.php(133): require_once(string) #26 {main}

thank you  – The preceding unsigned comment was added by 2001:8f8:1e23:1dca:60ba:229a:9b00:88dc (talk • contribs) at 08:05, 3 August 2020 (UTC).

This looks like you’re using incompatible versions of Wikibase and GeoData (most likely the master branch of Wikibase but the 1.34 release of GeoData). Make sure you’re using the same version across your MediaWiki install – either use the master branch everywhere, or the same release (branch), e. g. REL1_34. --Lucas Werkmeister (WMDE) (talk) 09:41, 3 August 2020 (UTC)
Thank you very much, replacing Geodata extension to 1_35 solved the issue.2001:8F8:1E23:1DCA:6453:4B8D:EA2C:7377 13:39, 3 August 2020 (UTC)

WIKIPEDIA SITELINK/PAGE NOT FOUND — ERROR

I’ve been trying to add the site link for Zayn Africa’s Wikipedia & it keeps returning with “Page not found”.

The Wikidata link is: https://wikidata.org/wiki/Q97959357 and The Wikipedia link is: https://en.wikipedia.org/wiki/Zayn_Africa

Please help!

Themajidi (talk) 02:56, 1 August 2020 (UTC)

Themajidi, seems like you were able to get this done?  Hazard SJ  03:55, 2 August 2020 (UTC)
 Hazard SJ  Thank you so much, I got it done.
Themajidi (talk) 19:46, 3 August 2020 (UTC)

City Categories

Hello,

right now i'm trying to run a few queries to get a specific set of cities. When i filtered cities for a population greater than 100k i noticed, that some cities didn't show up in the results.

Query:

https://query.wikidata.org/#SELECT%20DISTINCT%20%3FcityLabel%20%3Fpopulation%20%3Fcoord%20%3FcountryLabel%20%3FshortCountry%20%3Fcity%20%20WHERE%20%7B%0A%20%20%3Fcity%20%28wdt%3AP31%2F%28wdt%3AP279%2a%29%29%20wd%3AQ515%3B%0A%20%20%20%20wdt%3AP1082%20%3Fpopulation%3B%0A%20%20%20%20wdt%3AP625%20%3Fcoord.%0A%20%20FILTER%28%3Fpopulation%20%3E%20100000%20%29%0A%20%20%3Fcity%20wdt%3AP17%20%3Fcountry.%0A%20%20%3Fcountry%20wdt%3AP298%20%3FshortCountry.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22.%20%7D%0A%7D%0AORDER%20BY%20ASC%20%28%3FshortCountry%29

Now for example: Berlin didn't show up.

So i dug into the Wikidata model of Berlin https://www.wikidata.org/wiki/Q64 and found out that it's no instance of city (Q515). It's e.g. a capital (Q5119). Contrary to Berlin, Paris can be found under Q515. After that i tried finding Berlin with the capital category:

https://query.wikidata.org/#SELECT%20DISTINCT%20%3FcityLabel%20%3Fpopulation%20%3Fcoord%20%3Fcity%20%20WHERE%20%7B%0A%20%20%3Fcity%20%28wdt%3AP31%2a%29%20wd%3AQ5119%3B%0A%20%20%20%20wdt%3AP1082%20%3Fpopulation%3B%0A%20%20%20%20wdt%3AP625%20%3Fcoord.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22.%20%7D%0A%7D%0AORDER%20BY%20ASC%20%28%3FshortCountry%29

Now here "East Berlin" is shown, but again no "Berlin".

I'm a little bit confused if the subclasses are actually working out as intended in the queries.

I would really appreciate if you could provide a solution for my problem.

Best Regards Lukas

Hi Lukas. Berlin (Q64) doesn't show up in your query results because you search for best rank results only by using the wdt: prefix. Since this edit by User:Tomastvivlaren, the only best rank statement with P31 for Berlin is Berlin (Q64) instance of (P31) federated state of Germany (Q1221156). You need to either undo that edit or search for all statement ranks using p:P31 / ps:P31. See more about statement types at mw:Wikibase/Indexing/RDF_Dump_Format#Statement types. --Dipsacus fullonum (talk) 13:37, 7 August 2020 (UTC)

Creating /wiki pages with a REST API?

Hello,

I need to create 116 Listeria pages to fit in with the Template:Every_Politician/Legislature_table_row template. Is it possible to automate this task? For instance, can I create pages with a REST API similar to https://www.mediawiki.org/wiki/API:REST_API/Reference#Create_page? I tried with https://www.wikidata.org/w/rest.php/v1/page/wiki/Wikidata:WikiProject_every_politician/United_States_of_America/data/Senate/116th_United_States_Congress, but it didn't seem to work. I received a "did not match any known handler" error.

Thanks, Gettinwikiwidit (talk) 08:26, 4 August 2020 (UTC)

@Gettinwikiwidit: The right base URL for that page seems to be https://www.wikidata.org/w/rest.php/v1/page/Wikidata:WikiProject_every_politician%2FUnited_States_of_America%2Fdata%2FSenate%2F116th_United_States_Congress (the /wiki is not part of the page title, and the title also needs to be URL-encoded). I’m not sure if it’s also possible to create new pages through the REST API – it’s fairly new, and I haven’t used it myself yet. You could also use the action API (specifically edit), you might find more documentation with that API. --Lucas Werkmeister (WMDE) (talk) 09:25, 12 August 2020 (UTC)

Missing Endpoint Class using SPARQL-JAVA jar from BorderCloud

Hello,

Trying to make a few tests with Java and SPARQL, I used an example from Wikidata Query Service. However, when I copied the code from the Java example code after executing a query, I downloaded SPARQL-JAVA library but couldn't find how to import com.borderland.sparql.Endpoint, wherever I searched. Where could I find this class ? Is there any others way to make it work ? --NairdaTorage (talk) 12:59, 7 August 2020 (UTC)

@NairdaTorage: It looks like the SPARQL-JAVA library has undergone a complete rewrite in the meantime :/ maybe you can find an old version to use? I’ve also filed T259900 to update the example code we generate. --Lucas Werkmeister (WMDE) (talk) 16:41, 7 August 2020 (UTC)

Hello,

We are developing a new application using Java that could include some databases, and we would like to include the WikiData database. However, after some researches and finding API like WikiData Tool or Apache Jena, we would like to know if you had any recommendations of API to include SPARQL queries to our Project by any chance. Thanks you in advance. --NairdaTorage (talk) 16:52, 30 July 2020 (UTC)

Hi @NairdaTorage:, on the Query Service page after running a query, you can click on "code" on the menu bar above the results, then choose "Java", and you will get a code snippet to embbed the result dynamically into your Java code. Would that solve your problem? If not, could you give me more details about what you're trying to do? Thanks, Lea Lacroix (WMDE) (talk) 16:35, 11 August 2020 (UTC)
Hi @Lea Lacroix (WMDE):, thank you very much, it was indeed the best that I could find, even if the library is a bit outdated. However, I think that this issue is already reported and I achieved to adapt the code so that our application works with it. I wished that I have found this button before, but it was still really helpful. Thank you again for your answer ! --77.204.196.151 11:22, 20 August 2020 (UTC)

Wikidata Query Service dramatic performance drop (August 18)

I've written some code to do disambiguation of a table column using a SPARQL query (similar to OpenRefine but more efficient).

I ran this code today for a first time in a month and time per query seems to have gone up dramatically from ~1 second per query to ~70 seconds per query. My code respects the Retry-After header so I'm hoping this isn't because my IP has been blacklisted.

Is there a reason for this performance/can you recommend any way I could speed it up? At the moment I'm considering running the API calls in parallel. - Kdutia (talk) 15:56, 18 August 2020 (UTC)

@Kdutia: Hm, I’m not sure why that would be. Looking at sparql.py, I would recommend a better User-Agent header, and I also notice that if there is a Retry-After header but it’s not an int (according to MDN, it may also be a date), the code won’t sleep at all; however, I don’t think either of these are likely to cause the slowdowns you get.
However, for the code in reconciler.py, you might not need the query service at all? The query only looks up the Wikidata item of this property (P1629) statements of one particular entity – you could also get those from EntityData or wbgetentities. --Lucas Werkmeister (WMDE) (talk) 16:53, 18 August 2020 (UTC)
@Lucas Werkmeister: Thanks. Good idea RE using wbgetentities but that search is only called once for each column: I'm actually more concerned about disambiguation.search.wikidata_text_search as it looks up the instance of/(class of)* tree - and as far as I'm aware this has to be done in SPARQL. Changing my User Agent header solved the problem, and thanks for catching that bug about it not sleeping.  – The preceding unsigned comment was added by Kdutia (talk • contribs) at 09:04, 19 August 2020‎ (UTC).

Viewing deleted pages does not show labels

When I view a deleted revision (e.g. [1]), I see the claims as expected but, where the labels/descriptions/aliases box ought to be is just a string like "$?UNIQ868cbb21f8405a0c#2$". This makes it harder to review deleted items. Does this happen for others or is it just me? Bovlb (talk) 15:27, 28 August 2020 (UTC)

@Bovlb: Yes for me too the labels are not visible for deleted items. I go through the history to view the details.-❙❚❚❙❙ GnOeee ❚❙❚❙❙ 15:38, 28 August 2020 (UTC)