Wikidata:Bot requests/Archive/2015/12

From Wikidata
Jump to navigation Jump to search

Help with list of cities?

I am hoping to use de:Liste der Millionenstädte (which seems better for this purpose than any list in other languages) to completely revise the list of cities la:Maximae urbes orbis terrarum on Vicipaedia. Is it possible for a bot to help with the work? Specifically, for each city entry in de:Liste der Millionenstädte --

  1. List the la:pagename about that city, or, if there is none, list the de:pagename
  2. List the item number on Wikidata for the city page (Q...)
  3. List the la:pagename of the country the city is in
  4. List, directly from de:wiki, the continent the city is in

My long term aim is to improve the pages for major cities on la:wiki, starting from the revised list. Thanks in advance if anyone can help. Andrew Dalby (talk) 15:32, 1 December 2015 (UTC)

@Andrew Dalby: Does this help?  Hazard SJ  03:35, 3 December 2015 (UTC)
You might want to try Template:Wikidata list. --- Jura 12:06, 3 December 2015 (UTC)
Thank you to both. I will study the template linked by Jura -- it will be useful to me in the future -- but meanwhile Hazard-SJ has given me exactly the information I needed! Andrew Dalby (talk) 13:04, 3 December 2015 (UTC)
This section was archived on a request by: Andrew Dalby (talk) 13:05, 3 December 2015 (UTC)

= Populate P27/nationality from Q21619956 (kowiki)

Field "국적" seems to include nationality information (with a template such as ko:Template:KOR). --- Jura 12:06, 3 December 2015 (UTC)

Bad bot behaviour

bot reCh [1.0] absuses (IMHO) the property "said to be the same as (P460)" for adding Armenian country names. The bot should probably add a label with the Armenian country name instead (or add wikipedia links)

Example https://www.wikidata.org/w/index.php?title=Q37&diff=282224600&oldid=282196639

I don't have a clue, how to contact the author. Could anybody help out and contact him?

 – The preceding unsigned comment was added by Givegivetake (talk • contribs).

This section was archived on a request by: I answered on my talk page. Please refrain from labeling my edits in such a way. --- Jura 07:13, 11 December 2015 (UTC)
Updated my explanations on your talk page, why the problem is not the bot itself, but the data source it uses. Really sorry for my wording..

 – The preceding unsigned comment was added by Givegivetake (talk • contribs).

TV series endings

  • Most TV series endingss need links to Swedish. For example, all "Category:1966 television series endings" shall add "Kategori:TV-serieavslutningar 1966", "Category:1967 television series endings" shall add "Kategori:TV-serieavslutningar 1967" for example. J 1982 (talk) 23:18, 11 December 2015 (UTC)* Forget it, it has been fixed now. J 1982 (talk) 09:42, 12 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 12:46, 21 December 2015 (UTC)

cs description and p27

For this selection (2700 items), P27 could be defined and the description changed to use caps for the adjective. --- Jura 10:42, 4 December 2015 (UTC)

done --Pasleim (talk) 22:46, 30 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 22:46, 30 December 2015 (UTC)

P106 and Q15117302

If P641 is Q1734 then P106 should be at least Q15117302 but for hundreds of players, the field is empty (https://tools.wmflabs.org/autolist/index.php?language=fr&project=wikipedia&category=Joueuse%20de%20volley-ball&depth=12&wdq=claim%5B31%2C5%5D%20AND%20%20noclaim%5B106%5D%20AND%20claim%5B641%2C1734%5D&pagepile=&statementlist=&run=Run&find_label=1&mode_manual=or&mode_cat=or&mode_wdq=and&mode_find=or&chunk_size=10000 ) Dacoucou (talk) 09:59, 15 December 2015 (UTC)

OK, it is done, thanks a lot. Same issue with: If P641 is Q8418 then P106 should be at least Q13365117. Two links to verify: handballeur and handballeuse. Dacoucou (talk) 16:55, 15 December 2015 (UTC)
done --Pasleim (talk) 14:08, 21 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 18:10, 30 December 2015 (UTC)

Person:

Some stuff (about 150) that slipped through. --- Jura 19:06, 28 December 2015 (UTC)

done --Pasleim (talk) 20:58, 30 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 20:58, 30 December 2015 (UTC)

Add P569 (possibly P570 as well)

Based on descriptions here (about 100 items). --- Jura 19:16, 28 December 2015 (UTC)

done --Pasleim (talk) 20:26, 30 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 20:26, 30 December 2015 (UTC)

More Coordinates!

It would be nice if this constraint became emty! -- Innocent bystander (talk) 08:33, 30 December 2015 (UTC)

Maybe @Yger: wants to give it a try with QuickStatements (Q20084080). --- Jura 19:15, 30 December 2015 (UTC)
done --Pasleim (talk) 20:58, 30 December 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 20:58, 30 December 2015 (UTC)

Guidelines for automatical creation of entries

I am working on the back-end of the interface for the user-assisted migration of enwiki Persondata to Wikidata. Apart from the thousands of challenges which are already available there, there is data which is associated to Wikipedia articles which do not have Wikidata entries. I would like to create those entries automatically using my bot. What guidelines do I have to follow? I would assume the following procedure:

  1. checking whether the article is already connected to an entry → exit
  2. checking whether the article has inline language links → exit
  3. creating an entry with sitelink to the article, label in English based on the article's name (disambiguation part removed) plus instance of (P31)  human (Q5) statement

I did some test edits. See for instance Q21897838. Can I go on with this procedure? What else should I take in consideration? Thank you, -- T.seppelt (talk) 10:16, 31 December 2015 (UTC)

The English name can be used for more Latin script languages (DE/FR/NL/SV/HR/ES/PT/IT/etc). Maybe you can pull more data from an existing infobox, like birthdate/date of death/etc. Edoderoo (talk) 22:07, 31 December 2015 (UTC)
Thank you. -- T.seppelt (talk) 13:40, 2 January 2016 (UTC)
This section was archived on a request by: T.seppelt (talk) 13:41, 2 January 2016 (UTC)

Add a distinction (P166) to the members of a category

Hello,

The category en:Category:Recipients of the Knight's Cross of the Iron Cross with Oak Leaves and Swords contains all the people that were decorated with Knight's Cross of the Iron Cross with Oak Leaves and Swords (Q3003477) (and lower levels).

I think a bot should run to add Knight's Cross of the Iron Cross with Oak Leaves and Swords (Q3003477) (and remove Iron Cross (Q154554), Knight's Cross of the Iron Cross (Q165558) and Knight's Cross of the Iron Cross with Oak Leaves (Q3003469) if present as these are lower levels of this award).

I already did the job for Knight's Cross of the Iron Cross with Oak Leaves, Swords and Diamonds (Q3003470) and Knight's Cross with Golden Oak Leaves, Swords and Diamonds (Q3003471) (the higest levels).

After that the same job could be done with Knight's Cross of the Iron Cross with Oak Leaves (Q3003469) on en:Category:Recipients of the Knight's Cross of the Iron Cross with Oak Leaves.

Regards

--Hercule (talk) 15:35, 22 December 2015 (UTC)

Hello,
Is there a problem with this request ?
Regards
--Hercule (talk) 14:50, 4 January 2016 (UTC)
You could try to do it with Autolist (Q21640555). If a person received several awards of the same category, we would list each one of them.
--- Jura 14:53, 4 January 2016 (UTC)
Could you help me to make a request on Autolist ? --Hercule (talk) 15:28, 4 January 2016 (UTC)
Ok, I understand how it works. Thanks. --Hercule (talk) 15:38, 4 January 2016 (UTC)
This section was archived on a request by:
--- Jura 12:42, 10 January 2016 (UTC)

Recurent ARWIKI articles

Hey, Is it possible to run a bot to add those instances to items mainly existing in Arabic Wikipedia ?

--Helmoony (talk) 18:56, 27 December 2015 (UTC)

You could try to do them with Autolist (Q21640555).
Sample for Disambiguation (click "Run", it then takes about 5-10 minutes to load/prepare, finds 7 items only). --- Jura 11:22, 28 December 2015 (UTC)
Actually, I'm not sure if it works, as your sample isn't included. Instead of searching that string, maybe using the disambiguation category instead gives better results. Sample (loads in 1-2 minutes). --- Jura 11:29, 28 December 2015 (UTC)
@Helmoony: I'm currently doing everything, except the first one. The problem, tgat could be with Wikimedia disambiguation page (Q4167410), is that something in arwiki may be disambig, but it could be normal article in other Wikipedia. I will later check, which pages in disambig categories exists only in arwiki and mark them as Wikimedia disambiguation page (Q4167410). --Edgars2007 (talk) 17:23, 10 January 2016 (UTC)
@Edgars2007: Are you going to do all that manually or by bot ? It's gonna take you a lot of time manually. @Jura1: I' gonna do them manually to avoid conflicts. At least if you or Edgars2007 can do by bot those who are only existing in arwiki it's gonna help me a lot. --Helmoony (talk) 18:00, 10 January 2016 (UTC)
@Helmoony: the page title is bot requests :) Yes, I'm doing automatically. --Edgars2007 (talk) 18:04, 10 January 2016 (UTC)
@Edgars2007: It's because I've seen the edits in your contributions not in a bot contributions. May be you are using a special too. I hope I can give other articles like those because in our local wiki we usually use that method of naming. At least it's usefull for Wikidata.--Helmoony (talk) 18:09, 10 January 2016 (UTC)
@Helmoony: I currently have flooder flag (temporar bot flag), but it looks like it's only appearing in my watchlist. I'm using Autolist (Q21640555) with MS Excel and some other tools. OK, yes, I can add some statements more, but it would be good to know what and where I should add in approximately next 36 hours, because I plan to finally remove flooder flag after some 2 days. --Edgars2007 (talk) 18:28, 10 January 2016 (UTC)
@Edgars2007:, Articles ending with (رواية) which mean (novel) ----» add P31 Book and genre Property:P136 novel Q8261. articles ending with (مجلة) which means (magazine) -----» add P31 magazine(Q41298)articles ending with (جريدة) or (صحيفة) which means(newspaper) ----» P31 Newspaper (Q11032).--Helmoony (talk) 18:55, 10 January 2016 (UTC)
@Helmoony: should be ✓ Done. --Edgars2007 (talk) 14:19, 12 January 2016 (UTC)
This section was archived on a request by: --Edgars2007 (talk) 11:08, 16 January 2016 (UTC)

mark Hidden categories as Administrative

https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2015/04#Marking_adiministrative_categories says that all Hidden categories are Administrative. I tried to do this with http://tools.wmflabs.org/autolist/index.php by specifying Category=Hidden categories, depth=1 (all admin cats are direct subcats of "Hidden categories"):

  • set P31:Q15647814 Wikimedia administration category page
  • delete P31:Q4167836 Wikimedia category

But there are 15,859 total (see https://en.wikipedia.org/wiki/Category:Hidden_categories), so the tool takes forever.


Noteː this was dismissed in 2015/05 without much discussion, but I'd like to try one more times --Vladimir Alexiev (talk) 13:30, 29 December 2015 (UTC).

@Sjoerddebruin: commented in 2015/05:

  • "Don't forget that other projects could do something different. :)"
    • Do you mean non-English language Wikipedias? Can you please give an example of a Mediawiki project where Hidden categories are not Administrative?
  • "Seems controversial then, no response of requester"
    • Sorry Sjoerd that I have not watched the page. I'll watch this page now, and ping is a useful macro. Cheers!

 – The preceding unsigned comment was added by [[User:{{{1}}}|{{{1}}}]] ([[User talk:{{{1}}}|talk]] • [[Special:Contributions/{{{1}}}|contribs]]).

FYI: on rowiki all stubs categories are hidden. --XXN, 14:30, 28 March 2016 (UTC)
This section was archived on a request by: Jura 08:14, 8 April 2016 (UTC)

Transfer of asteroids labels

Help need to copy-paste label content in pages, that link to Q3863 from Russian to Armenian. Please, include only labels with titles, that consist a complex of letters and figures like Q8604, but not names like Q2356051, if possible. Transfer from English is prefered, if brackets () added to asteroid number. No limits to contents in that case. Please, don’t touch labels, that almost exist in Armenian. Thank you - Kareyac (talk) 07:37, 11 October 2015 (UTC)

This could be done with Add Names as labels (Q21640602) --- Jura 12:06, 4 December 2015 (UTC)


Connecting pages

On ceb.wiki and war.wiki are several thousands of homonymous categories not connected in wikidata. It's necessary to check cross-wiki if pagename exists on ceb.wiki + war.wiki, and also on sv.wiki — then connect these pages in one item - in 99% of cases they are about taxons. --XXN, 12:34, 22 October 2015 (UTC)

Example of pages to be interconnected

--XXN, 00:44, 3 November 2015 (UTC)

This could be done with Item Creator (Q21640495) (to create items) and QuickStatements (Q20084080) to connect additional languages/sitelinks. Ideally the cebwiki/warwiki/svwiki data would be maintained directly in Wikidata. --- Jura 12:06, 4 December 2015 (UTC)

Add place of burial (P119) based on image of grave (P1442)

Commons might have the information to add the property for the 2573 items currently listed at Wikidata:Database_reports/Constraint_violations/P1442#.22Item_place_of_burial_.28P119.29.22_violations. --- Jura 05:58, 6 July 2015 (UTC)

Can you point me to an example in which Commons stores this information? --Pasleim (talk) 12:56, 26 July 2015 (UTC)
Normally, there should be category on the image that would indicate the location, but Commons isn't that well structured. For the first five on todays constraint report:
  1. Q882#P1442 > File:Charlie_Chaplin_grave.jpg > Commons:Category:Charles Chaplin Grave > Commons:Category:Cemetery of Corsier-sur-Vevey > no QID linked, item doesn't exist yet
  2. Q2767#P1442 > File:Grave of Daniel Morgan, Winchester, Virginia - Stierch.jpg > Commons:Category:Mount Hebron Cemetery (Winchester, Virginia) > no QID linked, Q16895462 exists.
  3. Q5977#P1442 > File:HollyGrave850909.JPG > Commons:Category:Graves in the United States > no QID, Q30 might be too general
  4. Q7122#P1442 > File:Tombe Michel Audiard, Cimetière de Montrouge.jpg > Commons:Category:Cimetière de Montrouge > no QID linked, Q2972544 exists.
  5. Q24078#P1442 > File:La Jana - Waldfriedhof Dahlem.jpg > Commons:Category:Waldfriedhof Dahlem > no QID linked, Q875626] exists
Numbers 2,4,5 could work if the category was linked to the Wikidata item for the cemetery.
The easiest approach might be to query the images for categories and them attempt to search the corresponding QID for the more frequent ones and add those. --- Jura 13:15, 26 July 2015 (UTC)

Wikidata:Database reports/top missing properties by number of sitelinks/P119 gives a list of some that might be worth doing manually. --- Jura 22:39, 16 December 2015 (UTC)

Import P569/P570 dates from ruwiki (text)

Wiki ruwiki
Items without P569 count (all)
As of Oct 12 37361 (12 %)
overview


Some ruwiki articles still have dates in the article text:

  • Sample: Q20742501, ru.wikipedia.org: Худалов, Харитон Алексеевич
  • Sample text: Харитон Алексеевич Худалов (9 января 1905, с. Махческ, Терская область - 7 июля 2000, Владикавказ)
  • Format: (D MMM YYYY, ... - D MMM YYYY, ...)

To avoid calendar issues, skip pre-1919 dates. --- Jura 08:53, 11 October 2015 (UTC)

  • The proposed import does not take Gregorian/Julian calendar issue into account. -- Vlsergey (talk) 05:17, 28 December 2015 (UTC)

USA population

I would like USA populations data for USA communes. Source: tools.wmflabs.org.

Thank You! --B.Zsolt (talk) 10:46, 10 November 2015 (UTC)

@B.Zsolt: Please provide the official source of the data and if possible providing data in a text or cvs format. If we want to add data to wikidata we should extract them from the official data set instead of using an intermediate data set. Snipre (talk) 13:40, 4 December 2015 (UTC)
Looks like the US Census is the source. --Pyfisch (talk) 20:22, 27 December 2015 (UTC)
There are CSV files provided by the Census Bureau under https://www.census.gov/popest/data/index.html --Pyfisch (talk) 20:40, 27 December 2015 (UTC)

VIAF Identifiers for Dutch streets

There are about 900 items like Q19302580 for individual streets in the Netherlands. They all are instance of (P31)  street (Q79007), country (P17)  Netherlands (Q55), have dutch label "straat in ...", no sitelinks at all and additionally properties P276, P969, P625 and P281.

VIAF has got notice of these items and wrongly assigned them to (mostly geographic) entities. Recently these assignments had been imported here, resulting in P214 with reference stated in (P248)  Virtual International Authority File (Q54919) and retrieved (P813) some time this year. To remedy that situation on both sides P214 should be deleted on these items and then set to "novalue" (VIAF will take this as a hint to dissassociate the WD item from its clusters). -- Gymel (talk) 19:31, 16 October 2015 (UTC)

This could be done with Autolist (Q21640555). --- Jura 12:06, 4 December 2015 (UTC)
Neither deleting values with specific qualifiers nor setting to novalue is possible with autholist, at least when I checked last some weeks ago. -- Gymel (talk) 22:22, 29 December 2015 (UTC)
Removal does, but not setting novalue. --- Jura 19:16, 30 December 2015 (UTC)

Correct a massive redundancy problem in references related to a document

The url http://www.scb.se/Statistik/MI/MI0810/2010A01Z/05_Tatorter2010_befolkning_1960_2010.xls is massively used to reference some statements, but its title is inline into references and is using the old string datatype. There is an item for this document now Population in urban areas 1960-2010 (Q21855710) View with Reasonator View with SQID and substituting the set of claims used in those references would save a lot of work when using https://tools.wmflabs.org/pltools/addlingue/ as it keeps popping up for every claim, while solving a massive redundancy problem. Thanks :) author  TomT0m / talk page 17:29, 28 December 2015 (UTC)

  • Same for "List of FIPS region codes" and its parts : "List of FIPS region codes (D-F)" for example, about 230 results on wikidata seen by google : "https://www.google.fr/?gws_rd=ssl#q=site:wikidata.org+%22List+of+FIPS+region+codes+%28D%E2%80%93F%29%22&start=10 . A little more difficult. author  TomT0m / talk page 13:09, 29 December 2015 (UTC)
    • I'm not convinced. It seems that some people prefer this approach (non-contributing users of Wikidata and talk-show hosts). --- Jura 13:53, 29 December 2015 (UTC)
      • I'm not asking for a policy, just for a robot. Here clearly it makes maintenance hard to hardcode the string in every references as to make a correction we would add to correct every single instances of this string. The same for the next correction that might occur. Although we should clearly enlight the need to create an item in those cases in our good practices list. author  TomT0m / talk page 16:04, 29 December 2015 (UTC)
        to be more explicit : this string is monolingual and needs to be "multilinguised". This is fastidious and boring, (kind of fast to do manually, but still a lot of work). Making the job by a bot makes all subsequent corrections just to be made on the item. author  TomT0m / talk page 16:13, 29 December 2015 (UTC)
        last thing : the reference, once this work done, is just a click a way, for consultation (or no click away with a popup) or for correction. So I would not understand what would be the drawbacks, even if some contributors would prefer to write the informations again and again (an idiot thing to do in my mind if you use the same reference in many statement ...) author  TomT0m / talk page 16:24, 29 December 2015 (UTC)