Wikidata:Bot requests

From Wikidata
Jump to: navigation, search






for permissions

for deletions


for deletion

for comment

and imports


a query


Bot requests
If you have a bot request, add a new section using the button and tell exactly what you want. To reduce the process time, first discuss the legitimacy of your request with the community in the Project chat or in the Wikiprojects's talk page. Please refer to previous discussions justifying the task in your request.

For botflag requests, see Wikidata:Requests for permissions.

Tools available to all users which can be used to accomplish the work without the need for a bot:

  1. PetScan for creating items from Wikimedia pages and/or adding same statements to items
  2. QuickStatements for creating items and/or adding different statements to items
  3. Harvest Templates for importing statements from Wikimedia projects
  4. Descriptioner for adding descriptions to many items
On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2018/01.
Filing cabinet icon.svg
SpBot archives all sections tagged with {{Section resolved|1=~~~~}} after 2 days.
You may find these related resources helpful:

High-contrast-document-save.svg Data Import Hub
High-contrast-view-refresh.svg Why import data into Wikidata.
Light-Bulb by Till Teenck.svg Learn how to import data
Noun project 1248.svg Bot requests
Question Noun project 2185.svg Ask a data import question
Check Box Noun project 10759.svg Data Import Archive

Taxon labels[edit]

For items where instance of (P31)=taxon (Q16521), and where there is already a label one one or more languages, which is the same as the value of taxon name (P225), the label should be copied to all other empty, western alphabet, labels. For example, this edit. Please can someone attend to this? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:11, 10 March 2016 (UTC)

Do you mean label or alias? I would support the latter where there is already a label and that label is not already the taxon name. --Izno (talk) 17:03, 10 March 2016 (UTC)
No, I mean label; as per the example edit I gave. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:16, 10 March 2016 (UTC)
See your last request: Wikidata:Bot_requests/Archive/2015/08#Taxon_names. --Succu (talk) 18:57, 10 March 2016 (UTC)
Which was archived unresolved. We still have many thousands of missing labels. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:16, 10 March 2016 (UTC)
Nope. There is no consensus doing this. Reach one. --Succu (talk) 20:22, 10 March 2016 (UTC)
You saying "there is no consensus" does not mean that there is none. Do you have a reasoned objection to the proposal? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:56, 10 March 2016 (UTC)
Go back and read the linked discussions. In the nursery of wikidata some communities had strong objections. If they changed their mind my bot can easily execute this job. --Succu (talk) 21:19, 10 March 2016 (UTC)
So that's a "no" to my question, then. I read the linked discussions, and mostly I see people not discussing the proposal, and you claiming "there is no consensus", to which another poster responded "What I found, is a discussion of exactly one year old, and just one person that is not supporting because of 'the gadgets then need to load more data'. Is that the same 'no consensus' as you meant?". There are no reasoned objections there, either. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:24, 10 March 2016 (UTC)
For the lazy ones:
--Succu (talk) 21:53, 10 March 2016 (UTC)
I already done for Italian label in past. Here other two propose: May 2014 and March 2015 --ValterVB (talk) 09:54, 11 March 2016 (UTC)
@ValterVB: Thank you. Can you help across any other, or all, western-alphabet languages, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:18, 16 March 2016 (UTC)
Yes I can do it, but before to modify 2,098,749 items I think is necessary to have a strong consensus. --ValterVB (talk) 18:14, 16 March 2016 (UTC)
@ValterVB: Thank you. Could you do a small batch, say 100, as an example, so we can then ask on, say, Project Chat? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:03, 18 March 2016 (UTC)
Simply ask with the example given by you. --Succu (talk) 15:16, 18 March 2016 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── @Pigsonthewing:

  • Test edit: Q14945671, Q21444273, Q2508347, Q25247.
  • Language: "en","de","fr","it","es","af","an","ast","bar","br","ca","co","cs","cy","da","de-at","de-ch","en-ca","en-gb","eo","et","eu","fi","frp","fur","ga","gd","gl","gsw","hr","ia","id","ie","is","io","kg","lb","li","lij","mg","min","ms","nap","nb","nds","nds-nl","nl","nn","nrm","oc","pcd","pl","pms","pt","pt-br","rm","ro","sc","scn","sco","sk","sl","sr-el","sv","sw","vec","vi","vls","vo","wa","wo","zu"
  • Rule:

Very important: is necessary verify if the list of languages is complete. Is the same that I use for disambiguation item. --ValterVB (talk) 09:42, 19 March 2016 (UTC)

    • I really don't like the idea of this. The label, according to Help:Label, should be the most common name. I doubt that most people are familiar with the latin names. Inserting the latin name everywhere prevents language fallback from working and stops people from being shown the common name in another language they speak. A very simple example, Special:Diff/313676163 added latin names for the de-at and de-ch labels which now stops the common name from the de label from being shown. - Nikki (talk) 10:29, 19 March 2016 (UTC)
      • @Nikki: The vast majority of taxons have no common name; and certainly no common name in every language. And of course edits can subsequently be overwritten if a common name does exist. As for fallback, we could limit this to "top level" languages. Would that satisfy? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:02, 19 March 2016 (UTC)
        • As far as I'm aware most tools rely on the absence of certain information. Adding #10,000 csv file of Latin / Welsh (cy) species of birds. would be rendered to handcraft. --Succu (talk) 23:11, 19 March 2016 (UTC)
          • Perhaps this issue could be resolved by excluding certain groups? Or the script used in your example could overwrite the label if it matches the taxon name? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:14, 23 March 2016 (UTC)
        • It may be the case that most taxon items won't have a common name in any language, but I don't see anything here which is only trying to target the taxon items which have no common names. Adding the same string to lots of labels isn't adding any new information and as Succu pointed out, doing that can get in the way (e.g. it makes it more difficult to find items with missing labels, it can get in the way when merging (moving common names to the aliases because the target already has the latin name as a label) and IIRC the bot which adds labels for items where a sitelink has been recently added will only do so if there is no existing label). To me, these requests seem like people are trying to fill in gaps in other languages for the sake of filling in the gaps with something (despite that being the aim of the language fallback support), not because the speakers of those languages think it would be useful for them and want it to happen (if I understand this correctly, @Innocent bystander: is objecting to it for their language). - Nikki (talk) 22:40, 22 March 2016 (UTC)
          • Yes, the tolerance against bot-mistakes is limited on svwiki. Mistakes initiated by errors in the source is no big issue, but mistakes initiated by "guesses" done by a bot is not tolerated at all. The modules we have on svwiki have no problem handling items without Swedish labels. We have a fallback-system which can use any label in any language. -- Innocent bystander (talk) 06:39, 23 March 2016 (UTC)
            • @Innocent bystander: This would not involve an "guesses". Your Wikipedia's modules may handle items without labels, but what about third-party reusers? Have you identified any issues with the test edits provided above? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:14, 23 March 2016 (UTC)
              • No, I have not found any issue in the examples. But this is not my subject, I would not see an issue even if it was directly under my nose. Adding correct statements for Scientific names and Common names looks more important here for the third party users than labels, which cannot be sourced. NB, the work of Lsjbot have done that Swedish and Cebuano probably have more labels than any other language in the taxon set. You will not miss much by excluding 'sv' in this botrun. -- Innocent bystander (talk) 07:00, 24 March 2016 (UTC)
                • If a taxon name can be sourced, then by definition so can the label. If you have identified no errors, then your reference to "guesses" is not substantiated. true, adding for Scientific names and Common names is important, but the two tasks are not mutually exclusive, and their relative importance is subjective. To pick one example at random, from the many possible, Dayus (Q18107066) currently has no label in Swedish, and so would benefit from the suggested bot run. indeed, it currently has only 7 labels, all the same, and all using the scientific name. Indeed, what are the various European language's common name for this mainly Chinese genus? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:34, 25 March 2016 (UTC)
          • No, this is not "trying to fill in gaps in other languages for the sake of filling in the gaps". Nor are most of the languages affected served by fallback. If this task is completed, then "find items with missing labels" will not be an issue for the items concerned, because they will have valid labels. Meanwhile, what is the likelihood of these labels being provided manually? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:14, 23 March 2016 (UTC)
            • If this is not trying to fill in the gaps for the sake of filling in the gaps, what problem is it solving and why does language fallback not help? (I'm sure the development team would be like to know that language fallback is not working properly). The taxonomic names are not the preferred labels and valid is not the same as useful (adding "human" as the description for humans with no description was valid, yet users found it annoying and useless and they were all removed again), the labels for a specific language in that language are still missing even if we make it seem like they're not by filling in all the gaps with taxonomic names, it's just masking the problem. I can't predict the future so I don't see any point in speculating how likely it is that someone will come along and add common names. They might, they might not. - Nikki (talk) 23:02, 24 March 2016 (UTC)
              • It solves the problem of an external user, making a query (say for "all species in genus X") being returned the Q items with no labels, in their language. This could break third party applications, also. In some cases, there is currently no label in any language - how does language fallback work then? How does it work if the external user's language is Indonesian, and there is only an English label saying, say, "Lesser Spotted Woodpecker"? And, again, taxonomic names are the preferred labels for the many thousands of species - the vast majority - with no common name - or with no common name in a given language. The "human" examples compares apples with pears. This is a proposal to add specific labels, not vague descriptions (the equivalent would be adding "taxon" as a description). Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:26, 25 March 2016 (UTC)
                • Why should an external user query a Wikidata internal called label and not rely on a query of taxon name (P225)? --Succu (talk) 22:04, 25 March 2016 (UTC)
                  • For any of a number of reasons; not least that they may be querying things which are not all taxons. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:32, 26 March 2016 (UTC)
                    • Grand answer. Maybe they are searching the labels for aliens, gods, fairy tales or something else? A better solution would be if the Wikibase could be configured to take certain properties like as taxon name (P225) or title (P1476) as a default value as a language independent label. --Succu (talk) 21:09, 27 March 2016 (UTC)
                      • Maybe it could. But it is not. That was suggested a year or two ago, in the discussions you cited above, and I see no move to make it so, no any significant support for doing so. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:19, 27 March 2016 (UTC)
                        • So what? Did you reached an agreement with svwiwki, cebwiki, warwiki, viwiki or nlwiki we should go along your proposed way? --Succu (talk) 21:43, 27 March 2016 (UTC)
    • @ValterVB: Thank you. I think your rules are correct. I converted the Ps &Qs in your comment to templates, for clarity. Hope that's OK. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:02, 19 March 2016 (UTC)
  • Symbol oppose vote.svg Oppose That majority of taxons does not have a common name, does not mean that all western languages should automatically use the scientific name as label. Matěj Suchánek (talk) 13:23, 16 April 2016 (UTC)
    • Nobody is saying "all western languages should automatically use the scientific name as label"; if the items already have label, it won't be changed. If a scientific label is added as a label, where none existed previously, and then that label is changed to some other valid string, the latter will not be overwritten. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:31, 20 April 2016 (UTC)

We seem to have reached as stalemate, with the most recent objections being straw men, or based on historic and inconclusive discussions. How may we move forward? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:28, 16 May 2016 (UTC)

That's simple: drop your request. --Succu (talk) 18:33, 16 May 2016 (UTC)
Were there a cogent reason to, I would. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:57, 17 May 2016 (UTC)
Anyone? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:04, 10 September 2016 (UTC)
@Pigsonthewing: I'll support the proposal if it is limited to major languages that don't have other fallbacks. For most taxons, the scientific name is the only name, and even for taxons with a common name, having the scientific name as the label is better than having no label at all. I'm reluctant to enact this for a huge number of languages though, as it might make merges (which are commonly needed for taxons) a pain to complete. Kaldari (talk) 23:02, 28 September 2016 (UTC)
@Kaldari: Thank you. Please can you be more specific as to what you mean by "major languages that don't have other fallbacks"? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:35, 29 September 2016 (UTC)
@Pigsonthewing: Maybe just the biggest Latin languages: English, German, Spanish, French, Portuguese, Italian, Polish, Dutch. Kaldari (talk) 18:29, 29 September 2016 (UTC)
I'm not sure why we'd limit ourselves to them, but if we can agree they should be done, let's do so. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:40, 29 September 2016 (UTC)
@Kaldari: Did you see my reply? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:48, 10 October 2016 (UTC)

Symbol oppose vote oversat.svg Strong oppose As said before...--Succu (talk) 22:02, 10 October 2016 (UTC)

What you actually said was "There is no consensus doing this. Reach one.". My reply was "You saying 'there is no consensus' does not mean that there is none. Do you have a reasoned objection to the proposal?", and you provided none then, nor since. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:11, 11 October 2016 (UTC)

It appears that Succu now agrees that adding scientific names of taxons as labels is a good thing. Shall we now proceed with a bot? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:53, 22 February 2017 (UTC)

Nonsens. My bot is adding labels in a few languages where there is agreement to do so. --Succu (talk) 17:07, 22 February 2017 (UTC)
Your bot is - as can be seen in the diff I provided, by anyone who chooses to look at it - adding scientific names of taxons as labels, which is exactly what I proposed we use a bot to do; and what you claimed to have opposed. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:38, 23 February 2017 (UTC)
...where there is agreement to do so. --Succu (talk) 14:02, 23 February 2017 (UTC)
Agreement from whom? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:09, 23 February 2017 (UTC)
Simply follow the discussions. --Succu (talk) 14:16, 23 February 2017 (UTC)
Please answer the question. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:37, 23 February 2017 (UTC)
I did often enough. So it's meanwhile a little bit boring. --Succu (talk) 15:50, 23 February 2017 (UTC)
I ask you a third time: Agreement from whom? Please answer, succinctly and unambiguously, with links, otherwise people may reasonably conclude that there are no such agreements. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:00, 23 February 2017 (UTC)
Again: follow the links in this thread. There is no need to repeat it again and again. --Succu (talk) 16:06, 23 February 2017 (UTC)
Succu has finally given an answer at Wikidata:Project chat#Agreement to add scientific names of taxons as labels, where discussion continues. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:53, 24 February 2017 (UTC)
I'm not willingly to answer in a thread full of insults. It doesn't help to move on.
Over there I stated „At earlier discussions that RfC was accepted as a starting point.“ refering to Wikidata:Requests for comment/Automatic labelling. To find the follow up discussions please scroll to the top of this thread.
Here I stated „...where there is agreement to do so“. This refers to the current common practice implemented by different bot owners, which stretches the scope of the aforementioned RfC. I'm not aware that the Wikidata community disagreed with this additions. My bot activity regarding adding ruwiki labels based on P225 were scrutinized at User:Succu/Archive/2015#Latest_SuccuBot_activities. There are some more pros and cons why not doing this.
Mr. Mabbett could you please summarize your arguments why we should mass add labels written with the en:Western alphabet based on taxon name (P225). BTW: This query gives back more than 240 language codes, including some not existing like lat-vul.
--Succu (talk) 21:56, 24 February 2017 (UTC)
The "Western alphabet" is inescapable, no matter of the language of the project, as both major Codes of nomenclature prescribe it. - Brya (talk) 05:51, 25 February 2017 (UTC)
Mr. Mabbett says: „In the light of comments made here I propose shortly to get a bot ...” --Succu (talk) 20:57, 28 February 2017 (UTC)
@billinghurst, Jarekt, Tagishsimon: Sry. --Succu (talk) 21:00, 28 February 2017 (UTC)
Indeed I do. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:29, 19 March 2017 (UTC)
Anyway, from time to time there appears editors which clones related labels for several items or even to small bacthes of items, but this is a drop in ocean. To keep a ~million of items without labels in many of most important languages isn't the best choice. XXN, 21:35, 30 June 2017 (UTC)
  • I'd Symbol support vote.svg Support this request, but with a clause - to not do this for *absolutely all taxon items*. To begin with taxons which have only few sitelinks (especially in "botopedias") or no sitelinks at all - these should be taxons which are exotic, uncommon or almost unknown, and most probably they does not have yet a common name in most of languages. The percentage of potential "unwated additions" in such case is very low. XXN, 21:35, 30 June 2017 (UTC)
At least one of your botopedias said no. --Succu (talk) 21:39, 30 June 2017 (UTC)
So, adding scientific name as the default Russian and Bulgarian label it's OK, but for other languages not? Maybe this was discussed somewhere else, I don't know. What would be then opposing arguments for doing the same for Romanian, for example?
I think this discussion needs a wider attention and involvement of the community, and perhaps it should take place either ad Project Chat or RFC. XXN, 19:01, 2 July 2017 (UTC)
@XXN: Please see Wikidata:Project chat/Archive/2017/03#Agreement to add scientific names of taxons as labels. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:17, 3 July 2017 (UTC)

OpenStreetMap objects[edit]

(Pinging participants in the deletion discussion for OpenStreetMap Relation identifier (P402): Yurik, Jura1, MaxSem, Kolossos, Susanna Ånäs, Abbe98, Andy Mabbett, d1gggg, Jklamo, Denny, Nikki, Sabas88, Thierry Caro, Glglgl, Frankieroberto, VIGNERON, and Kozuch.)

This is (for now) a draft proposal, but I'd rather not put this in the deletion debate where fewer people with bot experience would see this. I also did not realize this page existed on Wikidata and put a bot suggestion in the community portal a few weeks ago for some reason (no responses), which is why this is rather late (the deletion discussion began in November). Feel free to modify this, because I don't really know how this would work but wanted to make a request anyway because apparently no one's done it yet. Jc86035 (talk) 11:51, 17 January 2017 (UTC)

Could we have a bot which

  1. automatically pulls Wikidata item links from OSM objects' wikidata tags (but, if necessary for data quality, only if the item matches the Wikipedia article in the object's wikipedia tag), from the whole database initially and from new changesets thereafter;
  2. updates Wikidata items' OpenStreetMap Relation identifier (P402) (as well as properties for way and node tags, if created) from the initial dump and afterwards (human-approved if there's more than one object linked to the same Wikipedia article/Wikidata item ID);
  3. deletes the property value(s) from Wikidata items whenever a Wikidata ID is removed from an OSM object and not readded to another object (but, if necessary for data quality, only if the wikipedia tag is also removed without replacement; and only if manually approved for OSM users removing more than 10 of them within 24 hours); and
  4. makes a list of one-way links in the Wikidata → OSM direction and a list of OSM objects/Wikidata items where the links between items don't match each other?

In addition, would it be possible to use the same bot to automate this on OSM in the other direction as well? (I haven't notified the wiki or the OSM mailing lists or anything.)

Many thanks, Jc86035 (talk) 11:51, 17 January 2017 (UTC)

  • Symbol oppose vote.svg Oppose 1-3, for the reasons given in the deletion discussion; and ad nauseum. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:23, 17 January 2017 (UTC)
  • Symbol oppose vote.svg Oppose because OSM ids are not stable and they can change unexpectedly and without notice they should not really be stored. This is also why it was suggested that the OSM relation Property should be deleted. What would the use case for having a connection from Wikidata to OSM be in your case? I tried to address the discoverability issue by creating a userscript that displays a link to OSM on Wikidata... --Abbe98 (talk) 14:25, 17 January 2017 (UTC)
    • I don't have any sort of use case for this, but some people in the deletion discussion think this is a good idea and I thought I might as well make this. Jc86035 (talk) 14:50, 19 January 2017 (UTC)
  • Symbol support vote.svg Support the idea is good but with it needs some discussion and improvements (the import should respect the constraint of OpenStreetMap Relation identifier (P402) for examples). @Abbe98: OSM id are pretty stable, pretty much as stable as Wikipedia article name and when they change this is not really « unexpectedly and without notice » (I slightly remeber a tool like an API for querrying the changeset). For the use case, I can see hundred thousand cases who benefit to have all information on one single place; the first one being no need to use two different tools (using just Wikidata Query is - by definition - better than using Wikidata Query and Turbo Overpass). Plus, as there is some discussion against adding Wikidata ID on OSM (more exactly, IIRC, revert mass adding until a consensus is reached) or as the rules maybe different on the two projects, it's more secure to store the data on our side too (with our rules, like we do for other databases). PS: Your tool seems interresting but I can't make it work (or more probably I did something wrong or I'm not looking for what should be expect), what is it supposed to do? (I'd like to test it on cases, like [1] linking to Wilhelm Trute (Q15987301) or [2] and [3] both linking to Assen (Q798), examples taken from OpenStreetMap Relation identifier (P402) contraint violations). PPS: @Jc86035: for the first point, you don't really need a bot, you could do it yourself with Overpass request and Quickstatements (and some wit to check for inconsistencies). Cdlt, VIGNERON (talk) 15:12, 17 January 2017 (UTC)
    @VIGNERON:, I do disagree with you on the stability of OSM ids, some are stable(per the examples given during the deletion discussion) but most are not and just because you can query the changes it's not a streamlined process. Most of the "hundred thousands" of use cases must have been forgotten during the deletion discussion. I can see many use cases too but not where they are better then the options. On the subject of my tool; it should add a link to the OSM element in the sidebar to the left(under tools). Please drop me a note on Github or on my talk page if the issue persist. --Abbe98 (talk) 15:29, 17 January 2017 (UTC)
    @Abbe98: you can of course disagree but do you have any figures to suport your views? out of 96 French départements (Overpass request), 88 have the same number since their creation so ~91 % are stable. That is pretty stable to me (especially since départements are highly edited objects, so relation on more common objects are less likely to be unstable), in the same order as wiki sitelinks are stable (but more than Wikidata items, though there not 100 % stable either).
    Oh, my mistake, indeed, I wasn't looking at the right place. I will look into it (right now, I can spot one thing : when several OSM objects link to Wikidata your tool only show one, see the Assen (Q798) example I gave earlier). It's a great tool for readers and visualisation but it doesn't help for editors, re-users, external manipulation or querying. Why not turn this tool into a « compare WD and OSM (via OpenStreetMap Relation identifier (P402)), give warning if there is inconsistencies/problem and suggest to add/correct the relation »?
    Cdlt, VIGNERON (talk) 16:04, 17 January 2017 (UTC)
    @VIGNERON: your Overpass query did not return any data for me(Should it display anything when outputted as CSV?). I'm not sure départements is s good example as they are not merged and deleted as often as most OSM elements(AFAIK). No I have not put any efforts into obtaining any figures as it's easy to just analyze a set of non-diverse elements. Please see the deletion discussion here at Wikidata and T145284.
    Yes it's a known limitation that it does only links to one element(I'm not sure how to solve it UI-vise). I wonder how Kartographer deals with multiply Wikidata tags. The reason for not creating such a tool is that I believe OpenStreetMap Relation identifier (P402) should be deleted and even if OSM identifier where stable OpenStreetMap Relation identifier (P402) would be very limited(It's only for relations). --Abbe98 (talk) 19:03, 17 January 2017 (UTC)
    @Abbe98: strange, the overpass query should give a CSV with French départements relation ID. It was just an example, if you have a better one, or even better something more general, feel free to share it. I've seen phabricator tickets and the two deletion requests (and even participating), I'm still waiting to be convince a valid reason for deletion and to see numbers about the instability (either OSM instabillity or Wikidata instabillity). Cdlt, VIGNERON (talk) 19:55, 17 January 2017 (UTC)
    @Abbe98: Maybe you might need to press the magnifying glass button in the map sidebar? If there isn't any data you should see a grey bar at the top of the map something like "blank dataset received". Jc86035 (talk) 14:44, 19 January 2017 (UTC)
    I had my Overpass Turbo set to another Overpass instance witch returned broken data. --Abbe98 (talk) 14:12, 21 January 2017 (UTC)
    Ah, okay then. Jc86035 (talk) 15:54, 21 January 2017 (UTC)
  • Symbol oppose vote.svg Oppose it would need properties for nodes and ways (and in a future areas?) to describe the range of geometries one Wikidata object could be represented with. --Sabas88 (talk) 20:48, 17 January 2017 (UTC)
  • Symbol support vote.svg Support. I don't believe there is any serious unstability problem with OSM that does not exist with almost any external database. Whatever, a bot dedicating time to maintaining the property is not going to hurt anyone. So there is certainly no grounded reason to oppose. Thierry Caro (talk) 10:11, 19 January 2017 (UTC)
    • @Thierry Caro: I guess one downside could be that since OSM data can (I think) be directly pulled from WMF projects through GeoJSON, and Wikimedia editors can use their accounts to add data to OSM, there's not really much point in bothering to maintain two separate and redundant databases with a bot. Jc86035 (talk) 14:44, 19 January 2017 (UTC)
  • Symbol support vote.svg Support It's more user-friendly when we have the property. I don't think a little bit of data-unstability will lead to problems that a bot can't remedy. ChristianKl (talk) 13:14, 29 May 2017 (UTC)
  • Pictogram voting question.svg Question Is it OK to pull ODBL database into CC0 one? "automatically pulls Wikidata item links from OSM objects' wikidata tags" - it sounds like a violation of ODBL license of OSM - see (but copyright is tricky and I failed to find Wikidata equivalent of ) Mateusz Konieczny (talk) 21:33, 1 October 2017 (UTC)
  • "In addition, would it be possible to use the same bot to automate this on OSM in the other direction as well?" - it would require a completely separate discussion on OSM mailing lists/or forum (depending on affected regions) Mateusz Konieczny (talk) 21:37, 1 October 2017 (UTC)
    • There are already automated and semi-automated tools doing this, in line with OSM's rather stringent rules on automated edits. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:39, 1 October 2017 (UTC)

items for segments[edit]

For many anthology film (Q336144), it can be worth creating an item for each segment (sample: Q16672466#P527). Such items can include details on director/cast/etc as applicable (sample: Q26156116).

The list of anthology films includes already existing items.

This task is similar to #duos_without_parts above.
--- Jura 10:14, 22 April 2017 (UTC)

What source could the bot use? Matěj Suchánek (talk) 07:25, 25 August 2017 (UTC)
Good question. I checked the first 12 films on list above and about half had a WP article detailing the episodes.
For automated import, the structure of these articles might not be sufficiently standard:
  • section header per episode (ru), (pl)
  • table with names of episodes (de),
  • section with more details for each episode (es), (ru), (ca)
Maybe a user would need to gather the list for each film and a tool would create segment items. I guess I could try that on a spreadsheet.
--- Jura 08:49, 25 August 2017 (UTC)

Cleanup sv/cebwiki imports ?[edit]

I just noticed that Q24695115 duplicates Q953452 and that I had imported the area 0,36 km² from svwiki for Q24695115, but Q953452 already had 36 acres (15 ha) (as enwiki). I wonder if there are more such issues with area imported from svwiki by others. If yes, some cleanup is needed. --
--- Jura 05:27, 26 May 2017 (UTC)

Yes, there are thousands, maybe hundreds of thousands of such duplicates Mateusz Konieczny (talk) 14:47, 27 December 2017 (UTC)

Merge of language of work or name (P407) with original language of work (P364)[edit]

Request date: 5 June 2017, by: Snipre (talkcontribslogs)

Link to discussions justifying the request

Please see Wikidata:Properties_for_deletion#language_of_work_or_name_.28P407.29_and_original_language_of_work_.28P364.29 and especially the last subsections Wikidata:Properties_for_deletion#Migration.

Task description

Following the discussion, the merge of property P407 with P364 was finally decided in order to have an uniform use in WD. We are looking for a bot operator which can do this merge according to the description. The merge should be announced first so we are currently looking for bot operator which will be available to perform that task. The date of the merge has to be discussed.

Licence of data to import (if relevant)



@Matěj Suchánek, Pasleim, ValterVB:

  • Symbol oppose vote.svg Oppose for movies until a solution is found. Please proceed for books and other works, it's messy answays.
    --- Jura 12:12, 5 June 2017 (UTC)
    @Jura1: It's time to wake up. Sorry but the topic was discussed in the Wikidata:Properties for deletion page since 3 weeks and when the question about the application of the merge was raised, nobody was there to mention a problem. I propose you to have a look at the discussion there and to provide your feedback. Meanwhile we can already find a bot operator and see with him the merge procedure, then we will inform the data users, so we have still 2-3 weeks before the merge. Snipre (talk) 13:57, 5 June 2017 (UTC)
    No solution was proposed. I understand that you are still interested in the theoretical appeal, but is there actually a practical problem you want to solve?
    --- Jura 14:07, 5 June 2017 (UTC)
    @Jura1: Please just look at the discussion on Wikidata:Properties_for_deletion#language_of_work_or_name_.28P407.29_and_original_language_of_work_.28P364.29, and read once the subsection Wikidata:Properties_for_deletion#Movies where the topic of the movies was discussed and try to add useful comment there: you were the first one to comment the announcement of the merging (see here) but you never deigned to come and to discuss the open points although a call for discussion was done in the announcement. So the question is to know if you really want to discuss ? Snipre (talk) 15:38, 5 June 2017 (UTC)
  • I can do it within the next few days. --Pasleim (talk) 13:41, 7 July 2017 (UTC)
Pictogram voting comment.svg Comment migration is ongoing by using QuickStatements, so we are losing all references and qualifiers. --Pasleim (talk) 12:31, 9 October 2017 (UTC)
Request process

Replace DOI citations with items[edit]

Do we have a bot doing this, and if not, should we?

  1. Find a citation that is DOI (P356) based
  2. Lookup that DOI to see if it used by an item in Wikidata
  3. if so, replace the citation using stated in (P248)
  4. [optional] if not, fetch metadata from the webpage that the DOI resolves to, create an item, then replace the citation

Obviously, any replacements should preserve other qualifiers, like quotations, page number, etc. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:06, 11 July 2017 (UTC)

Note: For sparqling the value of DOI (P356) please use the ASCII uppercase version of the DOI to avoid the creation of duplicates. -Succu (talk) 20:10, 13 July 2017 (UTC)
The proposal looks quite uncontroversial to me, at least without the optional part. However, I have recently understood that we have to be careful about citations based on stated in (P248). The issue with them is that they make it slightly harder to access the source: there is one more hop from the statement to the source. You need to click first on the target of stated in (P248), then scroll down to the identifiers section, and pick the one that (you think) will lead you to the actual source. If you don't understand how Wikidata works, this is non-trivial! This was pointed out in this discussion on enWP, although the issue there is a bit different: the DOI points to a dump of the database that constitutes the source, and the user actually needed a GRID ID (P2427) to read the source record.
So, I think your proposal could be amended slightly: instead of replacing the citation, you could just add stated in (P248) in it (if it is not there already). This would also make sure that all citation metadata is preserved (for instance, page numbers which might not be on the bibliographic item). − Pintoch (talk) 09:58, 2 October 2017 (UTC)
I suppose the DOI (P356) does not necessarily needs to be removed? The stated in (P248) can just be added. — Finn Årup Nielsen (fnielsen) (talk) 11:54, 2 October 2017 (UTC)
If kept, the DOI would be a qualifier of "stated in" (otherwise, we'd have two apparent references for one source), but that seems redundant, when the DOI is on the target item. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:19, 2 October 2017 (UTC)
I too have some concerns about removing a DOI with an intermediate bibliographic layer–I believe this kind of intermediation is something Crossref folks like Joe Wass were worried about–while I obviously do support the idea of augmenting the citation and cross-referencing a bibliographic item, when available. It seems to me the citation model for Wikidata statements needs some additional love/discussion. We obviously don't have the luxury of cramming all relevant identifiers when adding a source as a reference so some discussion on best practices seems in order--DarTar (talk) 21:42, 2 October 2017 (UTC)
For citing an online database in general, I prefer both “stated in” and “ref URL” along with the database ID and/or entry name as qualifiers. So here I’d prefer adding the “ stated in” and retaining the DOI. - PKM (talk) 23:42, 2 October 2017 (UTC)
We should perhaps continue this discussion on a different page. If there are alternative models being proposed, I'm sure we'd all find it useful to see some examples of what people consider best practice. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:13, 3 October 2017 (UTC)

Adding items to Wikivoyage listings[edit]

Please see voy:Wikivoyage:Travellers'_pub#adding_.22wikidata.3D.22_to_listings.

  • The task would consist in adding the item to uses of voy:Template:Listing based on "wikipedia=" in that template. "wikipedia=" is the title of the enwiki article.
  • The bot should probably check if the item isn't a disambiguation item before adding it in "wikidata=".
  • Sample edit [4].
  • A page at Wikivoyage can include hundreds of uses of the template. Many entries don't include a Wikipedia article.
  • There are a few templates that are based on Listing, e.g. voy:Template:See, etc. The same should be done for these.

An operator would obviously need to seek approval at Wikivoyage before proceeding.
--- Jura 08:47, 23 July 2017 (UTC)

Import birth and death dates from RKDartists ID (P650)[edit]

Request date: 27 July 2017, by: Jarekt (talkcontribslogs)

Link to discussions justifying the request
Task description

Netherlands Institute for Art History (Q758610) database seems to have many day-precision birth and death dates for many items which only have year precision dates. Those should be imported.

The logic could be as follows. For people items with RKDartists ID (P650) (RKD):

  • look up birth and death dates and compare to the current date
  • if current date does not have a reference to a source outside Wikimedia projects than delete it
  • if current date matches RKD date but there is not reference to RKD then add reference
  • if no current date than import RKD date and add reference
  • if current date is a year and RKD date is more precise than import RKD date and add reference, then set RKD date priority to "preferred" status
  • otherwise just import RKD date and add reference
I'm not in favour of deleting dates of birth. Also, it's RKD, not KDR.
fixed --Jarekt (talk) 19:35, 27 July 2017 (UTC)
I'm already importing this data (date of birth/death, gender, occupation, etc.) from the RKD. Haven't really implemented the updating/improving of existing statements yet. Multichill (talk) 17:59, 27 July 2017 (UTC)
Multichill, I am mostly interested in improving existing statements in items with Commons Creator page (P1472). What is happening is that as I compare and merge creator template data with Wikidata items I found ~ 600 items where Commons had day-precision item while Wikidata had year-precission items. I was trying reconcile such cases by hand but that in 90% of cases meant adding the day-precision date to Wikidata, verifying it in RKD, adding a reference and either deleting the old year date (if no references for it) or bumping priority of new date to "preferred" status. All those steps take a lot of time when done manually and the whole process could be done more easily if a bot would improve existing statements. An example: dob in c:Creator:Camillo Innocenti is "1871-06-14" without a reference, dob in Q3651523 is "1871" with a solid reference, dob in RKD is "1871-06-14". If we import RKD date and set it's priority to "preferred" status than it will match Commons. --Jarekt (talk) 19:35, 27 July 2017 (UTC)
Did you already turn this into a query? It's items that have Commons Creator page (P1472), RKDartists ID (P650) and date of birth (P569) with precision year and no (decent) source and no other decent statement. Something like that? Multichill (talk) 20:16, 27 July 2017 (UTC)
  ?item wdt:P1472 [] .
  ?item wdt:P650 ?rdkid .
  ?item p:P569 ?dobstatement .
  ?dobstatement psv:P569 [
                wikibase:timePrecision "9"^^xsd:integer ;
                wikibase:timeValue ?birth ;
  MINUS { ?dobstatement prov:wasDerivedFrom ?provenance .
         MINUS { ?provenance pr:P143 [] } .
Try it!
Nevermind, this is what I made. Just clicking around I find many cases like you describe. Multichill (talk) 20:35, 27 July 2017 (UTC)
It is not just dates of birth but also dates of death. I did not create a query but c:Category:Creator templates with Wikidata link: quick statements is mostly filled with creator templates that fit that pattern. --Jarekt (talk) 02:04, 28 July 2017 (UTC)
Ok, I modified the RKDartists bot to do two extra things:
  • Source date of birth/date of death (replacing imported from if it's the only reference)
  • Replace year of birth/death with a more precise date.
That should cover most of it. Multichill (talk) 20:43, 22 August 2017 (UTC)

Multichill, I was just looking at Taco Mesdag (Q2452500) your bot added reference to existing day-precision birth date, which is great. However death date had year precision on wikidata and day precision on RKD and it was not corrected (I corrected it now). Is your bot still running or did it miss it somehow? Same thing with Sano di Pietro (Q1379714), where both dates need to be imported from RKD and saved with "preferred" rank. --Jarekt (talk) 16:24, 6 September 2017 (UTC)

@Jarekt: The bot doesn't touch statements which are sourced. See the sparql query, only imported from (P143) is replaced. Multichill (talk) 16:50, 6 September 2017 (UTC)
Multichill That makes sense. Can I propose a slightly different functionality where if there is a soured date than we leave it alone, but new statement with RKD date (if different). Also if we have multiple non-conflicting dates of different precision than we add "preferred" status to the highest precision date. That way we can add dates to items like Taco Mesdag (Q2452500) or Sano di Pietro (Q1379714) (and I and others do not have to do it by hand). Thanks again for adding all those references - that cleared a lot of items that needed to be fixed. --Jarekt (talk) 17:36, 6 September 2017 (UTC)
That would be possible I guess. Not sure when I might work on this. These are probably the interesting items:
  ?item wdt:P1472 [] .
  ?item wdt:P650 ?rdkid .
  ?item wdt:P1472 [] .
  ?item p:P569 ?dobstatement .
  ?dobstatement psv:P569 [
                wikibase:timePrecision "9"^^xsd:integer ;
                wikibase:timeValue ?birth ;
              ] . 
  ?dobstatement prov:wasDerivedFrom ?provenance .
  ?provenance pr:P248 wd:Q36578  .
  MINUS { ?item p:P569 ?dobstatement2 . ?dobstatement2 prov:wasDerivedFrom ?provenance2 . ?provenance2 pr:P248 wd:Q17299517 } 
Try it!
Multichill (talk) 18:31, 6 September 2017 (UTC)
That would definitely be a huge help! (And not only the ~1k items with only year precision and Integrated Authority File (Q36578) as source, also the ~2.5 with other sources would profit!) --Marsupium (talk) 22:40, 10 September 2017 (UTC)
PS: And in my eyes RKDartists (Q17299517) sourced day precision dates could by default be set to preferred rank indeed. For all the dates I've checked so far there were almost no cases where RKDartists (Q17299517) (with a day precision date) wasn't the best source I could find online (Allgemeines Künstlerlexikon (Q15791802) included).
Request process

Import lighthouses from enwiki[edit]

Per discussion at Wikidata_talk:WikiProject_Lighthouses#enwiki_bot_import.3F, please import the remaining lighthouses at (from w:Category:Pages using infobox Lighthouse needing Wikidata item)

by creating new items and adding the new qid to the enwiki template. There is a mapping of properties at Wikidata:WikiProject_Lighthouses/tools#Mapping_of_infobox_properties_for_lighthouses. Some fields may not be suitable for bot import and could be skipped.

All these lighthouses are in more general articles about the region/island/place. Articles may include another infobox about that.
--- Jura 06:17, 30 July 2017 (UTC)

Import ISNI from VIAF page[edit]

Request date: 8 August 2017, by: Sporti (talkcontribslogs)

Hi. A lot of items have VIAF ID (P214), but no ISNI (P213). VIAF page often has a link to ISNI ([5] for Tomo Virk (Q7820078)) so it could be easily imported. --Sporti (talk) 10:07, 8 August 2017 (UTC)

  • Making it part of the primary sources is silly. We actively collaborate with the OCLC. We know their ISNI database is better than 96%. VIAF has always been the more important of the two ISNI maintained sources. The Primary Sources will not make it better, it will only slow things down. When we decide that VIAF is leading, we can compare their dumps and update ISNI from it as is suggested. Thanks, GerardM (talk) 05:37, 2 October 2017 (UTC)
I agree, this should be imported directly. Human attention is precious and should be saved for other imports that actually need reviewing. − Pintoch (talk) 11:16, 2 October 2017 (UTC)
Request process

Resolving Wikinews categories from "wikimedia category"[edit]

Request date: 12 August 2017, by: Billinghurst (talkcontribslogs)

Link to discussions justifying the request
Task description

Numbers of Wikinews items which are categories have been labelled as instance of (P31) -> Wikimedia category (Q4167836) which I am told is incorrect. It would be worthwhile running a query where this exists and removing these statements, and the corresponding labels where they exist in interwikilink isolation on an item, and generating a list where they exist among a bundle of sister interwikilinks. This will enable an easier merge of those independent items to their corresponding items that exist, and a tidy up of items that are confused.


Lymantria (talkcontribslogs) and I were at cross-purposes as items with that statement were being merged to items. We had a confluence of rule issues, merging of wikimedia category items to items, versus the merging of Wikinews categories to items. Now better understood.

Request process

Redirects after archival[edit]

Request date: 11 September 2017, by: Jsamwrites (talkcontribslogs)

Link to discussions justifying the request
  • Following a comment on the request made on the Wikidata:Contact_the_development_team, I am posting it here. As described in the previous link, currently the links on Project Chat, Request For Query and even Contact the development team, do not get redirected to the corresponding archive page, once archived. Take for example, Wikidata:Contact_the_development_team#Redirects_after_archive may not work after a while. It was however suggested that bots can be created to resolve this issue.
Task description

Retain the links to the original discussion section on the discussion pages, even after archival by allowing redirection.

Licence of data to import (if relevant)

Request process

Missing "preferred" on 10,000+ population numbers[edit]

Request date: 9 October 2017, by: Yurik (talkcontribslogs)

Link to discussions justifying the request
Task description

Currently, there are over 10,000 entries with multiple data points for the population data. Apparently community has settled on using "prefered" method of identifying the most recent value, but it seems it is not being maintained. Please help. Thanks! --Yurik (talk) 23:42, 9 October 2017 (UTC)

User:PreferentialBot should already do this. Matěj Suchánek (talk) 12:04, 10 October 2017 (UTC)
Request process
seems to be done by PreferentialBot --Pasleim (talk) 12:36, 18 November 2017 (UTC)

{{Section resolved|--Pasleim (talk) 12:36, 18 November 2017 (UTC)}}

I don't think it does that.
To some extent, PreferentialBot creates (manual) work:
  • when one adds a second value, it comes along and sets it to preferred.
  • When one adds another more recent value, nothing happens and one has to unset preferred oneself.
    --- Jura 15:20, 18 November 2017 (UTC)
@Jura1:,@Yurik: Right now the bot only sets preference if there is no another preference. I could easily switch it to always set preference for latest date and moves it from another date, if there's consensus that this is what is wanted. Also, the bot skips the entries where not all statements have dates (since it has no idea which one is the latest). But if anything else does not work please ping me (preferably with examples of items that failed). Laboramus (talk) 07:14, 29 December 2017 (UTC)

Problem of a position of a link in heritage designation (P1435)[edit]

Request date: 5 November 2017, by: Fralambert (talkcontribslogs)

Link to discussions justifying the request
Task description

heritage designation (P1435) qualifier: described at URL (P973) --> "url" to heritage designation (P1435) reference : described at URL (P973) our reference URL (P854) --> "url"

Thanks. --Fralambert (talk) 16:41, 5 November 2017 (UTC)

Licence of data to import (if relevant)
@Fralambert: I took a random example: no label (Q3424616). The statement already has another reference URL (P854) as a reference. So how should this item be handled? Matěj Suchánek (talk) 08:54, 11 November 2017 (UTC)
@Matěj Suchánek: For one it's the url taken from the monument database, and the link put in described at URL (P973) is the bylaw of the designation. I looked in the constraints of P973, and you can't put this propperty as a qualifier or a reference. So I imagine the best will be as a another source with reference URL (P854)? --Fralambert (talk) 20:11, 11 November 2017 (UTC)
Request process

Ping creators and editors of an item when the item gets nominated for deletion[edit]

Request date: 12 November 2017, by: ChristianKl (talkcontribslogs)

I just went through a few items on Pasleim's list of items with questionable notability. In many cases it seems to me like it would be easy for the creator of the item to improve it to fit our standards. From a workflow persective it would be good if I could nominate the item for deletion and write into the box what the item needs. Currently, when I do this the people who are responsible for the item don't get pinged, so they don't know that they should work on the item without me manually pinging them.

I think it would be great if we could have a bot that automatically pings everybody who created or edited an item that's nominated for deletion.  – The preceding unsigned comment was added by ChristianKl (talk • contribs) at 02:45, 12 November 2017‎ (UTC).

  • Yeah, everyone who just edits one item you mean?
    --- Jura 05:56, 12 November 2017 (UTC)
  • Yes. If Alice created "super item" and Bob added the German translation for it, then I would like both Alice and Bob to get pinged. ChristianKl () 15:25, 12 November 2017 (UTC)
  • Yes, we really need this bot. Because it creates the impression that a hidden war is being waged against the creators and editors of the item, work is being done quietly, which only reduces respect for the nominator for removal. --Fractaler (talk) 15:34, 12 November 2017 (UTC)
How do you think to manage QuickStatements in background? --ValterVB (talk) 17:04, 12 November 2017 (UTC)
Sorry, I did not manage "QuickStatements", I do not know the answer. Fractaler (talk) 11:24, 22 November 2017 (UTC)
Symbol support vote.svg Support The creators and editors of an item must be reported/pinged after an item is marked for removal. John Samuel 18:51, 12 November 2017 (UTC)
Pictogram voting comment.svg Comment We have DeltaBot, which checks WD:RfD every ten minutes and does necessary work there. We could easily modify it to send a message to the items' creator. But there are tens of new requests to be handled every day, which means tens of messages to creators. If one user created many items, they would get many messages (same with "pings"). Of course, we could limit it to one message per day, for example, but then they wouldn't have complete list of items in question... So we need to find some boundaries for this first. Matěj Suchánek (talk) 19:22, 12 November 2017 (UTC)
Since DeltaBot adds the name of the administrator who deleted an item, it was set on the echo blacklist to prevent too many pings for administrators. Thus, it can't to this job. --Pasleim (talk) 19:28, 12 November 2017 (UTC)
I don't have a problem with created a lot of pings for a user that creates items that then get requested for deletion. It's good when the user gets constantly reminded by pings that the items aren't up to our standards.
We could however go of a maximum of one ping per person per section '== ==' ChristianKl () 15:10, 13 November 2017 (UTC)
Correctly. The earlier the beginner learns about the current Wikidata data format and the current rules, he will soon understand what he needs to do, and |-> the better for everyone. Fractaler (talk) 11:24, 22 November 2017 (UTC)
You want to nominate items for deletion, that meet our notability criteria? Why would you do that? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:02, 10 December 2017 (UTC)
I don't see where you get that idea that I want to delete items that clearly meet our notability criteria. ChristianKl () 00:48, 13 December 2017 (UTC)
From "it would be easy for the creator of the item to improve it to fit our standards... it would be good if I could nominate the item for deletion ". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:41, 17 December 2017 (UTC)
I think also that notify aren't mandatory, so it is not obvious that a user has active notifications --ValterVB (talk) 18:55, 10 December 2017 (UTC)
Notifications are on by default. While not every user has them enabled, they are useful to give people a chance to defend their items when there's a deletion discussion. ChristianKl () 00:48, 13 December 2017 (UTC)
This is very honest and noble. --Fractaler (talk) 11:49, 14 December 2017 (UTC)
Request process

Dates: August–December 2017[edit]

Merge all date-links of August–December 2017 between at least 1 August–10 December 2017 in the languages Czech, English, French, Portuguese and Spanish. Like this Q42123488 (already fixed). Items usually exist in Spanish, so there's no need to create new items. I've already fixed some. This is just ordinary language-linking, and needs no support from controversial discussions.

J 1982 (talk) 10:47, 12 November 2017 (UTC)

@J 1982: most items can't be merged because frwikinews has two sitelinks for each date, e.g. 10 September 2017 (Q37787943) and no label (Q37802406) --Pasleim (talk) 10:05, 17 December 2017 (UTC)
  • I made a list for 2018. Maybe a bot could attempt to connect periodically (daily, hourly?) new pages to (some of) the items. --- Jura 21:21, 17 December 2017 (UTC)
    • It seems that 20% of the pages already existed (notably pt/fr, some ja/zh). I linked these. All items now have sitelinks. Periodically checking and adding new pages could save us countless duplicates days items generate --- Jura 23:52, 19 December 2017 (UTC)
    • Some wikis appear to create them day-by-day, others for a quarter or an entire year. Some languages have a lot of content on every page, others have empty ones for every day of a month.
      --- Jura 13:13, 21 December 2017 (UTC)
  • I merged a few more items for 2017 and added facet of (P1269) to the item for the second French link. This increased the number of sitelinks on the main items for that year from 1600 to 2900. --- Jura 14:19, 19 December 2017 (UTC)
    • I added more sitelinks for 2017. We know have about 4500. While these items probably provide useful sitelinks to Wikinews, a good use for the items within Wikidata still needs to be found.
      --- Jura 13:13, 21 December 2017 (UTC)
✓ Done by Jura1 --Pasleim (talk) 10:05, 9 January 2018 (UTC)
  • As pages keep coming up (e.g. today, maybe more so for 2018 than for Q3/Q4 2017), it still needs to be re-done periodically.
    --- Jura 10:11, 9 January 2018 (UTC)

Copy taxon common name to alias[edit]

We need a bot, please, to copy values from taxon common name (P1843), to aliases in the appropriate languages, like this edit. A periodic update would be good, too. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:00, 15 November 2017 (UTC)

At least two steps should happen before this:
  1. copy to labels where missing
  2. update labels/aliases with different capitalization (upper to lowercase)
At the moment, there are around 300,000 (!) aliases to be added. Matěj Suchánek (talk) 18:14, 14 December 2017 (UTC)
This is not a good idea at all. Please don't. - Brya (talk) 17:35, 27 December 2017 (UTC)
Yet another vague, negative comment, with no justification offered. Meanwhile, here's a quote from Help:Aliases (emboldening added): "The label on a Wikidata entry is the most common name that the entity would be known by to readers. All of the other common names that an entry might go by, including alternative names; acronyms and abbreviations; and alternative translations and transliterations, should be recorded as aliases". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:13, 27 December 2017 (UTC)
This Help:Aliases was written a very long time ago, at a time when Wikidata was envisioned as just a depository of sitelinks. - Brya (talk) 18:23, 27 December 2017 (UTC)
Indeed it was written a long time ago - and it remains current. But Wikidata was never "envisioned as just a depository of sitelinks". Still no justification offered. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:21, 27 December 2017 (UTC)
Plenty of users still envision Wikidata as just a depository of sitelinks, even today. And clearly Help:Aliases was written from that perspective. - Brya (talk) 05:08, 28 December 2017 (UTC)
"Plenty of users..." While you appear to be making things up, or at the very least making claims without substantiating them, there can be no useful dialogue with you. And please stop double-indenting your comments; as explained to you previously, it breaks the underlying HTML markup. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:44, 28 December 2017 (UTC)
To anybody sane, HTML is a tool to be used towards a purpose. Regarding HTML as a purpose onto itself is going beyond mere idiosynchrasy. - Brya (talk) 17:53, 28 December 2017 (UTC)
Sounds like a good idea to me. I think it's important to link the common names to the proper species name. I am not sure about the specific implementation. For example, I don't have a strong preference to have the taxon name or the common name as label, but the other should be an alias for sure. That makes it a lot easier to recognize the proper entry. --Egon Willighagen (talk) 14:34, 29 December 2017 (UTC)
Just to cite a current problem involving the "interpretation" of a common name: Wikidata:Project_chat#Xantus's_Murrelet. As far as I know we had a major import from Wikispecies without any references and some referenced additions via IOC, IUCN or Wörterbuch der Säugetiernamen - Dictionary of Mammal Names (Q27310853) done by my bot. I object to add any unreferenced common names as aliases. If have no problem with the latter onces, if there is an agreement here to do so. --Succu (talk) 21:33, 29 December 2017 (UTC)
Yes, there are at least two problem sets of common names
  • those moved from Wikispecies {by Andy Mabbett). After this had been done, at Wikispecies they did not want common names from Wikidata imported into Wikispecies because of the dubiousness of this content.
  • those from ARKive; proposed by Andy Mabbett as a property as being a "repository of high-quality, high-value media" but used by some user to "import common names" from. Many of these are not just dubious, but outright bogus. - Brya (talk) 05:39, 30 December 2017 (UTC)
Your indentation fixed, again. The idea that Wikispecies did not want to use content from, er, Wikispecies is, of course, laughable. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:39, 30 December 2017 (UTC)
Laughable? I doubt you can show us a discussion where people at Wikispecies are eager to drop there content related to taxon common name (P1843) and reimport it from here. --Succu (talk) 20:15, 30 December 2017 (UTC)

Importing IRFU player numbers into P3848 property[edit]

Request date: 19 November 2017, by: Blackcat (talkcontribslogs)

Link to discussions justifying the request
Task description

Irish Rugby Football Union ID (P3848) has recently been created. I created here a table with the data collected on containing players name and player number. I wanted to know whether a bot can import the player ID in the respective wikidata item.

Licence of data to import (if relevant)

Hi Blackcat (talkcontribslogs),

I think this task can be achieved using a bot. There are several questions in my mind:

--Tozibb (talk) 09:51, 22 November 2017 (UTC)

Hello Tozibb, the source is here. .. Blackcat (talk) 13:31, 22 November 2017 (UTC)
@Blackcat: Do you have a mapping of player names on and Q-Id's on Wikidata? --Pasleim (talk) 08:27, 17 December 2017 (UTC)
Hello @Pasleim:, I am not sure to have understood your question, could you please be so kind to show me an example? -- Blackcat (talk) 13:13, 17 December 2017 (UTC)
On the page you provided we get for each player name his irishrugby-ID. The problem is that there could be mulitple persons with the same name. So to add this irishrugby-IDs to Wikidata we need to know in addition which ID they have here on Wikidata. --Pasleim (talk) 15:00, 18 December 2017 (UTC)
It sounds like this might be better done in Mix'n;Match. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:33, 30 December 2017 (UTC)
Request process

sync changes on commons with WD[edit]

Request date: 26 November 2017, by: Herzi Pinki (talkcontribslogs)

Link to discussions justifying the request
Task description

Is there a bot that cares for consistency regarding changed or deleted commons objects like categories / galleries / etc? Reason is that Commons:Category:Sankt Georgen an der Gusen was moved to Commons:Category:St. Georgen an der Gusen and the original category was deleted thereafter. But in Sankt Georgen an der Gusen (Q127604) under Commons category (P373) there is still the old commons category set. Now dangling. Is there already a bot doing this work for commons? If so, this might be a slip through case: How can we deal with these cases? Please leave error condition on Sankt Georgen an der Gusen (Q127604) as it is for the moment.. Calling @Krd: who does similar for de:WP.

In the previous request, a bot operator stated his bot was doing this every day. Matěj Suchánek (talk) 08:26, 27 November 2017 (UTC)
Request process

Fetch coordinates from OSM[edit]

We have many items with an OpenStreetMap Relation identifier (P402), and no coordinates:

SELECT ?item ?itemLabel WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
  ?item wdt:P402 ?OpenStreetMap_Relation_identifier.
  MINUS { ?item wdt:P625 ?coordinate_location. }
LIMIT 1000

Try it!

Can someone please fetch the coordinates from OSM? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:59, 27 November 2017 (UTC)

an OSM relation id sometimes denotes a rather big area (a municipality, a nature reserve, etc.). What coordinates to take? --Herzi Pinki (talk) 19:19, 27 November 2017 (UTC)
The centroid. If that's not possible, the centre of the bounding box. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:39, 27 November 2017 (UTC)
@Pigsonthewing: It's a bit late to comment and I'm sure you're aware of this, but this won't work for public transport and route relations because they're either lines or made up of other relations, and won't work for objects with multiple locations like Toys "R" Us (Q696334). It's probably not a good idea to add coordinates to those with an automated process unless the resulting coordinates accurately represent a point that is actually on the line, or represent all points of a multiple-node relation (and neither of these may be desirable). Jc86035 (talk) 13:06, 27 December 2017 (UTC)
Sets of objects with multiple locations should not be in an OSM relation. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:38, 27 December 2017 (UTC)
What about ODBL license of OSM data? Mateusz Konieczny (talk) 14:45, 27 December 2017 (UTC)

French communes and their liberation date[edit]

Request date: 8 December 2017, by: Matěj Suchánek (talkcontribslogs)

Link to discussions justifying the request
Task description
Import the table from w:fr:Chronologie de la Libération en France. (Eventually, I can make it myself. Adding it here, just in case I should forget.) Matěj Suchánek (talk) 08:48, 8 December 2017 (UTC)
  • Is this going to include the sources? Also, what will the format be? Country start date? Significant event? --Yair rand (talk) 21:27, 19 December 2017 (UTC)
Request process

Remove commas from places[edit]

Request date: 17 December 2017, by: Lucywood (talkcontribslogs)

Link to discussions justifying the request
Task description

Replace in label Placename, County, Placename, State and Placename, Country to Placename and move it to alias. For example Thruxton, Hampshire (Q2298226) should be just Thruxton. It was pointed out by User:Jheald about places like Q4810933 that have the comma as part of the name so this time I am just asking for counties, states and countries.

Some place names with actual commas inside the place names, not as disambiguators, IIRC. Is this going to check to see if the part after the comma matches a parent division's name, before making the edit? --Yair rand (talk) 21:25, 19 December 2017 (UTC)
As far as I'm aware there aren't any places that the name includes the country or state after the comma. There appears to be a very low risk of error even if there is. Lucywood (talk) 16:03, 20 December 2017 (UTC)
How much effort have you spent researching whether those places exist? Just because you don't know about them, doesn't mean that they don't exist. There's cases where the day after the September 2 is the September 14. In general it's bad to expect that exceptations don't eist just because you don't know about them. ChristianKl❫ 09:36, 22 December 2017 (UTC)
Symbol oppose vote.svg Oppose We don't have a general policy that indicates that naming should always exempt the state. Decisions about how items about places are named should be done on a case by case basis. ChristianKl❫ 09:32, 22 December 2017 (UTC)
Request process

Fix spaces in music ID (P3192)[edit]

Request date: 29 December 2017, by: Rotpunkt (talkcontribslogs)

Task description

Sometimes music ID (P3192) uses "+" for spaces (example music ID (P3192) of Pink Floyd (Q2306)), sometimes not (example music ID (P3192) of Mark Hoppus (Q4271)). The request is to replace spaces with "+" (example), to have uniformity among all items. --Rotpunkt (talk) 12:54, 29 December 2017 (UTC)

  • Symbol support vote.svg Support. The "+" seems to be canonical, at LastFM, in the results from their search feature Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:30, 30 December 2017 (UTC)
Request process

Automatic creation of labels in Arabic dialects for Wikidata entities from existing labels in Modern Standard Arabic for these items[edit]

Request date: 29 December 2017, by: Csisc (talkcontribslogs)

Link to discussions justifying the request
Task description

There is no label in most of the Arabic dialects for entities in Wikidata. However, there are several simple rules that can be easily applied using Pywikibot to add labels in Arabic dialects to entities in Wikidata:

  • The label of a proper entity (Person, Place, Trademark...) in Modern Standard Arabic is the same as the one of such an entity in the following Arabic dialects: South Levantine Arabic (ajp), Gulf Arabic (afb), Hejazi Arabic (acw), Najdi Arabic (ars), Hadhrami Arabic (ayh), Sanaani Arabic (ayn), Ta'izzi-Adeni Arabic (acq), and Mesopotamian Arabic (acm).
  • Labels of places and people (if they do not hold another citizenship) from Palestine, Jordan, Syria, Iraq, Kuwait, Yemen, Oman, Bahrain, Qatar, UAE, Saudi Arabia, Sudan, Djibouti, Comoros, Somalia, and Mauritania are the same in all Arabic dialects as in Modern Standard Arabic.

Request process

OSM tag/ key URLs[edit]

Until such time (if ever) that OpenStreetMap tag or key (P1282) is converted to an 'external ID' datatype, please add URL qualifiers to all of its values, like in this example edit. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:49, 3 January 2018 (UTC)

‎Convert P1343->P248 qualifier to be P1343-> P805 qualifier[edit]

Request date: 13 January 2018, by: Billinghurst (talkcontribslogs)

Link to discussions justifying the request
Task description
Licence of data to import (if relevant)



The current use of P1343 -> P248 is generating a constraint violation as P248 is identified as only to be used for references. The above linked conversation stated that P805 was the more appropriate property to use. There will be 10s of thousands of these at a bare minimum as it was the recommended means to link at Wikidata:WikiProject Books  — billinghurst sDrewth 10:51, 13 January 2018 (UTC)

Request process

Some labels in French[edit]

Request date: 17 January 2018, by: Moumou82 (talkcontribslogs)

Link to discussions justifying the request


Task description

I would like to ask for support to add labels in French in the following instances:

  • "Tunisian handball player": add "handballeur tunisien" if P21=Q6581097 or "handballeuse tunisienne" if P21=Q6581072
  • "Tunisian volleyball player": add "volleyeur tunisien" if P21=Q6581097 or "volleyeuse tunisienne" if P21=Q6581072

I also would like to harmonize existing labels in French to match the most frequent naming:

  • "footballeur tunisien": change to "joueur de football tunisien"
Licence of data to import (if relevant)
Request process

UK station codes[edit]

Please can someone migrate codes for railway stations in the United Kingdom from station code (P296) to the newly-created UK railway station code (P4755)? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:05, 18 January 2018 (UTC)