Wikidata:Bot requests

From Wikidata
Jump to: navigation, search


Bot requests
If you have a bot request, create a new section here and tell exactly what you want. You should discuss your request first and wait for the decision of the community. Please refer to previous discussion. If you want to request sitelink moves, see list of delinkers. For botflag requests, see Wikidata:Requests for permissions.
On this page, old discussions are archived after 2 days, if they are marked with {{Section resolved}}. An overview of all archives can be found at this page's archive index. The current archive is located at 2015/07.

Intel processors[edit]

Hello. Can some bot please read Intel's site http://ark.intel.com and gather information from that database for the Wikidata? Its needed that it catches the name of the processors, create an item corresponding to each one and complete the item with the properties sockets supported (P1041), instruction set (P1068), manufacturer (P176) and number of processor cores (P1141).--MisterSanderson (talk) 15:58, 17 May 2014 (UTC)

Why don't you write them to ask that they release that data via a dump/machine readable API under a free license (or rather CC-0)? Even better, they could add it themselves here on Wikidata, to save us some work. --Nemo 17:16, 17 May 2014 (UTC)
I could not find an appropiate e-mail adress at http://www.intel.com/content/www/us/en/company-overview/contact-us.html, so there is no way to contact them.--MisterSanderson (talk) 18:45, 17 May 2014 (UTC)
Try any of the first four in "Intel PR Departments" [1] (calling yourself an analyst) and [2], you'll be fine. --Nemo 15:51, 23 May 2014 (UTC)
Ok, I sent them a message.--MisterSanderson (talk) 11:37, 25 May 2014 (UTC)
The contact was closed without response.--MisterSanderson (talk) 16:50, 29 May 2014 (UTC)
So the creation of the items needs to be made by Wikidata robots...--MisterSanderson (talk) 15:32, 4 July 2014 (UTC)

Today I found the link "Export Full Specifications" that generates a XML file with the data. I think this will turn easy to gather the information with bots.--MisterSanderson (talk) 15:06, 2 October 2014 (UTC)

Here, I even extracted myself manually the data and created a table: http://hypervolution.wordpress.com/2014/10/01/soquete-lga771/. I think that now there is no excuse to not include these informations on Wikidata.--MisterSanderson (talk) 18:52, 3 October 2014 (UTC)

The table looks good. However, we can't yet add values with a dimension (e.g. Hz, MB, nm) so the only information we can now extract is the number of cores (number of processor cores (P1141). Are there already items on Wikidata about intel processors or should a new item be created for every row in the table? --Pasleim (talk) 19:15, 3 October 2014 (UTC)
Not only number of processor cores (P1141), there are other properties too: sockets supported (P1041), instruction set (P1068) and manufacturer (P176). I think that maybe there is a "release date" property too, but I could not find. And there is the subclass of (P279): all Celeron models are a subclass of the Celeron family. Some processors already have an item, but in Wikipedia is more common to create articles about a family of processors, not to individual models, so I think that each row must be an item.--MisterSanderson (talk) 22:39, 3 October 2014 (UTC)

New table: https://hypervolution.wordpress.com/2014/11/01/soquete-lga775/ .--MisterSanderson (talk) 23:20, 6 December 2014 (UTC)

New table: https://hypervolution.wordpress.com/2015/01/01/soquete-fclga1366/ .--MisterSanderson (talk) 17:02, 1 February 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/02/01/soquete-fclga-1567/ .--MisterSanderson (talk) 22:41, 1 February 2015 (UTC)

You can add the data by yourself using http://tools.wmflabs.org/wikidata-todo/quick_statements.php . However, it is still not possible to add columns 3-6 to Wikidata as there is no support for quantities with units. Adding sockets supported (P1041) and instruction set (P1068) can be interesting but I do not find these data on your page. --Pasleim (talk) 13:54, 3 February 2015 (UTC)
This tool only add statements to already existing items. But there are not items for all these processors. That's why I need that a robot creates them, I don't want to create manually, I already did my part of the job by creating the tables from the ARK. sockets supported (P1041) is the title of the posts, and instruction set (P1068) is not available for all the processors. There is too little information for processors released before 2006. --MisterSanderson (talk) 18:11, 2 March 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/03/01/soquete-lga-1156/ .--MisterSanderson (talk) 18:11, 2 March 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/04/01/soquete-lga-1155/ .--MisterSanderson (talk) 21:14, 2 April 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/05/01/soquete-pga-478/.--MisterSanderson (talk) 00:35, 11 May 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/06/01/ppga-423/.--MisterSanderson (talk) 03:20, 15 June 2015 (UTC)

What's the point of saying that every month? Sjoerd de Bruin (talk) 07:29, 15 June 2015 (UTC)
I'm not saying the same thing each month, I'm notifying a new table every time.--MisterSanderson (talk) 00:19, 22 June 2015 (UTC)

Importing Commons Categories (P373)[edit]

Dutch Wikipedia has a nl:Categorie:Wikipedia:Commonscat zonder link op Wikidata (Commonscat without equivalent in Wikidata). From there the P373 statement could be filled. -- Pütz M. (talk) 23:13, 13 June 2014 (UTC)

In en:Category:Commons category without a link on Wikidata is even more. -- Pütz M. (talk) 23:28, 13 June 2014 (UTC)
 Writing something up for this, I'll work on it as much as I can. George Edward CTalkContributions 19:25, 15 January 2015 (UTC)
@George.Edward.C: any progress in this or do you know how to do that? We would like to import commonscat links from fi-wiki to Wikidata. --Stryn (talk) 17:59, 21 January 2015 (UTC)
Doing the parser right now. Will work on it more at the weekend. If I can figure out the parser, it should be easy to complete the rest of it. When I've written it, and it works, I will run both NL and FI after that. George Edward CTalkContributions 18:05, 21 January 2015 (UTC)
Please don't run a bot on nl. There are a lot of conflicts in them, also a lot of pages with double templates. Sjoerd de Bruin (talk) 18:12, 21 January 2015 (UTC)
Noted. :-) The bot will only run on EnWiki and FiWiki (as long as there's no problems with either of them). (Edit: I will probably need a category similar to those mentioned in the reques') George Edward CTalkContributions 18:23, 21 January 2015 (UTC)
Been a while, but I've finally finished the code, and tested it with 4 edits (2 didn't work as planned, so I'm going to work on that, as it happens when Commonscat defaults to the pagename when no value is specified). Expect a request at RFP soon. --George (Talk · Contribs · CentralAuth · Log) 08:47, 20 February 2015 (UTC)

Add instance of (P31) from MusicBrainz Artist Type via MusicBrainz artist ID (P434)[edit]

We can leverage the existing links between Wikidata and MusicBrainz to populate instance of (P31) for the people and groups linked with MusicBrainz artist ID (P434). Specifically, the bot would add human (Q5) if the MusicBrainz Artist Type was "Person", and band (Q215380) if the Type was "Group". The reference would be imported from (P143) MusicBrainz (Q14005), as with the MusicBrainz identifiers. If User:Mineo is interested, this could be an additional task for User:MineoBot -- or someone else could take it on. JesseW (talk) 06:21, 8 July 2014 (UTC)

What about duos, orchestras, non-performing personnel (compsers, producers etc)? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:18, 14 September 2014 (UTC)
That shouldn't be a problem for the first case (adding human (Q5)). It's a good point about automatically adding "Group" entries, though. MusicBrainz now categorizes some items in more detail (with orchestras, choirs, etc.) so those might be useful. JesseW (talk) 06:40, 5 December 2014 (UTC)
We should not describe, say, Simon & Garfunkel (Q484918) as an instance of (a) human (Q5). Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:04, 5 December 2014 (UTC)
I agree -- and they wouldn't fall under the first case, as they are identified as a Group on Musicbrainz (as can be seen here), not a Person. Do you see any problem with the first case (adding human (Q5) to items linked to "Person"-type MusicBrainz entries)? JesseW (talk) 06:53, 13 December 2014 (UTC)
I think "Group" in MusicBrainz is actually closer to instance of (P31) musical ensemble (Q2088357). According to http://tools.wmflabs.org/wikidata-todo/tree.html?lang=en&q=Q2088357&rp=279, orchestra (Q42998) is not currently a subtype of band (Q215380), whereas in MusicBrainz "Orchestra" is a subtype of "Group". -- Nikki (talk) 03:17, 5 January 2015 (UTC)
I agree, musical ensemble (Q2088357) is a better choice. Thanks for finding it. Now we just need to find someone interested in coding and running it... JesseW (talk) 06:01, 15 January 2015 (UTC)
I can take this from 24 January. --JulesWinnfield-hu (talk) 11:55, 15 January 2015 (UTC)

I've made ten test edits. May I proceed? What about types other than “Person” and “Group”? --JulesWinnfield-hu (talk) 15:34, 24 January 2015 (UTC)

Should Anonymous (Q10920) be instance of a musical ensemble? Do anyone know how many groups that aren't realy about music exists on MusicBrainz? If there are only a few then I guess it's better to mark everything up and try to fix the few that's strange, however if there are many I think we need to make sure that the group actually is about music. --Pajn (talk) 09:28, 25 January 2015 (UTC)
Thank you SO MUCH for coding this up and making some test edits! Considering the Anonymous issue, it seems like running it fully automatically would produce too many false positives. I see two possible paths forward:
  1. Generate list pages for manual operation (with humans checking to make sure the entity is actually music-related)
  2. Cross-check against Wikipedia categories and only do ones that already have a music-related category on their associated Wikipedia entry.
In either case, it would be great to get a sense of how many items we are talking about -- could you generate such counts? Thanks again (and sorry for the delayed reponse) JesseW (talk) 08:29, 23 February 2015 (UTC)
Also, of the Musicbrainz artist types, I think "Group" is probably the most ambiguous. You might have better luck working from "Person" (adding human (Q5), not anything more music-specific), or one of the new categories like Orchestra or Choir. JesseW (talk) 08:33, 23 February 2015 (UTC)
Type Count Total MB Artists MB Artists with Wikidata links
Group 4,225 226,496 32,022
Person 578 455,522 74,148
Orchestra 54 1,343 458
Choir 50 933 129
Character 49 2,447 209
Other 45 1,320 125
N/A 146 241,961 1,441

--JulesWinnfield-hu (talk) 22:42, 23 February 2015 (UTC)

I'm surprised at how few Person items there are (considering that there are nearly 500,000 MusicBrainz entities of that type). I'm also slightly confused by what this is a count of. Is it:
  1. A count of Wikidata items with links to MusicBrainz Artist entries of the specified type
  2. A count of Wikidata items with links to MusicBrainz Artist entries of the specified type AND without instance of (P31).
  3. A count of MusicBrainz Artist entries of the specified type with a link to a Wikidata item
  4. Something else?
These may not be the same because while User:MineoBot does try to keep them synced, I'm not sure if it is fully bi-directional. Looking at the numbers, I think doing the Orchestra and Choir ones would be useful (and small enough to manually fix after the fact (we'll need a way to mark false positives so they don't get re-done)). Regarding Person -- could you make a count of the Wikidata items that are linked to MusicBrainz Artist Person entities and lack instance of (P31) = human (Q5)? Also, maybe we should move this to your talk page, or some smaller page. Thanks for your work. JesseW (talk) 18:17, 28 February 2015 (UTC)
The counts are of option 2. --JulesWinnfield-hu (talk) 19:32, 28 February 2015 (UTC)
Neat -- could you find the counts for option 1 as well? JesseW (talk) 19:36, 28 February 2015 (UTC)
It would run for too much time, because of MusicBrainz rate limit. This was two and a half hours. --JulesWinnfield-hu (talk) 19:45, 28 February 2015 (UTC)
You might be able to use http://reports.mbsandbox.org/ . (Actually, I will look into doing so.) JesseW (talk) 20:24, 28 February 2015 (UTC) (
OK, here's a report that provides option 3: http://reports.mbsandbox.org/report/295/view . I've added the results to the table.

Lists of listed buildings[edit]

As can be seen from https://tools.wmflabs.org/wikidata-todo/autolist.html?q=claim[1216]%20and%20noclaim[625], a number of "list of listed buildings in [UK place]" articles (for example, Listed buildings in Brindley (Q15979098)) have multiple National Heritage List for England number (P1216) values. These should be removed, and replaced by the equivalent Q value, as has part (P527). Where they have multiple coordinate values these should also be removed. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:14, 20 September 2014 (UTC)

Can anyone help with this? Perhaps User:Magnus Manske has an idea? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:13, 26 October 2014 (UTC)
Note that many of these National Heritage List for England number (P1216) are Grade II buildings, many of which do not have a Q item yet. The WLM drive only called for grade I and grade II* buildings to get items. If there is will to do this, I recommend quick_statements. To create a new item for value, linking back to list (no leading spaces; separate by tabs):
CREATE
LAST P361 Qlist
LAST P1216 "value"

plus some "grade II" instance of, if the grade is known. (Note that I am using part of (P361) here; has part (P527) could then be created as "counter-link"?) One should probably clear out double values for National Heritage List for England number (P1216) first. --Magnus Manske (talk) 23:42, 26 October 2014 (UTC)

@pigsonthewing:@Magnus Manske: I'll take a look at the double values; most of the ones I've spot-checked so far are the list & the building, but not all. - PKM (talk) 18:02, 23 November 2014 (UTC)
Okay. I have cleared out the list of doubles for P1216. There were two cases where two items have the same English Heritage ID because there are separate ENwiki articles for different tenants of the same building. Otherwise the National Heritage List for England number (P1216) should only be duplicated between a building's item and the list-of-listed-buildings item.
I have not added new, separate items for every boathouse, pumphouse, wall, and gate that are part of an estate, so a few items have multiple EH ID's. Does that need to be done? - PKM (talk) 04:20, 24 November 2014 (UTC)

@Pigsonthewing: Should this job still be done? On one hand there are still 74 lists of listed buildings with National Heritage List for England number (P1216) claims but on the other hand, no single item out of 436 lists of listed buildings has a has part (P527) claim. So it doesn't look to me that there is a consensus to use has part (P527) on those lists.--Pasleim (talk) 12:54, 28 February 2015 (UTC)

Adding SOC job codes[edit]

http://www.bls.gov/soc/#materials

I've noticed that SOC Occupation Code (2010) (P919) didn't have much data (https://tools.wmflabs.org/wikidata-todo/translate_items_with_property.php?prop=919), while a canonical list from the Bureau of Labor Statistics (thus PD and freely reusable, cf http://www.bls.gov/bls/linksite.htm) does exist (we're talking 7K items).

The data is tabulated as follows:

SOC Codes and Job Titles
2010 SOC Code 2010 SOC Title 2010 SOC Direct Match Title
11-1011 Chief Executives CEO
11-1011 Chief Executives Chief Executive Officer
11-1011 Chief Executives Chief Operating Officer
11-1011 Chief Executives Commissioner of Internal Revenue
11-1021 General and Operations Managers Department Store General Manager

http://www.bls.gov/soc/#materials It is available at : http://www.bls.gov/soc/soc_2010_direct_match_title_file.xls (with more related files at http://www.bls.gov/soc/#materials)

What I think could be nice are adding the SOC codes based on the 2010 SOC Direct Match Title since so many outside services and pages use the SOC codes (which seems to be a requirement for a lot of job offers in the US).

I've already added manually the French equivalent of the SOC code in Wikidata, so being able to match national codes and job titles through Wikidata would be cool.Teolemon (talk) 18:59, 29 November 2014 (UTC)

SOC is based on an UN standard, ISCO. Each country's bureau of national statistics have their own translation/adaptation of ISCO. There is a standard for historical occupations too, HISCO. In Denmark the national adaptation of this code system is called DISCO, for Norway STYRK and so on. Adding all these codes will be messy, since they may be non-overlapping. But it may also becoma a source for statisticians to get translations from one code to another. H@r@ld (talk) 23:12, 19 January 2015 (UTC)

NLI identifier (P949)[edit]

Can a bot please check the identifier NLI (Israel) identifier (P949)? Some of the IDs are wrong, see Property talk:P949#Unstable numbers? I don't know if it is only a small or a large percentage. --Kolja21 (talk) 03:24, 27 December 2014 (UTC)

Add data of birth / death to French people[edit]

I noticed that a lot of items about people (instance of (P31) -> human (Q5)) don't have date of birth (P569) and date of death (P570), but do have an article at the French Wikipedia (list of about 78.000 items). I already imported a lot of dates from the Dutch Wikipedia, but that was much more difficult. It looks like the French Wikipedia uses the template Date de naissance (date of birth) and Modèle:Date de décès (date of death). Could someone please harvest these templates? Probably best suitable for a French speaking bot operator. Multichill (talk) 21:31, 28 December 2014 (UTC)

Mayor election results in Hungary[edit]

Good day!

Last year, the mayors voted in Hungary. The French Wikipedia hungarian settlements mayor to update information. The data are available on the valasztas.hu: [3] --นายกเทศมนตรี (talk) 12:32, 6 January 2015 (UTC)

I will have a look at that during the week-end. Orlodrim (talk) 11:52, 16 January 2015 (UTC)
I updated the data templates on the French Wikipedia.
I don't know if it's possible to put the mayors on Wikidata without creating an item for each mayor, but if someone is interested, I put the list in CSV format on this page.
Orlodrim (talk) 21:56, 16 January 2015 (UTC)

Dates[edit]

I have set more than 8000 dates via "Wikidata - The Game" and because of that I can safely say that a bot could also do this with a efficiency of 95 - 99 %. The bot should only take a look on articles where two brackets and withing these brackets date(s) are. If the date of the death of the person is missing, the bot should enter the date after the "-", the "†" or the word "death" as the property. If the date of the birth of the person is missing, the bot should enter the date before the "-", the "*" or after the word "born" as the property. This could be also done with other languages just by translating the words "born" and "died". --Impériale (talk) 00:43, 11 January 2015 (UTC)

I have set more than 100,000 dates via "Wikidata - The Game", and I highly disagree that this is something that can be done by bot -- there are just too many irregularities. In the case where dates are added with templates it can be done very reliably, but not in cases where the dates are stored in plain text. Jon Harald Søby (talk) 19:26, 15 January 2015 (UTC)
Haha, good work! What irregularities do you mean? I'll try to explain the bot better (i have a bit experience with programming):

::If in the first line are numbers and parentheses then:
:::If a "*" or a "born" is between the parentheses then:
::::If there is one! date between a "*" or a "born" and a "†" or a "died" then:
:::::If the date of the persons birth is missing in wikidata then:
::::::transcribe it wikidata
:::::Else:
::::::go to the next article
::::else:
:::::go to the next article
:::elif a "†" or a "died" is between the parentheses then:
::::If there is one! date between a "†" or a "died" and the ")" parentheses then:
:::::If the date of the persons death is missing in wikidata then:
::::::transcribe it wikidata
::::else:
:::::go to the next article
:::elif a "-" is between the parentheses then:
::::If there is one! date between the "(" and the "-":
:::::If the date of the persons birth is missing:
::::::transcribe it to wikidata
::::else:
:::::go to the next article
::::If there is one! date between the "-" and the ")":
:::::If the date of the persons death is missing:
::::::transcribe it to wikidata
::::else:
:::::go to the next article
:::else:
::::go to the next article
::else:
:::go to the next ariticle
I know that there are probably some mistakes but in my imagination this bot could not solve set all dates (there would be still a lot of work for us), but I think it should work nearly without mistakes and help us a lot. --Impériale (talk) 17:04, 16 January 2015 (UTC)

I have a suggestion: go over all 8000 additions and make sure you know whether each and every date was in the Gregorian or Julian calendar. This has been discussed before in this forum.
For example, this edit claims that David Leslie Melville, 6th Earl of Leven, was born 4 May 1722, Gregorian calendar. But this source says that he was born 4 May 1722 in Leven, Fife, UK. At that date and place the Julian calendar was in force, so the date 4 May 1722 appears to be a Julian calendar date, not a Gregorian calendar date. Jc3s5h (talk) 02:14, 17 January 2015 (UTC)
Do you probably know how many articles are proximally affected (in percentage) by this? --Impériale (talk) 00:19, 19 January 2015 (UTC)
If you mean how many in all of wikidata, I would say all dates before 1583 are suspect. Also, dates of people from the British Isles, Canada, and American colonies before 1752. Russia before 1918. Greece before 1923. I don't know what the coverage is for persons from various areas and time periods, so I couldn't guess what the percentage is. Jc3s5h (talk) 03:15, 20 January 2015 (UTC)
So after this dates the bot wouldn't probably make any mistakes, right? --Impériale (talk) 16:50, 27 January 2015 (UTC)
Since the bot will usually not know in what region a dated event occurred, it would have to assume the latest date. The latest adoption date for the Gregorian calendar (where the previous calendar was the Julian calendar) is documented at https://secure.ssa.gov/apps10/poms.nsf/lnx/0200307523
That date is March 1, 1923. Jc3s5h (talk) 18:05, 27 January 2015 (UTC)
I think even that would affect a few thousand entries. Is there anyone who could realise this? --Impériale (talk) 17:04, 28 January 2015 (UTC)

────────────────────────────────────────────────────────────────────────────────────────────────────If you mean, can anyone make the corrections with a bot, I don't know how myself, but I guess the procedure might be something like this:

  1. Obtain a list of all the edits made by the bot that didn't know about Julian/Gregorian.
  2. See if the date is greater than or equal to 1 March 1923; if so, do nothing.
  3. See if the current date matches the date inserted by the bot; if the dates do not match, do nothing.
  4. See if any references have been added since the bot edit; if so, put the article on a list for manual inspection.
  5. If no references have been added since the bot edit, delete the date.

Jc3s5h (talk) 17:21, 28 January 2015 (UTC)

Template:OK your list looks great but i think the last point is not necessary because there are manually also no references needed. --Impériale (talk) 21:25, 28 January 2015 (UTC)
The point about references is that if the references are the same as when the bot imported the date, there is no new information to show if the date is right or wrong. But if a reference was added, that means a human editor probably looked at the date, decided it was right, and added a reference. A bot should not override a decision by a human editor.
At some point in the process, there must be a step where the suspicious date is deleted. Otherwise the process never does anything and there is no point in the procedure. Jc3s5h (talk) 21:42, 28 January 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── "In the case where dates are added with templates it can be done very reliably" - In en.Wikipedia infoboxes, many (sadly not all) dates use en:Template:Birth date, en:Template:Birth date and age, en:Template:Death date, en:Template:Death date and age, en:Template:Start date, en:Template:Death date, en:Template:Film date, or suchlike. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:04, 13 March 2015 (UTC)

It is fortunate that some infoboxes do not use the templates listed by User:Pigsonthewing because all those emit metadata which is specified to use ISO 8601 and the Gregorian calendar. But since it is contrary to normal writing customs in the English language to use the Julian calendar before 15 October 1582, or for any event that occurred at a time and place where the Julian calendar was in force, any such date placed in one of the listed templates would be false information, either to the human reader, or to the metadata user. Jc3s5h (talk) 23:51, 13 March 2015 (UTC)

Move all template ICD9 and ICD10 references to wikidata[edit]

I would be grateful if a knowledgeable editor could create a bot to go through every template in en:Category:Medicine templates and its subcategories to move references contained to ICD9 and 10 in the title field to wikidata. Relevant details:

Thanks! --LT910001 (talk) 21:59, 6 February 2015 (UTC)

LT910001, resolving the constraint violations of P493 and P494 first could help. --Succu (talk) 22:11, 6 February 2015 (UTC)
Thanks for your reply, I apologise for not having responded more promptly. Whether or not there are existing constraint violations (by which I assume you mean duplicates and poor formatting?) won't affect the data that's already present in the infoboxes. It may even be more helpful to have that data on board here so that these violations can be addressed in a more comprehensive manner. Moving this from the English WP to WD would certainly help the english WP. I will respond more promptly in the future! --LT910001 (talk) 21:30, 14 February 2015 (UTC)

London transport adjacent stations from English Wikipedia navbox tables[edit]

Hello,

Most of Tube, DLR and London Overground stations have nice tables with adjacent stations qualified by a line (Picadilly Line, Victoria Line, DLR, etc.). It would be nice to somehow scrape that data and include it in Wikidata items for each station. Holek (talk) 12:15, 3 March 2015 (UTC)

How will these be qualified (north, south, inbound/ outbound, name of line, etc)? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:32, 17 March 2015 (UTC)
I'd say by name of a line and direction, much like some German and Polish stations. (Sorry for late response.) Holek (talk) 23:46, 5 April 2015 (UTC)

Importing sitelinks from en:Template:German Type VII submarines[edit]

Could someone have a Bot compare (a) the articles linked to at en:Template:German Type VII submarines and sub-templates with (b) pt:Predefinição:Tipo VIIA, pt:Predefinição:Tipo VIIB, pt:Predefinição:Tipo VIIC, pt:Predefinição:Tipo VIIC/41, pt:Predefinição:Tipo VIID, pt:Predefinição:Tipo VIIF; and, when both en.wp and pt.wp articles exist, but aren't linked on Wikidata, merge the Wikidata items? I have done two ([4], [5]) by hand as proof of concept; it would be quite tedious to manually go through the entire list. Thanks! It Is Me Here t / c 20:26, 30 March 2015 (UTC)

radio stations, replace located in the administrative territorial entity (P131) with headquarters location (P159)[edit]

There are 13314 radio stations with located in the administrative territorial entity (P131) which should all actually be headquarters location (P159). Here is a list of them. Could someone please change the property on all these items? Popcorndude (talk) 18:17, 18 April 2015 (UTC)

For instance of (P31)  radio station (Q14350) or instance of (P31)  television station (Q1616075) where country (P17)  Canada (Q16) or country (P17)  United States of America (Q30): It's likely that the value of the ambiguous located in the administrative territorial entity (P131) is neither the headquarters location (P159) (offices/studio) nor the transmitter (Q190157) location, but actually licensed to broadcast to (P1408) (the "city of license" that the station is mandated to target but tries to ignore), or occasionally a marketing claim of the main city of a media market (Q2270014), which there is no Wikidata property for. Sometimes headquarters location (P159) and licensed to broadcast to (P1408) and the media market (Q2270014) are the same place, but very often they are not. You can check licensed to broadcast to (P1408) against the official U.S. database by running a query on https://www.fcc.gov/encyclopedia/fm-query-broadcast-station-search (or the other search tabs on that page) however. Reliable sourcing for the studio or headquarters location (P159) would require more complex sources. --Closeapple (talk) 18:41, 21 April 2015 (UTC)
It would seem that 693 of them have located in the administrative territorial entity (P131)  Texas (Q1439), which leads me to suspect that many of the others are equally vague. What property should then be used? Popcorndude (talk) 22:50, 21 April 2015 (UTC)
I don't think there's any good destination for the data that is in located in the administrative territorial entity (P131) without qualifiers. I think the accurate thing to do would be:
  1. Remove located in the administrative territorial entity (P131) if there is no qualifier and no source except imported from (P143), since there's no evidence that the value is real. That's especially true near any border, which a lot of major metropolitan areas are.
  2. Import licensed to broadcast to (P1408) by matching the city and state from the Federal Communications Commission (Q128831) database or AMQ/FMQ/TVQ dumps; both city and state are in there. It's authoritative for country (P17)  United States of America (Q30). (You have to make sure the name matches the right item if the state has multiple settlements with the same name, of course.) It also has data for other countries in the Americas (Q828) if you're desperate, but it's not as accurate: For country (P17)  Canada (Q16) or country (P17)  Mexico (Q96) it will be the licensed to broadcast to (P1408) "notified" to the U.S. for treaty purposes, which may be different the current station and its domestic license; you can get the real domestic information from those countries' websites though. For other countries, sometimes it's even the name of a nationwide station instead of a location; it's not accurate beyond the country level.
  3. You can grab the transmitter coordinates in the same way, but processing that is tricky: If you're going to actually use them for coordinate location (P625), they have to be converted from NAD27 or NAD83 to World Geodetic System 1984 (Q215848) used by Wikidata.
  4. Even unconverted, the transmitter datum would be within a few hundred meters, so if you're just trying to find out if the studio is in a specific administrative territorial entity (Q56061), you can probably assume that a station that has a full-power license, is not a Non-commercial educational (Q17145784), and has a transmitter 50km away from a border, has to have a studio in the same administrative territorial entity (Q56061) since U.S. law requires such stations to have a nearby "main studio". You could do the same by getting coordinates for licensed to broadcast to (P1408) and guessing that the studio wouldn't be more than 50km from the licensed to broadcast to (P1408) without a Non-commercial educational (Q17145784) license or a "main studio waiver". That's the good news. The bad news is: That requires comparing the coordinates to geographic borders. And it may be a leap to even assume that the legal studio is a headquarters location (P159): Most stations are owned by out-of-town media conglomerates now, and though most still do have form of local management, they sometimes don't have much real local decision-making power left. --Closeapple (talk) 08:33, 22 April 2015 (UTC)
I should have pointed out that the above advice is still for country (P17)  Canada (Q16) or country (P17)  United States of America (Q30). The concept of licensed to broadcast to (P1408) based on locality (Q3257686) may not exist in other places: it might be for media market (Q2270014) or administrative territorial entity (Q56061) or not have an official licensed to broadcast to (P1408). --Closeapple (talk) 02:10, 23 April 2015 (UTC)

Patrol page moves and deletes[edit]

Two kinds of edits pollute the list of recent unpatrolled changes that are trivial to patrol:

  • “Page moved from [Xwiki:A] to [Xwiki:B]”: Mark as patrolled if Xwiki:A is now a redirect of Xwiki:B.
  • “Page on [Xwiki] deleted: A”: Mark as patrolled if Xwiki:A is deleted.

Even if the page move or deletion is incorrect, that’s something that Xwiki has to handle, not Wikidata.

It would be great if someone could write a bot to automatically patrol these two kinds of changes, so that human patrollers can focus on more important changes. —DSGalaktos (talk) 11:42, 22 April 2015 (UTC)

I agree, these edits should be patrolled automatically. But in case you don't know User:pasleim recently created a great tool that puts unpatrolled edits into groups (like "page moves") and allows patrolling them from a list view and even patrolling multiple edits (of one user) with one click. I think this helps a lot already. --YMS (talk) 12:02, 22 April 2015 (UTC)
I saw the announcement, but wasn’t aware of those features… thanks! —DSGalaktos (talk) 15:39, 22 April 2015 (UTC)
For fun, I started writing the bot myself (bash, curl, jq, sed, and coreutils). You can see it on GitHub. “Page moved” patrolling works. “Page deleted” isn’t yet implemented, mainly because it’s harder to work on because there are less edits of this type that I can use to test :D —DSGalaktos (talk) 15:05, 27 April 2015 (UTC)
@DSGalaktos: Even if the page move or deletion is incorrect, that’s something that Xwiki has to handle, not Wikidata. I'm not sure I agree. A confused item/article subject pair can very well mean a wrong label later and incorrects statements thereafter has the locutor of this language will read the label. It is totally a problem for future interconflicts redirects solving. And I don't really like this is not our problem, this means potential future misanderstanding and ball throwing at each other. I don't think that Wikidata and the other wikis are different projects as they are highly connected and dependant. For example I think that Infoboxes on Wikipedia should be edited on Wikidata. Changes on Wikiadata can affect infoboxes. So it is our duty to take care of each other. TomT0m (talk) 16:04, 27 April 2015 (UTC)
@TomT0m: I don’t think most Wikidata users are equipped to properly assess page moves, simply because of language problems. From my personal experience, I’d say that could actually understand about 10% of the page moves I’ve patrolled – the rest was in Hebrew, or Chinese, or Japanese, or Kannada, or some other language that I don’t speak. (That number is probably biased because I’m likely to be awake and active at the same time other European users are likely to be awake and active, whose changes I’m more likely to understand than Asian users’.) In those cases, all I can do is check that both pages have the same content, and that the old page link has something right below the title that looks like a “redirected from” info.
And I really don’t think that’s a problem. What I’m proposing to auto-patrol is only the most basic kind of change: the old page redirects to the new one. In my experience, this usually happens because the spelling of the title changed, e. g.: an English title or term being translated; or transliteration of a name added, improved, or removed in favor of original spelling (example); or simply case changes (example). None of these are problematic in any way – in fact, it wouldn’t change much if the Wikidata link wasn’t updated at all (users would just see the redirect). (Similarly, if a page on another wiki was deleted, there’s no point in keeping the sitelink, because it doesn’t go anywhere.) The other frequent kind of page move, where the old page becomes a disambiguation (for example, for multiple people with the same name), I’m happy to leave alone. —DSGalaktos (talk) 19:58, 27 April 2015 (UTC)
As for 2nd point. I've seen a bunch of unpatrolled edits with "article was deleted" where article was only temporarily deleted (for moving, for merging histories, for filtering vandal edits) and immediately restored. But sitelink doesn't restore in the item itself! So such edits should be manually checked (and reverted). --Infovarius (talk) 13:25, 29 April 2015 (UTC)
I wasn’t aware of such edits – should I limit the bot to page moves only? (Which conveniently is exactly what’s already implemented.) —DSGalaktos (talk) 13:26, 30 April 2015 (UTC)
I’ve written another version, again only for page moves (deletions are much rarer and consequently less important), but this time as a user script: User:DSGalaktos/checkMoveDiff.js. If the new page has a redirect from the old page, it adds a green ✔ before the move comment, freeing you of the necessity to verify the move manually, while still leaving the responsibility of the actual patrolling to you. (I recommend using User:DSGalaktos/hidePreviewOnDiff.js as well if you’re using the Preview gadget.) Perhaps User:TomT0m will be happier with this behavior? —DSGalaktos (talk) 15:19, 30 April 2015 (UTC)

Move last group of ship classes[edit]

Now that a consensus was finally reached to retain vessel class (P289), I would appreciate a bot to port the last 800 or so statements that still indicate the ship class [ ship class (Q559026) ], improperly using property instance of (P31).

Out of 5175 ships using vessel class (P289), these 800 or so : [Query: claim[31:(claim[31:(tree[559026][][279])])]] are duplicating the information in P31. Rather than blindly deleting these P31 statements (since they are redundant), I would prefer to port references and qualifiers from P31 to P289.

Thanks ---- LaddΩ chat ;) 00:10, 24 April 2015 (UTC)

I'm sad to learn that. By the way, you did not commented on Wikidata:Requests_for_comment/Adopt_Help:Classification_as_an_official_help_page, which takes ship classes as an example and is way broader in this scope. I put a lot of energy in this, so I'd like at least some comments. TomT0m (talk) 06:25, 24 April 2015 (UTC)
Hi TomT0m, I responded on your talk page. There is nothing "sad" about fixing the remaining inconsistencies to comply with the community decision, rather the opposite. -- LaddΩ chat ;) 14:20, 24 April 2015 (UTC)
I'm just saying I would have like the decision to be the other option :) TomT0m (talk) 08:12, 25 April 2015 (UTC)
  • Oppose - The general concept is to have P31 always to be as precise as possible. FreightXPress (talk) 12:10, 28 April 2015 (UTC)
    • Comment User:Laddo made me aware of this discussion. So, there might be reasons to have "vessel class", e.g. display in infoboxes. But that does not mean that for ships, submarines, spacecrafts the P31 should be degraded. All P279 of coastal defence ship (Q2304194) are ship classes [6]. So, all ships belonging to one of these ship classes can be made P31 of the more specific value. If there would be a subclass "blue coastal defence ship" it could be that a ship could be put into two subclasses. But that is not the case. Each ship (that is a coastal defence ship [clarification thanks to User:TomT0m]) is an instance of exactly one subclass of coastal defence ship (Q2304194) or is a direct instance of coastal defence ship (Q2304194). It can then inherit anything from that class. That is what subclassing is for, not? FreightXPress (talk) 15:25, 29 April 2015 (UTC)
      • @FreightXPress: Did you read Help:Classification ? If you did, you know you're wrong ;) No, every subclass of ship en:ship class is not a ship class in the sense of the article. But every subclass of ship that is a ship class is
        < the ship class > instance of (P31) miga < ship class >
        , which allows us to query or find the correct class. But I'm not sure you do not know that. TomT0m (talk) 15:32, 29 April 2015 (UTC)
        • @TomT0m: "No, every subclass of ship en:ship class is not a ship class in the sense of the article." - I know that. Laddo made that pretty clear. Each ship type class is not a ship class. Please help me to understand with which statement I was wrong. FreightXPress (talk) 15:44, 29 April 2015 (UTC)
          • @FreightXPress: I barely understood what you mean, every ship is obviously not a coastal defense ship. TomT0m (talk) 15:48, 29 April 2015 (UTC)
            • @TomT0m: Thanks a lot. Now I see how my text might be misleading if not read as a whole. Starting with "All P279 of coastal defence ship (Q2304194)" I was only talking about coastal defence ships. FreightXPress (talk) 15:52, 29 April 2015 (UTC)
              • Oh, OK, I think we agree then. Of course, as I said before, for practical reasons it could be convenient to have as real statements the instance of (P31) claims about the ship type and the ship class to ease infobox coding. No change in meaning, just a small amount of chosen redundancy. TomT0m (talk) 16:38, 29 April 2015 (UTC)

Import DOI Prefixes[edit]

DOI Prefix (P1662): This Crossref list could be imported. It's not complet but contains a basic selection of DOI Prefixes. --81.171.85.254 13:42, 30 April 2015 (UTC)

ISBN conversions[edit]

We have, at the time of writing, 8,287 items with ISBN-13 (P212) and 10,713 items with ISBN-10 (P957). Since it is possible to generate the former from the latter as described on Wikipedia, please can someone do so? The task might usefully be repeated on, say, a monthly basis.Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:15, 11 May 2015 (UTC)

Changing P143 with P248 only on certain items[edit]

If it's possible, I'd like to change imported from (P143) with stated in (P248) only on those items that use Anagrafe delle biblioteche italiane (Q19753501) as a source, since I've read the latter is to be preferred. There should be just around 60 items to be fixed. I'd like to do it on my own, but I don't know how to use pywikibot to do it on my own. Thanks. --Sannita (ICCU) (talk) 13:17, 11 May 2015 (UTC)

Adding missing labels for all biological species, genus, families, etc[edit]

In a lot of languages the labels are missing, while a label can be added. This is especially easy for items with instance of (P31)  taxon (Q16521) + taxon name (P225) -> the scientific name that is added with P225, can be added to the labels of all languages with the Latin alphabet. If the label already has been given in a language, and the label differs from the Latin name, the taxon name (P225) should be added as synonym (if it is not already added). This should finding the right items make a lot easier.

This bot run should be done for all languages with Latin alphabet, and those include: en, de, fr, it, es, af, an, ast, bar, br, ca, co, cs, cy, da, de-at, de-ch, en-ca, en-gb, eo, et, eu, fi, frp, fur, fy, ga, gd, gl, gsw, hr, hu, ia, id, ie, is, io, kg, lb, li, lij, mg, min, ms, nap, nb, nds, nds-nl, nl, nn, nrm, oc, pcd, pl, pms, pt, pt-br, rm, ro, sc, scn, sco, sk, sl, sr-el, sv, sw, vec, vi, vls, vo, wa, wo, zu. Are there any languages missing where this should be done as well?

Thanks! Romaine (talk) 17:25, 13 May 2015 (UTC)

There is no consensus for this (see latest discussion). --Succu (talk) 17:48, 13 May 2015 (UTC)
What I found, is a discussion of exactly one year old, and just one person that is not supporting because of "the gadgets then need to load more data". Is that the same "no consensus" as you meant? Edoderoo (talk) 20:13, 13 May 2015 (UTC)
(after edit conflict)The discussion you link to links to another discussion in what only 1 person is objecting is because it would increase the size of items, and some tools can't handle such. This was and is silly reasoning, the adding will happen anyway, it only will take a bit longer. Now we have the problem that a lot of items can't be found because of missing labels. Is Wikidata intended to serve bots only, or can we humans get a better serving by having a label in my own language? Romaine (talk) 20:16, 13 May 2015 (UTC)
Almost all taxa have an english label. So searching for them you should have a hit. Adding these labels is not a big deal. My bot can easely do this. Simply get consensus for this. --Succu (talk) 20:33, 13 May 2015 (UTC)
In the search section on top of every page, I get to see only the items with a label in the language I have set in my preferences. So it happens a lot that I can't easily search and select. Secondly, to be able to link to an item, I need to have a label set in my language to be able to link it on other items. This is frustrating users to work easily with Wikidata. Romaine (talk) 20:45, 13 May 2015 (UTC)
Romaine: An example would be helpfull. --Succu (talk) 21:10, 13 May 2015 (UTC)
An random example is Kahayan (Q12488462), searching for it fails. This is not a taxon, but illustrates that we need more labels in more languages on many more items, including taxa. Romaine (talk) 22:01, 13 May 2015 (UTC)
  • I haven't changed my position: it would be best if the software could read P225 as if it were the label (this would save a whole lot of redundant content). And otherwise, Succu is the user that has the closest knowledge of the situation here. - Brya (talk) 11:06, 14 May 2015 (UTC)
Fully agree with Brya here. The Lua-modules used on WP and in other places should often prefer to use the name-properties when they exist. Introducing "wrong label" by bot, especially to this kind of items, would cause a lot of wrath on my home-wp. The search-engine-problem can be solved otherwise. Earlier, I could search for correct items by the content in the properties. I have not seen that for some time here now. -- Innocent bystander (talk) 11:40, 14 May 2015 (UTC)

mark Hidden categories as Administrative[edit]

https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2015/04#Marking_adiministrative_categories.

I tried to do this with http://tools.wmflabs.org/autolist/index.php by specifying Category=Hidden categories, depth=1 (all admin cats are direct subcats of "Hidden categories").

But there are 15,859 total (see https://en.wikipedia.org/wiki/Category:Hidden_categories), so the tool takes forever. Could someone do it with a bot? Please:

  • set P31:Q15647814 Wikimedia administration category page
  • delete P31:Q4167836 Wikimedia category

--Vladimir Alexiev (talk) 06:52, 21 May 2015 (UTC)

Don't forget that other projects could do something different. :) Sjoerd de Bruin (talk) 12:06, 25 May 2015 (UTC)

Elo ratings for chess players[edit]

The FIDE database is the official source of present and past Elo ratings (e. g. Magnus Carlsen: Elo history where 1503014 is Fide ID (P1440)). Column RTNG is the Elo rating. It would be nice to have those values in the items with Fide-ID - the present value is the most interesting one, but the full history could be useful as well. That would add Elo ratings to thousands of chess players. The request was originally posted by an IP at the German bot requests page. --mfb (talk) 13:47, 7 June 2015 (UTC)

✓ Done should I repeat it every month? --Pasleim (talk) 19:53, 12 June 2015 (UTC)
Thanks, but you only added the Elo ratings of the current month. My wish is, that every Elo rating a player had in the past is added. The FIDE database contains the ratings back till 2000. Is it possible for you that your bot adds all these ratings? Note that until 2009, there were only four Elo ratings a year, from 2009 until 2012 there were six ratings a year, and since 2012, every month has its new rating. 85.212.22.136 07:59, 13 June 2015 (UTC)
Thanks. A monthly bot run to get the most recent numbers would be great. --mfb (talk) 20:53, 13 June 2015 (UTC)

Chemical substances from dewiki[edit]

May someone let a bot walk de:Kategorie:Chemische Verbindung and give all articles, which don't have an item yet (because they don't have interwikis) a Q number? For statistical purposes this would be useful. At least 1000 substances from German wikipedia are not at wikidata in the moment.--Kopiersperre (talk) 19:59, 7 June 2015 (UTC)

With [7] I created 60 items. I can't find more articles in that category without an item. --Pasleim (talk) 20:48, 7 June 2015 (UTC)
@Pasleim: Have you walked all subcategories? File:Treffen der Redaktion Chemie - Statistik.pdf says, that there were 11297 articles about chemical substances in 2013. Wikidata Query only counts 7771. Is my query wrong?--Kopiersperre (talk) 17:46, 8 June 2015 (UTC)
@Kopiersperre: Probably some items of chemical substances don't have instance of (P31)=chemical compound (Q11173). You can add them via Autolist.--GZWDer (talk) 04:53, 10 June 2015 (UTC)
@GZWDer: I've just got a CSV from CatScan. Is it now possible to fill instance of (P31)=chemical compound (Q11173) by bot?--Kopiersperre (talk) 14:34, 11 June 2015 (UTC)
@Kopiersperre: If there're only Qids, you may fill them to "Manual item list" box of [8]. If it is page names, add [[ and ]] before and after every page names, then use [9].--GZWDer (talk) 16:26, 11 June 2015 (UTC)
@Kopiersperre: Just a detail: don't import large data sets from deWP about chemicals. I am curating the current lists of constraints about chemicals so I don't want to have bot coming after me and adding wrong data. For items creation, ok but for data import please add a comment in the Wikidata:WikiProject Chemistry. Thanks Snipre (talk) 14:07, 17 June 2015 (UTC)
@Snipre: Because I couldn't exclude some proteins from the organic chemicals, I'm adding properties by hand. Most of the newly created substances in dewp haven't had P31=Q11173. I think, there is a bot missing, adding new chemicals.--Kopiersperre (talk) 14:36, 17 June 2015 (UTC)
@Kopiersperre: We can't analyze creation of new articles in each WP. The best is to inform WP that in case of creation of new articles about chemicals, they can let a message in the Wikidata:WikiProject Chemistry. Snipre (talk) 15:09, 17 June 2015 (UTC)

US-American => P27:Q30 (currently 83579 items)[edit]

It might be worth doing this addition: Search descriptions for "American". --- Jura 16:29, 9 June 2015 (UTC)

I suspect that might not always be accurate, for various reasons. For example, there are 294 items on that list that have their date of death listed as before the United States was founded. --Yair rand (talk) 07:11, 10 June 2015 (UTC)
I'm afraid, after ten thousands of P27-statements have been (semi)roboticaly added based on much weaker guesses than this one, it doesn't really matter. The P27 data can't be trusted anymore. Surely not those without references.--Shlomo (talk) 11:00, 10 June 2015 (UTC)
We can probably clean it up, given enough time. That is, assuming nobody goes around continuing to make things worse. @Jura1: Should I assume you're verifying your hundreds of additions on this manually? --Yair rand (talk) 16:42, 10 June 2015 (UTC)
If I placed it here, it wasn't to do it myself. The sample could be limited by P570. --- Jura 16:46, 10 June 2015 (UTC)
It would need to be limited quite a bit further than that, I think. For example, the items with labels beginning with "American Samoa[n]", many of whom do not have American citizenship. I don't recommend doing this, even if a reasonable number of extra conditions can be figured out. Better to find some actual sources, and add those. --Yair rand (talk) 16:58, 10 June 2015 (UTC)
An alternate route is to use fairly reliable categories from Wikipedia. --- Jura 17:17, 10 June 2015 (UTC)

Change qualifier type of P1343[edit]

According to Wikidata:Project chat#Change described by source (P1343) qualificator for Wikisource articles discussion please change all stated in (P248) qualifiers of described by source (P1343) property to subject of (P805). Documentation and LUA modules will be updated after bot work complete. -- Vlsergey (talk) 20:54, 21 June 2015 (UTC)

Import coordinates from Wikipedia[edit]

Even though more and more articles directly grab coordinates from Wikidata, many people still add coordinates in Wikipedia articles, and they should be imported to Wikidata. All primary coordinates in :fr:Catégorie:Page sans coordonnées Wikidata, and probably sitelinked categories as well should probably be imported to Wikidata. Possible exceptions would be subclasses of organization (Q43229), that may not need coordinates in Wikidata. But as they appear to be a small minority, importing coordinates for all pages appear to be much better than importing nothing at all.

@Multichill: I think you have done this one before ?

Ideally, this task periodically repeated.--Zolo (talk) 06:11, 30 June 2015 (UTC)

@Zolo: I guess the subject should actually be to import coordinates to wikidata? I've imported coordinates in the past, for this I wrote coordinate import.py. Anyone with Pywikibot installed can run it. It will skip organization items if you set the headquarters location qualified with the coordinates. Anyone else feels like having a shot at this? Multichill (talk) 16:37, 1 July 2015 (UTC)
@Multichill:, actually I meant import from Wikipedia. Thanks, I have not installed Pywikibot and my todo list is already long, but if nobody has done it in a few weeks, I might give it a try. --Zolo (talk) 07:57, 2 July 2015 (UTC)
@Zolo: it's running. Multichill (talk) 19:45, 2 July 2015 (UTC)

Import titled player from FIDE database[edit]

Is it possible that a bot imports e.g. all players with the title "International Master" frome the FIDE database? At the moment these are 3434 players, and only a few of them are already in Wikidata. Per notability criteria in, e.g., english and german wikipedia, all International Masters are notable. 85.212.22.136 22:35, 30 June 2015 (UTC)

Update category labels[edit]

Could a bot adjust the labels of category items to their respective sitelinks? There are a lot of old labels out there. Sjoerd de Bruin (talk) 14:09, 1 July 2015 (UTC)

I work on category periodically, have you some example? --ValterVB (talk) 18:53, 1 July 2015 (UTC)
Someone left a message on my talk page about Category:Compositions by Hubert Parry (Q8406409). He didn't understand the situation on Wikidata because the label of the item was still "Categorie:Compositie van Parry" instead of "Categorie:Compositie van Hubert Parry". I think category labels should always be the same as the sitelinks. Sjoerd de Bruin (talk) 19:07, 1 July 2015 (UTC)

Connecting it.wiki's categories[edit]

There are a few hundreds categorie on it.wiki without a Wikidata item that could be linked to a category on fr.wiki (and possibily other wikis). I wonder if a bot could:

  • connect each "Categoria:Film giapponesi del <year>" to frwiki's "Catégorie:Film japonais sorti en <year>"
  • connect each "Categoria:Film portoghesi del <year>" to frwiki's "Catégorie:Film portugais sorti en <year>"
  • connect each "Categoria:Film brasiliani del <year>" to frwiki's "Catégorie:Film brésilien sorti en <year>"
  • connect each "Categoria:Film polacchi del <year>" to frwiki's "Catégorie:Film polonais sorti en <year>"
  • connect each "Categoria:Film sovietici del <year>" to frwiki's "Catégorie:Film soviétique sorti en <year>"
  • connect each "Categoria:Film svizzeri del <year>" to frwiki's "Catégorie:Film suisse sorti en <year>"
  • connect each "Categoria:Film hongkonghesi del <year>" to frwiki's "Catégorie:Film hongkongais sorti en <year>"
  • connect each "Categoria:Film messicani del <year>" to frwiki's "Catégorie:Film mexicain sorti en <year>"
  • connect each "Categoria:Film senegalesi del <year>" to frwiki's "Catégorie:Film sénégalais sorti en <year>"
  • connect each "Categoria:Film turchi del <year>" to frwiki's "Catégorie:Film turc sorti en <year>"
  • connect each "Categoria:Film israeliani del <year>" to frwiki's "Catégorie:Film israélien sorti en <year>"
  • connect each "Categoria:Film tunisini del <year>" to frwiki's "Catégorie:Film tunisien sorti en <year>"

obviously also adding the proper label. There may be some cases where the category on itwiki is already linked, but I believe that if it is not then the category has no Wikidata item (or frwiki's category does not exist; in both cases, there should be no need for merges). Thanks in advance--Dr Zimbu (talk) 11:50, 2 July 2015 (UTC)

Remove "(disambiguation)" from English labels[edit]

Per autolist (slow) there are over 255k items that have a label that contains "(disambiguation)". However those should not be added to label so I'm asking someone to delete those disambiguations. --Stryn (talk) 15:09, 2 July 2015 (UTC)

There are also quite a lot of items about people that have the disambiguator imported from Wikipedia articles. These should be fixed as well. --- Jura 16:24, 2 July 2015 (UTC)
Yeah, but that's not that easy. Often, the part in the parentheses often belongs to the title (e.g. song titles, places, ...), so it would be wrong to remove them all (some bots did not import them in the first place. This is why many labels now actually are missing some part in parentheses). For specific labels like "(disambiguation)", especially if they can be cross-check with an "is a disambiguation page" statement, automatic removal would be a good thing, though. --YMS (talk) 16:30, 2 July 2015 (UTC)
I can't follow: What's the link between "items about people" and song titles? --- Jura 16:35, 2 July 2015 (UTC)
Perhaps what makes it obvious is an example or two which are false positives for a simple "has parentheses with a phrase inside" or a more complex of "has such a phrase at the end of the title", as in en:Benzo(a)pyrene and en:V (The Final Battle). --Izno (talk) 17:02, 2 July 2015 (UTC)
Neither should have P31:Q5. If you have some others, maybe yes. --- Jura 17:04, 2 July 2015 (UTC)