Wikidata:Requests for permissions/Bot

From Wikidata
Jump to: navigation, search
Wikidata:Requests for permissions/Bot
To request a bot flag, or approval for a new task, in accordance with the bot approval process, please input your bot's name into the box below, followed by the task number if your bot is already approved for other tasks.

Old requests go to the archive.

Once consensus is obtained in favor of granting the botflag, please post requests at the bureaucrats' noticeboard.

Bot Name Request created Last editor Last edited
Jeblad (bot) 2015-08-03, 01:08:06 Jeblad (bot) 2015-08-03, 01:09:09
Xaris333Bot 2015-07-29, 17:45:22 Xaris333 2015-07-29, 17:45:22
ProteinBoxBot 4 2015-07-28, 09:18:37 Andrew Su 2015-07-29, 16:04:35
ProteinBoxBot 3 2015-07-26, 07:28:08 Andrew Su 2015-07-29, 16:05:18
Dexbot 11 2015-04-07, 18:15:00 Wylve 2015-05-16, 21:59:15
SaschaBot 2 2015-03-26, 16:28:44 Jura1 2015-05-26, 11:46:03
Revibot 3 2014-12-07, 12:16:54 Pasleim 2015-03-30, 21:34:41
Shyde 2014-11-29, 15:09:36 Gallaecio 2015-06-20, 11:19:26
JhealdBot 2014-09-07, 23:30:46 Jheald 2014-11-22, 22:39:39
BthBasketbot 2014-06-10, 08:17:14 Bthfan 2014-08-11, 14:27:02
Fatemibot 2014-04-25, 08:59:32 Ymblanter 2014-09-19, 12:38:16
ValterVBot 12 2014-04-11, 19:12:34 Nemo bis 2015-07-15, 16:09:59
Structor 2014-04-09, 15:50:38 Ymblanter 2014-10-15, 06:44:08
Global Economic Map Bot 2014-01-26, 21:42:37 Ladsgroup 2014-06-17, 14:02:29
KunMilanoRobot 2014-01-21, 19:27:44 Filceolaire 2015-07-25, 00:55:22

Jeblad (bot)[edit]

Jeblad (bot) (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Jeblad (talkcontribslogs)

Task/s: Mostly just language specific updates for the Norwegian (bokmål) community. Updates of illustrations, setting labels, adding claims specific for Norway, etc. Maintenance stuff.

Code: Pywikibot-core (there are some additional tweaks.)

Function details: Not much of interest here. --Jeblad (bot) (talk) 01:07, 3 August 2015 (UTC)


Xaris333Bot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Xaris333 (talkcontribslogs)

Task/s: Harvesting infoboxes and categories from Wikipedias (mostly to add claims to items.

Code: Pywikibot-core

Function details: I will be using the two pywikibot scripts available for Wikidata ( and to add claims from infobox parameters (using harvest) and categories (using claimit). I am most active in Greek wikipedia and I mostly intend to edit items related to Greece and Cyprus. For recent trial run, I have added the follows (P155) and followed by (P156) from el:Πρότυπο:Κουτί πληροφοριών περιόδου πετοσφαιρικού τουρνουά to items linked to articles in the category el:Κύπελλο Κύπρου (πετοσφαίριση γυναικών). I have also many others similar contributions from last year. (see contribs). --Xaris333 (talk) 17:45, 29 July 2015 (UTC)

ProteinBoxBot 4[edit]

ProteinBoxBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Sebastian Burgstaller (talk)

Task/s: ProteinBoxBot 4 will add new items and improve/maintain existing items on pharmaceutical drugs and drug like compounds and substances (small molecules and biologics).

Code: The code is divided into two parts. A core class which handles all Wikidata API related tasks [1] and a pharmaceutical drug data specific part [2].

Function details: The core class handles all Wikidata API related tasks and specifically takes care to select the correct item for writing data to. This module is used by all subtasks the ProteinBoxBot performs and is called by the resource specific part of the bot. The latter takes care of aggregating pharmaceutical drug data information from a set of well established scientific databases, e.g. DrugBank, PubChem, ChEMBL. A full list of properties maintained by the bot and the data sources used is available at the bot page [3]. The bot first aggregates the data from the listed resources and then writes the aggregated data to Wikidata. As a starting set, the bot will work on all FDA/EMA approved or withdrawn drugs listed in Drugbank. For each value, the data source is added as a reference and therefore very reliably maintains provenance. The bot will run at least weekly and will therefore maintain a very up-to-date list of pharmaceutical drugs and the corresponding data. --Sebotic (talk) 09:18, 28 July 2015 (UTC)

  • Symbol support vote.svg Support. Would be awesome to have the bot working on adding pharmaceutical drugs, next to the genes and diseases Andrawaag (talk) 15:21, 28 July 2015 (UTC)
  • Symbol support vote.svg Support. --I9606 (talk) 16:31, 28 July 2015 (UTC)
  • Symbol support vote.svg Support. I see a huge need op CCZero data to support pharmaceutical research of various kinds, and I really think getting more of this data into Wikidata will hugely benefit the field. Moving this data into Wikidata will create an open platform for linking related data, open ways for open community curation, create a bigger Linked Open Data network (e.g. link to PubChemRDF), etc. I really hope to see this move forward. Egon Willighagen (talk) 19:58, 28 July 2015 (UTC)
  • Symbol support vote.svg Support Andrew Su (talk) 16:04, 29 July 2015 (UTC)

ProteinBoxBot 3[edit]

ProteinBoxBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: 'Andra Waagmeester (talk)

Task/s: Add disease information from the disease ontology to Wikidata and subsequently enrich these items with relevant additions from authoritative resources.

Code: The ProteinBoxBot contains a core module which takes care of the generic handling of adding and updating of items on wikidata for all current and future tasks. Each task requires a resource specific module. The module for the Disease ontology :code on bitbucket

Function details: The ProteinBoxBot is core in our efforts to enrich wikidata with Genes, Proteins, Diseases, Drugs and the relationships between them. We currently have code in place to enrich wikidata with content from entrez gene and Disease ontology. With the previous bot credentials this code has added human and mouse genes as well as the diseases in the disease ontology. The items have also been updated regularly. We now understand that part of these tasks have been performed out of the scope of the initial bot approval. We would like to request an approval the task to add content from the Disease ontology. The implemented bot takes disease classes related the external from the disease ontology and adds/updates them as Wikidata items. In the near future the bot will also add pubmed references and maintain links to the English wikipedia.

--Andrawaag (talk) 07:27, 26 July 2015 (UTC)

  • Symbol support vote.svg Support. Original issues were fixed. Bot is ready to run again. Emitraka (talk) 10:44, 26 July 2015 (UTC)
  • Symbol support vote.svg Support. The bot is functional and waiting to add useful content. See test edits. Please approve. I9606 (talk) 16:21, 27 July 2015 (UTC)
  • Symbol support vote.svg Support. I fully support adding this information to Wikidata and can easily imagine many projects taking advantage of it. Egon Willighagen (talk) 22:16, 27 July 2015 (UTC)
  • Symbol support vote.svg Support. Yes. Multichill (talk) 16:40, 28 July 2015 (UTC)
  • Symbol support vote.svg Support Andrew Su (talk) 16:05, 29 July 2015 (UTC)

Addbot 5[edit]

Addbot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Addshore (talkcontribslogs)

Task/s: Mark dates that need checking for calendar model correctness

Code: php, not written yet, using addwiki framework, will be on github.

Function details:

--·addshore· talk to me! 09:35, 22 July 2015 (UTC)

Symbol support vote.svg Support, sounds good to me! Sjoerd de Bruin (talk) 12:41, 23 July 2015 (UTC)
I have already suggested it, but now ask: Will the dates be marked as deprecated as well? Matěj Suchánek (talk) 14:02, 23 July 2015 (UTC)
No. Or at least that is not in the 'current plan'. If everyone feels they should be I am happy to also mark them as deprecated. ·addshore· talk to me! 14:48, 23 July 2015 (UTC)
As I have suggested, when a client wiki uses some data, these are not usually deprecated data (well, there aren't many such statements), so marking them as deprecated can "hide" them. Of course, checking for this qualifier pair is possible as well. Matěj Suchánek (talk) 14:59, 23 July 2015 (UTC)
Symbol oppose vote.svg Leaning oppose using qualifiers to track meta-information. --Ricordisamoa 23:33, 23 July 2015 (UTC)


Dexbot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Ladsgroup (talkcontribslogs)

Task/s: Auto-transliterating for names of humans

Code: Based on pwb, probably publish it soon.

Function details: The codes analyses dumps of Wikidata and can create an auto-transliterating system for any given pair of languages based on that. I started with Persian and Hebrew (some edits for test [4] [5]) --Amir (talk) 18:14, 7 April 2015 (UTC)

  • Pictogram voting comment.svg Comment, please let me know when you try your system for some cyrillic language. I'd like to see it myself. --Infovarius (talk) 14:10, 8 April 2015 (UTC)
@Infovarius: I work in pair of languages like fa and he (which the bot adds Persian transliteration based on Hebrew and vice versa) which pair of language do you suggest? en and ru? Amir (talk) 11:54, 9 April 2015 (UTC)
Probably you should have stated this in your request. Your phrase "I started with" has encouraged me :) No, I don't suggest Russian as I understand the complexity of the task. --Infovarius (talk) 13:16, 10 April 2015 (UTC)
@Infovarius: I don't think Russian is too complicated to abandon. I took care of lots of different issues including country of citizenship, etc. so It's not hard for this bot. I asked you what language do think is the best pair for Russian *to start with* Amir (talk) 21:11, 10 April 2015 (UTC)
Will the bot be able to dedect delicate labels as in King An of Han (Q387311)? --Pasleim (talk) 19:24, 13 April 2015 (UTC)
It probably skips them or make a correct transliteration (depends on the language) but I can't say for sure. Let me test Amir (talk) 13:33, 15 April 2015 (UTC)
Are we ready for approval here?--Ymblanter (talk) 16:08, 15 April 2015 (UTC)
  • Just a caveat when when dealing with Chinese languages: Chinese to Latin script (and vice versa) transliterations are rarely standardized. For example, Alan Turing's given name might be transliterated into 艾伦 or 阿兰 (as in the case of Alan Moore (Q205739)) or 亚伦 (as in the case of Alan Arkin (Q108283)). These Chinese characters are roughly resembles "Alan" when pronounced, but due to regional differences (i.e. mainland China, Taiwan, Hong Kong, etc), they result in different transliterations. Even when two people's names are transliterated by the same region, they can be different. There is simply no standardization on this matter. —Wylve (talk) 14:53, 23 April 2015 (UTC)
    hmm, User:Wylve: Just a question: Is it wrong to put "亚伦" for Alan in Alan Turing? Amir (talk) 12:36, 25 April 2015 (UTC)
    It's not wrong, but it might not be the only way people call Alan Turing in Chinese. The lead sentence of Turing's article on zhwiki mentions that "Alan" is also transliterated as 阿兰. —Wylve (talk) 20:48, 25 April 2015 (UTC)
    @Wylve: I made 50 auto-transliterations [6], please check and say if anything is wrong or unusual. Thanks Amir (talk) 20:05, 16 May 2015 (UTC)
    I can't verify every name, since some of those people aren't mentioned in Chinese news sources. My standard of what is "wrong" or "unusual" is whether the transliterations you've produced are used predominantly in reliable and reputable sources. It is hard to judge sometimes, as there is a variety of transliterations used. For instance:
  • Jonathan Ross is transliterated as 强纳·森罗斯 and also 喬納森·羅斯
  • Leonard B. Jordan is also transliterated as 萊昂納德·B·喬丹
  • Jimmy Bennett is also transliterated as 吉米·本内特, 吉米班奈, 吉米班奈特.
  • Jason Lee is also named 杰森·李.
  • "Scott" from A. O. Scoot is also transliterated as 史考特.
  • All of your edits should be fine if read in Chinese, as they all sound like their English name. Also, I have found this page ([7]), which documents Xinhua News Agency (Q204839)'s official transliterations of names. These transliterations are considered official only in Mainland China. —Wylve (talk) 21:58, 16 May 2015 (UTC)

SaschaBot 2[edit]

SaschaBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Sascha (talkcontribslogs)

Task/s: Add missing common names to Wikidata.

Function details: To mine for missing common names, I went over all humans in Wikidata, extracted the first part of their English label (eg., Simone de BeauvoirSimone), and matched this against a list of all common names in Wikidata. Here is the result: List of common names that seem to be missing from Wikidata.

It should be easy to mine the gender of these common names. I think this would be best done in a later, separate pass, since this could then also check the gender of existing common names in Wikidata. After that step, another bot run would create descriptions (such as Female common name) for all common names in Wikidata that don't have descriptions yet.

Impact: If the bot gets permission to run, it would create 108636 items. If we insert missing common names only if there's at least 2 people with that name, the bot would create 30893 items. If we restrict to names with at least 3 people, it would create 18361 items.

Caveats: If you look at the list, you will see a couple of bogus entries. Some are not in the Latin script, or contain funny characters like :. I will make sure that these do not get inserted, but wanted to start the discussion already now. However, there are also some entries that would not be detectable by a script, such as Empress. What should we do about those? Is there a good tool so that others could help reviewing the list? (I've made the spreadsheet world-editable on Google Docs).

--Sascha (talk) 16:28, 26 March 2015 (UTC)

Hey Sascha. The list is at the moment a mix of first names, last names (e.g. Li), pseudonyms (e.g. Seven) and other entries (e.g. Saint, K). To set proper descriptions and for later usage it is however important that the type of name is known. Do you see a way to figure out the type automatically or should all entries be reviewed by a human? --Pasleim (talk) 17:52, 26 March 2015 (UTC)
Notify User:Jura1 – the name expert in Wikidata --Pasleim (talk) 18:03, 26 March 2015 (UTC)
  • Good idea. I had thought about doing that at some point as well, but I'm glad it's being taken up.
    How about checking the names against some of the lists at Wikipedia? Special:Search/list of given names helps find some.
    WikiProject Names describes how to structure the items.
    To avoid problems, I usually leave out given names that are not first names (Chinese, Korean, Japanese, Hungarian).
    This list provides most existing first names. --- Jura 20:34, 26 March 2015 (UTC)
    BTW, I couldn't resist and created Phil (Q19685923). --- Jura 05:47, 27 March 2015 (UTC)
@Sascha: Finally, do you plan to use the list or may I use it to create some of the missing names? --- Jura 16:36, 13 April 2015 (UTC)
Apologies for the delay, I was traveling and just came back today. Sure, feel free to use the list. Sascha (talk) 11:36, 26 May 2015 (UTC)
Thanks, but in the meantime I made one on quarry and outlined a "top-down" approach on WikiProject Names. --- Jura 11:44, 26 May 2015 (UTC)

Revibot 3[edit]

Revibot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: -revi (talkcontribslogs)

Task/s: Fix double redirect

Code: mw:Manual:Pywikibot/

Function details: It's simple: bot will retrieve the list of double redirects and try to fix it, unless it is circular redirect. (I am running initial test run now.) — Revi 12:16, 7 December 2014 (UTC)

Note: Moe Epsilon's userpages are circular redirect, which means bot cannot solve it . — Revi 12:23, 7 December 2014 (UTC)
Time2wait.svg On hold phab:T77971 blocking this task. — Revi 15:19, 9 December 2014 (UTC)
@-revi: I'm doing this taks since a couple of weeks. If you want to take it over, I've published the code here --Pasleim (talk) 21:33, 30 March 2015 (UTC)


Shyde (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Gallaecio (talkcontribslogs)


  • Update of the latest stable version of free software. My initial plan is to support videogames from the Chakra repositories that exist in Wikidata already. For games that do not exist in Wikidata, I may create an entry for them if a Wikipedia article exists about them. Later, I plan to extend the software list to other types of Chakra software that is also present in Wikidata or the English Wikipedia.


Function details:

  • For each piece of software that the script supports (current list):
    • Add the latest stable version to the version property of a software entity if such version is not present.
    • Add a release date qualifier to the latest stable version property value of a software entity if such version lacks a release date qualifier.
    • Add an URL reference to the latest stable version property value of a software entity if such version lacks a reference.


  • I was writing a script to detect which games (and possibly regular applications in the future) in Chakra are out of date in the Chakra repositories. I realized that such information, if pulished in Wikidata, could benefit a wide audience. Since I got some skills with Pywikibot, I though that making a new script that updates this information in Wikidata would be fun, and it would help me to get to know Wikidata better.

--Gallaecio (talk) 15:09, 29 November 2014 (UTC)

It would be good if you could set the rank of the latest version to "preferred" and all other ranks to "normal". --Pasleim (talk) 18:17, 6 December 2014 (UTC)
Any progress here?--Ymblanter (talk) 16:40, 11 February 2015 (UTC)
@Gallaecio:--GZWDer (talk) 04:19, 12 February 2015 (UTC)
I was planning to answer as soon as I had it done, but I am busy working on something else and it will take me at least a couple of months. In that time I won’t run the bot. That said, my plan is to implement the rank thingy before I run the bot again. Many thanks for your feedback. Gallaecio (talk) 05:01, 25 May 2015 (UTC)
@GZWDer: Done, the bot can now make the latest version the "preferred" one and any other version "normal". I have a daily test that will let me know when any of the data that the bot can update needs updating, and I will run the bot manually when required. Thanks! Gallaecio (talk) 11:19, 20 June 2015 (UTC)


JhealdBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Jheald (talkcontribslogs)

Task/s: To add about 35,000 new topic's main category (P910) / category's main topic (P301) property pairs based on matches on Commons category (P373)

Code: not yet written

Function details: About 35,000 potential new topic's main category (P910) / category's main topic (P301) pairs have been identified, based on a unique Commons category (P373) existing for each cat-like item and each non-catlike item, where the article-like item does not have any topic's main category (P910) property currently set.

A preliminary sample, for Commons cats starting with the letter 'D', can be found at User:Jheald/sandbox.

Still to do, before starting editing, would be to remove "List of" articles, as these should not be the category's main topic (P301) of a category; and also to check the cats for any existing category's main topic (P301) and category combines topics (P971) properties set. -- Jheald (talk) 23:30, 7 September 2014 (UTC)

Would you please make several dozen trial contributions?--Ymblanter (talk) 15:54, 24 September 2014 (UTC)
@Jheald: is this request still current? Or can it be closed? Multichill (talk) 17:13, 22 November 2014 (UTC)
@Multichill: It's not near the top of my list. I probably will get back to it eventually, but I don't see topic's main category (P910) / category's main topic (P301) pairs as so important, if we will have items on Commons for Commons categories. And I'd probably use QuickStatements, at least for the test phase. So the request can be put into hibernation for the moment. Jheald (talk) 22:39, 22 November 2014 (UTC)


BthBasketbot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Bthfan (talkcontribslogs)

Task/s: Import basketball players team history from Template:Infobox basketball biography from English Wikipedia

Code: User:Bthfan/

Function details: This bot is using pywikibot and is based on the script (see I modified the original code a lot to import the team history of a basketball player from Template:Infobox basketball biography on the English wikipedia. That template has years1, team1, years2, team2, ... to specify the single teams a player has played for in which years. This bot will combine a years* property with one team* property to create the following Wikidata claim:
member of sports team (P54) miga: team name
with qualifiers: start time (P580) miga: (start year gets extracted from years*, bot looks for four digit number); end time (P582) miga: (end year gets extracted from years*, bot looks for four digit number; if there is only one number it assumes start year=end year)

The bot re-uses existing member of sports team (P54) miga entries if there are no qualifiers attached to that claim yet. This reuse is needed as some basketball players already have a few member of sports team (P54) miga claims, as other bots imported for example categories like (every player in such a category got a member of sports team (P54) miga entry). But those entries have no start and no end date and such lack some information that's included in the infobox.

The bot code is not completely finished yet, it needs the patch from to use the editEntity and JSON feature as far as I see this. The problem is that some players have played for two different teams in one year. Then the template entry in Wikipedia looks like this:
team1=Team A
team2=Team B

So I need to possibly reorder existing member of sports team (P54) miga claims so that the order of items is correct and corresponds to the order in the template/infobox. This is currently not possible yet with the existing pywikibot code (I would need to access the API function wbsetclaim as this one allows one to set the index of a claim). --Bthfan (talk) 08:17, 10 June 2014 (UTC)

BTW: Basically I'm waiting for the patch at to be finished, then I could edit the whole entity and with that way reorder existing claims. --Bthfan (talk) 22:08, 28 June 2014 (UTC)
Still blocked by, that patch is buggy in some way. I guess it will take a while until the bot code is finished so that it can do some test run :/ --Bthfan (talk) 07:01, 9 July 2014 (UTC)

@Bthfan: gerrit:125575 can manage qualifiers and sorting as well. Even if it's not near to being merged, you can test it locally (SamoaBot is running on Tool Labs with that change, just now). However, your code does not appear to take full advantage of the new system. See an example of the correct usage:

claim = pywikibot.Claim(site, pid)
qual = pywikibot.Claim(site, pid, isQualifier=True)
if qual.getID() not in claim.qualifiers:
    claim.qualifiers[qual.getID()] = []

--Ricordisamoa 19:37, 23 July 2014 (UTC)

Ok, thanks for the example. I did test that patch, the patch broke the addQualifier function in pywikibot (I left a comment on gerrit). So your example code is the new way to add a qualifier, one should no longer use addQualifier? --Bthfan (talk) 20:17, 23 July 2014 (UTC)
That is the only fully working way. I plan to support addQualifier (in the same or in another patch), but editEntity() would have to be called to apply the changes. --Ricordisamoa 18:09, 24 July 2014 (UTC)
Ah, I see how it should work then :). Ok, I'll try that. --Bthfan (talk) 11:56, 25 July 2014 (UTC)
Just an update on this: This bot is currently blocked by a bug in the Wikidata API (at least it looks like a bug to me), see wbeditentity ignores changed claim order when using that API function. --Bthfan (talk) 08:19, 6 August 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
My understanding is that we are not yet ready for approval and waiting for a bug to be resolved.--Ymblanter (talk) 13:56, 11 August 2014 (UTC)
That's correct. I could try to use another API function for reordering the claims (there is one: wbsetclaim with the index parameter), but this means to modify pywikibot quite a bit as this API function is not used/implemented yet in pywikibot. I currently don't have enough time for that :) --Bthfan (talk) 14:25, 11 August 2014 (UTC)


Fatemibot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Fatemi127 (talkcontribslogs)

Task/s: Updating sitelink and items that moved categories, articles &... in all namespace in Persian wikipedia (fawiki)

Code: Pywiki (This code)

Function details: Moving category needs to updating it item in WD. My bot can move categories and need permission bot in wd for this work. --H.Fatemi 08:59, 25 April 2014 (UTC)

BA candidate.svg Precautionary oppose since the code is not very well-written and seems to use hard-coded configuration specific to another bot. I could change opinion when the code is cleaned up. --Ricordisamoa 14:56, 25 April 2014 (UTC)
@Ricordisamoa: Hi, this code running very well. Plz see this history for وپ:دار. User:Rezabot and User:MahdiBot Working with this code. :) also my bot has flag in Persian wiki (fawiki) and in this month (reference) Fatemibot has more 15000 edits that means I am able to do this properly.Please Give me a chance for a test case to prove to you, thanks. H.Fatemi 15:30, 25 April 2014 (UTC)
Nothing is preventing you from making a short test run (50-250 edits) :-) --Ricordisamoa 15:36, 25 April 2014 (UTC)
thanks H.Fatemi 21:15, 25 April 2014 (UTC)
BTW, very simple changes like these should really be made in a single edit. --Ricordisamoa 21:01, 25 April 2014 (UTC)
@Ricordisamoa: See this Special:Contributions/ :-| from my wrong typing these edits with IP Occurred but I edited my user name in this code and for test i submitted a page for job to labs. plz wait to other users request the changing with my bot in w:fa:وپ:دار it is now is empty! I come back soon. thanks a lot and forgive me for speaking weakness. H.Fatemi 21:15, 25 April 2014 (UTC)
@Ricordisamoa: ✓ Done about 110 edits :) H.Fatemi 06:42, 26 April 2014 (UTC)
@Fatemi127: my first comment still applies, so you'd have to wait for a bureaucrat. --Ricordisamoa 20:32, 8 May 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
@Fatemi127, Bene*, Vogone, Legoktm, Ymblanter: Is it ready to be approved?--GZWDer (talk) 12:31, 19 September 2014 (UTC)
We clearly have a problem here, and I do not see how it was resolved.--Ymblanter (talk) 12:38, 19 September 2014 (UTC)

ValterVBot 12[edit]

ValterVBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: ValterVB (talkcontribslogs)

Task/s: Delete all population (P1082) that I have added in Italian municipality.


Function details: After this discussion and this, is necessary delete population (P1082) in all Italian municipality, because the source is Istituto Nazionale di Statistica (Q214195) and they have a license "CC-BY" (Legal notes). With this license is impossible to use data outside of Wikidata, because we have "CC0". --ValterVB (talk) 19:12, 11 April 2014 (UTC)

I don't think such data can be copyrightable :/ --Ricordisamoa 19:41, 11 April 2014 (UTC)
@ValterVB: If it's a really copyvio, should we revdel related version?--GZWDer (talk) 07:24, 12 April 2014 (UTC)
I think that technically isn't copyviol because I gave appropriate credit inside Wikidata. But if someone use this data outside of Wikidata without credit there is the problem. So probably it's sufficient delete the data. --ValterVB (talk) 08:29, 12 April 2014 (UTC)
I don't see any indication that this is a real problem. The CC-BY 3.0 license of ISTAT doesn't waive database rights (only CC-0 does; and for our purposes CC-BY 4.0 as well), but the data is being used nonetheless on Wikipedia. Wikimedia projects are already ignoring ISTAT's database rights and m:Wikilegal/Database Rights doesn't say it's a problem to store (uncopyrightable) data which was originally extracted against database rights. --Nemo 08:48, 14 April 2014 (UTC)
Maybe ask WMF Legal to weight in, or ask ISTAT with the Email template to be developed if they are OK with our usage and republication. If either takes an unreasonable time to go through, or comes back negatively, then I agree with the proposal. Content that has been imported directly from a database with an incompatible license should be removed. Given that the template gets developed, does anyone have connections to ISTAT to ask? --Denny (talk) 20:18, 14 April 2014 (UTC)
The license of the database is not incompatible, the data itself is PD-ineligible in Italy (a. not innovative, b. official document of the state). The problem, as usual, are database rights, but see above.[8] Someone from WMIT will probably ask them to adopt a clearer license in that regard but I wouldn't worry too much. --Nemo 15:54, 15 April 2014 (UTC)
@Nemo, Sorry but I don't understand you when you say there is no problem: CC-0 is not CC-BY. 1) Wikipedia respects the licence about CC-BY and 2) the authors select a licence for their work so the database rights are not more applicable. WD doesn't respect the CC-BY so that's the main problem. Database rights are a problem for databases which are not free: you can always use their data according to the short citation right but the problem becomes difficult when a lot of short citations are made in the same document. Snipre (talk) 15:10, 15 May 2014 (UTC)
This is like the PD-art discussion. IANAL, but as far as I know the US doesn't have database rights, see also en:Sui generis database right. So from a Wikidata as a site shouldn't have any problems. If User:ValterVB is in a country that does have these laws (Italy), he might be liable as a person. Sucks man, but we're not going to delete it because of that. Be more careful in the future. This request already open for quite some time. Should probably be closed as denied. Multichill (talk) 17:00, 22 November 2014 (UTC)

I pointed Federico Morando to this discussion. --Nemo 16:09, 15 July 2015 (UTC)


Structor (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Infovarius (talkcontribslogs)

Task/s: Setting claims, labels and description. Some small (<1000) portions of edits which will be tedious to do by hand. Particular task: To provide structural information for species items. genera under consideration.

Code: Uses API functions through URLFetch function of Wolfram Mathematica which is not open-source. Mathematica code of main edit function, Mathematica code of the task.

Function details: Just an example. The bot gets a list of potential items like for genus Eutreptia. I view the list and extract working list (here I remove the genus itself). Then the bot goes and do such edits. The item for genus I choose or create by myself if necessary. --Infovarius (talk) 15:50, 9 April 2014 (UTC)

Making strange, unsourced edits to universe (Q1) like these? Its completly unclear what your strange named bot will do. A common name for your first bot would be InfovariusBot. Regards --Succu (talk) 18:52, 9 April 2014 (UTC)
universe (Q1) was first-try error. As its name assumes Structor will make a structures by different properties. I don't know if it needs a bot-flag for small tasks. I hoped that the flag helps me to do quicker edits through API, but may be it's not the case. --Infovarius (talk) 11:19, 10 April 2014 (UTC)
I don't want to rename the bot as it has SUL with ~0.5 mln edits. --Infovarius (talk) 11:21, 10 April 2014 (UTC)
@Infovarius: Your bot added a lot of duplicate P31 and P171 claims. Why?--GZWDer (talk) 12:57, 10 April 2014 (UTC)
A lot? I know about 1 duplicate P31 which I've corrected already. --Infovarius (talk) 16:49, 10 April 2014 (UTC)
@Infovarius: And, Please do not use confused edit summary [9]--GZWDer (talk) 04:56, 11 April 2014 (UTC)
I want to do it as precisely as I can. What variant do you propose? Infovarius (talk) 05:00, 11 April 2014 (UTC)
@GZWDer:, please tell me what summary should be? --Infovarius (talk) 15:54, 25 April 2014 (UTC)
@Infovarius: If you use wbeditentity, you can set summary as "import Pxx/Pxx/Pxx"; You can use no summary if you use wbcreateclaim.--GZWDer (talk) 09:00, 26 April 2014 (UTC)
Don't forgt to add stated in (P248) as reference.--GZWDer (talk) 09:02, 26 April 2014 (UTC)
Hm, there's a problem. I am deriving genus from the species latin names. What should I note as source? Infovarius (talk) 12:03, 26 April 2014 (UTC)
@Infovarius: You can use no source if the claim is obvious.--GZWDer (talk) 12:12, 26 April 2014 (UTC)
AFAIU, only the Mathematica engine is non-free, while you can redistribute programs based on it. --Ricordisamoa 00:10, 12 April 2014 (UTC)
@Infovarius: do you haz teh codez? :P --Ricordisamoa 23:26, 23 April 2014 (UTC)
@Ricordisamoa, Succu: I've updated the request with the codes. Infovarius (talk) 11:46, 1 July 2014 (UTC)

@Infovarius:: For your tests you should use our testrepo, not wikidata (see history of Hydrangea candida). --Succu (talk) 12:09, 16 April 2014 (UTC)

There is no reaction and an obvious a lack of experience. So I decided to Symbol oppose vote.svg oppose. --Succu (talk) 21:57, 21 April 2014 (UTC)
@Succu: You should @Infovarius: to get more message.--GZWDer (talk) 10:24, 23 April 2014 (UTC)
@Infovarius: Do more test edits please!--GZWDer (talk) 10:25, 23 April 2014 (UTC)
I've learned API:wbeditentity, so I can do now such edits. Infovarius (talk) 21:26, 23 April 2014 (UTC)
@Succu: Structor is being tested at testwikidata:Special:Contributions/Structor.--GZWDer (talk) 14:53, 25 April 2014 (UTC)
@GZWDer:: I know. I see some test edits. But I don't see a reasonable summary, which you demanded. @Infovarius: You should choose a clear and limited area of operation for your first but run. It would be nice if you could define it and run some further test edits. --Succu (talk) 15:17, 25 April 2014 (UTC)
I think that my task will be: "To provide structural information for species items." There are so many of empty (without properties) species which I am running into. --Infovarius (talk) 12:03, 26 April 2014 (UTC)
It's not true that most taxa have no properties. Two questions:
  1. There are some hundred genra with the same name. How do you identify the correct one for parent taxon (P171)?
  2. Based on which assumptions you will use taxon (Q16521) / monotypic taxon (Q310890) for instance of (P31)?
--Succu (talk) 06:45, 30 April 2014 (UTC)
1. I am trying to skip homonymous genera at first. 2. I use taxon (Q16521) always. While for species monotypic taxon (Q310890) is always OK too, I suppose, superset is also correct. --Infovarius (talk) 21:24, 3 May 2014 (UTC)
  1. To „skip homonymous genera” you have to be aware of them, so I have to repeat my question: How do you identify them?
  2. You suppose? Species are never monotypic.
Dear Infovarius, mind to inform WikiProject Taxonomy about your bot plans? --Succu (talk) 21:47, 3 May 2014 (UTC)
Thank you for recommendation, ✓ Done. --Infovarius (talk) 11:46, 1 July 2014 (UTC)
I added some remarks over there. --Succu (talk) 17:05, 2 July 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter, The Anonymouse: Any 'crat to comment?--GZWDer (talk) 04:56, 30 April 2014 (UTC)

@Infovarius: Could you explain this change, please. Thx. --Succu (talk) 19:35, 14 July 2014 (UTC)

I think this is a page logging all actions by bot. @Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:51, 11 August 2014 (UTC)
  • Let us return here. What is the current situation?--Ymblanter (talk) 07:46, 16 September 2014 (UTC)
The discussions stopped here. --Succu (talk) 07:59, 16 September 2014 (UTC)
  • Time is changing. Now the task about genus-species linking almost expired because Succu has done most of it. I can still perform the task of labelling species by its scientific name. I support en, de, fr, ru labels and descriptions but can gather information about as many languages as possible. But I won't do it without approving of flag as I'm afraid it'll be in vain because of vast burocracy... Infovarius (talk) 19:01, 14 October 2014 (UTC)
There are around 300.000 items left. :) --Succu (talk) 20:47, 14 October 2014 (UTC)
Why do not the two of you agree on some well-defined task, and then I could flag the bot.--Ymblanter (talk) 06:44, 15 October 2014 (UTC)

Global Economic Map Bot[edit]

Global Economic Map Bot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Alex and Amir

Task/s: The Global Economic Map Bot will be the primary bot to update the Global Economic Map project. It will retrieve data from a variety of economic databases.

Code: Python

Function details: The Global Economic Map Bot will be the primary bot to update the Global Economic Map project. It will retrieve data from World Bank Indicators, UN Statistics, International Labor Organization, Bureau of Economic Analysis, Gapminder World, OpenCorporates and OpenSpending. The data retrieved will automatically update Wikidata with economic statistics and it will also update the Global Economic Map project. --Mcnabber091 (talk) 21:42, 26 January 2014 (UTC)

I'm helping for the harvesting and adding these data Amir (talk) 21:47, 26 January 2014 (UTC)
@Mcnabber091, Ladsgroup: Is this request still needed? Vogone talk 13:18, 30 March 2014 (UTC)
yes Amir (talk) 13:39, 30 March 2014 (UTC)
Could you create the bot account and run some test edits? The Anonymouse [talk] 17:09, 7 May 2014 (UTC)
@Ladsgroup:--GZWDer (talk) 05:06, 11 June 2014 (UTC)
Can you please, give us several months in order to get the datatype implemented? Amir (talk) 14:02, 17 June 2014 (UTC)


KunMilanoRobot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Kvardek du (talkcontribslogs)


  • Add french 'intercommunalités' on french communes items (example)
  • Add french communes population
  • Correct Insee codes of french communes


Function details: Takes the name of the 'communauté de communes' in the Insee base and adds it if necessary to the item, with point in time and source. Uses pywikipedia. --Kvardek du (talk) 19:27, 21 January 2014 (UTC)

Imo the point in time qualifier isn't valid here as the propriety isn't time specific. -- Bene* talk 15:10, 22 January 2014 (UTC)
point in time (P585) says "time and date something took place, existed or a statement was true", and we only know the data was true at January 1st, due to numerous changes in French organization. Kvardek du (talk) 12:18, 24 January 2014 (UTC)
Interesting, some comments:
  • Not sure that "intercommunalités" are really aministrative divisions (they are built from the bottom rather than from the top). part of (P361) might be more appropriate than located in the administrative territorial entity (P131)
  • Populations are clearly needed but I think we should try do it well from the start and that is not easy. That seems to require a separate discussion.
  • INSEE code correction seems to be fine.
  • Ideally, the date qualifiers to be used for intercommunalité membership would be start time (P580) and end time (P582) but I can't find any usable file providing this for the whole country. --Zolo (talk) 06:37, 2 February 2014 (UTC)
Kvardek du : can you add « canton » and « pays » too ? (canton is a bit complicated since some cantons contains only fraction of communes)
Cdlt, VIGNERON (talk) 14:01, 4 February 2014 (UTC)
Wikipedia is not very precise about administrative divisions (w:fr:Administration territoriale). Where are the limits between part of (P361), located on terrain feature (P706) and located in the administrative territorial entity (P131) ?
Where is the appropriate place for a discussion about population ?
VIGNERON : I corrected Insee codes, except for the islands : the same problem exists on around 50 articles due to confusion between articles and communes on some Wikipedias (I think).
Kvardek du (talk) 22:26, 7 February 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter, The Anonymouse: Any 'crat to comment?--GZWDer (talk) 14:37, 25 February 2014 (UTC)
I'm still not familiar with the "point in time" qualifier. What about "start date" because you mentioned the system has changed to the beginning of this year? Otherwise it might be understood as "this is only true/happened on" some date. -- Bene* talk 21:04, 25 February 2014 (UTC)
Property retrieved (P813) is for the date the information was accessed and is used as part of a source reference. point in time (P585) is for something that happened at one instance. It is not appropriate for these entities which endure over a period of time. Use start time (P580) and end time (P582) if you know the start and end dates. Filceolaire (talk) 21:19, 25 March 2014 (UTC)

Symbol support vote.svg Support if the bot user uses start time (P580) and end time (P582) instead of point in time (P585) --Pasleim (talk) 16:48, 28 September 2014 (UTC)

@Kvardek du: Do you still plan to run the bot? If so, could you please do agian some test edits with the use of start time (P580), end time (P582) instead of point in time (P585)? --Pasleim (talk) 07:52, 24 May 2015 (UTC)
@Pasleim: : it's planned, but not for the moment... The problem I have with french data is that you only have the membreship at a moment t, and not with a start time (P580). Kvardek du (talk) 13:20, 25 May 2015 (UTC)
Kvardek du then use retrieved (P813) in the reference and leave out start time (P580) and point in time (P585). Joe Filceolaire (talk) 08:33, 23 July 2015 (UTC)
Filceolaire : yeah but I have a retrieved (P813) t2 which is different from my point in time (P585)... Kvardek du (talk) 15:47, 24 July 2015 (UTC)
If you don't know the 'start time' then leave it out. If you want then you can create a separate item for the document that the data comes from and add the point in time statement to that item then reference the item for that document in the references for the 'located in ... entity' statements. Look on it as the 'point in time' date relates to the info in the document (true on that date).
Note that population figures should have a 'point in time' qualifier to say when that population figure applies since the population figure is not true for a period; it is only true for the day it was measured. Joe Filceolaire (talk) 00:55, 25 July 2015 (UTC)