Wikidata:Requests for permissions/Bot

From Wikidata
Jump to: navigation, search
Shortcuts: WD:RFBOT, WD:BRFA, WD:RFP/BOT
Wikidata:Requests for permissions/Bot
To request a bot flag, or approval for a new task, please input your bot's name into the box below, followed by the task number if your bot is already approved for other tasks.


Old requests go to the archive.

Once consensus is obtained in favor of granting the botflag, please post requests at the bureaucrats' noticeboard.

SamoaBot 39[edit]

SamoaBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Ricordisamoa (talkcontribslogs)

Task/s: importing interwiki links for Wikimedia Commons

Function details: it will follow the outcome of WD:Requests for comment/Commons links; see also commons:Commons:Bots/Requests/SamoaBot 5 and [1]. --Ricordisamoa 16:01, 2 November 2013 (UTC)

It would also create new items, if the RFC establishes that. --Ricordisamoa 16:03, 3 November 2013 (UTC)

I suggest that this request be put Time2wait.svg On hold until that RfC is closed, which could be some time. The Anonymouse (talk) 22:40, 6 November 2013 (UTC)

@The Anonymouse: it's ok for me. --Ricordisamoa 02:21, 11 November 2013 (UTC)
@Ricordisamoa::Wikidata:Requests for comment/Commons links is closed. Please add sitelink Categories to categories, articles to galleries, and use category's main topic (P301) and topic's main category (P910) (NOT Commons category (P373)) to connect them. and please DO NOT create any items with only a sitelink to a category page in Wikimedia Commons unless it has links to Wikipedia. see Wikidata_talk:Requests_for_comment/Commons_links#Review_of_closure.--GZWDer (talk) 10:47, 26 November 2013 (UTC)
@GZWDer: the bot will follow the RFC's outcome. At first, it won't create new items nor add any statements. I'll make some tests ASAP. --Ricordisamoa 17:06, 27 December 2013 (UTC)
Did you already make some tests? Some links would be helpful. -- Bene* talk 12:51, 3 January 2014 (UTC)
@Ricordisamoa:--GZWDer (talk) 13:53, 3 January 2014 (UTC)
@GZWDer, Bene*: some old ones --Ricordisamoa 22:13, 13 January 2014 (UTC)
Is this change really wanted by the guideline? I see no point to link pages to a template. -- Bene* talk 05:54, 14 January 2014 (UTC)
@Ricordisamoa: any updates?  Hazard SJ  16:28, 6 March 2014 (UTC)
@Bene*, Hazard-SJ: I'm rewriting my script from scratch, and I've already included a namespace filter to avoid such mistakes. --Ricordisamoa 02:22, 18 March 2014 (UTC)

@Ricordisamoa:One month later... status? I find import of links very important, in particular Commons where it's been possible for 7 months and the number of connected pages is same as cswiki or arwiki (source). Symbol support vote.svg Support Matěj Suchánek (talk) 19:27, 18 April 2014 (UTC)

The script is now well-tested, but the peculiar linking rules for Commons (especially between galleries, creators and instance of (P31)  human (Q5)) makes this anything but an ordinary task. --Ricordisamoa 14:48, 25 April 2014 (UTC)

Five month later... status? --Pasleim (talk) 16:41, 28 September 2014 (UTC)

KunMilanoRobot[edit]

KunMilanoRobot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Kvardek du (talkcontribslogs)

Task/s:

  • Add french 'intercommunalités' on french communes items (example)
  • Add french communes population
  • Correct Insee codes of french communes

Code:

Function details: Takes the name of the 'communauté de communes' in the Insee base and adds it if necessary to the item, with point in time and source. Uses pywikipedia. --Kvardek du (talk) 19:27, 21 January 2014 (UTC)

Imo the point in time qualifier isn't valid here as the propriety isn't time specific. -- Bene* talk 15:10, 22 January 2014 (UTC)
Property:P585 says "time and date something took place, existed or a statement was true", and we only know the data was true at January 1st, due to numerous changes in French organization. Kvardek du (talk) 12:18, 24 January 2014 (UTC)
Interesting, some comments:
  • Not sure that "intercommunalités" are really aministrative divisions (they are built from the bottom rather than from the top). part of (P361) might be more appropriate than is in the administrative territorial entity (P131)
  • Populations are clearly needed but I think we should try do it well from the start and that is not easy. That seems to require a separate discussion.
  • INSEE code correction seems to be fine.
  • Ideally, the date qualifiers to be used for intercommunalité membership would be start date (P580) and end date (P582) but I can't find any usable file providing this for the whole country. --Zolo (talk) 06:37, 2 February 2014 (UTC)
Kvardek du : can you add « canton » and « pays » too ? (canton is a bit complicated since some cantons contains only fraction of communes)
Cdlt, VIGNERON (talk) 14:01, 4 February 2014 (UTC)
Wikipedia is not very precise about administrative divisions (w:fr:Administration territoriale). Where are the limits between part of (P361), located on terrain feature (P706) and is in the administrative territorial entity (P131) ?
Where is the appropriate place for a discussion about population ?
VIGNERON : I corrected Insee codes, except for the islands : the same problem exists on around 50 articles due to confusion between articles and communes on some Wikipedias (I think).
Kvardek du (talk) 22:26, 7 February 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter, The Anonymouse: Any 'crat to comment?--GZWDer (talk) 14:37, 25 February 2014 (UTC)
I'm still not familiar with the "point in time" qualifier. What about "start date" because you mentioned the system has changed to the beginning of this year? Otherwise it might be understood as "this is only true/happened on" some date. -- Bene* talk 21:04, 25 February 2014 (UTC)
Property date retrieved (P813) is for the date the information was accessed and is used as part of a source reference. point in time (P585) is for something that happened at one instance. It is not appropriate for these entities which endure over a period of time. Use start date (P580) and end date (P582) if you know the start and end dates. Filceolaire (talk) 21:19, 25 March 2014 (UTC)

Symbol support vote.svg Support if the bot user uses start date (P580) and end date (P582) instead of point in time (P585) --Pasleim (talk) 16:48, 28 September 2014 (UTC)

Global Economic Map Bot[edit]

Global Economic Map Bot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Alex and Amir

Task/s: The Global Economic Map Bot will be the primary bot to update the Global Economic Map project. It will retrieve data from a variety of economic databases.

Code: Python

Function details: The Global Economic Map Bot will be the primary bot to update the Global Economic Map project. It will retrieve data from World Bank Indicators, UN Statistics, International Labor Organization, Bureau of Economic Analysis, Gapminder World, OpenCorporates and OpenSpending. The data retrieved will automatically update Wikidata with economic statistics and it will also update the Global Economic Map project. --Mcnabber091 (talk) 21:42, 26 January 2014 (UTC)

I'm helping for the harvesting and adding these data Amir (talk) 21:47, 26 January 2014 (UTC)
@Mcnabber091, Ladsgroup: Is this request still needed? Vogone talk 13:18, 30 March 2014 (UTC)
yes Amir (talk) 13:39, 30 March 2014 (UTC)
Could you create the bot account and run some test edits? The Anonymouse [talk] 17:09, 7 May 2014 (UTC)
@Ladsgroup:--GZWDer (talk) 05:06, 11 June 2014 (UTC)
Can you please, give us several months in order to get the datatype implemented? Amir (talk) 14:02, 17 June 2014 (UTC)

Structor[edit]

Structor (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Infovarius (talkcontribslogs)

Task/s: Setting claims, labels and description. Some small (<1000) portions of edits which will be tedious to do by hand. Particular task: To provide structural information for species items. genera under consideration.

Code: Uses API functions through URLFetch function of Wolfram Mathematica which is not open-source. Mathematica code of main edit function, Mathematica code of the task.

Function details: Just an example. The bot gets a list of potential items like for genus Eutreptia. I view the list and extract working list (here I remove the genus itself). Then the bot goes and do such edits. The item for genus I choose or create by myself if necessary. --Infovarius (talk) 15:50, 9 April 2014 (UTC)

Making strange, unsourced edits to universe (Q1) like these? Its completly unclear what your strange named bot will do. A common name for your first bot would be InfovariusBot. Regards --Succu (talk) 18:52, 9 April 2014 (UTC)
Q1 was first-try error. As its name assumes Structor will make a structures by different properties. I don't know if it needs a bot-flag for small tasks. I hoped that the flag helps me to do quicker edits through API, but may be it's not the case. --Infovarius (talk) 11:19, 10 April 2014 (UTC)
I don't want to rename the bot as it has SUL with ~0.5 mln edits. --Infovarius (talk) 11:21, 10 April 2014 (UTC)
@Infovarius: Your bot added a lot of duplicate P31 and P171 claims. Why?--GZWDer (talk) 12:57, 10 April 2014 (UTC)
A lot? I know about 1 duplicate P31 which I've corrected already. --Infovarius (talk) 16:49, 10 April 2014 (UTC)
@Infovarius: And, Please do not use confused edit summary [2]--GZWDer (talk) 04:56, 11 April 2014 (UTC)
I want to do it as precisely as I can. What variant do you propose? Infovarius (talk) 05:00, 11 April 2014 (UTC)
@GZWDer:, please tell me what summary should be? --Infovarius (talk) 15:54, 25 April 2014 (UTC)
@Infovarius: If you use wbeditentity, you can set summary as "import Pxx/Pxx/Pxx"; You can use no summary if you use wbcreateclaim.--GZWDer (talk) 09:00, 26 April 2014 (UTC)
Don't forgt to add stated in (P248) as reference.--GZWDer (talk) 09:02, 26 April 2014 (UTC)
Hm, there's a problem. I am deriving genus from the species latin names. What should I note as source? Infovarius (talk) 12:03, 26 April 2014 (UTC)
@Infovarius: You can use no source if the claim is obvious.--GZWDer (talk) 12:12, 26 April 2014 (UTC)
AFAIU, only the Mathematica engine is non-free, while you can redistribute programs based on it. --Ricordisamoa 00:10, 12 April 2014 (UTC)
@Infovarius: do you haz teh codez? :P --Ricordisamoa 23:26, 23 April 2014 (UTC)
@Ricordisamoa, Succu: I've updated the request with the codes. Infovarius (talk) 11:46, 1 July 2014 (UTC)

@Infovarius:: For your tests you should use our testrepo test.wikidata.org, not wikidata (see history of Hydrangea candida). --Succu (talk) 12:09, 16 April 2014 (UTC)

There is no reaction and an obvious a lack of experience. So I decided to Symbol oppose vote.svg oppose. --Succu (talk) 21:57, 21 April 2014 (UTC)
@Succu: You should @Infovarius: to get more message.--GZWDer (talk) 10:24, 23 April 2014 (UTC)
@Infovarius: Do more test edits please!--GZWDer (talk) 10:25, 23 April 2014 (UTC)
I've learned API:wbeditentity, so I can do now such edits. Infovarius (talk) 21:26, 23 April 2014 (UTC)
@Succu: Structor is being tested at testwikidata:Special:Contributions/Structor.--GZWDer (talk) 14:53, 25 April 2014 (UTC)
@GZWDer:: I know. I see some test edits. But I don't see a reasonable summary, which you demanded. @Infovarius: You should choose a clear and limited area of operation for your first but run. It would be nice if you could define it and run some further test edits. --Succu (talk) 15:17, 25 April 2014 (UTC)
I think that my task will be: "To provide structural information for species items." There are so many of empty (without properties) species which I am running into. --Infovarius (talk) 12:03, 26 April 2014 (UTC)
It's not true that most taxa have no properties. Two questions:
  1. There are some hundred genra with the same name. How do you identify the correct one for parent taxon (P171)?
  2. Based on which assumptions you will use taxon (Q16521) / monotypic taxon (Q310890) for instance of (P31)?
--Succu (talk) 06:45, 30 April 2014 (UTC)
1. I am trying to skip homonymous genera at first. 2. I use taxon (Q16521) always. While for species monotypic taxon (Q310890) is always OK too, I suppose, superset is also correct. --Infovarius (talk) 21:24, 3 May 2014 (UTC)
  1. To „skip homonymous genera” you have to be aware of them, so I have to repeat my question: How do you identify them?
  2. You suppose? Species are never monotypic.
Dear Infovarius, mind to inform WikiProject Taxonomy about your bot plans? --Succu (talk) 21:47, 3 May 2014 (UTC)
Thank you for recommendation, ✓ Done. --Infovarius (talk) 11:46, 1 July 2014 (UTC)
I added some remarks over there. --Succu (talk) 17:05, 2 July 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter, The Anonymouse: Any 'crat to comment?--GZWDer (talk) 04:56, 30 April 2014 (UTC)


@Infovarius: Could you explain this change, please. Thx. --Succu (talk) 19:35, 14 July 2014 (UTC)

I think this is a page logging all actions by bot. @Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:51, 11 August 2014 (UTC)
  • Let us return here. What is the current situation?--Ymblanter (talk) 07:46, 16 September 2014 (UTC)
The discussions stopped here. --Succu (talk) 07:59, 16 September 2014 (UTC)
  • Time is changing. Now the task about genus-species linking almost expired because Succu has done most of it. I can still perform the task of labelling species by its scientific name. I support en, de, fr, ru labels and descriptions but can gather information about as many languages as possible. But I won't do it without approving of flag as I'm afraid it'll be in vain because of vast burocracy... Infovarius (talk) 19:01, 14 October 2014 (UTC)
There are around 300.000 items left. :) --Succu (talk) 20:47, 14 October 2014 (UTC)
Why do not the two of you agree on some well-defined task, and then I could flag the bot.--Ymblanter (talk) 06:44, 15 October 2014 (UTC)

ValterVBot 12[edit]

ValterVBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: ValterVB (talkcontribslogs)

Task/s: Delete all population (P1082) that I have added in Italian municipality.

Code:

Function details: After this discussion and this, is necessary delete population (P1082) in all Italian municipality, because the source is Istituto Nazionale di Statistica (Q214195) and they have a license "CC-BY" (Legal notes). With this license is impossible to use data outside of Wikidata, because we have "CC0". --ValterVB (talk) 19:12, 11 April 2014 (UTC)

I don't think such data can be copyrightable :/ --Ricordisamoa 19:41, 11 April 2014 (UTC)
@ValterVB: If it's a really copyvio, should we revdel related version?--GZWDer (talk) 07:24, 12 April 2014 (UTC)
I think that technically isn't copyviol because I gave appropriate credit inside Wikidata. But if someone use this data outside of Wikidata without credit there is the problem. So probably it's sufficient delete the data. --ValterVB (talk) 08:29, 12 April 2014 (UTC)
I don't see any indication that this is a real problem. The CC-BY 3.0 license of ISTAT doesn't waive database rights (only CC-0 does; and for our purposes CC-BY 4.0 as well), but the data is being used nonetheless on Wikipedia. Wikimedia projects are already ignoring ISTAT's database rights and m:Wikilegal/Database Rights doesn't say it's a problem to store (uncopyrightable) data which was originally extracted against database rights. --Nemo 08:48, 14 April 2014 (UTC)
Maybe ask WMF Legal to weight in, or ask ISTAT with the Email template to be developed if they are OK with our usage and republication. If either takes an unreasonable time to go through, or comes back negatively, then I agree with the proposal. Content that has been imported directly from a database with an incompatible license should be removed. Given that the template gets developed, does anyone have connections to ISTAT to ask? --Denny (talk) 20:18, 14 April 2014 (UTC)
The license of the database is not incompatible, the data itself is PD-ineligible in Italy (a. not innovative, b. official document of the state). The problem, as usual, are database rights, but see above.[3] Someone from WMIT will probably ask them to adopt a clearer license in that regard but I wouldn't worry too much. --Nemo 15:54, 15 April 2014 (UTC)
@Nemo, Sorry but I don't understand you when you say there is no problem: CC-0 is not CC-BY. 1) Wikipedia respects the licence about CC-BY and 2) the authors select a licence for their work so the database rights are not more applicable. WD doesn't respect the CC-BY so that's the main problem. Database rights are a problem for databases which are not free: you can always use their data according to the short citation right but the problem becomes difficult when a lot of short citations are made in the same document. Snipre (talk) 15:10, 15 May 2014 (UTC)

SamoaBot 44[edit]

SamoaBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Ricordisamoa (talkcontribslogs)

Task/s: removing obsolete (OBSOLETE) main type (GND) (P107) claims...

Code: coming soon

Function details: ...and replacing them with instance of (P31) sourced from RDF data (CC-0) by Integrated Authority File (Q36578). Test edit --Ricordisamoa 05:22, 15 April 2014 (UTC)

Fixed mistake with timezone. --Ricordisamoa 05:27, 15 April 2014 (UTC)
But date retrieved (P813) is shown as invalid. Bug? --Ricordisamoa 05:48, 15 April 2014 (UTC)
It's a known problem. The UI does not support hour, minute and second. You have to work with precision=11. (see here) --Succu (talk) 09:33, 15 April 2014 (UTC)
Although 11 is a sufficient precision in most cases, I think I will not run the bot until precision=14 is available. --Ricordisamoa 14:33, 25 April 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
@Ricordisamoa, Bene*, Vogone, Legoktm, Ymblanter: Is it ready to be approved?--GZWDer (talk) 12:31, 19 September 2014 (UTC)
My understanding is that we are waiting for developers here.--Ymblanter (talk) 12:45, 19 September 2014 (UTC)

MuBot[edit]

MuBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Mushroom (talkcontribslogs)

Task/s: Add IMDb identifier (P345) to items having occupation (P106): actor (Q33999), or film actor (Q10800557), or voice actor (Q2405480).

Function details: The bot will:

The bot made a dry run on the first 100 items with no errors. Mushroom (talk) 18:39, 24 April 2014 (UTC)

Symbol support vote.svg Support but:
  • please check as many Wikipedia articles as possible, since most IMDb-type templates are not very reliable: my bot made lots of mistakes some time ago;
  • make sure that the item is actually about a human (Q5), even if the item does not have instance of (P31) yet: this can be easily checked on Wikipedia or on external databases;
  • strive to add the most reliable and free sources you can find;
  • make as few edits as possible on Wikidata (also adding several statements at once), either by calling the wbeditentity API directly, or by using an advanced framework.
--Ricordisamoa 14:29, 25 April 2014 (UTC)
@Ricordisamoa: Thanks for the support and suggestions. I have modified the bot to always look at multiple Wikipedias and added a few checks to make sure the item is about a human (Q5). The number of edits shouldn't be a problem since I'm using item.editEntity() from pywikibot core. What about the sources though? Right now the bot just adds imported from (P143): [language] Wikipedia. Is there a reliable external database I can query and use as source? Mushroom (talk) 14:12, 26 April 2014 (UTC)
The Echo extension requires the ping being inserted in the same edit as the signature to work :-) pywikibot/core's editEntity allows setting arbitrary data (even sources) but you would have to build the JSON data manually, so I don't make it a requirement for now; data in the RDF/XML (Q48940) format by Integrated Authority File (Q36578) under CC0 (Q6938433) are probably a good source for instance of (P31)  human (Q5). --Ricordisamoa 01:58, 28 April 2014 (UTC)
@Ricordisamoa: Oh I see, that's why it wasn't working :D Ok I will use the GND for instance of (P31)  human (Q5). What I was wondering, though, is: should I find a proper source for statement IMDb identifier (P345)  ID or is imported from (P143) enough? Because unfortunately I can't find any authority control source linking to IMDb (except for Freebase, but I don't think it's reliable). Mushroom (talk) 10:40, 4 May 2014 (UTC)
I found many articles in fr:Catégorie:Actrice suisse that had fr:Modèle:Imdb nom, but no IMDb identifier (P345) on Wikidata. Rather than checking as many languages as possible, focusing on categories and templates of a given language might improve results. --- Jura 06:55, 17 May 2014 (UTC)
Thanks Jura, unfortunately I have been very busy so I had to stop working on the bot. I think I will follow your suggestion and limit its scope for now, so it will be easier to manage. Mushroom (talk) 17:06, 18 May 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
What is the situation with the bot right now?--Ymblanter (talk) 07:08, 14 August 2014 (UTC)
Unfortunately I have not had time to update and operate the bot because I am too busy in real life, so please consider this request suspended for now. Hopefully I will have more time in a few months. Mushroom (talk) 09:39, 16 August 2014 (UTC)

Fatemibot[edit]

Fatemibot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Fatemi127 (talkcontribslogs)

Task/s: Updating sitelink and items that moved categories, articles &... in all namespace in Persian wikipedia (fawiki)

Code: Pywiki (This code)

Function details: Moving category needs to updating it item in WD. My bot can move categories and need permission bot in wd for this work. --H.Fatemi 08:59, 25 April 2014 (UTC)

BA candidate.svg Precautionary oppose since the code is not very well-written and seems to use hard-coded configuration specific to another bot. I could change opinion when the code is cleaned up. --Ricordisamoa 14:56, 25 April 2014 (UTC)
@Ricordisamoa: Hi, this code running very well. Plz see this history for وپ:دار. User:Rezabot and User:MahdiBot Working with this code. :) also my bot has flag in Persian wiki (fawiki) and in this month (reference) Fatemibot has more 15000 edits that means I am able to do this properly.Please Give me a chance for a test case to prove to you, thanks. H.Fatemi 15:30, 25 April 2014 (UTC)
Nothing is preventing you from making a short test run (50-250 edits) :-) --Ricordisamoa 15:36, 25 April 2014 (UTC)
thanks H.Fatemi 21:15, 25 April 2014 (UTC)
BTW, very simple changes like these should really be made in a single edit. --Ricordisamoa 21:01, 25 April 2014 (UTC)
@Ricordisamoa: See this Special:Contributions/10.68.16.37 :-| from my wrong typing these edits with IP Occurred but I edited my user name in this code and for test i submitted a page for job to labs. plz wait to other users request the changing with my bot in w:fa:وپ:دار it is now is empty! I come back soon. thanks a lot and forgive me for speaking weakness. H.Fatemi 21:15, 25 April 2014 (UTC)
@Ricordisamoa: ✓ Done about 110 edits :) H.Fatemi 06:42, 26 April 2014 (UTC)
@Fatemi127: my first comment still applies, so you'd have to wait for a bureaucrat. --Ricordisamoa 20:32, 8 May 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
@Fatemi127, Bene*, Vogone, Legoktm, Ymblanter: Is it ready to be approved?--GZWDer (talk) 12:31, 19 September 2014 (UTC)
We clearly have a problem here, and I do not see how it was resolved.--Ymblanter (talk) 12:38, 19 September 2014 (UTC)

AkkakkBot 8[edit]

AkkakkBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Akkakk (talkcontribslogs)

Task/s:remove redundant aliases

Code:User:AkkakkBot/code/08-remove-redundant-aliases

Function details:will remove aliases that are the same as the label. example --Akkakk 09:15, 3 May 2014 (UTC)

Actually, I do not see having the label in the aliasas as a bad thing. This way, if we add a new label, the older one does not get completely lost. --Zolo (talk) 09:22, 3 May 2014 (UTC)
with this argumentation - maybe i should add aliases then. --Akkakk 09:47, 3 May 2014 (UTC)
That would be fine with me, but I can imagine the raised eyebrows - and doing it in all items may not really be that useful.--Zolo (talk) 20:04, 6 May 2014 (UTC)
Yes, I think that a larger discussion (RfC maybe) would definitely be needed for mass-adding aliases. The Anonymouse [talk] 17:19, 7 May 2014 (UTC)
Maybe the bot should watch for edits changing the label which do not add the older one to the aliases (except those edits which are fixing the capitalization of the label). Helder.wiki 20:52, 8 May 2014 (UTC)
maybe a bot should do that. i won't. (i won't mass add aliases identically to the label either. if there is no consensus to remove them, then just cancel this request)--Akkakk 14:12, 9 May 2014 (UTC)

BthBasketbot[edit]

BthBasketbot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Bthfan (talkcontribslogs)

Task/s: Import basketball players team history from Template:Infobox basketball biography from English Wikipedia

Code: User:Bthfan/bot_code.py

Function details: This bot is using pywikibot and is based on the harvest_template.py script (see https://git.wikimedia.org/blob/pywikibot%2Fcore.git/HEAD/scripts%2Fharvest_template.py). I modified the original code a lot to import the team history of a basketball player from Template:Infobox basketball biography on the English wikipedia. That template has years1, team1, years2, team2, ... to specify the single teams a player has played for in which years. This bot will combine a years* property with one team* property to create the following Wikidata claim:
member of sports team (P54) miga: team name
with qualifiers: start date (P580) miga: (start year gets extracted from years*, bot looks for four digit number); end date (P582) miga: (end year gets extracted from years*, bot looks for four digit number; if there is only one number it assumes start year=end year)

The bot re-uses existing member of sports team (P54) miga entries if there are no qualifiers attached to that claim yet. This reuse is needed as some basketball players already have a few member of sports team (P54) miga claims, as other bots imported for example categories like https://en.wikipedia.org/wiki/Category:Olimpia_Milano_players (every player in such a category got a member of sports team (P54) miga entry). But those entries have no start and no end date and such lack some information that's included in the infobox.

The bot code is not completely finished yet, it needs the patch from https://gerrit.wikimedia.org/r/#/c/125575/ to use the editEntity and JSON feature as far as I see this. The problem is that some players have played for two different teams in one year. Then the template entry in Wikipedia looks like this:
years1=2012
team1=Team A
years2=2012
team2=Team B

So I need to possibly reorder existing member of sports team (P54) miga claims so that the order of items is correct and corresponds to the order in the template/infobox. This is currently not possible yet with the existing pywikibot code (I would need to access the API function wbsetclaim as this one allows one to set the index of a claim). --Bthfan (talk) 08:17, 10 June 2014 (UTC)

BTW: Basically I'm waiting for the patch at https://gerrit.wikimedia.org/r/#/c/125575/ to be finished, then I could edit the whole entity and with that way reorder existing claims. --Bthfan (talk) 22:08, 28 June 2014 (UTC)
Still blocked by https://gerrit.wikimedia.org/r/#/c/125575/, that patch is buggy in some way. I guess it will take a while until the bot code is finished so that it can do some test run :/ --Bthfan (talk) 07:01, 9 July 2014 (UTC)

@Bthfan: gerrit:125575 can manage qualifiers and sorting as well. Even if it's not near to being merged, you can test it locally (SamoaBot is running on Tool Labs with that change, just now). However, your code does not appear to take full advantage of the new system. See an example of the correct usage:

claim = pywikibot.Claim(site, pid)
claim.setTarget(value)
qual = pywikibot.Claim(site, pid, isQualifier=True)
qual.setTarget(value)
if qual.getID() not in claim.qualifiers:
    claim.qualifiers[qual.getID()] = []
claim.qualifiers[qual.getID()].append(qual)

--Ricordisamoa 19:37, 23 July 2014 (UTC)

Ok, thanks for the example. I did test that patch, the patch broke the addQualifier function in pywikibot (I left a comment on gerrit). So your example code is the new way to add a qualifier, one should no longer use addQualifier? --Bthfan (talk) 20:17, 23 July 2014 (UTC)
That is the only fully working way. I plan to support addQualifier (in the same or in another patch), but editEntity() would have to be called to apply the changes. --Ricordisamoa 18:09, 24 July 2014 (UTC)
Ah, I see how it should work then :). Ok, I'll try that. --Bthfan (talk) 11:56, 25 July 2014 (UTC)
Just an update on this: This bot is currently blocked by a bug in the Wikidata API (at least it looks like a bug to me), see https://bugzilla.wikimedia.org/show_bug.cgi?id=68729 wbeditentity ignores changed claim order when using that API function. --Bthfan (talk) 08:19, 6 August 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:52, 11 August 2014 (UTC)
My understanding is that we are not yet ready for approval and waiting for a bug to be resolved.--Ymblanter (talk) 13:56, 11 August 2014 (UTC)
That's correct. I could try to use another API function for reordering the claims (there is one: wbsetclaim with the index parameter), but this means to modify pywikibot quite a bit as this API function is not used/implemented yet in pywikibot. I currently don't have enough time for that :) --Bthfan (talk) 14:25, 11 August 2014 (UTC)

InfoRobBot 2[edit]

InfoRobBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: TomT0m (talkcontribslogs)

Task/s: put {{Item documentation}} in head of each talkpages

Code: my wp git repo, not coded yet.

Function details: Put {{Item documentation}} in item talkpages if it is not already existing. --TomT0m (talk) 16:48, 11 July 2014 (UTC)

There is no such user.--Ymblanter (talk) 18:49, 18 July 2014 (UTC)
Thanks, now it has been created.--Ymblanter (talk) 20:50, 26 July 2014 (UTC)
@Bene*, Vogone, Legoktm, Ymblanter: Any 'crat to comment?--GZWDer (talk) 10:53, 11 August 2014 (UTC)
Is there a broad agreement that we should use talk pages that way? --Succu (talk) 18:43, 11 August 2014 (UTC)
Symbol oppose vote.svg Oppose I don't think there is a need for this template on all talk pages.--Pasleim (talk) 16:39, 28 September 2014 (UTC)
@Pasleim: It provides context about the item on all talkpages, plus basic links to external tools, basic linguistic informations and basic informations about the neighborhood of the item. Plus it's the only way we currently have to show up the very important classification informations about the item, and make this available to non Wikidatians powerusers without an army of extensions they might not want to setup, or external tools. TomT0m (talk) 11:30, 29 September 2014 (UTC)
@TomT0m: I do acknowledge the helpfulness of the template content but IMO a talk page is for discussions and not documentation. Moreover, to create 16M identical pages can't be the best approach. Why not provide the documentation by a gadget which might be enabled by default (sketchy at User:Pasleim/sandbox.js)? --Pasleim (talk) 09:11, 30 September 2014 (UTC)
+1 --Succu (talk) 09:18, 30 September 2014 (UTC)
@Pasleim: I'll don't see why documentation should be burried into gadgets, it should be easily accessible. If it does not belong in templates, does it belongs in ... gadget ? The same argument applies :) Think of {{Property documentation}}. It is on talkpages and nobody seems to be shocked. I don't really think it is an important point. Maybe a more efficient way, if possible, would be to set a default wikicontent header for pages of that type. TomT0m (talk) 16:28, 30 September 2014 (UTC)
To make it clear Symbol oppose vote.svg Oppose - This is a misusage of talkpages! --Succu (talk) 16:50, 30 September 2014 (UTC)

MergeBot[edit]

MergeBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Qlsusu (talkcontribslogs) qlsusu

Task/s: It supplements Wikidata item by adding statements through the extracted Infobox coming from item's corresponding Wikipedia(currently Enwiki) article.

The statement's value type currently is url/commonsMedia, and futher it will be all 7 types.

Code: I am planning to post an opensource project on github.

Function details:

It supplements Wikidata item by adding statements through extracted Infobox coming from this item's corresponding Wikipedia(currently Enwiki) article.

1. For an given Wikidata item, extract the template info (mainly Infobox) out of the corresponding Wikipedia article.

2. For every key in infobox, find the corresponding property defined by wikidata.

The following is the finding strategy:

  • a. Property has label/alias/description, and I first find out the candidated properties of the key through caculating the literature similarity.
  • b. Property defines its domain (and other ontology-relatived restraints), then I check out whether item belongs to the specified domain (class hierachy in wikidata is taken into consideration)

3. Construct the DataValue (which claim needs) for the specifed property throught extracting on the literature content of key

Property has 7 data types: url, commonsMedia, wikibase-item, string, globe-coordinate, time, quantity.

I take my efforts on all these 7 types extracting, and I have finished checking the validation of these extractions on the whole offline dump of wikidata and wikipedia (every article in wikipedia, and every item in wikidata) (time version: wikidata-20140106, wikipedia-20140203) for all these 7 types. Because these extracting strategy may still needs improvement, so currently, only url/commonsMedia data types take particapte in merging procedure, and in the further all other 5 types will move in.

4. Incremental dump handling

Wikidata everyday has an incremantal dump about 50M, and Enwiki has that incremental about 500M.

I first extracting the Wikipedia artcie, and then merging will be automatically triggered by specifing the item id of the updated aritcle.

Besides that, for avoiding creating reduplicative statement, I always persist the statement (which is going to be created for wikidata) in db (and apply an out-of-date strategoy to clear it), and if the further merging creates the same statement, it will be ignored.

I hope this MergeBot will be approved, seriously. And I notice that: The bot operator should do a test run of between 50 and 250 edits, and please notify me to start the test when the timing is right. --Qlsusu (talk) 03:45, 11 August 2014 (UTC)ca

User:Qlsusu, I believe that you want to help (which is great), but I don't think you're quite able to do so yet. Almost all of your own contributions so far had to be reverted, so I don't see a chance for you to operate a bot here for now. Study the help pages and the reality of Wikidata items, make some useful contributions yourself, and maybe some day this might become an option. --YMS (talk) 06:35, 13 August 2014 (UTC) PS: The account User:MergeBot doesn't even exist yet, only the user pages themselves. I'd suggest that you either close this request and request the deketion of the user pages, so someone else has the chance to create User:MergeBot, or that you create the account yourself regardless of the outcome of the bot flag request. --YMS (talk) 06:35, 13 August 2014 (UTC)
hi, User:YMS, thanks for your replay, when i see the notification, i am so excited although you reject my propose. I am new to wikidata, and I am sorry that i didn't reply you about your warn: "I undid one or more of your recent contributions because it didn't appear constructive" because i can't find a way to do that. Let me make an explanation. I have concentrated several months to do just one thing: supplementing wikidata through making new statement whose underlying data is coming from its corresponding wikipedia article. I investigate the template structure of wikipedia, and the api defined by wikidata. I may do some dirty statement inserting (such as incorrect, non-sense data), however, after this insertion, i always recover it back. And I may also insert some reduplicative data, because at the early stage of my merge development, quite a lot of test should be made, and at that time, i prepare the wikidata source (mainly claim) just though net, and it may not contain the newly statement i created before, and therefore it results in reduplicative data. So i am sorry for my ignorance not using sandbox. Howerver, after i receive your warning, i stop making real statement nor inserting it into wikidata, instead, i download the offline wikidata dump and wikipedia dump, and check the validation of 7 types mergeing strategy, as well as the feasibility of key-to-property mapping (which may be involved in ontology mapping between wikipedia/dbpedia and wikidata), and just a few days ago, i finish this checking on about 15 million item-article feeding procedure, that is why i made no statement insertion to wikidata during april - now. Now, i register a bot, ask for a bot flag, and desire to contribute my efforts to wikidata, and that’s why I feel so disappointed at your rejection. And again, i say that i am not a destroyer, and i just ask for a chance to run a test on 300 items of wikidata to let you to see that: whether or not i make a contribution (or a destruction)
A bad man should be punished, but a starter should be given a chance to prove something. And I will make a account for my bot MergeBot right away, and I never give up. And fianlly, please give me a chance to run the test!!!--Qlsusu (talk) 07:33, 13 August 2014 (UTC)
First of all, I am not the one who decides whether you can run the bot or not. This was just a comment. Sorry if I created a wrong impression. But then: A month after I warned you on your talk page, your edits made User:Reaper35 put another warning on your talk page, and another week later, User:Zolo even saw the need to block you for three days to prevent damages made to Wikidata. And even after that block you inserted statements in several items (most of them immediately reverted by the same IP, I surely believe you that that was you, but other than you say, you made the statements in the first place). --YMS (talk) 07:45, 13 August 2014 (UTC)
Hi, YMS, thanks for your reply. You may not believe that, I think, : “that is why I made no statement insertion to wikidata during april - now”. As you mentioned, I made a mistake, I just mistake the date marked at my talk page (When I reply your comment before, I almost lose my control that my efforts of last few months is going to die). But I am telling the truth that: after your warning, few statements were created, and even before that, the reason for block me, not too many dirty statements either. If I were a destroyer, I should spread a huge domain of destruction. It is all my fault about the ignorance. Currently now, I just need a chance to show the functionality of my MergeBot, to see how it contributes. I think the start time to launch the test should be approved, that’s why I am still waiting for the timing. If it is not what I thought, I will launch the test as soon as possible. If then it is still making the destruction, block me forever, if not, please give me a chance to do something for wikidata. My merging strategy for 7 data types may still need to be improved, but, at first, all that I struggled is the acception of you administrator. Please tell me wether or not i should still wait for launching the test.
As you mentioned, you are not the one to decide whether I can run my bot, and please give me a hint: who is the one, and how I can be notified whether my bot is approved or not.--107.178.200.203 09:08, 13 August 2014 (UTC)

Symbol oppose vote.svg Oppose The generic name MergeBot suggest you will do automated merges, which is not the case. Your description of what your bot is intended to do remains unclear. --Succu (talk) 19:10, 14 August 2014 (UTC)

Hi, Succu, thanks for your reply. MergeBot may not be the right name, how about renaming it InfoboxMergeBot. And I also modify the function section to clear the purpose of what my bot.
This bot indeed does automatically merging. For example, when given a wikidata item id, it finds the corresoponding wikipedia article though the sitelink of the item (currently enwiki). Then infobox of wikipedia article will be extracted. For each key in this infobox, find the corresponding property defined by wikidata, and construct the proper datavalue, which claim needs, from the literature value of key in infobox. All these steps just mentioned will be automatically done just by given the item id. Therefore, I think InfoboxMergeBot will be a proper name. My bot wants to supplement wikidata with the data coming from wikipedia’s infobox, that’s all its purpose.
Please tell me whether this modified name is ok, and any problems I need to fix.--Qlsusu (talk) 02:50, 15 August 2014 (UTC)
Hi, Succu, please tell me whether the modified name is ok, and whether my modification about the description about my bot is ok, and anything I need to improve to help retrieving this bot flag. Please!--Qlsusu (talk) 01:21, 18 August 2014 (UTC)
Hi, Succu, please give me a reply. A few days ago, I modified the bot name, and the description about my bot. And I wonder whether it is ok then to retrieve the bot flag. PLEASE!!--Qlsusu (talk) 07:29, 19 August 2014 (UTC)
Sorry, I wasn't notified. Why not call your bot QlsusuBot? And I'm sorry, I see not improvment in your bots task description. --Succu (talk) 08:51, 19 August 2014 (UTC)

Symbol oppose vote.svg Oppose As it is the first bot of the user, I would like to see a more specific task (e.g. focus on one property only). In addition, the bot name is confusing. --Pasleim (talk) 11:15, 12 September 2014 (UTC)

@Qlsusu: Please make several test edits to make community know what you are going to do, otherwise this request will be closed in 7 days.--GZWDer (talk) 12:38, 30 October 2014 (UTC)

SteinsplitterBot[edit]

SteinsplitterBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Steinsplitter (talkcontribslogs)

Task/s: Bot is moving categorys (etc.) on commons. When moving categorys the account is making an automated edit on wikidata.

Code: https://gerrit.wikimedia.org/r/#/q/Iad9bd7065bb0874ebf52e65a,n,z

Function details: See above (example) --Steinsplitter (talk) 16:49, 20 August 2014 (UTC)

 Flagged Leaving this open for a bit for the case any concerns arise. Vogone (talk) 18:04, 20 August 2014 (UTC)
Flagged with in less than two hours, Vogone - without any discussions, thats impressing. Why? --Succu (talk) 19:37, 20 August 2014 (UTC)
Because the edits are happening in any case and obviously affect commons itself only where the bot is approved. I intentionally left this request open in case someone disagrees with my interpretation and does not believe the bot should edit with a bot flag. Vogone (talk) 19:47, 20 August 2014 (UTC)
@Steinsplitter: How frequent do you expect your bot will make automatic edits on Wikidata as a result of category moves on Commons? I notice that not many Commons categories have a Wikidata item associated. There are many categories along the line of "<Objects> in <City>" on Commons, which are not likely to have Wikidata items. Perhaps your bot will not cause flooding, even without bot flag? --whym (talk) 10:10, 21 August 2014 (UTC)

JhealdBot[edit]

JhealdBot (talkcontribsSULBlock logUser rights logUser rights management)
Operator: Jheald (talkcontribslogs)

Task/s: To add about 35,000 new topic's main category (P910) / category's main topic (P301) property pairs based on matches on Commons category (P373)

Code: not yet written

Function details: About 35,000 potential new topic's main category (P910) / category's main topic (P301) pairs have been identified, based on a unique Commons category (P373) existing for each cat-like item and each non-catlike item, where the article-like item does not have any topic's main category (P910) property currently set.

A preliminary sample, for Commons cats starting with the letter 'D', can be found at User:Jheald/sandbox.

Still to do, before starting editing, would be to remove "List of" articles, as these should not be the category's main topic (P301) of a category; and also to check the cats for any existing category's main topic (P301) and category combines topics (P971) properties set. -- Jheald (talk) 23:30, 7 September 2014 (UTC)

Would you please make several dozen trial contributions?--Ymblanter (talk) 15:54, 24 September 2014 (UTC)