User talk:Ladsgroup/Archive 3

From Wikidata
Jump to navigation Jump to search

Commons category imports : wrong[edit]

It's wrong unconditionally importing Commons categories from Wikipedias as they might not match the Wikidata subject closely enough like here. Things that are correct for Wikipedias may be wrong for Wikidata; that's why Commons links are not fully automated on Wikipedias and your bot should not act as if it were. Thanks. -- Bjung (talk) 18:14, 26 December 2013 (UTC)[reply]

Hi, My bot checks that the linking is mutual or not for example about the link you exampled, see the category in Wikimedia Commons commons:Category:Cities_and_villages_in_Belgium there are interwiki to Wikipedia categories (e.g. [[nl:Categorie:Plaats in België]]) if you think it's wrong, the linking are done incorrectly Amir (talk) 22:46, 26 December 2013 (UTC)[reply]

Replacing correct values to wrong[edit]

Hi, please see this edit. — Ivan A. Krestinin (talk) 20:26, 27 December 2013 (UTC)[reply]

Hi, I checked by existence of date of birth (and bot considered items as human when it has date of birth category) so I'll add exception about this Amir (talk) 10:08, 28 December 2013 (UTC)[reply]
Thanks, I recommend do not replace unexpected existing values at all. It is bas situation when human correct value and bot override it again to wrong. This discourages humans to improve Wikidata. There are several different cases when date of birth can present in non biographical articles. — Ivan A. Krestinin (talk) 10:33, 28 December 2013 (UTC)[reply]
You're right, Done Amir (talk) 15:48, 28 December 2013 (UTC)[reply]

Your bot has made 10000000 edits[edit]

See Special:CentralAuth/Dexbot.--GZWDer (talk) 05:44, 31 December 2013 (UTC)[reply]

oh, Thank you, I was waiting for this :) Amir (talk) 15:45, 31 December 2013 (UTC)[reply]

readding once deleted claims[edit]

Hello, you added many Commons category (P373) from enwiki, but unfortunatelly many of these claims were deleted from wikidata long time before, because was totally wrong. [1] is only one of hundrets, where I deleted incorrect value long time ago and you readded it. in Wikidata:Database_reports/Constraint_violations/P373 was letter É on the end one day, now there is S (yesterday was T). Pages starting with 1 were in list only 14 - the rest was solved. etc.

Its very difficult to solve it - my work on import from cswiki take three or four months (10-20 pages daily) - I manually checked every page which was remaining in Category:Commons category without a link on Wikidata (Q11925744) and Category:Commons category with local link different than on Wikidata (Q11925740) and repaired or removed value in local wiki and then import it to wikidata. in enwiki and other it should be the same way - I deleted some incorrect values, but this category grows every minute.

What about to some check, if this value was not in history of item? I dont know if it is possible, but why not?

JAn Dudík (talk) 22:49, 3 January 2014 (UTC)[reply]

I'm at the middle of end-term exams, I'll answer you tomorrow. sorry for delay Amir (talk) 04:43, 5 January 2014 (UTC)[reply]
Hi, When my bot is doing something wrong, please tell me so I can fix the problem, I made a double check on pages to see if the linking is mutual or not and after that the bot can add the P373 but that part of code wasn't working properly, I'm fixing the problem. P.S. when you sent this message my bot finished working on Dutch Wikipedia so there was no rush in answering this message anyway sorry for delay Amir (talk) 07:23, 6 January 2014 (UTC)[reply]

Human dog[edit]

Dear Amir,

I hope your exam was going well. :-) On Pal (Q7126106), your bot stated a while ago that it is a human, see [2]. Perhaps it drew this conclusion from the dog having a date of birth, but it could have seen that it was a non-human animal because sex or gender (P21) was present as well. This has (if it is filled correctly) a different value in these cases: male organism (Q44148) for male animals and female organism (Q43445) for female ones. Best regards, Bever (talk) 13:56, 12 January 2014 (UTC)[reply]

Hi, Thank you :) I checked my codes and that was because that article is using w:Template:Infobox person not because of the birth date (the category must be something like "\d{1,4} births" not "\d{1,4} animal births", I don't know why in the English Wikipedia they have used that infobox for dogs but if it's something common, tell me to run a code and double check every edit my bot made this way. Best Amir (talk) 14:57, 12 January 2014 (UTC)[reply]

Please stop DexBot ASAP[edit]

Please stop DexBot from creating Wikisource items for now. Some things it is doing are already planned for other bots, possibly in a better manner. Come discuss it on Wikidata talk:Wikisource first ! --LBE (talk) 09:20, 15 January 2014 (UTC)[reply]

1- it's Dexbot 2- I stopped but I'll run it again but without making new items, just adding it to the old items (like this [3]) 3-I discussed about it in WD:PC and there is nothing wrong with the bot 4-How do you know manner of my bot is worse than what you planned? Amir (talk) 09:28, 15 January 2014 (UTC)[reply]
I don't *know* there is anything wrong with DexBot. I just know that Tpt has dev'ed a dedicated bot for that purpose, whose strategy was discussed in the WS project page, and which includes some dedicated criteria to avoid duplications. I think that before you launch your own bot on this, you should check that his would not be more efficient for the job. It's worth waiting a few more hours. --LBE (talk) 09:50, 15 January 2014 (UTC)[reply]
the way I'm running it now there won't be a duplicate (because it avoids making new items), I have experience of importing more than 1M items from other wikis (wikivoyage, wikipedia, commons) and I know what I'm doing, and it's a wiki and we don't say this task is dedicated to him/her and nobody has the right to do the job, Do we? Amir (talk) 10:07, 15 January 2014 (UTC)[reply]
You'd better import link Author page in Wikisource with article in parameters "wikipedia" in author template in Wikisource if there is one.--GZWDer (talk) 10:12, 15 January 2014 (UTC)[reply]
I'm exactly doing this way, you can see edits of my bot and check itAmir (talk) 10:19, 15 January 2014 (UTC)[reply]
Well, a wiki is also a collaborative project, so contributors must synchronize with one another to agree on the way to properly handle a specific task. Wikidata:Wikisource has been the page where people organizing the Wikisource landing have been talking about the best way to do it for the past few months. It's been pointed to in each and every preparatory announcements. I just want to make sure you don't go on your own assumptions on the topic, irrespective of the undisputed experience you have with handling items for the other projects. --LBE (talk) 10:31, 15 January 2014 (UTC)[reply]
I've read your page and I discussed about it [4] [5]. I know what I'm doing and importing categories and templates isn't really different from other projects. I've permission for it and there is nothing about running it Amir (talk) 10:38, 15 January 2014 (UTC)[reply]
That is somewhat arrogant. You have been asked by the Wikisource community to stop/pause your bot, and you have defied that request saying that you know better. We are looking to be collaborative with our importing and to have some review. There is no hurry, we can do this in parts.  — billinghurst sDrewth 11:04, 15 January 2014 (UTC)[reply]
please read this topic again, the user asked to stop my bot "from creating Wikisource items" which I did right away, but re-run my bot with just adding sitelinks to items of categories (in order to avoid making duplicates) so I saw no reason to keep the pauseAmir (talk) 11:47, 15 January 2014 (UTC)[reply]

Please pause your Wikisource bot[edit]

Can you please pause Dexbot at the Wikisources so that we have some time to review its edits, and look to seeing if we can get a bot designation through the wikis. Thanks.  — billinghurst sDrewth 10:58, 15 January 2014 (UTC)[reply]

I have temporarily blocked it at enWS, you are flooding RC, and we wish to review. There is no hurry.  — billinghurst sDrewth 11:17, 15 January 2014 (UTC)[reply]
Is the bot at all approved for adding sitelinks to Wikisource? -- Lavallen (talk) 11:27, 15 January 2014 (UTC)[reply]
I stop it now. I've general approval for adding sitelinks (first RfBA of my bot) if you think I need to ask another RfBA let me know. about running it in English Wikisource my bot is global so it has permission to remove interwiki links. do you accept global bots? Amir (talk) 11:34, 15 January 2014 (UTC)[reply]
Stopped Amir (talk) 11:37, 15 January 2014 (UTC)[reply]
Most Wikisource-projects do not accept Global bots, I do not know how it is in every subdomain. The nature of Interwiki in Wikisource is very different from other projects, so I would recomend precaution! -- Lavallen (talk) 11:40, 15 January 2014 (UTC)[reply]
I understand it's different so I just want to run my bot on three namespaces: Categories, templates, authors. I've imported templates and categories of several projects into Wikidata (more than 1M sitelinks) and nature of interwikis of categories, templates and authors aren't very different with other projects. are they?Amir (talk) 11:44, 15 January 2014 (UTC)[reply]
The purpose of categories looks different in some projects, yes. Some have added only works of Charles Darwin in the Category with his name, others have also added works about him. Some categories related to years have different purpose in some projects. That is reviewed as we speak.
Many projects do not have any Author-namespace, but still have such pages. Author-namespace also have some disambigs.
I also know that some pages in the MediaWiki-namespace can be found in Template-namespace in at least one project.
Two Wikisource-projects are to be found in the Wikipedia-domain.
But above all, the attitude to Interwiki and Bots among the Wikilibrarians is unique in almost every subdomain. We still do not know how some of the subdomains will react to this change. -- Lavallen (talk) 11:56, 15 January 2014 (UTC)[reply]
I don't think the difference is so much that we need to make another item. that's my opinion Amir (talk) 12:32, 15 January 2014 (UTC)[reply]

Wrong creation of new item[edit]

Your bot has created a new item (Q15626841) for a page that should have been added to an already existing one (Q411810). This may have happened other times. Please do not make your bot create a new item when no matching item is found. This would require a lot of extra work in the future. Regards, Erasmo Barresi (talk) 16:22, 16 January 2014 (UTC)[reply]

hello, I understood that and stopped my bot in creating new items. Thank you for notice. P.S. I erged these two :) Amir (talk) 16:24, 16 January 2014 (UTC)[reply]

Kleingebiet is a Subregion[edit]

A task for the regex master ;-) The German Kleingebiet-article belong to the subregions:

Androoox (talk) 06:14, 31 January 2014 (UTC)[reply]

Okay, It's working now, an example Pécs Subregion (Q15699491). we can use labels later in order to merge them (I checked the links of subregion of Hungary (Q715398) but maybe something went wrong) Amir (talk) 12:19, 1 February 2014 (UTC)[reply]

done now Amir (talk) 13:01, 1 February 2014 (UTC)[reply]
Seems removal worked, all 10 I checked had the link removed. But duplicates of Foo Subregion have been created. Can the bot merge these? Also set the German names in the district tree s/Kleingebiet/Bezirk - since now there are two items with the same German label "Kleingebiet Foo", and only one is actually related to a Kleingebiet (Subregion) - article. Androoox (talk) 20:12, 1 February 2014 (UTC)[reply]
I'll work on that tomorrow Amir (talk) 21:06, 1 February 2014 (UTC)[reply]
I fixed the label issue, I'm going to run my bot to merge these items. Amir (talk) 10:58, 3 February 2014 (UTC)[reply]
I merged almost all of them. list of all is User:Ladsgroup/here Amir (talk) 13:03, 3 February 2014 (UTC)[reply]
I solved the rest, converting some to districts. Androoox (talk) 07:09, 4 February 2014 (UTC)[reply]

ta.wiki articles[edit]

I'm finding that I'm requesting the deletion of a lot of the items Dexbot is creating with only ta.wikipedia links. I think most of them don't need to be added to Wikidata as the issue is at ta.wikipedia where topics are duplicated. I've fixed some issues on ta.wiki but I can't keep up with the bot, I gave up. 130.88.141.34 18:39, 3 February 2014 (UTC)[reply]

I'm asked to import ta.wp articles by User:GerardM and It's bad for a wiki to have lots of duplicates and making them by WD:N was okay (my bot didn't do anything wrong) so I think it's not a big deal. send me the list and I'll delete them massively Amir (talk) 18:43, 3 February 2014 (UTC)[reply]
Edits like this and this are odd. Not sure what's happening. 130.88.141.34 18:44, 3 February 2014 (UTC)[reply]
If you look closer I already rollbacked one of them, I'll take care of this kind of problems Amir (talk) 19:00, 3 February 2014 (UTC)[reply]
Thanks. 130.88.141.34 19:03, 3 February 2014 (UTC)[reply]

Miscategorization[edit]

I fixed already several, without looking at the details. But this time I did. It says imported from en WP [6] but I cannot see how English Wikipedia claimed that item to be a municipality. Androoox (talk) 07:35, 16 February 2014 (UTC)[reply]

Language[edit]

Your bot adds labels based on the site link. The language code of the labels should depend of the content language of the target wiki and not of the database ID of the wiki. In some wikis they differ. See wgLanguageCode in https://noc.wikimedia.org/conf/InitialiseSettings.php.txt.

Here some examples:

The labels with the wrong language code are not accessible because Wikidata converts wrong language code for the user interface language to the right language code: uselang=als and uselang=gsw shows the label with language code gsw. Please fix the bot and use the right language codes. --Fomafix (talk) 00:09, 2 March 2014 (UTC)[reply]

thank you for telling that, I'll do it Best Amir (talk) 13:58, 2 March 2014 (UTC)[reply]

@Fomafix: I fixed my code and It won't happen again (by my bot) I'm writing a code to fix older mistakes Amir (talk) 17:11, 2 March 2014 (UTC)[reply]

Thanks for fixing. The new changes looks good. Yes, the labels with the wrong language codes should be removed because they are not accessible. --Fomafix (talk) 18:10, 2 March 2014 (UTC)[reply]
My bot will do that too [7] and [8] Amir (talk) 18:54, 2 March 2014 (UTC)[reply]

At the moment it look like your bot only converts als to gsw (example). It would be nice to convert all languages at once. For example Q2111 has also be-x-old, zh-classical, zh-min-nan, zh-yue and simple. --Fomafix (talk) 20:15, 2 March 2014 (UTC)[reply]

That was a bug and it's fixed. Thank you for noticing it Amir (talk) 20:27, 2 March 2014 (UTC)[reply]

simple is also a wrong language code. simplewiki uses en as wgContentLanguage. uselang=simple shows the content of uselang=en. The label with the language code simple is not accessible. But it is possible that a item has a sitelink enwiki and to simplewiki, although these wikis don’t have the same label. --Fomafix (talk) 22:38, 2 March 2014 (UTC)[reply]

added to the code, I'll run it to fix them the next time (in the next two weeks). Amir (talk) 22:43, 2 March 2014 (UTC)[reply]

Import sitelinks from Wikisource[edit]

since TptBot is not running, can you continue importing sitelinks from Wikisource?--GZWDer (talk) 13:26, 20 February 2014 (UTC)[reply]

sure, Give me a day or two Amir (talk) 03:47, 21 February 2014 (UTC)[reply]
@GZWDer: I wrote a code to take care of wikisource new categories. what else is needed to be done? Amir (talk) 08:16, 24 February 2014 (UTC)[reply]
You should import sitelinks or create items for texts, authors, categories, and also for project pages/help/template only if there're interwiki link (for the time being).--GZWDer (talk) 10:40, 24 February 2014 (UTC)[reply]
Any update?--GZWDer (talk) 09:01, 1 March 2014 (UTC)[reply]
@GZWDer: glad you asked, I forgot to give you an update. I've technical problem in import anything you said except projects,help and templates. I want to explain to you the problem. For authors my bot have to check the templates an see that author already has item in wikidata (and if that person doesn't have it before, the makes it) but i18n of this task for other languages is really hard. and about text, I'm not sure all of them are eligible to make it, if you say ALL of them are eligible, It's not a big deal for me Amir (talk) 09:13, 1 March 2014 (UTC)[reply]
At least import all non-subpage.--GZWDer (talk) 09:23, 1 March 2014 (UTC)[reply]
Okay, I just finished debugging and/or writing scripts to take care of categories, templates, help pages, project pages, main namespace pages (subpages excluded). the only thing needs to be taken care of is authors Amir (talk) 11:06, 1 March 2014 (UTC)[reply]
Any update?--GZWDer (talk) 16:41, 1 March 2014 (UTC)[reply]
It's programmed to work. what else is needed? Amir (talk) 20:00, 1 March 2014 (UTC)[reply]
Just run it. It's not yet running.--GZWDer (talk) 06:42, 2 March 2014 (UTC)[reply]

I ran it once [9]) this is the timetable of running the script in future

date what bot will do
1,10,20 of each month articles in the first 20 wikipedia
2,11,21 Categories in the first 20 Wikipedia
3,12,22 Categories in the first 20 Wikisource
4,13,23 pages in the first 20 Wikisource (in namespaces: 4,12,10)
5,14,24 texts (ns:0) in the first 20 Wikisource

Best Amir (talk) 13:53, 2 March 2014 (UTC)[reply]

Not yet running. Why?--GZWDer (talk) 13:33, 4 March 2014 (UTC)[reply]
@GZWDer: there was a little problem that I fixed, and note that my bot only works on pages that has been created 10 days ago until one month ago (we talked about it in WD:PC) Amir (talk) 11:50, 6 March 2014 (UTC)[reply]
And how about old pages?--GZWDer (talk) 13:58, 6 March 2014 (UTC)[reply]
I thought Tbp took care of it, if you think it's not true, let me know so I'll run it on the whole first 20 Wikisources. Amir (talk) 14:52, 6 March 2014 (UTC)[reply]
@Tpt: Are you still running your Wikisource bot?--GZWDer (talk) 15:36, 6 March 2014 (UTC)[reply]
@GZWDer: I think you understood what I said incorrectly, My bot works on new pages every ten days, but pages older than one month, I think running once on whole Wikisource is enough for it but I'm not sure Tbp did this thing on all languages of Wikisource (especially the first 20) or he/she ran the bot on English Wikisource. Amir (talk) 15:46, 6 March 2014 (UTC)[reply]
I've run the importation bot once for Author pages of ca, br, de, en, es, fr, it, pl, pt, ru, sv and uk wikisources. I can run it periodically if needed. Please, discuss on the talk page before doing anything for controversial namespaces like ns0. Tpt (talk) 16:06, 6 March 2014 (UTC)[reply]

Import sitelinks for articles and categories in zhwiki[edit]

Now there're ~81854 articles without items in zhwiki [10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26]

It's possible to create items about unlinked articles and categories. Do not create items for pages in zh:Category:怀疑侵犯版权页面 (copyvio pages) and zh:Category:維基百科軟重定向 (soft redirect), and category in zh:Category:已重定向的分类 (redirected category) or empty categories (since a bot will report it for speedy deletion).--GZWDer (talk) 07:57, 2 March 2014 (UTC)[reply]

Sure, today I'll start Amir (talk) 14:02, 2 March 2014 (UTC)[reply]

It's working [27] I'll start work on categories after this Amir (talk) 15:49, 2 March 2014 (UTC)[reply]

Can you run the bot a bit faster? It'll now take 30+ days to create all items.--GZWDer (talk) 13:32, 4 March 2014 (UTC)[reply]
I speed it up and made it twice faster. Amir (talk) 13:21, 5 March 2014 (UTC)[reply]

P131 in Sweden[edit]

This is fine! Do you manage to also add the Province? It can for example be found in parameter landskap in sv:Template:Ortsfakta Sverige. They should have the same "rank" as municipalities have, since there is no natural hierarchy between them. -- Lavallen (talk) 10:56, 13 March 2014 (UTC)[reply]

Happy to see that, I'll check and see how to import them Amir (talk) 17:52, 13 March 2014 (UTC)[reply]
@Lavallen: I'm not very familiar with the system of hierarchy in Sweden populated places, but as far as I can see the Municipality is in the province, I checked another usage of the template you said w:sv:Alingsås is in w:sv:Alingsås kommun (Alingsås municipality) and in the province of w:sv:Västergötland (landskap), so if there is a property named like "is in province" I can add them easily. As I said before I'm not familiar with the system and I'm maybe wrong please correct me if it's so Amir (talk) 22:06, 13 March 2014 (UTC)[reply]
Alingsås municipality is in Västergötland Province, that far is true. But there is no guarantee that municipalities are within only one Province. Båstad Municipality (Q499464) is for example both in Halland and Skåne Province (but only in Skåne County), but any locality in Båstad municipality can be in any of Skåne or Halland Province or both. The capital 'Båstad' is in both, while 'Grevie' is only in Skåne Province and 'Östra Karup' is only in Halland. The problem is here that we have (at least) two paralell systems of hierarchy. The ~"simple design with a nice hierarchy" does not exists in Sweden, but P131 is the only property we have to describe it.
-- Lavallen (talk) 07:10, 14 March 2014 (UTC)[reply]
okay, I'll add them Amir (talk) 17:27, 14 March 2014 (UTC)[reply]
@Lavallen: It's working (I'll the P17 too) here is an example Ädelfors (Q10725529) Amir (talk) 17:51, 14 March 2014 (UTC)[reply]
Great! Ädelfors, that's not far from where I was born! :) -- Lavallen (talk) 18:28, 14 March 2014 (UTC)[reply]

Closed wiki[edit]

Why we should remove links to closed wiki? They have nevertheless articles. Infovarius (talk) 21:10, 14 March 2014 (UTC)[reply]

Hi, I did it because It makes my bot crash. I know it's not a valid reason but I said it's useless so I removed it, I don't know if there is policy about linking to closed wikis in Wikidata but If there is not, It's better to open a topic in WD:PC and make a clear policy about them. Amir (talk) 21:34, 14 March 2014 (UTC)[reply]
I agree with you that it is better to open a topic on the project chat rather than simply removing content. Imho this is not acceptable as there wasn't any bot request or discussion about removing those links. -- Bene* talk 22:07, 15 March 2014 (UTC)[reply]
At first I didn't know that this wiki is closed for editing (and not closed for viewing) but doesn't matter anymore I'll change the framework to handle this problem Amir (talk) 01:35, 16 March 2014 (UTC)[reply]
P.S. I deleted just one link not more (or used my bot to remove it) Amir (talk) 02:35, 16 March 2014 (UTC)[reply]

P107[edit]

Hey Amir. There are many P107 claims which can be removed:

--Pasleim (talk) 22:47, 20 March 2014 (UTC)[reply]

Hey! Thank you for sharing this with me, I wrote another code to take care of them, but I'll run the code on all of them except human, because AFAIK it's not okay to have anything but P31:5 for humans e.g. instance of King is not correct and instance of human is the correct, Am I wrong? Amir (talk) 06:47, 21 March 2014 (UTC)[reply]
Yes, it might be safer to exclude humans. --Pasleim (talk) 13:43, 21 March 2014 (UTC)[reply]

@Pasleim: the bot removed as many as It could, after several days (after updating of databases) you can check and see how much remain Amir (talk) 05:55, 23 March 2014 (UTC)[reply]

Thanks a lot! --Pasleim (talk) 10:10, 23 March 2014 (UTC)[reply]

P132 municipality of Hungary[edit]

Hi! Could you replace P132 (P132)instance of (P31) only for these Wikidata:Database reports/Constraint violations/P939#Type Q486972? --JulesWinnfield-hu (talk) 14:31, 21 March 2014 (UTC)[reply]

sure thing, give me one hour and consider it done Amir (talk) 15:04, 21 March 2014 (UTC)[reply]
It's finished now Amir (talk) 15:41, 21 March 2014 (UTC)[reply]
Many thanks! --JulesWinnfield-hu (talk) 15:47, 21 March 2014 (UTC)[reply]

Hi Ladsgroup, strange error by Dexbot: Occupation as place of death. place of death (P20): archaeology (Q23498). Happend 13. Dez. 2013‎. --Kolja21 (talk) 04:28, 30 March 2014 (UTC)[reply]

Hi, thank you for reporting it, It was a problem of parsing the text (note the first link after argument "death place") It won't happen again because I changed the whole parsing system a while ago Amir (talk) 07:37, 30 March 2014 (UTC)[reply]

Language - nonsense association[edit]

From English to Portuguese Communications Security Establishment Canada should be getting to Communications Security Establishment - Same name in English. Instead , Communications Security Establishment Canada when using the link for Portuguese language, one endes up in a soccer club Clube Sociedade Esportiva'. It was correct and looks like teh bot made it wrong again...

Talencar (talk) 02:13, 3 April 2014 (UTC)Talencar[reply]

Hi, can you give me a link? I can't understand what you are saying about the bot problem Amir (talk) 16:14, 4 April 2014 (UTC)[reply]

Only one item in P131 in Q199957[edit]

en:Småland tells that Småland is located in 5 counties, but your bot only added one of them? -- Lavallen (talk) 12:53, 3 April 2014 (UTC)[reply]

I set my bot not to add all of links (and just add one of them) if you think it's not right, tell me to stop the bot and make something that is better Amir (talk) 16:11, 4 April 2014 (UTC)[reply]

Heves Hungary[edit]

Hi, Heves (Q855011) is a city in Hungary, Heves County (Q191604) is a county in Hungary, this is not correct. --JulesWinnfield-hu (talk) 09:49, 4 April 2014 (UTC)[reply]

Actually it is correct because P131 has to be the closest administration entity to that geographical feature (e.g. Liberty statue has to have P131 of New york city instead of new york state) Amir (talk) 16:09, 4 April 2014 (UTC)[reply]
You misunderstood something. Vámosgyörk (Q375741) is a village in Heves county, it has nothing to do with Heves city, except that they are in the same county, but I noticed that it was wrong on enwiki, where I corrected it. --JulesWinnfield-hu (talk) 16:44, 4 April 2014 (UTC)[reply]
Oh. sorry for that I thought you meant that thing (a monument maybe) has to be in county not the city, and yeah It came from English Wikipedia and thank you for fixing that Amir (talk) 16:48, 4 April 2014 (UTC)[reply]

Script for importing from wikiquote/wikisource[edit]

Hello, do you have such a (pywiki) script, which could import from sisterproject and merge it with existing items? e.g. take q:category:Foo, look if exists w:category:Foo and connect it? Could you send me this code for import from cs/sk? Thanks. JAn Dudík (talk) 21:02, 8 April 2014 (UTC)[reply]

Hi Jan, I hope you're well, sorry for slow response here, I sent it to you like one week ago :) Amir (talk) 21:49, 16 April 2014 (UTC)[reply]

reasonator[edit]

Hi Ladsgroup! Please ping me if http://tools.wmflabs.org/?tool=reasonator is back again. Is there a reason why it has been disabled? Regards gangLeri לערי ריינהארט (talk) 19:59, 16 April 2014 (UTC)[reply]

Hi, It's working for me [28] Amir (talk) 21:49, 16 April 2014 (UTC)[reply]

Falsified claim[edit]

Can you check what went wrong in this edit [29] - the bot changed the correct value to a wrong one. The linked item is not the parent administrative entity, but the capital. I checked all the Thai districts with my bot, the same wrong action was done for Fang (Q475498). Ahoerstemeier (talk) 10:39, 19 April 2014 (UTC)[reply]

Hello, the problem is misinformation on the English Wikipedia article w:Chaiya District which mentioned w:Talad Chaiya as the most closed administrative entity which Chaiya District exists in it Amir (talk) 11:09, 19 April 2014 (UTC)[reply]

I do not understand your reverts, please comment. Regards--Oursana (talk) 15:33, 22 April 2014 (UTC)[reply]

My bot has removed duplicate statement (do not revert it) for example there were two statements with country: Germany Amir (talk) 15:38, 22 April 2014 (UTC)[reply]

Appreciation[edit]

Hy Dexbot / Ladsgroup / Amir! Thanks, refreshing the language alley reference of the many Hungarian sides and wikidata the connection of an element. :) --Vakondka (talk) 05:27, 23 April 2014 (UTC)[reply]

you're welcome, I'm pretty happy that I can do something to make Wikipedia (any language) better Amir (talk) 06:36, 23 April 2014 (UTC)[reply]

Bot: setting proper labels[edit]

your bot set the labels for Q5052486 to Comana, Constanța and for Q5118535 to Comana, Giurgiu. the correct label would have been Comana in both cases. --Akkakk 08:27, 23 April 2014 (UTC)[reply]

It's a bug, I'm fixing it, thank you for telling me Amir (talk) 08:31, 23 April 2014 (UTC)[reply]
Fixed, It'll work preperly in the next run Amir (talk) 08:35, 23 April 2014 (UTC)[reply]

Just to make you aware of that. --Ricordisamoa 12:05, 17 April 2014 (UTC)[reply]

Hi! @Ricordisamoa: even though I usually hate changes (and prefer status quo) but I think I have to shape up so I will certainly help you on deprecating it and at first I think we need to make a to-do list (like porting remained scripts, making a good documentation about the migration, etc.) and finish it and after that we can start calling compat deprecated, Don't you agree? Amir (talk) 14:21, 18 April 2014 (UTC)[reply]
Of course, see the RFC page for more information and a link to the tracking bug about the migration :-) --Ricordisamoa 19:26, 23 April 2014 (UTC)[reply]
@Ricordisamoa: That's my point, I think just a page and tracking bug isn't enough for this scale of work we need to make another page and start working in it before starting this proccess Amir (talk) 04:20, 25 April 2014 (UTC)[reply]

Bot - Italian description for items about Wikimedia categories[edit]

According to Help:Description/it#In minuscolo, it should be "categoria di un progetto Wikimedia", starting with a lowercase letter. However, it is probably not worth changing it when it has been already set, if not making other notable changes. Thanks, --Ricordisamoa 08:35, 24 April 2014 (UTC)[reply]

Hi, Thank you for notifying me, I fixed the code, so won't happen again. Amir (talk) 18:33, 24 April 2014 (UTC)[reply]

It's the same for Spanish, French, and Portuguese. So please use "categoría de Wikimedia" instead of "Categoría de Wikimedia", "page de catégorie de Wikimédia" instead of "Page de catégorie de Wikimédia", and "categoria de um projeto da Wikimedia" instead of "Categoria de um projeto da Wikimedia", respectively. Andreasm háblame / just talk to me 01:14, 19 May 2014 (UTC)[reply]

Hi, All of them are fixed now Amir (talk) 18:09, 19 May 2014 (UTC)[reply]

Importing sitelinks from Wikiquotes[edit]

Are you still importing sitelinks from Wikiquotes?--GZWDer (talk) 15:03, 27 April 2014 (UTC)[reply]

Yeah, but it's not very obvious due to lots of edits my bot is doing right now Amir (talk) 15:39, 27 April 2014 (UTC)[reply]

Carefuly with wikisource:[edit]

[30] Music drama instead of author... JAn Dudík (talk) 08:35, 28 April 2014 (UTC)[reply]

I need to do a little change in the codes in order to fix it, will do it ASAP thank you for notifying me Amir (talk) 08:44, 28 April 2014 (UTC)[reply]
I temporally disabled the code for main namespace of Wikisource, It needs to be redesigned in some parts Amir (talk) 05:28, 29 April 2014 (UTC)[reply]
Fixed, It works in a better way, (tries to parse text and get the item based on templates) so if you just add a link to wikipedia page the bot adds it to wikidata and interwiki links will be shown. Amir (talk) 07:16, 1 May 2014 (UTC)[reply]

United Kingdom: Films[edit]

Hello Dexbot / Ladsgroup / Amir! (I not speak good english.)

You can help with him, for cinema films, tv films and tv filmseries, that inside United Kingdom Englishs, Scots, Welshmen, North-Irish you are jointly it was done this they? --Vakondka (talk) 19:55, 1 May 2014 (UTC)[reply]

Hello, of course, Can you give me a source to input data from there? Amir (talk) 08:24, 2 May 2014 (UTC)[reply]

The Mr. Bean television filmseries British and the Mr. Bean televison cartoonseries British. It from within the two: English, Scot, Wales, Northern Irish, or common? What kind of citizens did it? --Vakondka (talk) 12:30, 2 May 2014 (UTC)[reply]

I see, I'll do it very soon Amir (talk) 13:10, 2 May 2014 (UTC)[reply]

I wait with patience. --Vakondka (talk) 14:29, 2 May 2014 (UTC)[reply]

I added data for all of tv episodes (not just Mr. Bean, anything that has article in Wikipedia) as an example you can see Mr. Bean Goes to Town (Q6928415) Amir (talk) 10:18, 4 May 2014 (UTC)[reply]

I did not mean this by his truth. :( --Vakondka (talk) 19:36, 5 May 2014 (UTC)[reply]

(true: Mr Bean (televísion filmseries and televísion cartoon series) = Englis? or = Scot? or = Welshman? or = North Irish? or = all 4?) --Vakondka (talk) 19:36, 5 May 2014 (UTC)[reply]

can you explain me more? Amir (talk) 10:17, 6 May 2014 (UTC)[reply]

I test it then. --Vakondka (talk) 18:59, 6 May 2014 (UTC)[reply]

Dexbot at Adrianne Wadewitz[edit]

Any ideas why Dexbot (talkcontribslogs) was making those odd changes to Q16438247? — Cirt (talk) 12:20, 2 May 2014 (UTC)[reply]

Because someone used wikipedia template in header of s:Wikimedians meet with museum leaders. I'll fix this Amir (talk) 12:52, 2 May 2014 (UTC)[reply]
Fixed, please tell me if happens again Amir (talk) 14:17, 2 May 2014 (UTC)[reply]
I removed links to Q16438247 from all pages at Wikisource except for the one intended linked page, the author page of s:Author:Adrianne Wadewitz -- will that help fix things? — Cirt (talk) 18:13, 2 May 2014 (UTC)[reply]
Yeah, Thank you so much Amir (talk) 10:24, 4 May 2014 (UTC)[reply]
Okay, thank you! — Cirt (talk) 04:36, 19 May 2014 (UTC)[reply]

Human groups[edit]

Hi, your bot makes one error often. It adds human-related properties to items that represent human groups, for example: Hughes brothers (Q960059). Could you make additional check in bot`s code: if item has has part(s) (P527) property then do not touch the item. — Ivan A. Krestinin (talk) 17:49, 4 May 2014 (UTC)[reply]

Hi! I saw that Italian WP uses the bio template for group of people [31] and I skip the pages with more than one bio template [32] and if you check it used one bio template for two person [33] so I see your concerns reasonable and add more checks including your suggestion to the code Amir (talk) 18:23, 4 May 2014 (UTC)[reply]
Thanks! More samples if needed: Montgolfier brothers (Q193751), Farrelly brothers (Q262337), Auguste and Louis Lumière (Q55965), Pang brothers (Q3089490), Katia and Maurice Krafft (Q1736578), Boulting brothers (Q3181105). — Ivan A. Krestinin (talk) 18:39, 4 May 2014 (UTC)[reply]
The check is added now, thank you for notifying me Amir (talk) 20:17, 4 May 2014 (UTC)[reply]
[34]Ivan A. Krestinin (talk) 18:32, 10 May 2014 (UTC)[reply]
I'm pretty much I added this check, let me check again. Amir (talk) 18:47, 10 May 2014 (UTC)[reply]
Oh my bad.... :( I fixed it in a code and I didn't in another code, Is everything fixed or I have to write something for it? Amir (talk) 20:23, 10 May 2014 (UTC)[reply]

Two questions[edit]

Hi dude, I have two questions:

  1. I tend to add the notable works of famous writers to their item. For example, 1 and 2. Then I noticed that these are usually done much efficiently by bots. Should I keep on doing it or the bot runners will take care of them?
  2. All the wikiquote pages of Saadi were linked to the disambiguation item. I manually moved them to the correct item. Is there any faster way to do this?

Gratefully, طاها (talk) 22:32, 5 May 2014 (UTC)[reply]

Hi!
  1. If there are Persian writers that haven't been added, tell me to write a bot and do it based on Persian Wikipedia.
  2. if you see this again, tell me to run my bot on it

I hope you enjoy being here Amir (talk) 10:17, 6 May 2014 (UTC)[reply]

Thanks. Regarding the Persian writers, I haven't seen any without a wd item. But, my question was about the main properties of the items such as major works which are almost always missing. It seems to me that it should be an all-robotic process and we are not supposed to it manually, right? Taha (talk) 05:46, 7 May 2014 (UTC)[reply]
there are two ways to get that data: one from w:Template:infobox writer and parameters of "notable works", and another one is to get backlinks of an author and see that person is mentioned as write of a works in Wikidata. I can work on both but the former sounds more reasonable to me. Amir (talk) 11:46, 8 May 2014 (UTC)[reply]
I just started harvesting data from infobox writer an example. it'll finish in the next week or so Amir (talk) 21:04, 9 May 2014 (UTC)[reply]

hassan zaeri amirani-iran[edit]

Hi Ladsgroup. Item Q16758138 is empty (no statements) and unlinked, and at least the English description doesn't look like it's a notable item otherwise. However, may you please have a look at the Persian label and description and give advice whether you think this should be deleted (or just delete it yourself, if you think so)? Thank you. --YMS (talk) 09:09, 6 May 2014 (UTC)[reply]

Hi, I checked Persian description, it was nonsense (translation = he loves potato and has friends) so I deleted it Amir (talk) 10:17, 6 May 2014 (UTC)[reply]

Elph[edit]

Elph has regained control of his account and his rights have been returned. Please unblock him in fawiki.--GZWDer (talk) 04:53, 9 May 2014 (UTC)[reply]

both of them are open now Amir (talk) 07:08, 9 May 2014 (UTC)[reply]

درخواست رسیدگی[edit]

درود امیر عزیز؛ وقت خوش و ایام به کام. اگر امکانش هست به Wikidata:Requests for permissions/Bot/Fatemibot یک نیم‌نگاهی بیندازید (گفته بودم انگلیسی افتضاحی دارم!) الان چه کار باید بکنم که ایشان قانع شوند و چه کار باید بکنم که زودتر مجوز دریافت کنم؟ (چرا که مجوز آزمایشی ویکی‌فا را داشتم و انجام داده ام و منتظرم که مهدی عزیز به ویکی بیاید و رسیدگی نماید :) )

ارادتمند.H.Fatemi 09:02, 11 May 2014 (UTC)[reply]

ایشان می‌گویند که کد شما باید تصحیح شود. ببین چجوری می‌شود درستش کرد. Amir (talk) 09:31, 11 May 2014 (UTC)[reply]

امیرجان؛ کد وپ:دار که درست است و درست کار کرده است و می‌کند؛ مشارکت های ربات را ببینید، متوجه می شوید که حدود صد و بیست ویرایش درست در انتقال رده داشته است. H.Fatemi 09:41, 11 May 2014 (UTC)[reply]
می گوید که برچسب و سایتلینک باید در یک ویرایش تصحیح شوند نه دو ویرایش Amir (talk) 09:57, 11 May 2014 (UTC)[reply]

Labels on new items[edit]

Hello Other bots removed contents inside parentheses at the end of the Wikipedia link in order to create the label, which seems to makes sense in most cases. But Dexbot keeps the content between parentheses like here). Is it intentional ? --Zolo (talk) 06:11, 17 May 2014 (UTC) <copyedited inintelligible content>Zolo (talk) 18:41, 17 May 2014 (UTC)</>[reply]

Hello, My bot gets whats before " (" and ", " as label but I'm not sure it's okay in all of my codes and the link you provided doesn't seem to have anything related to my bot label fixer Amir (talk) 08:05, 17 May 2014 (UTC)[reply]
Uh, I missed a digit in the &history of the URL and somehow that makes it point to a different item. The link should have been [35]-Zolo (talk) 18:41, 17 May 2014 (UTC)[reply]
So it was a bug in the codes of harvesting remained articles, I fixed it and now it works okay. thank you for telling me Amir (talk) 18:53, 17 May 2014 (UTC)[reply]

Weird edit at Reseda (Q7315195)[edit]

Please take a look at this edit. Multichill (talk) 17:24, 17 May 2014 (UTC)[reply]

Hi, It was a problem that I fixed it in my codes a very long time ago. How much error is remained? Amir (talk) 17:38, 17 May 2014 (UTC)[reply]
Just letting you know. Happy to hear that it has been fixed. Multichill (talk) 19:35, 18 May 2014 (UTC)[reply]

Bot additions[edit]

Hi Ladsgroup,

At Property talk:P1307, there are a few sources for the property. Would you want to add it with your bot? --- Jura 20:05, 17 May 2014 (UTC)[reply]

It seems that PLBot is doing it in the meantime. --- Jura 21:48, 17 May 2014 (UTC)[reply]
I'll start working on them very soon Amir (talk) 11:36, 18 May 2014 (UTC)[reply]
I finished harvesting the the information from French Wikipedia and the bot added several P1307 to items example Amir (talk) 12:23, 18 May 2014 (UTC)[reply]
Great, thanks! I think PLBot did just the German ones. Some may be left in English Wikipedia. --- Jura 12:28, 18 May 2014 (UTC)[reply]
I checked English Wikipedia now and nothing left to be done Amir (talk) 12:42, 18 May 2014 (UTC)[reply]
Ok. Cool. BTW, I you feel like it, there are few properties that could be sourced in a similar way: Property_talk:P1291, Property talk:P1253, Property talk:P781, even Property talk:P1258. --- Jura 16:29, 18 May 2014 (UTC)[reply]
All of them are possible the last one is a little bit tricky but also possible, I'll work on them tomorrow or the day after Amir (talk) 16:54, 18 May 2014 (UTC)[reply]
expect rottentomatoes I added all of them Amir (talk) 21:14, 19 May 2014 (UTC)[reply]
Excellent! Looks good. Thank you. --- Jura 21:16, 19 May 2014 (UTC)[reply]
I started harvesting rotten tomatoes, It'll finish by the time you read thisAmir (talk) 23:42, 19 May 2014 (UTC)[reply]

سوال[edit]

سلام امیر جان، وقت خوش؛ می‌خواستم بدانم چگونه باید حذف سریعی به یک آیتم خالی (در اثر ادغام) بزنم؟ برای نمونه Q15713184 اکنون خالی است، جایی برای حذف سریع ندیدم. ممنونم. H.Fatemi 05:04, 18 May 2014 (UTC)[reply]

سلام باید در WD:RFD بنویسید. Amir (talk) 11:32, 18 May 2014 (UTC)[reply]

Conflict with Wikidata[edit]

A lot of reports are not necessary. See User:Ladsgroup/Birth date report/Conflict with Wikidata/1:

Please remove them from reports.--GZWDer (talk) 04:58, 20 May 2014 (UTC)[reply]

Due to a bug in Wikidata dates less than 1000 has an extra zero that makes this problem, I will fix it for next reports and I'll start removing these from reports once the reporting is finished. Thank you for notifying it Amir (talk) 05:06, 20 May 2014 (UTC)[reply]
fixed Amir (talk) 06:41, 20 May 2014 (UTC)[reply]

P166[edit]

? NBS (talk) 09:59, 20 May 2014 (UTC)[reply]

Please check the edit more carefully, the bot just removed two out of three duplicate statements Amir (talk) 11:25, 20 May 2014 (UTC)[reply]
It's no duplicate - Peter Vlasov (Vladimirov) was awarded 3 identical оrders. NBS (talk) 12:38, 23 May 2014 (UTC)[reply]
you have to use qualifier (e.g. year of wining the award) in these cases otherwise it's considered as duplicate Amir (talk) 17:58, 23 May 2014 (UTC)[reply]
Which property as qualifier? NBS (talk) 18:53, 23 May 2014 (UTC)[reply]
point in time (P585) Amir (talk) 19:05, 23 May 2014 (UTC)[reply]
as such it's not considered as duplicate? NBS (talk) 18:19, 24 May 2014 (UTC)[reply]
It's good Amir (talk) 18:41, 24 May 2014 (UTC)[reply]

Burial coords[edit]

Hi, some time ago we already discuss this. Please see [36] (monument), [37] (bridge). Could you review another bot edits and fix errors? — Ivan A. Krestinin (talk) 18:34, 20 May 2014 (UTC)[reply]

Hi, Sure, Do you mind I will start tomorrow? Amir (talk) 04:42, 21 May 2014 (UTC)[reply]
@Ivan A. Krestinin: I checked all of them and fixed them. I didn't touch items with P31:bog body (Q199414) because it's best solution for them and I don't know what I can do about this mess Amir (talk) 16:30, 22 May 2014 (UTC)[reply]

P1258 conversion[edit]

Dexbot appears to be handling Rotten Tomatoes ID (P1258) strangely: [38] has quote marks around the value of Rotten Tomatoes ID (P1258); the true value does not have quote marks, and the link on English Wikipedia does not appear to have quote marks. I haven't found any more examples of Rotten Tomatoes ID (P1258) from Dexbot yet. --Closeapple (talk) 23:42, 20 May 2014 (UTC)[reply]

Some more (which I have removed the quotes from): The Intouchables (Q595); (Q12018); Kick-Ass (Q2201); Swept Away (Q1365); Jab Tak Hai Jaan (Q44280); Sátántangó (Q53481); Charlotte's Web 2: Wilbur's Great Adventure (Q83161); The Skulls (Q83612); Resident Evil: Retribution (Q83542). I've only found one person with the property so far: Maria Walliser (Q1984), for which Rotten Tomatoes ID (P1258) did not have quotes and was not added by Dexbot. It appears a bot job may be useful to go through and clean up. --Closeapple (talk) 00:30, 21 May 2014 (UTC)[reply]
Oh, I start fixing them right now Amir (talk) 04:41, 21 May 2014 (UTC)[reply]
I started, It'll be finished very soon Amir (talk) 04:59, 21 May 2014 (UTC)[reply]

Finished Amir (talk) 16:43, 22 May 2014 (UTC)[reply]

other weird ones[edit]

Some more things in Wikidata:Database_reports/Constraint_violations/P1258#Format, that are probably from Dexbot importing nonstandard data on English Wikipedia; User:Jura1 seems to have corrected them:

  • Sometimes there will be a %27 instead of an apostrophe in the URL; modern web browsers seem to display it in the address bar as apostrophe but send it and copy-paste it as %27. This should probably be converted to apostrophe in Wikidata, although it's valid to use %27 in the actual URL. Example: I'll Believe You (Q12124977) was m/10008669-i%27ll_believe_you (Actual percent signs in movie titles are not carried in the ID. See, for example, m/10_what_makes_a_hero for the film name 10%: What Makes a Hero?)
  • Full URLs instead of the directory/file part. Example: Nirut Sirijanya (Q6579476) => m/http://www.rottentomatoes.com/celebrity/nirut_sirichanya
  • Extraneous prefixes: Idle Hands (Q1213829) => m/m/idle_hands; Frederick Combs (Q5497564) => m/celebrity/frederick_combs
  • Extraneous slashes: Medic (Q6806231) => m//medic_classic_tv_series
  • Extraneous suffixes: Mangal Pandey: The Rising (Q1784272) => m/10005643-rising/?critic=all

Maybe you've already seen these. I did not go to English Wikipedia to find out what original data was the problem. --Closeapple (talk) 16:01, 23 May 2014 (UTC)[reply]

It's the original data. Overall, I like the final result of the import. Good work Amir! ----- Jura 16:27, 23 May 2014 (UTC)[reply]

There are just 8 items with problem, and they were w:GIGO. I fix them by hand Amir (talk) 18:06, 23 May 2014 (UTC)[reply]

Jura took care of these issues, Thank you :) Amir (talk) 18:26, 23 May 2014 (UTC)[reply]
NP. The nice thing about moving them to Wikidata is that it makes it much easier for others to check and fix them. --- Jura 06:17, 24 May 2014 (UTC)[reply]

It seems the bot is not working. Is it broken?--GZWDer (talk) 09:44, 25 May 2014 (UTC)[reply]

I checked, It's working but it seems it's going through a huge portion of items so it'll be a little bit slow Amir (talk) 13:11, 25 May 2014 (UTC)[reply]

Your Bot made a statement using a disambiguation page. Regards--Oursana (talk) 08:41, 27 May 2014 (UTC)[reply]

Hi, See Barbarigo (Q807751), it has P31: Family name too, it's the problem that the item has bee marked disambiguation and family name at the same time Amir (talk) 09:48, 27 May 2014 (UTC)[reply]

Person[edit]

Hi, this is not a person/human, but a tv-show. Sincerely, Taketa (talk) 10:15, 27 May 2014 (UTC)[reply]

Hi, the problem caused by using of infobox person template in the article, but I added several checks to my code so It won't happen again. Amir (talk) 10:30, 27 May 2014 (UTC)[reply]

Difficulty in adding interwiki link for Maria Gustafsson (writer) in English Wikipedia[edit]

The article in English Wikipedia Maria Gustafsson (writer) should be linked with the article Maria Gustafsson (författare) in Swedish Wikipedia and the article Maria Gustafsson in Spanish Wikipedia. However, attempts to add the link are stymied by the notation that "Site link enwiki:Maria Gustafsson (writer) is already used by item Q16729517." Since you have been linking interwiki articles, I hope you can assist with this problem.Roman Spinner (talk) 11:43, 30 May 2014 (UTC)[reply]

I just confirmed that the matter has been resolved insofar as the interwiki links for Maria Gustafsson (writer) are now present among the three languages.—Roman Spinner (talk) 15:12, 30 May 2014 (UTC)[reply]

Re: United Kingdom films (exact description)[edit]

Was Mr. Bean TV film serial and Mr. Bean TV cartoon serial made in England, Scotland, Wales or Northern Irealand?

It seems so Amir (talk) 09:16, 2 June 2014 (UTC)[reply]

P361 vs P131[edit]

Dexbot used 361 part of [39], why not P131 located in? Tamawashi (talk) 05:07, 9 June 2014 (UTC)[reply]

That was my bad I did it when wikidata just had started and the data model wasn't very clear to me Amir (talk) 07:49, 9 June 2014 (UTC)[reply]

P18[edit]

Hello Amir,
this time my request is about adding P18 (image) statements to some items. In German Wikipedia de:Kategorie:Local image but no image on Wikidata has around 1600 football players with an image, but the information is missing in Wikidata. Do you know a way for Dexbot to extract the filename from those articles and add it to the WD item? The parameter with the filename in the infobox is "bildname". Thanks. -- Pütz M. (talk) 15:48, 11 June 2014 (UTC)[reply]

Hello, sounds like a good idea, Give me two or three days Amir (talk) 16:41, 11 June 2014 (UTC)[reply]
Just an update ... now there are also handball players in the category. The picture parameter there is "bild". In total more than 2700 articles with images in the article, but P18 information in Wikidata missing. -- Pütz M. (talk) 20:00, 12 June 2014 (UTC)[reply]

@Pütz M.: the bot just started and it'll be finished very soon Amir (talk) 06:23, 14 June 2014 (UTC)[reply]

Dear Amir, as always: thank you so much for your comprehensive help. It seems as the feature got extended to other infoboxes as well: de:Vorlage:Infobox Tennisspieler (parameter "Bild"), de:Vorlage:Infobox Basketballspieler (parameter "bild"), de:Vorlage:Infobox Schwimmer (parameter "image"). Further it seems as in Dutch Wikipedia nl:Categorie:Wikipedia:Wel afbeelding lokaal en geen op Wikidata and French Wikipedia fr:Catégorie:Local image but no image on Wikidata the same feature got implemented with even more missing images. -- Pütz M. (talk) 10:03, 14 June 2014 (UTC)[reply]

Hi Amir, thanks for all the work. Your bot however just added an Italian local image, not a commons image, to Wikidata. See [40]. Sincerely, Taketa (talk) 17:19, 14 June 2014 (UTC)[reply]

Hi, I wrote a checking part in order to avoid this kind of mistakes, I checked it, but something went wrong, I'm checking what Amir (talk) 20:03, 14 June 2014 (UTC)[reply]

I found the problem, fixed it and wrote a script to report issues (It's reporting, I'll start removing them by hand once it's finished, thank you for noticing the error). in other hand I started adding other images from infoboxes of German Wikipedia and I'll work on French after the work is done Amir (talk) 21:39, 14 June 2014 (UTC)[reply]

This is the list of mistakes ~70 out 2200 (3.1%) I will fix them by hand. @Pütz M.: I finished German and started Dutch, Working on French is a little bit harder because they use local repository so my bot has to check the image exists in commons or there is a local image Amir (talk) 00:08, 15 June 2014 (UTC)[reply]

All of them are fixed by User:Ivan A. Krestinin's bot. Amir (talk) 14:03, 15 June 2014 (UTC)[reply]
This mistake is very popular. My bot removes invalid links daily. Also please see Property talk:P18#Notes for botmasters. — Ivan A. Krestinin (talk) 14:22, 15 June 2014 (UTC)[reply]
Thank you, I'm adding these checks to my code Amir (talk) 14:37, 15 June 2014 (UTC)[reply]
Added all of checks Amir (talk) 14:57, 15 June 2014 (UTC)[reply]
Hi Ladsgroup, your bot has re-added an image I removed from Wikidata [41]. Could you have a look please. Also, thanks for the quick response on the German images. Sincerely, Taketa (talk) 22:11, 15 June 2014 (UTC)[reply]
Hi, It's malformed input from Dutch Wikipedia w:nl:Thomas Bælum, I ran the code twice (or three times) because the my internet connection crashed during the process. It won't happen again. Amir (talk) 22:25, 15 June 2014 (UTC)[reply]
I guess User:Ladsgroup/sandy like Brazil is done. Seems as Ivan has a routine for his KrBot to remove all P18 statements, that are causing a "file exists" violation. See also Wikidata:Database_reports/Constraint_violations/P18. -- Pütz M. (talk) 00:17, 16 June 2014 (UTC)[reply]