User talk:Ladsgroup

From Wikidata
Jump to: navigation, search

Dexbot deletions[edit]

Hello, I am quite curious why a couple of hundreds of redirects I created, have been deleted by Dexbot. As an example I give Q9760271. I merged and redirected, but 10 days later your bot just deletes the redirect. I am actually quite frustrated about redirects being deleted - some admins do that when I have dealt with a RfD request but I hope that is in error, but your bot seems to have been programmed to do this? Please explain. Kind regards, Lymantria (talk) 07:31, 22 October 2014 (UTC)

Hi, I'm about to restore all of incorrect deletions. Amir (talk) 07:53, 22 October 2014 (UTC)
Ok, thank you. Kind regards, Lymantria (talk) 07:56, 22 October 2014 (UTC)

@Lymantria, Neo-Jay: I just started my bot to restore all of redirects that had been deleted by my bot. Special:Log/Dexbot it'll be finished soon. Amir (talk) 06:49, 23 October 2014 (UTC)

Thank you! --Neo-Jay (talk) 05:01, 25 October 2014 (UTC)


Hi. Could you fix the farsi label of this item (she seems to be from New Zealand, not USA). Best regards, --Wikijens (talk) 08:14, 27 October 2014 (UTC)

The same for Q832256. --Wikijens (talk) 08:40, 27 October 2014 (UTC)
Hi, both done. Amir (talk) 13:47, 27 October 2014 (UTC)

Badge templates on etwiki[edit]

Hi! Could you please remove "Link FA", "Link GA" and possibly "Link FL" templates on Estonian Wikipedia. Edit summary in Estonian could be "Eemaldatud mall Link FA; keelelinkide äramärkimine nüüd Vikiandmetes" (for a good article just swap "FA" with "GA"). If possible, remove all uses of these templates. Thanks in advance! Pikne 15:32, 27 October 2014 (UTC)

Hi, I just started my bot Amir (talk) 15:41, 27 October 2014 (UTC)

Bot: setting proper label for it wikisource[edit]

When adding a label for it to an item from it.wikisource label, please remove the "Autore:" before the name.

Look at this than the manual fix

Best regards, --Accurimbono (talk) 13:17, 29 October 2014 (UTC)

Hi, I'll fix it. Amir (talk) 16:59, 29 October 2014 (UTC)
It's fixed now. Amir (talk) 00:41, 8 November 2014 (UTC)

Bot flag[edit]

Hi. Please don't forget to set the bot flag for Dexbot's edits. Thanks. --Pasleim (talk) 18:31, 5 November 2014 (UTC)

Hi, I added flag, thank you for your notice I always forget Amir (talk) 19:04, 5 November 2014 (UTC)

Incorrect P31 assignment[edit]

Watch out, assigning P31=television (Q289) instead of television series (Q5398426) on If possible, adjust your analysis script to first check for longer multi-word matches, before checking for shorter words, when analyzing the first lines of articles. Note that nearly every category of en:12 Monkeys (TV series) was a TV series category. -- LaddΩ chat ;) 22:49, 6 November 2014 (UTC)

Stopped the bot, The analyzer only catches certain grammatical phrases (which should include "series") I'll check that tomorrow to see how I can make it better Amir (talk) 22:57, 6 November 2014 (UTC)

List of international flags = Auguste Dietrich[edit]

Hi Ladsgroup, can you please check what went wrong here: Dexbot, Nov. 6. Cheers --Kolja21 (talk) 10:06, 7 November 2014 (UTC)

I reverted today a couple of similar edits. Either the Wikidata value in commons:Template:Creator was wrongly added by User:Jarekt or the leading Q was missing. --Pasleim (talk) 10:26, 7 November 2014 (UTC)
I fixed the bot not to add these kind of problems by checking P31 I think it would be nice if we add a constraint violation to P1472 Amir (talk) 15:29, 7 November 2014 (UTC)

Commons Creator import[edit]

Hi, i found that in many cases Dexobot imported in 6 November Dexobot imported Commons Creator (and VIAF, GND, etc.) to the disambiguation items (in some cases proper item exists, sometimes not). Please try to avoid these error in future, as there is no easy way (using UI) to move these data to proper item. --Jklamo (talk) 11:04, 8 November 2014 (UTC)

Hi, thank you for sharing this: I keep track of errors (which won't happen again or happen very rarely) in Wikidata:Database reports/Constraint violations/P1472 and fix all of errors once it's finished Amir (talk) 11:09, 8 November 2014 (UTC)
(same for villages or [[q4167836|categories]). OK, nice to listen that you keeping track of these and planning to fix them. --Jklamo (talk) 11:18, 8 November 2014 (UTC)

Lucile de Chateaubriand interesting bug[edit]

Hi Amir, Category:1072 in England (Q8086788) showed up in the constraint violations. Turns out the correct item is Lucile de Chateaubriand (Q18086788) so it appears that the 1 fell off. I noticed at Commons:Creator:Lucile_de_Chateaubriand that the Q is missing. Let me quess, you just chop off the first character and because the Q is missing you ended up with 8086788? ;-) Can you look into this? I'll fix this one, but some other cases might have happened. cc Ivan, he already solved a couple. Multichill (talk) 11:35, 8 November 2014 (UTC)

Hi Amir, looks like a bit over 100 items are affected. Multichill (talk) 12:45, 8 November 2014 (UTC)
Exactly, So anything in commons is fixed now? if it's okay let's wait for new report and remove anything remained afterwardsAmir (talk) 15:58, 8 November 2014 (UTC)
No, not at all. Here on Commons all affected items need to be checked. Take for example Proportional counter (Q643003).
Maybe you can connect to irc, easier to talk. Multichill (talk) 16:51, 8 November 2014 (UTC)
I went through the list. User:Gymel already caught quite a few of them. I fixed the remaining.
Could you please start a new import run? If your bot adds Commons Creator page (P1472) to the right page and we missed something, it should show up in this report. Multichill (talk) 17:52, 8 November 2014 (UTC)

commons creator import Electorate of Saxony Electorate of Saxony (Q156199) and other mistakes[edit]

Commons Creator page (P1472) was wrongly used by @Jheald: and my revert was redone by your bot. Please check this item, your bot made many more mistakes This was the problem, Wikidata nr of electorate of Saxony --Oursana (talk) 12:05, 8 November 2014 (UTC)

It should be fixed by now Amir (talk) 16:00, 8 November 2014 (UTC)

Wikipedia is no valid source[edit]

stop creating invalid references like "italian wikipedia" --Eingangskontrolle (talk) 07:09, 11 November 2014 (UTC)

Dear Eingangskontrolle, adding imported from (P143) <some wikipedia> is common practice here at Wikidata. That's why English Wikipedia (Q328) is the most linked to item. This way users can see where data is coming from. Better than no source at all. You're of course more than welcome to support a claim with a real reference. Multichill (talk) 17:27, 11 November 2014 (UTC)

Then I will delete this claim in the IT:WP. And common practice is no good practice. I think we need a "Meinungsbild" in the DE.WP to protect us from the results of this practice. --Eingangskontrolle (talk) 18:56, 11 November 2014 (UTC)

Delete vs merge[edit]

Hi Amir, please stop deleting items like invalid ID (Q7267446), but instead redirect them. Pywikibot already contains that functionality. Multichill (talk) 09:21, 11 November 2014 (UTC)

Even though I'm personally against using redirects in databases (a redirect "row" IMO is a weired thing) but I'll redirect them from now on Amir (talk) 09:28, 11 November 2014 (UTC)

James Maubert[edit]

Pages Q18511487, Q18511876, Q18511761, Q18508335, Q18511679, Q18511546, Q18511609 are all identical new pages you created for James Maubert based only on VIAF number ( There might be something wrong with the code. --Jarekt (talk) 19:17, 11 November 2014 (UTC)

I had added this check the before this error happened:
       f=urllib2.urlopen("[214:\"%s\"]" % aut.get('VIAF').strip())
       res = json.loads(
       print "couldn't get url"
       res= {'items':[]}
    if res['items']:
       print "horay"
       add_it(page,text, res['items'][0])

I think it happened because of the lag in wdq. I shouldn't run several scripts back to back Amir (talk) 08:09, 12 November 2014 (UTC)

Similar situation with "Joseph McNally", Commons creator will link to Q18511889.--Jarekt (talk) 13:24, 12 November 2014 (UTC)
I'm working on Wikidata:Database_reports/Constraint_violations/P214#Unique_value and I hope fix any mistake my bot madeAmir (talk) 16:41, 12 November 2014 (UTC)

Pywikibot error[edit]

Hey, Since November 12, a copy of fails with

  File "/data/project/huwiki/scripts/archivebot", line 92, in <module>
    import pywikibot
ImportError: No module named pywikibot

on Tool Labs. Can you please help me, what is the problem? --JulesWinnfield-hu (talk) 09:36, 13 November 2014 (UTC)

Hi, At first please be sure you are running it from pwb wrapper (and not directly):
python /data/project/huwiki/ "archivebot hu" ARGS

instead of

python /data/project/huwiki/scripts/archivebot ARGS

And be careful if you're using virtual environment you should activate it first (if you're using virtual env read this manual) and if the problem still persists, add me (Ladsgroup) to your service group so I can see what I can do about it. Amir (talk) 09:49, 13 November 2014 (UTC)

Thank you for your answer. Until November 11 it worked fine without pwd. I don't use virtual environment. I didn't change anything. Why did it stop working? --JulesWinnfield-hu (talk) 10:20, 13 November 2014 (UTC)

In core we highly recommend users to use pwb wrapper (to catch errors in arguments before sending it to the script and some other features). So please run it via When you run a script in scripts folder you can't import pywikibot directly so it uses pywikibot general installation (in dist util folder not your local folder) I think this installation in labs has been uninstalled to prevent errors or other reasons. Amir (talk) 10:33, 13 November 2014 (UTC)
Thank you. I changed the command. Will see tonight. --JulesWinnfield-hu (talk) 11:00, 13 November 2014 (UTC)
It worked. Thanks! --JulesWinnfield-hu (talk) 09:38, 14 November 2014 (UTC)

"Proper label issue"[edit]

Hi, apparently your bot have a bug. See Special:Diff/173715678. This occurred in dozens of pages. Best, Lugusto (talk) 06:14, 14 November 2014 (UTC)

Hi, thank you for notifying. I fixed the bug and corrected all of mistakes in English language but I need to check for other languages as well. Amir (talk) 08:28, 14 November 2014 (UTC)
All other mistakes in other languages are fixed now Amir (talk) 07:25, 18 November 2014 (UTC)

Removing valid claims and making invalid claims[edit]

I see here, , that you have removed the claim type of administrative territorial entity (P132) = non-metropolitan district (Q1187580). This is not right. Chiltern District (Q1073057) is a non-metropolitan district (Q1187580) and has been since 1 April 1974. There's a whole load of these errors made by this bot. Danrok (talk) 21:44, 14 November 2014 (UTC)

You confuse P131 and P132. Dexbot removed is in the administrative territorial entity (P131) => Q1187580 which is fine. --JulesWinnfield-hu (talk) 22:01, 14 November 2014 (UTC)
Danrok deleted P31 which your bot had created. I fixed that by hand, but could your bot duplicate all P132 to P31 in case there is no P31 at all? P132 is really a pain. Andrea Shan (talk) 05:55, 17 November 2014 (UTC)

Languages - move P133 to P279[edit]

Could your bot copy all P133 (super language family OBSOLETE) to P279 and then delete P133 statements? There is currently a mess with P31, P279 and P133 for language items. Removing P133 from the list of properties that one has to check, makes the editing easier for people actually editing the items. There are still more than 1500 items for "claim[133] and noclaim[279]". Andrea Shan (talk) 05:59, 17 November 2014 (UTC)

Hey, My bot fixed anything it could and you can check that the report contains so much less now Amir (talk) 12:50, 17 November 2014 (UTC)
Great, thank you. User:Pasleim deleted P133, lots of thanks also to him. Andrea Shan (talk) 02:13, 18 November 2014 (UTC)

Duplicate P132 to P31[edit]

There are 6400+ items for "claim[132] and noclaim[31]". Could you copy P132 to P31? But don't delete P132, it is not marked as obsolete. Andrea Shan (talk) 02:25, 18 November 2014 (UTC)

Started my bot, it'll finish soon Amir (talk) 04:29, 18 November 2014 (UTC)
Done. Check again :) Amir (talk) 06:56, 18 November 2014 (UTC)
Great, thanks a lot! Check brings 23 items, but maybe this is some delay in the DB. Again, thanks a lot! Andrea Shan (talk) 09:54, 18 November 2014 (UTC)

P107 removal, P31[edit]

"claim[107:618123] and noclaim[31]" - 110000+ items. That means, they are not classified. Could you set P31=Q618123, so the items appear within "claim[31:(tree[618123][][279])]"? And delete P107 for these items. I started this task with Widar, but that is very slow. If it is done, GND will only have ~6000 items left. Andrea Shan (talk) 10:01, 18 November 2014 (UTC)

Please don't do this. The idea behind the deletion of P107 was to get more specific claims with P31. With turning P107=Q618123 to P31=Q618123 we win nothing. During the last 30 days we have replaced 80,000 claims of P107 to more specific P31 claims [1] and I'm optimistc we can continue with that. --Pasleim (talk) 10:13, 18 November 2014 (UTC)
(edit conflict)I like to do that but during our talks about migration away from P107 people decided not to do this task and start by adding a reasonable P31 from the start and remove P107 anywhere that there is P31 (I do this task once in month) Amir (talk) 10:16, 18 November 2014 (UTC)
I see no rational. For editors that work manually it is easier to change the value of a property than to create a new statement, and delete another statement. 22 000 in two days or less. That looks like a very special factor that helped to have 80 000 in 30 days. There could be more such things, but the fewer are left, the less likely it is. Ladsgroup could fix Property talk:P107#Having separate properties for controlled ontologies is silly in a day. For those people that use bots, altering instanceOf to a more specific value is very easy too. Having items without instanceOf or subClassOf, apart from "entity" is silly too. Ladsgroup could help to get more items tagged with instanceOf. People that have nothing to do with GND editing but are interested in fixing this bug, will benefit, since their list of items having this bug is reduced by 110 000 items in one day. Unbundling the two tasks
  • assigning any instanceOf and deleting P107 (within 24h)
  • improving instanceOf
allows people to do what they can do best. Andrea Shan (talk) 20:40, 18 November 2014 (UTC)

Already in P31 tree 618123[edit]

"claim[107:618123] and claim[31:(tree[618123][][279])]" ~4500 items. P107 can be removed. Andrea Shan (talk) 00:24, 19 November 2014 (UTC)

Already done by PLBotAmir (talk) 12:15, 19 November 2014 (UTC)

Replace P31=lake with P202 value[edit]

For all "claim[202]" could you copy the P202 value to P31 and remove any P31=lake if it exists? The items will still be lakes, since the P202 values are subclasses of lake, but the instanceOf values will be more specific. Depending on the outcome at Wikidata:Properties for deletion#.7B.7BPfD.7CProperty:P202.7D.7D P202 can be removed. Andrea Shan (talk) 23:08, 18 November 2014 (UTC)

I'll write a script for it tonight Amir (talk) 12:36, 19 November 2014 (UTC)
Done Amir (talk) 08:41, 20 November 2014 (UTC)

Creator and institution imports[edit]

Hi Amir, how is it going? Nice work on the imports. See this list and this list. You should be able to import that based on the "homecat". Do you think you're able to match some more institutions templates? See also this conversation. Multichill (talk) 19:48, 19 November 2014 (UTC)

Hey, I started importing as much as possible information from these lists, it'll be finished by tomorrow and you can see the results Amir (talk) 21:16, 19 November 2014 (UTC)
It's finished, we have to wait until next report comes out Amir (talk) 09:14, 20 November 2014 (UTC)