Wikidata talk:Bots

From Wikidata
Jump to navigation Jump to search
On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2019.

Wikidata:List of wikis/python updated[edit]

More info in the talk page. Report any errors. Emijrp (talk) 07:25, 19 July 2017 (UTC)

Drop "if possible" for the need for P31/P279[edit]

"Bots should add instance of (P31) or subclass of (P279) if possible" I advocate to drop the "if possible", especially given that the sentence already says should and not must. ChristianKl (talk) 15:13, 10 November 2017 (UTC)

Maximum edits per minute[edit]

What is the value of maximum edits per minute that a bot with approved task should set? I think the default Pywikibot (put_throttle = 10) of 10 seconds between consecutive edits is too large for massive edits. --Albert Villanova del Moral (talk) 06:41, 19 January 2018 (UTC)

The default Pywikibot throttle to wait 10 s between requests is definitely not used by many tools. While a Pywikibot script would only edit a maximum of 6 pages per minute QuickStatements usually makes about 30 edits per minute while some bots achieve more than a hundred edits per minute as can be seen in Edit Groups. I think respecting maxlag is more effective in keeping Wikidata fast for human contributors. --Pyfisch (talk) 16:21, 10 December 2018 (UTC)

Revoke bot User:MatSuBot[edit]

Please stop User:MatSuBot from doing edits as it just stupidly adds Wikipedia article titles as "labels" in different languages, which are not labels. That's why there are for example thousands of entries in the format "English Title (film)" or "Title (2018)" and so on in the film data (example [1]). The operator of the bot is not willing to change anything about this, he just writes (User_talk:MatSuBot) "there's nothing easier to remove them by hand", which is ridiculous because this would mean to check every single of the many thousands contributions of the bot manually.

The biggest problem of Wikidata is that it is becoming a giant messy dump. There are already so many wrong entries and nobody is able or willing to clean up this mess by hand. If you don't act soon, the only solution will be to delete Wikidata completely and start again from scratch. --146.60.144.241 11:51, 1 January 2019 (UTC)

  • It's tricky to find the right balance between
  1. not having any label (despite a sitelink)
  2. labels that should be there as the article title
  3. labels that need some change
  4. labels that could be completely different
Various bots take slightly different approaches. Before Matsubot operated, we had too many of #1. I found Matej fairly receptive to suggestions on balancing #2 and #3. I'm not sure if there is a solution for #4 other than editing them afterwards. --- Jura 13:13, 1 January 2019 (UTC)
Some comments:
  • it just stupidly adds - misleading, see [2] (or [3] if you don't trust me)
    There may be many cases when removing the parenthesis is undesired. For example, category/template titles, songs, chemical substances, actual titles, some intersting cases etc. After the import, the bot tries to match the parenthesis to the description in the same language. This usually covers cases like "2018 film", "playwright", "Arizona" but not all of them, and first of all, the description actually needs to be there. So I hope these wrong additions will motivate users to insert the description and also clean up the label. I think it's worth.
  • That's why there are for example thousands of entries... - I saw users who insert such labels by hand.
  • Jura very well revealed the motivation (ie. what was/wasn't before my bot).
  • I admit: I haven't got approval for this task, knowing there had been many bots doing the same thing before. My bot just does it regularly by scanning all previous week's (since 14 days go) sitelink additions. (Guess why there's such a delay...)
  • Previous "incidents": Topic:Uq6c5fvmxcl073xy, Topic:Uh7junmm0ep4bppw.
Matěj Suchánek (talk) 15:55, 1 January 2019 (UTC)
(Edit conflict) Not even suffixes like "... (film)" or "... (2018)" are removed. My experience with Wikidata is that it becomes increasingly unreliable. And a lot of the easy to find inaccuracies comes from bots. I already corrected a lot of labels added by this (or maybe also other bots) but this is very annoying. (Slow) humans can't correct the mistakes of (fast) robots, it has to be the other way round! --146.60.144.241 16:10, 1 January 2019 (UTC)
  • I had a look at English labels for films and fixed some (<100), not really more than last time (before MatSuBot). There are others that have "()" that should remain.[4] --- Jura 06:55, 2 January 2019 (UTC)
That's nice, but what about all the other languages? --146.60.145.101 23:44, 27 January 2019 (UTC)
You'd need to create a user account if you want to use QuickStatements to fix them. Also, some items might not have P31 yet, so these would need to be added first. --- Jura 02:50, 28 January 2019 (UTC)

Add Kontinent to all wididata-pages with state Ukraine[edit]

Please can someone with a bot add to all Wikidata-pages with the item „State“ = Ukraine, the item „Contient“ = Europe--Francis McLloyd (talk) 13:45, 16 February 2019 (UTC)

Let's not do that. All these item will have the country set to Ukraine (Q212) and that item has the continent on it. Multichill (talk) 15:40, 16 February 2019 (UTC)