User talk:GZWDer/2020

From Wikidata
Jump to navigation Jump to search

Reckless false edits[edit]

I have reverted the reckless false edits you made to Template:Q11608 in which you falsified the birth and death dates. Jc3s5h (talk) 21:57, 3 January 2020 (UTC)[reply]

Two Olivias[edit]

I'm still of the opinion that we should merge Olivia Evelyn Mary Fletcher-Vane (Q76152859) into Olivia Vane (Q76361064), or delete the Peerage item. The heart of the matter is, how much confidence do we need to be allowed to say two records refer to the same person? It is pretty clear from Olivia's academic profiles that she is also known as Olivia Fletcher-Vane, she is from England, and the years listed in her education profile would imply that she must have been born around 1991 - all of these corroborate with the information from the Peerage entry. I would say that's enough confidence. Even if somebody published her date of birth, one could still argue it's insufficient - what if there were two Olivia Fletcher-Vanes born on the same day? There's no proof to the contrary, and we should act based on the most plausible representation of facts, not a rigid requirement on external proofs of equivalence. Deryck Chan (talk) 13:41, 10 January 2020 (UTC)[reply]

@Deryck Chan: It may be interested that for several year Chinese Wikipedia had two articles (and than, two items) for one person, because the lack of public source about the relationship (see zh:Talk:鬼頭桃菜). The community also argues the validity of self-published sources.--GZWDer (talk) 19:20, 10 January 2020 (UTC)[reply]
In the future, Wikidata is likely have much more case about multiple profiles (=items with set of identifiers/sources) in different fields about one living person. One may argue a simple merge may compromise the privacy of people involved.--GZWDer (talk) 19:24, 10 January 2020 (UTC)[reply]
Another matter is different wikis (and other sources) using different standard when handling BLP data. When some wikis decides to remove some personal information (e.g. this), they may be still included in other Wikipedias and the data goes to Wikidata. Wikidata may also have dates imported from external databases (reliable or not), which comes from sources that are considered unacceptable in Wikipedia (like information brokers).--GZWDer (talk) 19:33, 10 January 2020 (UTC)[reply]
Could always ask her if she thinks it should be merged or not, there's an email address at https://www.oliviavane.co.uk/research/about.html. Ghouston (talk) 00:57, 28 February 2020 (UTC)[reply]
Deryck Chan knows her personally.--GZWDer (talk) 00:59, 28 February 2020 (UTC)[reply]

Merge[edit]

You caused such a mess, I can't stop finding duplicate items, ie. Q75455126, please fix it...--Arnaugir (talk) 20:11, 10 January 2020 (UTC)[reply]

@Arnaugir: Duplicated items should not be taken to RFD. See Help:Merge.--GZWDer (talk) 20:15, 10 January 2020 (UTC)[reply]
This doesn't invalidate my assessment above.--Arnaugir (talk) 09:02, 11 January 2020 (UTC)[reply]

Same remark, can't you do something for fixing the situation? Like, do some merge automatically? At least for the items with the same label, same birthdate and deathdate (with precision to the day)? See https://w.wiki/FM7 (sadly the query timesout without the LIMIT, do you know a way to override that?) I did around 50 by hands and there was no false-postive (with such precise conditions, it's very unprobable). Cheers, VIGNERON (talk) 13:38, 12 January 2020 (UTC)[reply]

CAS numbers, InChIs etc.[edit]

Could you modify your future additions to not add statements that causes constraint violations? You have added many CAS numbers, InChIs etc. Many of them are correct, but in some items such statements were deleted for a reason during manual curation of data (databases have errors and that have to be curated manually by e.g. moving statements to other items). Unfortunately, you can't deduce automatically which statements were deleted from an item and are not present in WD anymore (that's why deprecated rank should be used for such cases), but you should take constraints into consideration during automatic import of data. Otherwise, all manual curation of data would be a Sisyphean task. Regards, Wostr (talk) 22:22, 13 January 2020 (UTC)[reply]

کمک کنید[edit]

سلام لطفا برای این مقاله شناسه مورد به ویکی پدیا اتصال بدهید Elmirasharifimoghadam

Mehdii158 (talk) 12:19, 16 January 2020 (UTC)[reply]

Peerage project[edit]

Hi! What's the current status of the import project from The Peerage website? I've been doing some reconciliation today from a London history source, and I have hit a number of cases where there are marriages recorded on the Peerage site for two people who have items here, but no spouse (P26) link. I thought these connections had been comprehensively imported. They're really important, for trying to trace chains of Peerage people who all need to be merged. Are there a lot still to do, or were most of them added? Jheald (talk) 19:55, 16 January 2020 (UTC)[reply]

I do have a plan to import spouses, but this require analyzing of texts. Another thing is it may be a question that what counts as a spouse (father/mother relationship is rather objective), so I also propose to check other sources before importing (The Peerage does not have a consistent format describing spouses; genealogics.org does, but this will requiring matching and mapping against another external database).--GZWDer (talk) 21:46, 16 January 2020 (UTC)[reply]

The formats seem to be:

  • "(name) married <..>

If there is some other parent mentioned:

  • (He or she) married <..>
    (He or she) married, (firstly|secondly), <..>

Hope that helps. --- Jura 07:34, 30 January 2020 (UTC)[reply]

It would also be very useful to have any "position held" information -- eg to chase down duplicate members of parliament, Lord mayors of London, bishops and archbishops, etc. Jheald (talk) 09:59, 30 January 2020 (UTC)[reply]
It looks like there may also be some further dates of birth & dates of death that were missed, eg on William Ireland Thomas de Courcy Wheeler (Q76354837). Jheald (talk) 08:42, 1 February 2020 (UTC)[reply]
This is a relative new entry, not in Mix'n'Match yet.--GZWDer (talk) 11:41, 1 February 2020 (UTC)[reply]

Hi! I have noticed that there might have a problem in the above item created by your bot. The zhwiki article name in this item should be linked to Q7590467 according to the article names of other Wikipedia versions. I have added the Chinese article name to Q7590467 and deleted the name on Q22101484 (which Q22101484 is now empty); however, when I checked on the zhwiki article, the linkage to enwiki is "Churches in Galveston, Texas", which it doesnt have an article with this name. I am not sure what to do in order to solve these issues as mentioned above and I would like to request your help. Thanks.--BenedictusFX (talk) 18:41, 29 January 2020 (UTC)[reply]

@BenedictusFX: Help:Merge--GZWDer (talk) 18:43, 29 January 2020 (UTC)[reply]
I have merged the two items, but it seems that the linkage to enwiki on zhwiki is "Churches in Galveston, Texas" and not "St. Mary Cathedral Basilica (Galveston, Texas)".--BenedictusFX (talk) 18:54, 29 January 2020 (UTC)[reply]
The local interwiki link should be removed.--GZWDer (talk) 18:56, 29 January 2020 (UTC)[reply]
I have noticed your edit on zhwiki (I first thought I have to do this edit until I realised what you mean when I saw your edit on zhwiki, that edit is now reverted by myself) and the problem is now solved. Thanks for your help!--BenedictusFX (talk) 19:05, 29 January 2020 (UTC)[reply]


Labels from import[edit]

Hi GZWder, the problem mentioned at User_talk:GZWDer/2019#Prefixes_in_labels is still unresolved. To help you clean it up, I fixed some and another user some more, but many are still there. Please attempt to fix before adding more. --- Jura 10:48, 2 February 2020 (UTC)[reply]

Notability criteria for new items[edit]

I notice that you've been creating some new items from MnM, with information limited to eg just a Geni.com ID and a Genalogics ID, and no apparent connection to any person with an existing Wikidata item.

Perhaps I am wrong in the above, but do you consider these people notable? Because on the face of it, they don't appear to meet our criteria. Jheald (talk) 23:05, 5 February 2020 (UTC)[reply]

@Jheald: They (recent ones) may be found here. At least they have an ancestor that is notable.--GZWDer (talk) 23:09, 5 February 2020 (UTC)[reply]
Who was that? I'm wary about pushing "inherited notability" further than one or perhaps at most two generations or degrees of separation. Jheald (talk) 23:12, 5 February 2020 (UTC)[reply]
  • If I can butt in: "inherited notability" is a concept from English Wikipedia. "Our criteria" is Wikidata:Notability in that entries on people must link to an outside source to show that they actually exist, and were not created as a prank or vandalism. Any genealogical database could be imported, if someone was willing to do the tremendous work load of running mix_and_match and then merging all the duplicates, and running various error detection queries. There appears to be a moratorium on adding any new large data sets of people because of current computational constraints. The query servers currently time out for even simple date queries for instance_of=human combined with at least two other fields. --RAN (talk) 00:43, 26 February 2020 (UTC)[reply]

wiktionary items[edit]

please creat wiktionary items in wikidata with your bot Amirh123 (talk) 07:59, 10 February 2020 (UTC)[reply]

DOI upper-case constraint[edit]

I notice you added this constraint recently on property P356 - do you intend to run anything to fix these? Or is there an existing bot that does this? ArthurPSmith (talk) 20:57, 18 February 2020 (UTC)[reply]

This will be fixed once I collected all DOIs (likely via this tool). But I have other tasks for now.--GZWDer (talk) 21:00, 18 February 2020 (UTC)[reply]
Looks like I added a lot of them; I'm working on fixing them now. Thanks for the pointer to the dumper tool, it's a lot more useful than the constraint violations page in cases like this! ArthurPSmith (talk) 15:21, 24 February 2020 (UTC)[reply]

? 91.197.junr3170 (talk) 15:32, 25 February 2020 (UTC)[reply]

Wikidata:Database reports/unmarked supercentenarians[edit]

I noticed you helping on the error correction for people who died before they were born, here is another good one: Wikidata:Database reports/unmarked supercentenarians, if they are over 110 years mark as a supercentenarian, otherwise more errors, typos, and vandalism. --RAN (talk) 00:28, 26 February 2020 (UTC)[reply]

merge[edit]

https://www.wikidata.org/wiki/Q4416508

https://www.wikidata.org/wiki/Q85800152

Abieule (talk) 19:26, 27 February 2020 (UTC)[reply]

Something wrong: your created a double of existed item[edit]

Q86372350 is obviously a double of long existed Q13283399

--Slb nsk (talk) 10:54, 29 February 2020 (UTC)[reply]

Wikidata:Database reports/Humans with missing claims/P2600[edit]

Wikidata:Database reports/Humans with missing claims/P2600 is outdated, any way to update? 77.11.15.97 13:07, 5 March 2020 (UTC)[reply]

Large ..[edit]

Hi GZWDer,

if some are available, would you be so kind to upload articles about Q84263196 in priority? Thanks. --- Jura 20:54, 7 March 2020 (UTC)[reply]

Creating duplicates[edit]

Hi GZWDer. What is the point of creating an item without general label defined, when another item already exists on the same topic with the label equal to the title of the article on the French Wikipedia (French label)? Here is another example. Rather than create a useless duplicate, could you please check whether an item exists and if so directly link the Wikipedia article to that item. Thanks! Regards. --Ideawipik (talk) 19:02, 9 March 2020 (UTC)[reply]

Thanks, GZWDer, for the answer and the links. I understand the case and probably don't get all the facts. I just regret the artificial increase of the Item ID number. And the fact that it is much more difficult to merge items than to link a wiki article (without WD item) to an existing item.
Perhaps your bot (or script) should not create an item when another already exists with the same name. For example: Q3605395 (Adohoun) / Q86685478 (duplicate item with no name, frwiki article with the same title) and several articles from fr:Modèle:Palette Mono (département).
@Pasleim: couldn't this projectmerge list be made directly from wiki sites (or dumps) instead of Wikidata item when no WD item has been created.
For sure, we should strongly recommend page translators/creators to link their new pages to existing items. Regards. --Ideawipik (talk) 19:02, 23 March 2020 (UTC)[reply]

TR genea prop|P7977[edit]

I don't see the underlying template but we have P7977. --RAN (talk) 21:27, 19 March 2020 (UTC)[reply]

Andrew Drake[edit]

Thanks for catching my error! I will add the correct generations and link them together. The DAR website is terrible, there are supposed to be 4 other searchable databases but I cant find the search page for them. There is a descendant database and a grave database and a bible database. Even finding the ancestor database from their home page seems impossible. Have you found the search page for their other dbs? If so let me know and I will add a link to my Wikidata home page. --RAN (talk) 16:53, 24 March 2020 (UTC)[reply]

Swedish nature reserves[edit]

Thank you for all the fine work you have done! I was happy to find 4900+ nature reserves of Sweden imported by you, today :) --So9q (talk) 12:16, 27 March 2020 (UTC)[reply]

A lot of new nature reserves are being created right now in Sweden. I just loaded the latest source http://gpt.vic-metria.nu/data/land/NR.zip into JOSM and it reported 5041 features (with names) some 100 more than we have. Could you import the missing ones? The NVRID/Naturvårdsregistret ID (P3613) is unique I believe so you could filter based on that. Thanks in advance!--So9q (talk) 12:44, 27 March 2020 (UTC)[reply]

Q86688005[edit]

Q86688005 is junk; I've nominated it for deletion. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:31, 7 April 2020 (UTC)[reply]

likewise Q87846062. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:33, 7 April 2020 (UTC)[reply]
and Q88114658 In each case you're creating items based, apparently, on invalid ORCID iDs. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:37, 7 April 2020 (UTC)[reply]
and Q87259374, where you added the ORCID iD "fix spelling". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:38, 7 April 2020 (UTC)[reply]
and Q86653963. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:44, 7 April 2020 (UTC)[reply]
Per What links here this rarely happens (compared with several million ORCIDs the bot handled). I don't think special treatment is required as the bot will not revisit the same ID, and the import from current source (European PubMed Central) is mostly complete (more than 99% for first 30 million IDs).--GZWDer (talk) 10:50, 7 April 2020 (UTC)[reply]
Why is your bot handling "several million ORCIDs"? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:05, 16 April 2020 (UTC)[reply]
I don't mean several million different ORCIDs (though it may reach that stage in the future), but the bot proceeded such a number of total authors with ORCID (each person may appears multiple times) and added such a number of author (P50) statements.--GZWDer (talk) 22:15, 16 April 2020 (UTC)[reply]
These errors are rare; I don't see how it can not be fixed manually.--GZWDer (talk) 04:42, 13 May 2020 (UTC)[reply]

Your reply does not address why you are adding such bad values, nor say what you are going to do to reduce or prevent such issues in the future.

On Q88225145 you added the ORCID iD http://orcid.org/0000-0003-4122-373. That is clearly junk; it is no where near matching the REGEX for that property. Why are you not checking your data before adding it? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:28, 16 April 2020 (UTC)[reply]

On Q88335805 you added the ORCID iD [1], [1], [1], [1], [1], [1], [1], [1]. there are no other statements, other than that the subject is a human. Do you think that is acceptable? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:32, 16 April 2020 (UTC)[reply]


Yeah dude, your bot is bad. It can be good too, but also bad. Check out ORCID ID#Format violations. For example, it occasionally adds full URLS as ORCID IDs, and has trouble distinguishing between lowercase and uppercase X, see for instance Cafer Akkoz (Q92466947) and Elham Anisi (Q90621031). Please give your bot a stern talking to, and maybe set it (or some other bot) on the journey of fixing its mistakes. -Animalparty (talk) 04:36, 13 May 2020 (UTC)[reply]

CAS COVID-19 Anti-Viral Candidate Compounds[edit]

This is another time you are adding statements without any consideration of property constraints. You've managed to add one statement per item with two constraint violations: catalog (P972) with violation of value-type constraint (Q21510865) and CAS Registry Number (P231) with violation of property scope constraint (Q53869507). Firstly, CAS number as a reference is nonsensical as it does not reference anything (and it should be modified or deleted), secondly – how do you plan to fix this situation you've created? Wostr (talk) 16:44, 14 April 2020 (UTC)[reply]

I'm coming to this because I repeatedly find CAS links with nonexisting CAS numbers. Seems you only do promises about cleaning up, no actual work? --SCIdude (talk) 09:09, 2 June 2020 (UTC)[reply]
@SCIdude: Currently these items do not have constraint violations. Do you mean the CASID is invalid?--GZWDer (talk) 09:14, 2 June 2020 (UTC)[reply]
Yes, see Q90545647 for example. --SCIdude (talk) 09:17, 2 June 2020 (UTC)[reply]
@SCIdude: This means CAS itself published a list of entries with some invaild CAS IDs. I will deprecate these IDs, but you should confirmed these are not found in SciFinder (no other source has a complete list of CAS entries).--GZWDer (talk) 09:20, 2 June 2020 (UTC)[reply]

Q88579292[edit]

What is the point of items like Q88579292? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:04, 16 April 2020 (UTC)[reply]

@Pigsonthewing: Per Wikidata:Property_proposal/Archive/39#P2093 use of author name string (P2093) is temporary in nature. If an item can be found then it should be used. What's the problem of this item? Do you have evidence that this conflate multiple people? (in this case you can ask ORCID to lock the profile.)--GZWDer (talk) 22:12, 16 April 2020 (UTC)[reply]
The only issue I found is someone incorrectly merged it.--GZWDer (talk) 22:21, 16 April 2020 (UTC)[reply]
I didn't say anything about author name string. I didn't say anything about the bad merge, which I already fixed. I asked "what is the point of items like that"? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:30, 16 April 2020 (UTC)[reply]
Use of author name string (P2093) is temporary in nature, so author (P50) is preferred if available. Many users are resolving authors using various tools, but this should be able to be done automatically if possible.--GZWDer (talk) 22:32, 16 April 2020 (UTC)[reply]
again: I didn't say anything about author name string. I asked "what is the point of items like that"? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:39, 16 April 2020 (UTC)[reply]
They works as normal items and more information can be added. Other toolses also make used of them (example).--GZWDer (talk) 22:56, 16 April 2020 (UTC)[reply]

Bot appears to have created duplicate item[edit]

Hello. It looks like your bot has created a duplicate item? Q88643388 was created much more recently than Q16729936. I am new to Wikidata so not sure how this all works, but thought I should let you know. Cheers Ballofstring (talk) 22:50, 16 April 2020 (UTC)[reply]

@Ballofstring: In this case Philippa Howden-Chapman (Q16729936) did not link to an ORCID so the bot can not find the item. See Help:Merge for how to handle this.--GZWDer (talk) 22:53, 16 April 2020 (UTC)[reply]
Ah right. Thanks for the link the merge page. I have done that now. Many thanks Ballofstring (talk) 23:18, 16 April 2020 (UTC)[reply]

LargeDatasetBot imported invalid DOI please cleanup[edit]

Have you noticed noticed that a your bot LargeDatasetBot has created a few Invalid DOIs - sample query (not all from your bot) - https://w.wiki/NL5 Any chance you're already in the process of cleaning these up? Wolfgang8741 (talk) 10:28, 19 April 2020 (UTC)[reply]

Once I imported all entries (currently there're 1.28 million left, and will be completed in no more than 32 days) I will fix them. I am not planning an immediate fix as I need to deal with the holes (entries failed to import) first, and if the DOIs are removed they will be imported again. Eventually the invalid DOIs may simply be removed.--GZWDer (talk) 14:54, 19 April 2020 (UTC)[reply]

Block of LargeDatasetBot[edit]

Please, see User talk:LargeDatasetBot. Lymantria (talk) 17:23, 4 May 2020 (UTC)[reply]

Discussion at project chat[edit]

Hi! You might want to join this discussion as you have been indirectly mentioned. Bovlb (talk) 18:09, 12 May 2020 (UTC)[reply]

An item for every second[edit]

Please, stop your bot right now. All those items are perfectly useless at the moment, and have absolutely no need for being created. --Sannita - not just another it.wiki sysop 10:05, 22 May 2020 (UTC)[reply]

I am doing this as a echo of Wikidata:Property proposal/local time. People use refine date (P4241) to express time of day, but it can not express everything.--GZWDer (talk) 12:47, 22 May 2020 (UTC)[reply]

Requesting Q76335638 for deletion[edit]

I'm new, but I'm not sure this meets the notability guideline. Thanks, WikiMacaroons (talk) 11:21, 27 May 2020 (UTC)[reply]

William Graham[edit]

Just noticed your said-to-be-the-same-as edits to William Graham (Q8010075) - I've checked and all three items do seem to be the same man (the husband of each daughter has an ODNB entry, and both of these state their wife's father was the Glasgow MP). Do you mind if I just merge them directly or is there any reason I should hold off? Andrew Gray (talk) 18:05, 4 June 2020 (UTC)[reply]

@Andrew Gray: Yes, I only added them because I does not found any indication in article that William Graham have more issues. Note I also asked in WikiTree.--GZWDer (talk) 18:09, 4 June 2020 (UTC)[reply]
@Andrew Gray: If possible, please also check Joseph Lawrence (Q76129051) and Joseph Lawrence (Q13529965).--GZWDer (talk) 18:11, 4 June 2020 (UTC)[reply]
Great, thanks - I'll merge the Grahams, and I've left a note at Wikitree. Lawrence is definitely mergable as well - Pollen's ODNB entry says "he married Maud Beatrice, daughter of Joseph Lawrence, a prominent Conservative MP". I'll do that as well. Andrew Gray (talk) 18:21, 4 June 2020 (UTC)[reply]
@Andrew Gray: Check Paulina Pepys (Q75495083) - the parents provided in The Peerage seems wrong, but there're some websites following it.--GZWDer (talk) 18:34, 4 June 2020 (UTC)[reply]
Hmm. The ODNB for her son Edward has "... the second but eldest surviving son of Sir Sydney Montagu (c.1571–1644) of Hinchingbrooke, Huntingdonshire, MP for Huntingdonshire, master of requests, and groom of the bedchamber to James I, and his wife, Paulina, formerly Pepys (d. 1638)"; the two-volume biography of Edward just says she was not very rich and doesn't name her parents, but it does point to a copy of her father's will here. This confirms her father was the John Pepys of Cottenham who died in 1589, with surviving children Elizabeth, Edith, Susan, Paulina, John (died 1604), Thomas (the elder), Thomas (the younger, d 1615), Robert (d 1630), Apollo (d 1644) & Talbot. His wife at the time of his death was Anne.
The footnote on p 104 about the eldest son John (died 1604) states that he married Elizabeth Bendish. Talbot is the same as Talbot Pepys (Q7679037), whose History of Parliament entry has "6th s. of John Pepys (d.1589) of Cottenham, Cambs. and Edith, da. and h. of Edmund Talbot of Cottenham".
So I think this indicates that John Pepys (Q75588710) is the brother of Paulina, and John Pepys (Q75588707) is the father - the opposite of the way the Peerage has it. Andrew Gray (talk) 18:57, 4 June 2020 (UTC)[reply]
@Andrew Gray: Joan Champernowne (Q6204937) - Are wife of Robert Gamage and wife of Anthony Denny one person? Wikipedia does not mention her marriage to Robert Gamage, and WikiTree have two profiles but they have the same day of death.--GZWDer (talk) 19:38, 4 June 2020 (UTC)[reply]
I suspect two different people (sisters?). History of Parliament & ODNB for Denny say they married in 1538 and she survived him as his widow in 1549, dying 1553. There's no date given for Gamage's marriage but their son was born 1535 and Robert Gamage died 1553 as well. This means she couldn't have remarried after being widowed, so it would imply that they had divorced, which was pretty unusual at the time and so I would expect the sources to have mentioned it somewhere if it was the case. Andrew Gray (talk) 20:31, 4 June 2020 (UTC)[reply]

Duplicate[edit]

Hi,

did you see that LargeDatasetBot sometimes creates duplicates? See e. g. Q84983076 and Q61657899 (now merged).

Best, 86.193.172.227 14:09, 9 June 2020 (UTC)[reply]

As the other item does not have ORCID the bot can not find the item. In the future when more data are imported to these items, they will be more easy to find.--GZWDer (talk) 17:42, 9 June 2020 (UTC)[reply]

RE: Q81110974[edit]

I have restored it. I have deleted a bunch of non-notable relatives items. I deleted that too because there were no sources others than the IDs. Esteban16 (talk) 21:34, 10 June 2020 (UTC)[reply]

Additions of unreferenced statements[edit]

According to Wikidata:Bots#Statement adding bots, bot operators should "Add sources to any statement that is added unless it has been agreed the data is 'common knowledge' in which case the bot should state where the information has been copied from." When I see a P31 statement saying "case report" or "meta-analysis", I expect this. Please comply with the policy. Charles Matthews (talk) 10:28, 17 June 2020 (UTC)[reply]

@Charles Matthews: If you open the European PMC reference such as [3], you may see the pubType field (this is where the data comes from). Alternatively same thing appears when clicking the PubMed link. I will add a source when doing review articles.--GZWDer (talk) 10:33, 17 June 2020 (UTC)[reply]

Thank you for your opinion on this matter. I asked whether you will comply with the policy. It seems the answer is no. Charles Matthews (talk) 10:38, 17 June 2020 (UTC)[reply]

Invalid page numbers (Zootaxa)[edit]

Please fix this mess. --Succu (talk) 20:47, 17 June 2020 (UTC)[reply]

关于许可馨事件[edit]

阁下先前认为我对于许可馨事件的条目描述不准确,我更改为:官二代言论争议事件。望获悉。 Assifbus (talk) 14:23, 18 June 2020 (UTC)[reply]

阁下说法明显错误,许可馨自己都说自己是官二代,而且根据大陆网友的深挖,发现许可馨的父亲是当地一个政府机构的局长,因此我认为称呼他为官二代,并不为过。 Assifbus (talk) 14:30, 18 June 2020 (UTC)[reply]

@Assifbus:如果有可靠来源说明许可馨的父亲的背景,建议先写到zh:许可馨事件#当事人里面。提醒一下,如果在百科认为违反BLP的内容这里也不能接受。--GZWDer (talk) 14:32, 18 June 2020 (UTC)[reply]

那在这上面写个:中国大陆言论争议事件,怎么样? Assifbus (talk) 14:36, 18 June 2020 (UTC)[reply]

@Assifbus:这样可以。--GZWDer (talk) 14:38, 18 June 2020 (UTC)[reply]

Q59039699[edit]

https://www.wikidata.org/w/index.php?title=Q59039699&diff=1030757198&oldid=964626363

I can't figure out the basis for matching these two David Ridgleys together, do you have any idea? I'm not sure that "David Ridgley" the librarian is the same as "David Latimer Ridgely" the son of the Governor. Gamaliel (talk) 12:41, 19 June 2020 (UTC)[reply]

@Gamaliel: The data is imported from Mix'n'Match system, see here.--GZWDer (talk) 12:43, 19 June 2020 (UTC)[reply]
Looks like I created this mess! I'll clean it up, because the more I dig the more I'm convinced these are different people. Thanks. Gamaliel (talk) 12:47, 19 June 2020 (UTC)[reply]

GZWDer_(flood) going at ~1000 edits per minute[edit]

GZWDer, your bot account currently creates new items at roughly 1000 edits per minute (see here, "max single user rate"), which is beyond the edit resources available for all editors at Wikidata. Please slow it down to something of the order ot 100/min maximum in order to avoid a block. —MisterSynergy (talk) 08:17, 12 July 2020 (UTC)[reply]

@MisterSynergy: I am monitering the maxlag and currently it seems normal. Are there other issues involved in edit speed?--GZWDer (talk) 08:20, 12 July 2020 (UTC)[reply]
It is related to maxlag, and you have already brought it beyond 5 s. It was ~1 s before you started your bot. Be considerate with the edit resources, and be aware that the sawtooth pattern in maxlag start at roughly 700 edits per minute *for all editors*. If you are going at 1000/min, you alone already bring the servers into trouble. —MisterSynergy (talk) 08:26, 12 July 2020 (UTC)[reply]
@MisterSynergy: Several weeks ago when a bot using edit speed <100/min, it took ~20 minutes to make maxlag climbed to five seconds. Now it takes even longer. So I do not think such edits is really expensive to query services. There are much less triples involved in a 300 byte item than a 30KB item.--GZWDer (talk) 08:31, 12 July 2020 (UTC)[reply]
It depends on other editors as well. Activity in the recent days was rather low, thus you cannot directly compare these situations.
Be aware that this is not a negotiable request. I am aware that there is no formal rate limit defined for bots currently. Yet, there is simply no reason to create pages at that rate which is known to strain the servers beyond all acceptable limits and thus I am prepared to issue a block in case I see this again. —MisterSynergy (talk) 08:36, 12 July 2020 (UTC)[reply]
Soon this will become a regular-running process using cron (instead of running a very large batch once or twice each year), so this will no longer happen.--GZWDer (talk) 08:37, 12 July 2020 (UTC)[reply]

As you seem to be unwilling to change the pacing and continued to go at edit rates of 500–1000 edits per minute with User:GZWDer (flood), I have now blocked the bot account after maxlag went beyond 5s again. Please implement a considerate throttling mechanism so that I can unblock it again. —MisterSynergy (talk) 10:18, 12 July 2020 (UTC)[reply]

@MisterSynergy: I have stopped the current process.--GZWDer (talk) 10:26, 12 July 2020 (UTC)[reply]
Okay, I am going to unblock now. If I see you going at this pace again, I will not hesitate to reblock, and request removal of the botflag from your bot account.
I have no idea how you edit, so I cannot help you how to fix the issue. You either need to drastically reduce the number of concurrent tasks (if there are any), or hard-code some sort of "sleep" command with a reasonable waiting time after each successful edit into your scripts (or do both). —MisterSynergy (talk) 10:33, 12 July 2020 (UTC)[reply]

Merged Q5651245, Q72500102[edit]

You created the duplicate Q72500102 so I merged them now. --SCIdude (talk) 08:19, 12 July 2020 (UTC)[reply]

Merged Q418993, Q72461336[edit]

You created the duplicate Q72461336. --SCIdude (talk) 08:35, 12 July 2020 (UTC)[reply]

Merged Q204178, Q72460898[edit]

You created the duplicate Q72460898. --SCIdude (talk) 08:43, 12 July 2020 (UTC)[reply]

Merged Q7103624, Q72497972[edit]

You created the duplicate Q72497972. --SCIdude (talk) 09:03, 12 July 2020 (UTC)[reply]

Merged Q26840979, Q72516856[edit]

You created the duplicate. --SCIdude (talk) 09:48, 12 July 2020 (UTC)[reply]

Merged Q22330463, Q72487289[edit]

You created the duplicate. --SCIdude (talk) 09:57, 12 July 2020 (UTC)[reply]

That auto creating new items doesn't make sense. You created a duplicate. Emptywords (talk) 12:42, 12 July 2020 (UTC)[reply]

Redundant item[edit]

Q97280767 is redundant to Q63126171. —Justin (koavf)TCM 19:52, 12 July 2020 (UTC)[reply]

See also Q63137866 and Q97187299. Why are you doing this? @MisterSynergy:. —Justin (koavf)TCM 19:59, 12 July 2020 (UTC)[reply]
They are "hidden" duplicates, which are not easy to find until an item is created.--GZWDer (talk) 20:35, 12 July 2020 (UTC)[reply]

Merged Q27109396, Q72488957[edit]

You created the duplicate. --SCIdude (talk) 09:17, 13 July 2020 (UTC)[reply]

Merged Q27155622, Q72443030,[edit]

You created the duplicate. --SCIdude (talk) 09:42, 13 July 2020 (UTC)[reply]

Bot importing *everything" from danish Wikipedia[edit]

I yesterday noticed you imported *every single* item from the danish list of new articles. The page says that new pages will at the earliest be bot-imported 3 weeks after creation, to give us reasonable time to look at them. So, if you do this again, I will ask to have your account blocked. --Hjart (talk) 21:09, 13 July 2020 (UTC)[reply]

@Hjart: I have changed the page age limit to seven days. Are there any people actively handling unconnected pages?--GZWDer (talk) 23:17, 13 July 2020 (UTC)[reply]
3 weeks please. I know I am regularly checking out that page. --Hjart (talk) 07:12, 14 July 2020 (UTC)[reply]
Your bot is creating quite a lot of duplicates, as you should know. Finding the right item to connect a page to can be too tricky for a bot. That's why people are actively handling those pages. Hjart (talk) 05:40, 16 July 2020 (UTC)[reply]
I asked you to set your page age limit to 3 weeks, so why did you import Q98097278 after 1 week? Could you please respect our wishes for our wiki? --Hjart (talk) 18:45, 7 August 2020 (UTC)[reply]
@Hjart: Fixed.--GZWDer (talk) 18:47, 7 August 2020 (UTC)[reply]

Duplicate values[edit]

Wikidata:Database_reports/Constraint_violations/P8150#"Unique_value"_violations

The some 2000 there seem to be duplicates added by you, e.g. at [4] and [5]. --- Jura 07:42, 15 July 2020 (UTC)[reply]

@Jura1: I may fix them, though this is not the highest priority. Eventually these will be removed by KrBot.--GZWDer (talk) 07:46, 15 July 2020 (UTC)[reply]
Apparently, they aren't (it's there for a month now). How come your tool adds them twice? Can you investigate the bug and fix it. What's the "sandbox identifier" added as well? --- Jura 07:50, 15 July 2020 (UTC)[reply]
@Jura1: It was Semantic Scholar corpus ID (P8299), at that time the property did not exist. I will fix them soon.--GZWDer (talk) 07:53, 15 July 2020 (UTC)[reply]
If there isn't too much noise, maybe you could try autofix for P8299. --- Jura 07:57, 15 July 2020 (UTC)[reply]
I will use QuickStatement.--GZWDer (talk) 07:58, 15 July 2020 (UTC)[reply]

Delete this[edit]

Q97417109. This is element not used. It is value Q8456102.--Jordan Joestar (talk) 05:23, 16 July 2020 (UTC)[reply]

Category items lacking property P31[edit]

Hi, your bot created the item Category:Eisenhower Fellows (Q97429282), which is good, but it should also have added property instance of (P31), which is mandatory. Can you do that in your next item creations, please? —capmo (talk) 13:47, 16 July 2020 (UTC)[reply]

Merge[edit]

Hi GZWDer. Can you merge to Q97599100 and Q11276425. Because there are same disease. I wrote a new article but forgotten to linked wikidata item. Thanks. --KediÇobanı🐈 11:45, 23 July 2020 (UTC)

@KediÇobanı: Help:Merge--GZWDer (talk) 11:47, 23 July 2020 (UTC)[reply]
Thanks. --KediÇobanı🐈 12:07, 23 July 2020 (UTC)

Please stop adding duplicates[edit]

Your practice of adding items for articles without making any checks for pre-existing items degrades wikidata and causes other users effort to clean up.

If you *must* pick up articles with no items, please at least allow a decent time - a fortnight after creation, perhaps - for users who take more care than your bot does, to do a better job that your bot is doing. --Tagishsimon (talk) 00:23, 24 July 2020 (UTC)[reply]

  • 1. Merging duplicates should be as easy as creating new items; 2. It will be helpful if items are created earlier as they may be used by others. If there's some projects who actively takes care of unconnected pages, please let me know.--GZWDer (talk) 04:15, 24 July 2020 (UTC)[reply]
YOU are the one adding duplicates. It is YOUR JOB of 1. merging them, or 2. not creating duplicates. --SCIdude (talk) 08:31, 24 July 2020 (UTC)[reply]
Unconnected items may be hidden duplicates. Creating items will make them visible.--GZWDer (talk) 08:33, 24 July 2020 (UTC)[reply]
Unconnected items may be checked by humans. "Making them visible" by creating wikidata items causes them to disappear from the wikipedia lists of new articles (in effect making them invisible), tricking us into believing they may have been properly taken care of, when they really have not.--Hjart (talk) 12:27, 24 July 2020 (UTC)[reply]
@Hjart: This assumes that there are enough people regularly checking unconnected pages, which is not the case of many large wikis. If you know some wiki doing so, let me know.--GZWDer (talk) 13:05, 24 July 2020 (UTC)[reply]
I am aware that many (even quite seasoned) wikipedia editors appear unfamiliar with Wikidata. The solution here is to communicate the purpose of Wikidata, how it works etc, more clearly, not trick all of us into believing everything is ok. --Hjart (talk) 13:22, 24 July 2020 (UTC)[reply]
nl-wiki is pretty much up to speed. But if your bot creates blank doublures after one minute of creating an article, it might look better then it really is. And your argument "items can be merged" is also false. Double items will not be easily tracked down as easy as "no item yet" will. Better put energy in creating 10 valuable items, then 100 nonsense items "just to blank the list". Edoderoo (talk) 16:33, 6 August 2020 (UTC)[reply]
This is irrelevant. Adding duplicates only makes items visible if a person does the merging work. YOU make the duplicate, YOU do the work of merging, it is YOUR duty. --SCIdude (talk) 15:25, 24 July 2020 (UTC)[reply]
So the other solution is no making items at all, however this also means it will be much more difficult to find duplicates.--GZWDer (talk) 15:31, 24 July 2020 (UTC)[reply]
Of those duplicates created by you, that I've found, many were several years old. And many I found only by luck and many others only by comparing wikipedias. --Hjart (talk) 15:48, 24 July 2020 (UTC)[reply]
This does not mean not creating new items is a good idea; By comparing wikipedias you can find some connections; but once an item is created there are more tools to find duplicates.--GZWDer (talk) 15:51, 24 July 2020 (UTC)[reply]
Please enlighten me then. What are those tools? You are creating tons of items with no data at all. Take a village somewhere with different names in different languages. With no data, how can we tell that it's the same village by looking at Wikidata? --Hjart (talk) 05:03, 25 July 2020 (UTC)[reply]
User:Pasleim/projectmerge, User:Ivan_A._Krestinin/To_merge, toollabs:wikidata-game/distributed/#game=1, none of which can be used without an item. The only tool that can work without an item is Duplicity, where I regularly found huge backlog.--GZWDer (talk) 05:10, 25 July 2020 (UTC)[reply]
User:Pasleim/projectmerge works for items with same sitelinks only. User:Ivan_A._Krestinin/To_merge appears to work for items which actually has data only. toollabs:wikidata-game/distributed/#game=1 appears to be for manually adding to items (which as we can see may be duplicates anyway) --Hjart (talk) 05:21, 25 July 2020 (UTC)[reply]
User:Ivan_A._Krestinin/To_merge works between sitelinks; for example if someone created an article for en:1234 BC and de:1234 v. Chr., it will be reported by KrBot. Duplicates can also be found when searching Wikidata. In fact, I feels very regret when someone imports a list of geographic locations from various sources (before we emptied the cebwiki backlog) as there may exist cebwiki articles that users can not find easily without items. Duplicates will eventually happens, let it happen sooner better than latter. I hate infinitely growing unconnected page backlogs.--GZWDer (talk) 05:31, 25 July 2020 (UTC)[reply]
Even then User:Ivan_A._Krestinin/To_merge appears to depend on items actually having data. With differently named sitelinks and no data it's useless. --Hjart (talk) 06:08, 25 July 2020 (UTC)[reply]
In User:Ivan_A._Krestinin/To_merge/enwiki you can find a number of items without statement. In addition, it is only one of tools to find duplicates. Appearing in this page make duplicates discoverable. They may never be discovered if no item is ever created. For users working with connection of new articles, they can still do so.--GZWDer (talk) 06:10, 25 July 2020 (UTC)[reply]
There's a lot of items in that page. Please point me to actual items in there without statements. Studying the danish version I found a bunch of false positives. Do also (again) note that (at least on the danish WP) once items are created articles are removed of the list of new articles, making them harder to check. --Hjart (talk) 06:25, 25 July 2020 (UTC)[reply]
e.g. Q97277816. If your Wikipedia version have enough people handling Wikidata connection of new pages (such as Dutch one), that is fine. But Most Wikipedias do not.--GZWDer (talk) 06:31, 25 July 2020 (UTC)[reply]
Why did you create Q97277816 without checking whether it was a duplicate? Why are you leaving it to other editors to manually merge items like this? --Hjart (talk) 07:18, 25 July 2020 (UTC)[reply]
In English Wikipedia alone, there was 30000-40000 unconnected pages. It is impossible to check them one by one, and in reality nobody is doing so.--GZWDer (talk) 07:21, 25 July 2020 (UTC)[reply]
And you think that's an excuse for not at least trying? --Hjart (talk) 08:18, 25 July 2020 (UTC)[reply]
It is not more difficult to merge duplicates than to check individual unconnectedpages.--GZWDer (talk) 08:27, 25 July 2020 (UTC)[reply]
I think your actions hides problems, more than they help solve them. Did you communicate with anybody on i.e. Danish WP before emptying our list of new articles?
This is an argument not to create items for too new articles (to let people working on them), but in some wikis there seems no people working specifically on unconnected pages at all.--GZWDer (talk) 08:49, 25 July 2020 (UTC)[reply]
  • +1 It's not our job to clear up your shit. Lots of tools and 3rd party uses rely critically on WD being well de-duplicated; and for our own use it is crucial, to make sure that relevant info drawn in from external projects all gets added to the one item, not splattered all over the project. It's a fundamental issue of good data hygiene, of crucial importance to the project's success and wellbeing. I am fed up with putting my own work on hold because there is shit that you have created that has to be cleared up as a priority. And even more fed up that you seem to take no responsibility over what you do. No effort to avoid creating it. No effort to clean up after yourself. Not even any effort apparently to help others fix the shit you create. Rule #1 around here: if you create shit, then you own it, and it's on you to do all you can to clear it up. Take some responsibility, rather than forcing everybody else to clear up after your excrement. Jheald (talk) 16:47, 24 July 2020 (UTC)[reply]
  • GZWDer, it seems a lot of headaches are caused not just by your mass imports, but by your apparent aversion to pre-planning and community discussion. Even giving a courtesy notice like "Hey I'm going to be importing hundreds of thousands of names from a random database, lots of them probably already have Wikidata items, and may have unreliable data" would help, but even better would be to announce: "Here's what I plan to do: let's discuss how best to minimize duplication beforehand, reconcile duplicates and merges afterwards, and tackle this with concerted community effort." But no, you largely work alone, in silence, dumping truckloads of messy data, in hopes that someone, some day, will clean up your messes. Wikidata is a community of editors, not just a personal game for people who like operating bots. Please act accordingly. -Animalparty (talk) 18:22, 24 July 2020 (UTC)[reply]
  • We have discussed what the bot does before: Wikidata:Requests_for_permissions/Bot/GZWDer_(flood)_4.--GZWDer (talk) 18:26, 24 July 2020 (UTC)[reply]
This is a lie. That bot description only talks about one task, but the bot did much more than what is discussed there. See for example this discussion about your chemistry import. You have created 600 duplicates and have not helped with a single merge. --SCIdude (talk) 04:23, 25 July 2020 (UTC)[reply]

In general terms, the "mission creep" in GZWDer's bot permissions should be a concern for those who granted those permissions. There should be an audit. Charles Matthews (talk) 05:07, 25 July 2020 (UTC)[reply]

I agree. I think it's time to reconsider those permissions. --Hjart (talk) 05:24, 25 July 2020 (UTC)[reply]
+1 --Voyagerim (talk) 07:56, 25 July 2020 (UTC)[reply]
+1 --Sabas88 (talk) 16:26, 6 August 2020 (UTC)[reply]
Ideally if every wiki have enough users to clean up unconnected pages, my bot will be useless. Otherwise, There is a requirement to clean up infinitely growing backlog of unconnected pages.--GZWDer (talk) 16:37, 6 August 2020 (UTC)[reply]
Just wanted to say, reasoning on a per-wiki basis is not the complete story, as there is also per-project/theme. There may or may not be editors combing through all en.wiki unconnected articles ; but I (and others) certainly do comb through en.wiki articles about video games to connect them (see here). I also comb through empty items connected to en.wiki articles about video games (the overwhelming majority of which you created) and merge a large part of them.
So for my perspective, you’re just emptying one backlog to fill up another one, more annoying to process.
I also disagree with what you say above regarding tool support − on backlog n°1, the PetScan integration with Duplicity makes it overall a breeze to spot articles with potential matches ; and the articles where really no items exist, at the very least I’d create them with P31:Q7889 (again, PetScan makes it easy). Conversely, when processing backlog n°2, I can either start improving the item by adding statements and identifiers (and maybe find out it has a dupe later on, making most of that work a waste of time) or trying to hit Special:Search, an annoying process. I have a stab at backlog n°2 pretty much every day and I have not been able to get it under 100 empty items for months on end.
I would really suggest to reconsider this workflow.
Jean-Fred (talk) 18:06, 6 August 2020 (UTC)[reply]
Empty items can be improved (included merging) via various tools. Not creating them just hides works that should be done. It is useful to convert the backlog to another way, when there are not enough people cleaning up unconnected pages. The number of unclaimed items is actually decreasing even if many new ones are created (see [6]).--GZWDer (talk) 18:16, 6 August 2020 (UTC)[reply]
I am well aware that empty items can be improved, thank you very much.
This conversation is going nowhere. You’re basically refusing to even acknowledge (let alone address!) my position and my concerns (and the ones of others). I am telling you in no uncertain words that, to me, “you’re just emptying one backlog to fill up another one, more annoying to process, and try to explain why − I outline workflows that work for me and for the project I am involved with. You just plain ignore what I say and repeat the same talking points you’ve been parroting to everyone here. This is uncollaborative, and borderline rude.
Jean-Fred (talk) 19:35, 6 August 2020 (UTC)[reply]
Jean-Fred, you hit the nail on the head. Edoderoo (talk) 19:39, 6 August 2020 (UTC)[reply]
The issue is the workflow is not scalable, at least in most wikis. In many wikis there're no people doing that at all.--GZWDer (talk) 04:20, 7 August 2020 (UTC)[reply]
You should at least try to comunicate your intent to managers of the various wikis, before doing anything. At the danish wiki for years we've had a well maintained list of new articles, not yet in Wikidata: da:Wikipedia:Database_rapporter/Nye_artikler_som_ikke_er_i_wikidata, which we're looking at and cleaning out from time to time. You didn't know anything about that, did you? Last time you cleaned it out, we had a fairly large backlog and you just assumed nobody was looking at it? You didn't even bother asking for permission? Some of those new articles were mistakes or spam and were just awaiting deletion and you just imported all of it to Wikidata. --Hjart (talk) 06:31, 7 August 2020 (UTC)[reply]
If there is someone actively cleaning up, please let me know; I will increase the minimum age of the page.--GZWDer (talk) 06:40, 7 August 2020 (UTC)[reply]
Please ask. Managers of the various wikis may not be aware of the real reason their lists get cleaned out. They may very well just think one of their members did it properly. --Hjart (talk) 07:18, 7 August 2020 (UTC)[reply]
I have revoked the flag, see Wikidata:Bureaucrats' noticeboard#Please reconsider bot permissions for User:GZWDer (flood)--Ymblanter (talk) 22:12, 7 August 2020 (UTC)[reply]

Q97621619 and similar[edit]

Hello. I notice that your bot account seems to create items for new Wikinews articles. I would suggest changing this to only create an item for a published article rather than drafts like this. It is more than likely that this WN draft will be deleted, and consequently leave an empty WD item. Green Giant (talk) 21:45, 26 July 2020 (UTC)[reply]

@Green Giant: I will skip articles with Template:Develop (Q20765099) or Template:Abandoned (Q17586361). If there are other templates for drafts, please let me know.--GZWDer (talk) 21:47, 26 July 2020 (UTC)[reply]
Cheers. I would also add any page that has one of the following: Template:Editing (Q17588240), Template:Tasks (Q13420881), Template:Minimal (Q17589095), Template:Future (Q17586294), Template:Review (Q13421187), n:Template:Quick review and Template:Breaking review (Q17586502). Green Giant (talk) 22:03, 26 July 2020 (UTC)[reply]

Deny option exists?[edit]

I've unlinked a file you created (Q97798633) leaving it dangling in the wind, and I'd unlink Q97861414 as well, only it would probably just encourage yet another uselessly created item, so I left it. Please see this discussion at the community portal. Thanks, Mathglot (talk) 00:58, 30 July 2020 (UTC)[reply]

@Mathglot: See Help:Merge and WD:RFD. For this article, I have added a soft redirect template so that it will not be imported again.--GZWDer (talk) 01:08, 30 July 2020 (UTC)[reply]
Thank you! Mathglot (talk) 01:12, 30 July 2020 (UTC)[reply]

Connect ms:HTTP 404 to Q206219[edit]

Hi. Can u help me connect ms:HTTP 404 to Q206219? --Syed Muhammad Al Hafiz (talk) 14:11, 30 July 2020 (UTC)[reply]

@Syed Muhammad Al Hafiz: Help:Merge--GZWDer (talk) 14:12, 30 July 2020 (UTC)[reply]

Thank you.--Syed Muhammad Al Hafiz (talk) 14:13, 30 July 2020 (UTC)[reply]

DSSTOX compound ids[edit]

You have imported these IDs yesterday. Where is your bot permission to do this kind of task? --SCIdude (talk) 05:53, 6 August 2020 (UTC)[reply]

I have discussed the intention in the property proposal.--GZWDer (talk) 06:00, 6 August 2020 (UTC)[reply]
So there is no permission? What is your opinion on the notion that imports to items that are part of a WikiProject require at least a note on their talk page? Also, if you have a DSSTOX entry, how do you identify the item where you put that statement? --SCIdude (talk) 07:26, 6 August 2020 (UTC)[reply]
The DSSTOX dump contains a mapping from DTXSID to DTXCID.--GZWDer (talk) 07:28, 6 August 2020 (UTC)[reply]
And you have used that mapping? So, you pretend to have no opinion on communicating with a WikiProject, despite failing to do so frequently. What is your approach to teamwork, in general? --SCIdude (talk) 07:40, 6 August 2020 (UTC)[reply]


Mass creation of items without labels[edit]

Please check https://editgroups.toolforge.org/b/QSv2/28721/ and theses from https://quickstatements.toolforge.org/#/batch/28721 (+possibly others). --- [[User talk:|Jura]] 12:12, 11 August 2020 (UTC)

I no longer use QuickStatements to create new items. For existing items, they will be fixed by MatSuBot.--GZWDer (talk) 12:14, 11 August 2020 (UTC)[reply]
@Jura1: Do you find other examples?--GZWDer (talk) 17:42, 11 August 2020 (UTC)[reply]

There is a detailed record of James McCutcheon Baker's military career at [7] and extensive primary records at FamilySearch [8]. Brother of Page Mercer Baker, editor-in-chief of the New Orleans Times-Democrat, who certainly deserves an item (see front page obituary in Times-Democrat May 29, 1910; Find a Grave; Confederate Military History, Volume 10, pages 330-332, etc.) Son of James McCutcheon Baker (c. 1777-1861) who served in the War of 1812 and Mexican War but doesn't seem to have lived a very memorable life. — Levana Taylor (talk) 04:25, 19 August 2020 (UTC)[reply]

THe Peerage[edit]

Please look at Wikidata:Administrators%27_noticeboard#Please_restore_the_red_link%2Flinks_in_this_family_tree where individual people from The Peerage upload are being deleted, creating gaps in the family. --RAN (talk) 17:12, 20 August 2020 (UTC)[reply]


GZWDer (flood)[edit]

Hi, could you please stop your bot "GZWDer (flood)" from crating Wikidata items based on articles from the Bulgarian Wikipedia as it's doing more damage than help. Thanks. --StanProg (talk) 11:52, 28 August 2020 (UTC)[reply]

@StanProg: This bot is currently not running. You are welcomed to comment on Wikidata:Requests for permissions/Bot/RegularBot 2 (its intended successor) to provide issues you identified (but, there are no need to repeat what others already said.)--11:55, 28 August 2020 (UTC)

2020-09 Catholic Encyclopedia[edit]

Hello,

Please explain special:diff/1275840022 + special:diff/1275984724 (maybe ther are other diff of this kind). Visite fortuitement prolongée (talk) 07:47, 12 September 2020 (UTC)[reply]

There are plenty of users links such Encyclopedia articles to the items about the topic, which is wrong. They should have their own items. Creating them will prevent incorrect sitelinks from being added.--GZWDer (talk) 09:55, 12 September 2020 (UTC)[reply]
Thank you for your quick answer. Visite fortuitement prolongée (talk) 10:42, 12 September 2020 (UTC)[reply]

翻译通知:Template:Hello[edit]

嗨!GZWDer,

您收到此通知,是因为您作为一名中文的翻译者在Wikidata注册了。页面Template:Hello已提供翻译。您可在此翻译:

这个页面有中重要度。


非常感谢您的帮助。像您一样的翻译者正在不断努力帮助Wikidata成为一个真正多语言的社群。

您可以更改您的通知设置

感谢您! Wikidata翻译协调员‎, 19:10, 22 September 2020 (UTC)

American National Biography[edit]

Why are you removing this from P1343? It is a valid use of this property. Gamaliel (talk) 12:34, 26 September 2020 (UTC)[reply]

@Gamaliel: See Wikidata:Project_chat#Possible_duplicated_information.--GZWDer (talk) 22:36, 26 September 2020 (UTC)[reply]

Same author added several times[edit]

I noticed a few times that your bot sometimes adds the same author twice, such as here: Microarray analysis of serum mRNA in patients with head and neck squamous cell carcinoma at whole-genome scale (Q33597997) - two of the authors there are stated twice. Why do you think this keeps happening and what can we do to fix it? Vojtěch Dostál (talk) 14:45, 26 September 2020 (UTC)[reply]

See also discussion at Wikidata:Requests for permissions/Bot/Orcbot for such kind of issue.--GZWDer (talk) 22:38, 26 September 2020 (UTC)[reply]
Thanks, @ArthurPSmith: who seems to have noticed those errors too. Vojtěch Dostál (talk) 14:09, 28 September 2020 (UTC)[reply]
The case Vojtěch Dostál points out seems odd to me, in that I can't see how your bot was confused in this case - there's only one copy of the duplicated author listed in the source (the referenced PubMed link), and no co-author with the same last name, etc. Cases I've seen were more where there was a problem in the source, or there were several authors with the same surname (or even the same exact name string). Generally only one of the matching authors in those cases is the correct match; however I have seen a few papers where the same author was (accidentally I assume) duplicated in two different places, so there are legitimate cases where you could have the same P50 value with two different series ordinal numbers in the same author list. But usually that indicates something went wrong... ArthurPSmith (talk) 17:02, 28 September 2020 (UTC)talk:ArthurPSmith[reply]
@ArthurPSmith: Yup. I've seen this problem at least 5 times when going through recent imports. In all of these cases, the author was only stated once in the original publication and there weren't two authors of the same name present. I will post another examples here when I come across them. Vojtěch Dostál (talk) 17:55, 28 September 2020 (UTC)[reply]
@ArthurPSmith: Same here: Q35094372. Here, it is caused by the same author having two separate records (Q63982890 and Q79878785) which for some reason get both added and numbered. Vojtěch Dostál (talk) 20:43, 29 September 2020 (UTC)[reply]
@Vojtěch Dostál: Right, that's the kind of case I was talking about regarding the source - if you go to the linked EuropePMC reference you will see they are the source of the error (well, ultimately it's because this author has 2 different ORCID id's, and EuropePMC interpreted that strangely). ArthurPSmith (talk) 15:11, 30 September 2020 (UTC)[reply]
@ArthurPSmith: Ah, now I get it, interesting.Vojtěch Dostál (talk) 18:45, 30 September 2020 (UTC)[reply]

Importing from Geni[edit]

Hello! Does your bot import from Geni in a random way or is it possible to import things by demand? Thanks! -Theklan (talk) 10:20, 29 September 2020 (UTC)[reply]

@Theklan: The code is here. To use it:
  • Open [9] and create a terminal (or, you can use the terminal in your computer) and start Python 3
  • Copy all codes to the terminal
  • Login to Geni and get an access token here (the token will expire in 24 hours and you need to get a new one every day)
  • Set the access token via access_token="XXX"
  • Find an item with Geni ID
  • Find the ahnentafel number you want to import (e.g. father is 2, mother is 3, paternal grandmother is 5)
  • Run dogeni("Item QID or Geni ID", ahnentafel number) - QID is preferred as the code use query service to get QID from Geni ID, which may be lagged
  • If you merged any items created in this process, please wait some time (to lat the query service lag catch up) and run geni_items={}

--GZWDer (talk) 11:41, 29 September 2020 (UTC)[reply]

Thanks! I will try! -Theklan (talk) 14:57, 29 September 2020 (UTC)[reply]
I got lost in the second step... :( -Theklan (talk) 20:42, 9 October 2020 (UTC)[reply]

@Theklan:

--GZWDer (talk) 22:35, 9 October 2020 (UTC)[reply]

Items for category redirects[edit]

Hi GZWDer,

It seems that this (19:25, 22 February 2018‎) created an item for a category redirect [10] ( 17h41min de 22 de fevereiro de 2018‎). There are few similar ones. I'm cleaning up some of them, but more is probably needed. --- Jura 14:06, 2 October 2020 (UTC)[reply]

See User_talk:GZWDer/2019#Q61016578. This may be simply checked by finding items linked to pages in Categoria:!Redirecionamentos de categorias.--GZWDer (talk) 14:59, 2 October 2020 (UTC)[reply]
So you are finally going to take care of it or should I added it to Wikidata:Bot requests? --- Jura 19:07, 2 October 2020 (UTC)[reply]
This may be found using PetScan, but currently PetScan is down.--GZWDer (talk) 19:10, 2 October 2020 (UTC)[reply]

DAHR Database[edit]

Did you see the the people at the DAHR database want to load the entire database the way you loaded The Peerage? Do you want to work with them. Do you have a tutorial on how you did it? --RAN (talk) 15:41, 3 October 2020 (UTC)[reply]

翻译通知:Wikidata:Eighth Birthday[edit]

嗨!GZWDer,

您收到此通知,是因为您作为一名中文的翻译者在Wikidata注册了。页面Wikidata:Eighth Birthday已提供翻译。您可在此翻译:

这个页面有低重要度。


非常感谢您的帮助。像您一样的翻译者正在不断努力帮助Wikidata成为一个真正多语言的社群。

您可以更改您的通知设置

感谢您! Wikidata翻译协调员‎, 10:57, 11 October 2020 (UTC)

翻译通知:Wikidata:Vandalism[edit]

嗨!GZWDer,

您收到此通知,是因为您作为一名中文的翻译者在Wikidata注册了。页面Wikidata:Vandalism已提供翻译。您可在此翻译:

这个页面有高重要度。


非常感谢您的帮助。像您一样的翻译者正在不断努力帮助Wikidata成为一个真正多语言的社群。

您可以更改您的通知设置

感谢您! Wikidata翻译协调员‎, 02:46, 15 October 2020 (UTC)

German lexemes[edit]

Can you please stop mass importing empty German lexemes? You're creating TONS of work for everyone else. - Nikki (talk) 17:57, 3 November 2020 (UTC)[reply]

@Nikki: Stopped, but what should be done for existing lexemes and further new imports? Note forms are planned to be added in a future stage.--GZWDer (talk) 17:59, 3 November 2020 (UTC)[reply]
Then tell us how you’re going to add forms? Instead of spamming thousands of almost-worthless lexemes… (For instance, and this is just from a cursory glance through your contributions, you’re creating old-spelling lexemes like (L345233) and (L345230) using the wrong language code.) --Lucas Werkmeister (talk) 18:06, 3 November 2020 (UTC)[reply]
@Nikki, Lucas Werkmeister: the data source does include forms. However there is a issue how to model forms, for example heiter (L2273) (this lexeme already exists and the bot will not create a new one) contain only one form in Wikidata, four in German Wiktionary, but 208 in English Wiktionary, and 219 in the source. Before we have an agreement on how to model them, I will only import the lexeme (as it is possible to add forms in any time). You can download the source I used (this and two others are also imported to s54101__flexion_p).--GZWDer (talk) 18:13, 3 November 2020 (UTC)[reply]
Prior to resolving the concerns of the above users (plus @Robin van der Vliet, VIGNERON, Bodhisattwa: who have also expressed concerns about these imports in other fora), I have blocked your bot from editing the Lexeme: namespace indefinitely. Mahir256 (talk) 21:04, 3 November 2020 (UTC)[reply]
@Mahir256: Do you have copy of relevant discussion "in other fora"? So that I can see what is the concern.--GZWDer (talk) 23:55, 3 November 2020 (UTC)[reply]
@GZWDer: It is perhaps better if you join these fora (which, incidentally, are Wikidata-related Telegram channels) rather than cause a privacy issue copying messages over here. Mahir256 (talk) 01:34, 4 November 2020 (UTC)[reply]
@Mahir256: You may sum up points they raised. I do not use Telegram.--GZWDer (talk) 06:40, 4 November 2020 (UTC)[reply]
The summary is exactly what Nikki said: « Can you please stop mass importing empty German lexemes? ». Your creations does not meet the minimal expectation for a lexeme, in particular there is no statements, no sense and no form, plus you did not let the community before, you don't quote your source and you made some mistakes. Separately these points are not a problem but all together, this is unacceptable. Cdlt, VIGNERON (talk) 07:25, 4 November 2020 (UTC)[reply]
@VIGNERON: We does need some agreements on how to add forms (I have posted Wikidata_talk:Lexicographical_data#German_adjective). But on the other hand there are many lexemes with no forms at all such as fest (L1776) and platt (L311959) so in my plan we created bare lexemes first and fill forms later when an agreement is reached.--GZWDer (talk) 08:40, 4 November 2020 (UTC)[reply]
Note the Flexion database does not provide senses and even they did it may be a copyright violation to import them.--GZWDer (talk) 08:44, 4 November 2020 (UTC)[reply]
As I said « Separately these points are not a problem ». For instance, I would have no problem if you added no forms and senses but gave the source and some statements. But here, you gave absolutely nothing and this is problematic. fest (L1776) and platt (L311959) are bad example, because other people did bad creation does not mean that other people can do (or even should do) bad creation. And at the very least, these lexemes give an identifier (still bad but a bit less worse). Thanks for starting the discussion on Lexicographical data, it's a bit late but better late than never. PS: speaking of copyright, is this source compatible with Wikidata? Cheers, VIGNERON (talk) 09:35, 4 November 2020 (UTC)[reply]
Per m:Wikilegal/Lexicographical_Data#Conclusion, the following is not under copyright: 1. individual lemma and 2. inflection forms.--GZWDer (talk) 11:43, 4 November 2020 (UTC)[reply]

Why?[edit]

Hello GZWDer, I noticed your bot is creating a lot of Q-items that seems to be unwanted. Take Q101228743, next to the label with the name of this female there is a link to a genealogical site and there is a link to a child and that's about it. No description so it is unclear what makes her special enough to have a Q-item, not even an indication what century she was living or if she is still alive, nothing like that. In no Wikimedia project there seems to be a page about her or that child. Why do you think you bot should create an item like this? Please for now stop your bot to create items like this. - Robotje (talk) 11:42, 6 November 2020 (UTC)[reply]

@Robotje: The people you mentioned is described in Geschichten Und Thaten Kaiser Richards Aus Dem Geschlecht Der Könige in Engelland So Insgemein Die Historie Des Interregni Genennet Wird; Nach Denen Schrifften Derselben Zeit Ausgefertiget (Classic Reprint) ([11]) although the URL is currently not working. More importantly, Wikidata intentionally have a lower notability standard than Wikipedia.--GZWDer (talk) 13:43, 6 November 2020 (UTC)[reply]
So you think there used to be a page in the German language on a Russian website that mentioned this lady who is supposed to have a remote link with kings from England. The link is not working, and archive.org never made a copy of it. I hope you understand this is absolutely not convincing to me. How can I nominate that Q-item for deletion? - Robotje (talk) 14:37, 6 November 2020 (UTC)[reply]
@Robotje: Wikidata does not require any significant coverage; only one sourced statement is enough.--GZWDer (talk) 14:40, 6 November 2020 (UTC)[reply]
So far you did not provide me with a sourced statement! On top of that it looks like you are on purpose giving me an extra hard time to find out how to nominate it for deletion. OK, I will find out myself. No wonder I see a lot of complaints about you and your bots on this page. - Robotje (talk) 14:50, 6 November 2020 (UTC)[reply]
@Robotje: As you see Genealogics is not a primary source. It gets data from Voorouderstafel van het Belgisch Koningshuis, Brussel, 1998, which is a book I (currently) have no access to.--GZWDer (talk) 14:55, 6 November 2020 (UTC)[reply]
Sure, first you think she was mention in a German text on a Russian website about kings from England but the link is not working and webarchive never took it as important enough to make a copy. Now you come up with a book in Dutch refered on a genealogical website (in general those websites are not so reliable). And now it is not longer about a remote link with a king from England but the Belium Royal family. The first person to become a king in Belgium did so around 1830. The father of Dorothea's daughter died in 1589. So there are like 8 or 10 generations in between! And that book in Dutch you don't have access to. You persist in not helping me to find out how to nominate the item. Never mind, I found Wikidata:Requests_for_deletions myself. If you create more of these pages/items, I will contact you again, but I hope that won't be necessary. - Robotje (talk) 15:15, 6 November 2020 (UTC)[reply]

Raja[edit]

Hi there,

I can't understand this revert. Everything feels right for me. Could you please elaborate?

Best, Nomen ad hoc (talk) 10:01, 20 November 2020 (UTC).[reply]

@Nomen ad hoc: The previous version contains two advisors with same name Q16018830 and Q53772616. Please check which is correct.--GZWDer (talk) 10:02, 20 November 2020 (UTC)[reply]
Ah yes, I didn't notice that. Thanks! Nomen ad hoc (talk) 13:02, 20 November 2020 (UTC).[reply]

MGP imports[edit]

It's a great idea importing this information, but when you do so, please add advisors as doctoral advisor (P184) instead of student of (P1066), as you did for instance here. Similarly, add students as doctoral student (P185) instead of student (P802), as you did here. Thanks. --Bender235 (talk) 18:10, 21 November 2020 (UTC)[reply]

@Bender235: Many pre-modern scholars have some degree that is not a doctorate. As Dr. rer. pol. may be considered a doctorate I have updated the code. There are 30 current cases for this. (In the future I may revisit the list for P1066.)--GZWDer (talk) 18:18, 21 November 2020 (UTC)[reply]
This is certainly correct when you look at scholars from the day of Euler and Gauss, but for 20th-century and even 19th-century academics, MGP almost exclusively lists doctorates (in their national varieties). --Bender235 (talk) 18:22, 21 November 2020 (UTC)[reply]
I found some entry as recent as 1910s that is only Master or even Bachelor, but is included as they are advisors of someone else.--GZWDer (talk) 18:39, 21 November 2020 (UTC)[reply]

90x identical "ACS Publications", 70x "Spotlights on Recent JACS Publications"[edit]

This search finds, among few other things:

1. about 90 items labeled "ACS Publications", which are legitimate articles but mislabeled

2. about 70 items labeled "Spotlights on Recent JACS Publications, which seem to be the result of misidentifying a recurring item

3. Dozens of items labeled "Publications of <some name>", (example: Publications of William M. Gelbart (Q87835990)) which seem to originate from a similar mistake as (2), above

--Matthias Winkelmann (talk) 04:19, 22 November 2020 (UTC)[reply]

Geni import[edit]

Wouldn't it be better to import women under their maiden name, instead of their married name? It would match the importation from The Peerage, and we would be better able to merge duplicates. --RAN (talk) 20:14, 23 November 2020 (UTC)[reply]

As an aside to the above, names from Geni.com often bear little resemblance to names as commonly used in reliable sources. Geni should be considered one of the least reliable sources for personal names. -Animalparty (talk) 03:20, 24 November 2020 (UTC)[reply]

Mathematics Genealogy Project import[edit]

Note that some universities have wrong/non-unique name on the MGP website which results in the imported statements linking to disambiguations (esp. Polytechnic (Q17071325)!): https://w.wiki/nqB --Mormegil (talk) 11:58, 24 November 2020 (UTC)[reply]

You can create a list of schools affected. After the import they may be corrected or removed.--GZWDer (talk) 15:02, 24 November 2020 (UTC)[reply]
That was exactly the point. The list is in the result of the above-linked query. --Mormegil (talk) 10:28, 25 November 2020 (UTC)[reply]
As more may come they will be fixed once the import is complete.--GZWDer (talk) 11:54, 25 November 2020 (UTC)[reply]

BreezeAgent[edit]

Hi there,

I'm not sure that you was pinged correctly; may I ask you what you think of the update on the Wikidata:Requests_for_deletions#Q92070347BreezeAgent case? His IMDB page is now empty: don't you think that your vote is now obsolete (such as mine)? Best, Nomen ad hoc (talk) 08:28, 3 December 2020 (UTC).[reply]

Thanks and good continuation, Nomen ad hoc (talk) 20:48, 4 December 2020 (UTC).[reply]

More items with problematic labels (inclusion of military ranks and other prefixes in English labels)[edit]

Hi GZWDer,

I thought the problem with prefixes in labels was understood. How come we get "Lieut. Commander Charles William Gerald Coventry" [12] in June 2020? The correct label would have avoided a duplicate for "Charles William Gerald Coventry" (Q75322544). This means that probably more cleanup is needed.

Previous discussions were at User_talk:GZWDer/2019#Prefixes_in_labels and Wikidata:Requests for permissions/Bot/RegularBot 3.

If you need help fixing them, please make a request at Wikidata:Bot requests. --- Jura 07:50, 6 December 2020 (UTC)[reply]

Discussion at project chat[edit]

There is a discussion at project chat that would benefit from your input. Bovlb (talk) 05:26, 10 December 2020 (UTC)[reply]

DAHR Database[edit]

Did you ever get in contact with User talk:Saverkamp about uploading the DAHR database? --RAN (talk) 04:08, 11 December 2020 (UTC)[reply]

Why replace the template for Wikiprojects?[edit]

https://www.wikidata.org/w/index.php?title=Template%3AAnatomy_properties&type=revision&diff=1324359495&oldid=1301428705 It's not clear to me what benefit this should create. On the other hand it has clear disadvantages in making it harder to edit the content of the template. ChristianKl17:52, 17 December 2020 (UTC)[reply]

Import Data from Geni[edit]

Hi GZWDer, I would like to discuss making a bot with you to import data from Geneology sites. Would you be interested to work together? Germartin1 (talk) 04:38, 21 December 2020 (UTC)[reply]

@Germartin1: See User_talk:GZWDer#Importing_from_Geni - Please check before import and do not import data in any large scale as Geni contains many errors.--GZWDer (talk) 21:02, 22 December 2020 (UTC)[reply]

MGP IDs[edit]

Hi. I notice several errors related to MGP IDs. For example, You added the (wrong) MGP ID 240646 to the item Q102198036. Nevertheless, this ID belongs a student of him. Moreover 240646 had already a MGP ID (42403, the correct one), and those IDs are supposed to be unique. Same bug in Q102188292. --Zbmath authorid (talk) 07:47, 21 December 2020 (UTC)[reply]

  • @Zbmath authorid: In https://mathgenealogy.org/id.php?id=240471 the MathSciNet link goes to Jason Michael Starr. You may report the error to MGP.--GZWDer (talk) 10:50, 21 December 2020 (UTC)[reply]
  • @GZWDer: I think you mixed up the IDs of the person themselves, and the IDS of their advisor or students. https://mathgenealogy.org/id.php?id=240471 links to a person named "Dingxin Zhang" whose PhD advisor was MPG-ID 42403 (Jason Michael Starr).
    • @Zbmath authorid: The page has a link to https://mathscinet.ams.org/mathscinet/search/author.html?mrauthid=703981.--GZWDer (talk) 15:24, 21 December 2020 (UTC)[reply]
    • @GZWDer: Yes indeed, and that link is also wrong (I reported it). This is a different mistake. Please just observe that you mixed-up the person and its advisor. The link mathscinet<->MGP is one thing, but I'm speaking about the link "wikidata<->MGP", which was wrong in Q102198036. Another bug, which might be related: in the profile Q102198036, you can see that the item (Jason Michael Starr) appears himself in his own list of doctoral students!--Zbmath authorid (talk) 13:53, 23 December 2020 (UTC)[reply]
    • @GZWDer: The "Self link violations" is a good idea. And it shows definitely a bug that mixes-up between the item themselves and their doctoral students. On the other hand, I do not see the point in using mathscinet: first of all, the presence of duplicate in MGP is well known. Although they correct it rather quickly when somebody reports it, this can not be avoided. (But it is good, that there is a "Single_value_violations"). On the other hand, I would not rely on the mathscinet links, which are also sometimes wrong (like in the case of MGP 240471 = "Dingxin Zhang"), since this is a different thing. Many MGP items have no mathscinet link.--Zbmath authorid (talk)
      • The import is one-off and because there are many preexisting items with MathSciNet IDs we need to prevent new duplicates. Some of entries with same IDs are true duplicates.--GZWDer (talk) 16:45, 21 December 2020 (UTC)[reply]
      • @GZWDer: Do I understand correctly, that you intended to import the full MGP database and create accordingly a new wikidata item for each? Did you propose this project for discussion before implementing it? I'm not sure that it is a good idea in this form. First of all, there are many MGP items that contain very few information. Those items are definitely not worth a wikidata item creation. This would create many superfluous wikidata items for which there is almost no information, and almost no way of verifying this information. The information source of many of the MGP items is know only from one person (the maintainer of MGP database, which is more or less a 1-person project), and is recorded nowhere. Secondly, how do you check, that a MPG item needs a new wikidata entity creation? Do you perform at least a kind of name matching to check the possible existence of a wikidata item? I noticed many duplicate creation because of your bot...--Zbmath authorid (talk)
    • @GZWDer: Can you please answer my question: did you identify the bug in the bot, that mixes up the MGP items with their own advisor/students?--Zbmath authorid (talk)