User talk:Maximilianklein

From Wikidata
Jump to navigation Jump to search
Logo of Wikidata

Welcome to Wikidata, Maximilianklein!

Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!

Need some help getting started? Here are some pages you can familiarize yourself with:

  • Introduction – An introduction to the project.
  • Wikidata tours – Interactive tutorials to show you how Wikidata works.
  • Community portal – The portal for community members.
  • User options – including the 'Babel' extension, to set your language preferences.
  • Contents – The main help page for editing and using the site.
  • Project chat – Discussions about the project.
  • Tools – A collection of user-developed tools to allow for easier completion of some tasks.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date.

If you have any questions, please ask me on my talk page. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.

Best regards!

--Ymblanter (talk) 17:47, 28 January 2013 (UTC)

Wikidata:Requests for permissions/Bot/VIAFbot[edit]

Discussion would be good the only issue is that I never use IRC. I got it to work once, but it seems to be blocked on the network I am on. I just tried to connect to #wikimedia-commons using Mibbit and my connection was terminated after jess than a second :( . I was also thinking about possibly jumping into the wikidata bot business. So far, I was waiting for sitelinks and linking to Commons, but I could help out with this task. --Jarekt (talk) 20:34, 7 March 2013 (UTC)

Conference call would be the simplest, I am US based. --Jarekt (talk) 21:11, 7 March 2013 (UTC)
Your, doodle poll is for 3 weeks from now? --Jarekt (talk) 21:45, 7 March 2013 (UTC)


Hello, I imagine you will have some busy time with authority controls, but I was wondering if you know of any possibility to upload worldcat data to here (I do not know if that would raise copyright issues). In any case, you may be interested in Wikidata_talk:Notability#Books. --Zolo (talk) 11:21, 23 March 2013 (UTC)

With the experience gathered from writing the Authority control bot, I plan to write another bot that will fill in Infobox book type data. It's in the works. Maximilianklein (talk) 00:05, 24 March 2013 (UTC)
Great, thanks. By the way do you plan to upload VIAF data only when it is already in Wikipedia, or for all people, even those without Wikipedia articles. If the latter is manageable, that would surely be very useful. We will need tons of items on books, and we will need items for their authors. --Zolo (talk) 16:49, 24 March 2013 (UTC)
That is an interesting question. If you'd like to bring it up, try and come to the meeting we're having soon about AC data Wikidata:Requests_for_permissions/Bot/VIAFbot/Meeting_agenda Maximilianklein (talk) 23:15, 24 March 2013 (UTC)

GND main types[edit]

Hi Max, I just saw that there have been filed a RFD for Property:P107 (GND main types). Could you take a look at Wikidata:Requests for deletions#Property:P107? Thanks --Kolja21 (talk) 23:50, 28 March 2013 (UTC)

Thanks commenting now. Maximilianklein (talk) 00:27, 29 March 2013 (UTC)

VIAF: "Cluster has been deleted"[edit]

Hi Max, since your an VIAF expert, can you help me with de:Åke Blomström? It's not the first time I'll get the message: "This VIAF Cluster has been deleted". But in this case I can't find any new or dub VIAF identifier. (Åke Blomström @BIBSYS) --Kolja21 (talk) 04:51, 29 March 2013 (UTC)

Interesting case. So for every record in a national library file there should be at least one VIAF cluster. But since this is the only record in the VIAF cluster, and it's deleted (and not redirected), what does that mean? Well it implies that somehow, that person is no longer in the Norwegian BIBSYS national file according to VIAF! It means BIBSYS is telling VIAF the person isn't in their file anymore. Maximilianklein (talk) 19:11, 29 March 2013 (UTC)
It's a pity BIBSYS has no authority database that we could check. Do you know a person at BIBSYS I could contact? They have two email addresses (contact information: and I've tried both: "Mail delivery failed. Domain has no mail exchangers." --Kolja21 (talk) 13:17, 30 March 2013 (UTC)
Sorry, I don't know anybody at BIBSYS personally. What's your interest in this record specifically? Maximilianklein (talk) 16:18, 1 April 2013 (UTC)
I created the stub for Åke Blomström but also I would like to know more about BIBSYS. --Kolja21 (talk) 19:14, 1 April 2013 (UTC)
From browsing VIAF recntly, I got the impression that BIBSYS is no longer a part of VIAF sources. See list on VIAF homepage (also in the history section of a VIAF entity, BIBSYS entries have been deleted in November/December 2012) – Unfortunately I can't remember if BIBSYS icon had a Warning sign (exclamation mark in yellow triangle) attached prior to the removal from VIAF. -- Make (talk) 17:21, 10 April 2013 (UTC)
I heard they are "scaling back" I'm not sure if that means not contributing at all. I will check. Maximilianklein (talk) 17:46, 10 April 2013 (UTC)
That would be great. Thanks! -- Make (talk) 18:42, 10 April 2013 (UTC)
BIBSYS are still contributing, its just that, at their request, the name has been changed to National Library of Norway. You should still be able to see that name on the front page next to the Norwegian flag. 19:09, 10 April 2013 (UTC)
Yes, they are still contributing, but as I understand it they are also changing which records they share, which explains the behaviour. Maximilianklein (talk) 21:06, 6 May 2013 (UTC)

Please see Property talk:P214#identifying Wikidata page where a VIAF identifier is used about This VIAF Cluster has been deleted. It is no longer part of VIAF.
Can you please comment there? Thanks! לערי ריינהארט (talk) 10:12, 21 October 2013 (UTC)


Congratulations. I've just looked at the first edits of VIAFbot and found no errors. Good job! --Kolja21 (talk) 21:46, 2 April 2013 (UTC)

Yay, thanks. I now have to polish the code, do some internal stat counting. And generate the full database files for viaf redirect following and deadness. Maximilianklein (talk) 23:28, 2 April 2013 (UTC)

this edit should not have happened? Could you have a look? Cheers -- Make (talk) 17:27, 10 April 2013 (UTC)

Sorry, I've been staring at a lot of thesw. What's wrong with it exactly? Maximilianklein (talk) 17:46, 10 April 2013 (UTC)
The item already had the property "LCCN identifier" when VIAFbot touched it: [1] (user adds property "LCCN identifier" but gets standardization wrong), [2] (that was me, correcting the LCCN). VIAFbot just added the source statement "Imported from: English Wikipedia" -- Make (talk) 18:38, 10 April 2013 (UTC)
If the statement on Wikidata already matches what was in English Wikipedia, I add a source that says "Imported from English Wikipedia", because I'm basically corroborating that English Wikipedia supports this statement. I can't add a second statement that has the same value, so I have to add a source. Can you think of a better way to do this? 19:13, 10 April 2013 (UTC)

404 errors[edit]

Hello, there seems to be a problem with soem LCCN links, would you have a look at MediaWiki_talk:Gadget-AuthorityControl.js#P244 ? Thanks --Zolo (talk) 06:26, 3 April 2013 (UTC)

+1. The LCCN is tricky, see Property talk:P244: Format Summary. de:Template:Normdaten changes "n/93/77779" (Miguel Indurain) into "n93077779". --Kolja21 (talk) 23:23, 3 April 2013 (UTC)
Ok, I've fixed that in my program, thanks. Maximilianklein (talk) 23:47, 3 April 2013 (UTC)
BTW, do you know who is User:SamoaBot it seems like they are importing all of Normdaten. Except it's indiscriminate and not just TYP = P, even TYP = N. I left them a note, but you want to talk to them as well. It's clasing with my bot a bit, since I'm writing LCCN with '/' with the expectation that it'll be formatted, but Samoabot is writing it without '/' which is important, as you know, normdaten already gets it right. Maximilianklein (talk) 23:37, 3 April 2013 (UTC)
No, I don't know who that is. The outcome of the LCCN discussion was "Normalized LCCN" (not separating the 3 parts). I'll take a closer look at SamoaBot. --Kolja21 (talk) 23:51, 3 April 2013 (UTC)
So can I change the link to '$1' as suggested by user:Docu ? --Zolo (talk) 07:06, 4 April 2013 (UTC)
Yes, I think this is correct. --Kolja21 (talk) 17:43, 4 April 2013 (UTC)
ok done.--Zolo (talk) 18:42, 4 April 2013 (UTC) has some info on LCCN normalization with links to other standardizing resources. -- Make (talk) 17:24, 10 April 2013 (UTC)

I read that, and think I've implemented it correctly now. Maximilianklein (talk) 17:46, 10 April 2013 (UTC)


Also BnF "est difficile". Jules César has the BnF identifier 11894764. VIAFbot added 11894764p. The "p" is part of the web link but not part of the identifier. (Other links have a "q" or even no letter.) --Kolja21 (talk) 23:42, 3 April 2013 (UTC)


Should we add "0000 0001 2095 5689" with spaces (like in Vincent van Gogh)? See Property talk:P213. --Kolja21 (talk) 00:40, 4 April 2013 (UTC)

I've replied now. Can you tell me how certain properties are rendered into links? I don't understand how the software does it because it seems like the data type even on LCCN is "string". How does this work? Maximilianklein (talk) 18:15, 4 April 2013 (UTC)
This job is done by MediaWiki:Gadget-AuthorityControl.js - at least it was last month. I don't know if I'm up to date. --Kolja21 (talk) 19:27, 4 April 2013 (UTC)
I left a note on the talk page. --Kolja21 (talk) 19:37, 4 April 2013 (UTC)

Two VIAF identifiers[edit]

I posted the following query on WD:PC and was told I might ask you.

The author Carlos Castaneda (Q158878) has two VIAF identifiers associated with him. I added both, since the VIAF database contains two numbers for him. Probably the duplicate numbers are due to a mix-up regarding his year of birth, see for example [3] or [4].

It seems to me there should be some place where this should be reported, but I don't know where. Gabbe (talk) 10:36, 4 April 2013 (UTC)

Great, thanks for adding the multiple. Maximilianklein (talk) 16:04, 4 April 2013 (UTC)
+1, with one exception: don't add VIAF identifiers that have only an "undifferentiated" entry. (These are no persons, only placeholders for names.) --Kolja21 (talk) 17:49, 4 April 2013 (UTC)
So basically, I've already (foresightedly) done everything that ought to be done? Great! :) Gabbe (talk) 18:23, 4 April 2013 (UTC)
Yes! that's the whole point. Great foresight ;) Maximilianklein (talk) 18:32, 4 April 2013 (UTC)

This seems great. What are your thoughts for how to develop VIAF and other meta-auth systems in parallel? Is there a collab roadmap among those entities? Sj (talk) 19:10, 11 April 2013 (UTC)

  • If the question is "How is VIAF collaborating with other Authority Control aggregators (Bridge Identifiers)?" Then I'd ask to which competing Identifier in specific you are referring? Wikidata could be considered a bridge identifier as well, and the relationship is already symbiotic - in that one provide authority and the other provides algorithmic improvement. Wikidata item IDs may well overtake to become the most popular disambiguation method, it will still lean on VIAF as it contains information that Wikidata doesn't. Conceivably, Wikidata could duplicate VIAF, and render it almost obsolote, although Wikidata would not have direct data partnerships with the underlying libraries. Still, that would be interesting territory. I presume it will be telling of how far society has come to embrace crowd-sourcing as authoritative.
  • Is there a road map? The closest thing so far, which might be difficult to jump into is this transcript of an IRC meeting on the subject.Wikidata:Requests_for_permissions/Bot/VIAFbot/Meeting_agenda#Meeting_transcript. Maximilianklein (talk) 19:48, 11 April 2013 (UTC)
Very nice. Thank you for this thorough response. Other aggregators include those listed at Wikisource:Authority_control (are all of these part of VIAF?) and Google's KG identifiers (or the subset of it exposed via Freebase). Sj (talk) 22:14, 18 April 2013 (UTC)

Hi, You might be interested in Wikidata:Database reports/Constraint violations/P214. --  Docu  at 15:11, 5 May 2013 (UTC)

Oh, I didn't know about this. Excellent, thank you very much. Maximilianklein (talk) 16:37, 6 May 2013 (UTC)


Should I continue to avoid running SamoaBot for VIAF... etc. ? Is there a solution? (Note: at the moment, it's importing only IMDb codes) --Ricordisamoa 22:28, 5 April 2013 (UTC)

I don't mind doing VIAF. It might be better for me to do it since I'm following VIAF redirects, so there will be less conflicts potentially with competing sources. You can run what you like, I don't want to be "territorial" about Authority Control. It's cool that you're doing IMDB - I didn't think about that. Maximilianklein (talk) 22:35, 5 April 2013 (UTC)

Also do you plan to write any other import bots in the future? My next one planned is ISBNs. Maximilianklein (talk) 22:37, 5 April 2013 (UTC) Also, also, how are you getting descriptions for items? I was thinking about how to do this. Mostly by following the VIAF links and gettnig the alternate names and professions. Maximilianklein (talk) 22:43, 5 April 2013 (UTC)

  • I'll continue to import IMDb only - until the conflicts are proven to be solved;
Great, I'll run Authority Control. Ran 1k items just now. Going to look over them for testing. Then I'll run the full set. Maximilianklein (talk) 03:58, 7 April 2013 (UTC)
  • The code is proven to work well :-) so I think I'll build an entire JS framework around it;
It defintiely seems like a good code to adapt for design patterns. Maximilianklein (talk) 03:58, 7 April 2013 (UTC)
  • "Setting description for some items" is SamoaBot's 2nd task (not approved yet, currently in testing, see here), it's completely unrelated with authority control. --Ricordisamoa 22:55, 5 April 2013 (UTC)

Signalling erroneous VIAF entries[edit]

Hello, I have noticed that the VIAF entry for the muséum de Toulouse has an incorrect BNF link (it links to something different). Is there a place where we should report that ? --Zolo (talk) 09:16, 29 April 2013 (UTC)

If you just correct the link on the page, that's sufficient, because I plan to run some statistics in the future that will compare the values of what identifiers were imported versus how they stand 6 months after, to try and check the error rate. If you want to start a manual list of reports for the BNF now, I'm sure they'll appreciate it. If you start the list I'll contact the BNF and tell them about it. I'm not sure where the best space for such a list is? A subpage, of VIAFbot? Maximilianklein (talk) 17:15, 29 April 2013 (UTC)
Ok, thanks, I think it is ok, as the BNF entry itself seems to be correct, the only thing wrong is the VIAF cluster. --Zolo (talk) 09:31, 30 April 2013 (UTC)
Another case is I doubt both links refer to the same person as one is a mathematician and the other writes a disseration about cholesterol metabolism. But given that we do not have the Canadian Library in Wikidata, it cannot be detected by analyzing Wikidata. --Zolo (talk) 16:40, 4 July 2013 (UTC)

Authority control in documentation pages[edit]

Your bot should not add any authority control properties to items that belong to non-articles, especialy Q9605193 and Q8614281 (and Q3907614, but this item your bot didn't edit yet). --Schnark (talk) 07:43, 4 May 2013 (UTC)

Oh, yes good point. These come from the fact that there are authority control on those pages as examples of how to use it. Those tutorial pages should be relatively contained, so I'll manually clear them out by hand now. Thanks for pointing this out. Maximilianklein (talk) 21:25, 4 May 2013 (UTC)
+Q4657349. --Schnark (talk) 09:27, 11 May 2013 (UTC)
Whatever you did, it didn't work: Q9605193. --Schnark (talk) 07:21, 16 May 2013 (UTC)
I removed the IDs from some the tutorial pages you requested. I'm running the bot to look at German Wikipedia now, so maybe its adding more from German help pages. It's easier to just delete these items as they arise than to program the bot to skip certain pages. I'm happy to delete the IDs on any pages they need deleted, it's <10 pages of >400,000 that have legitimate VIAFs. Can I help you more? I'm afraid I don't feel like I'm answering your question properly. Maximilianklein (talk) 18:09, 16 May 2013 (UTC)

A hostname for VIAF identifer[edit]

Any idea what might have caused this edit? Gabbe (talk) 16:02, 4 May 2013 (UTC)

My bot attempts to follow VIAF url redirects because sometimes the clusters move around and so I thought it'd be clean to point to the current location rather than what the cluster number was when it was entered into wikipedia. Maybe there was minor hiccup with VIAF redirecting somewhere else when i tried to follow that redirect. Please let me know if you see this in any other places. Maximilianklein (talk) 21:30, 4 May 2013 (UTC)
There's also this one, but I think that's all of them. I've fixed both of them now. Gabbe (talk) 07:34, 5 May 2013 (UTC)
Great. Weird bug, I'm still confused about where it came from. Maximilianklein (talk) 16:37, 6 May 2013 (UTC)
There are a few more listed on Constraint_violations/P214. Please fix them and check it once in a while. --  Docu  at 05:18, 8 May 2013 (UTC)
This is really useful information. I will comment on the talk page. Can you explain to me what "Format" violations with offending value "somevalue" means? Thanks Maximilianklein (talk) 06:11, 8 May 2013 (UTC)

en.wikipedia Authority Control template[edit]

I've added support for Wikidata content to the Authority template of the English Wikipedia. For more information see the related discussion.

PS: Thanks for your amazing bot!

Tpt (talk) 07:59, 11 May 2013 (UTC)


Hi Maximilianklein, please could you help with wikidata, there are 2 items for the same artist, they have all to be linked, see also Q782519 =Giuseppe Torretto/EN,it...Regards--Oursana (talk) 00:45, 13 May 2013 (UTC)

I have merged both items into Q782519 --Alexander.Meier (talk) 07:02, 13 May 2013 (UTC)
Excellent, thank you.--Oursana (talk) 07:33, 13 May 2013 (UTC)

Please see...[edit]

...this. --Ricordisamoa 12:26, 16 May 2013 (UTC)

Hmmm, I thought I was checking for that malformed data. Thanks for showing this to me. I'll fix this bug. Maximilianklein (talk) 18:09, 16 May 2013 (UTC)

Errors of VIAF bot in en.wikipedia[edit]

Hi! I've investigate the list of "unique value" violations for ISNI identifier and I've found that most of them were caused by duplicates entries for the same person or by wrong identifiers added by your bot to the English Wikipedia. I've merged most of duplicates so the remaining violations the 16th May are mostly error of your bot. Tpt (talk) 13:28, 16 May 2013 (UTC)

I will look into this. I don't think that ISNI identifier should have unique value constraint though. Sometimes one person has two names, like a pseudonym, and they have two different Wikidata items but the same ISNI id. Maximilianklein (talk)
Tanks! Face-smile.svg
I've understood (but I may have wrong) the reverse statement: there is only one Wikipedia article and so one Wikidata item by person and one ISNI id by name. So for he same item there can be more than one ISNI. By example, Lewis Carroll has only one item but two ISNI, one for his real name and one for his pseudonym. So, we can use a "unique value" constraint but not a "single value" constraint. Tpt (talk) 18:46, 16 May 2013 (UTC)

VIAF using Wikidata[edit]

Hi Max, your bot is working great and pages like Wikidata:Database reports/Constraint violations/P107 help improving Wikipedia. Do you know when VIAF will start to use Wikidata? I'm sure it will reduce the duplicates they have. Cheers --Kolja21 (talk) 03:00, 20 May 2013 (UTC)

The system in place that I've talked about with the VIAF team, is that they will start watching Wikidata in-links. So you make a correction to VIAF by fixing the Wikidata entry like you just did, then later on their algorithm will scan what's happened and 'see' your report. Maximilianklein (talk) 18:55, 20 May 2013 (UTC)


Hello, would it be possible for VIAFbot to add ULAN ID (P245), as it is part of VIAF. --Zolo (talk) 14:45, 20 May 2013 (UTC)

Sure I can look into the database and see how many WD<->ENWIKI<->VIAF<->ULAN links there are. The question I wonder is if that's necessary because the link already exists as I said in the previous sentence, its just a matter of shortening the chain. If you think that's useful, then we could cut out VIAF as the middle bridge. I really have no investment either way. Maximilianklein (talk)
ULAN is used in en:Template:Authority control, so it makes sense to have it in Wikidata, just as we have LCCN or GND. There may not be that many articles with a corresponding ULAN, but when there is one, it is often the most interesting of all VIAF links. --Zolo (talk) 19:30, 20 May 2013 (UTC)[edit]

Hi Max, in some cases VIAF bot added "" as Property:P214 (VIAF identifier).[5] Examples: Q949921, Q949940, Q979137 etc. More: Wikidata:Database reports/Constraint violations/P214. Any idea what went wrong? --Kolja21 (talk) 02:40, 24 May 2013 (UTC)

It's a weird bug. I'll add a check for it in the future, and write a script to delete previous occurrences. Maximilianklein (talk) 04:43, 24 May 2013 (UTC)
Thanx for deleting VIAF = "unknown value" as well. Can you do the same for GND = "unknown value"? The maintenance list is crowded with this items. --Kolja21 (talk) 20:54, 25 May 2013 (UTC)
Kolja21, if you want to give it a shot you can modify my VIAFClean script and run it yourself. Maximilianklein (talk) 22:28, 29 May 2013 (UTC)


Nuvola apps edu languages.svg
Hello, Maximilianklein. You have new messages at Wikidata:Requests for permissions/Bot/VIAFbot 3.
You can remove this notice at any time by removing the {{Talkback}} or {{Tb}} template.

--Ricordisamoa 23:27, 29 May 2013 (UTC)

✓ approved --Ricordisamoa 07:36, 30 May 2013 (UTC)

Edition data[edit]

Hi Max, please take a look to my last comments after the Hackathon, if that is an acceptable solution we should aim to close the RFC on June 15.--Micru (talk) 23:52, 30 May 2013 (UTC)

As always, so very glad to see your input here. Hope to catch up in more detail soon; come back to visit Boston one of these days and we'll organize a talk for you :-) Sj (talk) 01:49, 1 June 2013 (UTC)

Wikidata:Requests for permissions/Bot/SamoaBot 32[edit]

For you information: SamoaBot is planning to import birth/death dates from the BNF.

I notice the VIAFbot is currently importing sex given as source "imported from VIAF". If the data are mostly imported from BNF authoirty files, would it be possible to provide more precise sources, see my comment toward the bottom of Wikidata:Requests for permissions/Bot/SamoaBot 32. --Zolo (talk) 06:04, 5 June 2013 (UTC)

wrong VIAF entries[edit]


while checking ISNI constraint violations, I found [6] and removed the VIAF Id your bot added, since, it is NOT (please check the dates).

This is not the first erroneous VIAF value I find, added by the bot… you should perhaps add dates control to it :) (for ex, Q6490296 which was mixed with his homonym Q3218046).

Also Q7816097 (UK actor) is certainly not (even if they are born the same year), since this Tom Harper is a librarian, specialized in Maps at the British Library (, and Q1249485 who has been added the same VIAF (by Samoabot, I believe) is even wronger (he's…

I don't know how your bot works, but I really think VIAF id should NOT be added automatically… considering the very large number of homonyms and the sometimes great difficulty to find the right one among them (and the fact that there is no VIAF, so far, for most sportsmen and actors, unless they wrote their memories), human checking should be systematic, and in doubt, refuse the adding… I've removed dozens of wrong VIAF, caused by homonymy, that a single checking dates could have avoided.

On the other side, once the VIAF is fixed with certainty, your bot could be very useful to import automatically all the linked AC values :)

--Hsarrazin (talk) 04:33, 7 June 2013 (UTC)

VIAFbot on wikidata takes viaf ids copies from English German, and French wikipedias. So it could be copying bad info. VIAFbot on English Wikipedia did do matching and comparing dates. Even by comparing dates you don't get the right ID all the time, it's >98% accurate though. Here's the good news if you make corrections on Wikidata, is watching wikidata and will correct itself, and become more accurate. Maximilianklein (talk) 22:10, 8 June 2013 (UTC)

"The most unique Wikipedias"[edit]

Hi, I read you blog post, that's quite interesting. As it seems we cant comment there, I'll point out here that the Vietnamese-Dutch (and perhaps Cebuano-Waray-Swedish) cluster is most probably due to the high number of items about taxons. That also explains why they have relatively few "unique items". --Zolo (talk) 21:04, 12 June 2013 (UTC)

Weird that you can't comment there. Thanks for explaining that to me though. Maximilianklein (talk) 21:09, 12 June 2013 (UTC)

Wrong information in VIAF[edit]

So I was looking at some entries in the items that have multiple sexes (which is a constraint violation), and a lot of them seem to follow the pattern that the correct sex doesn't have a source while the wrong sex, added by your bot, has the VIAF as source; apparently, for these persons the VIAF is messed up and has the wrong gender. What's the correct way to deal with this? If the wrong sex is simply deleted, will your bot re-add it? --DSGalaktos (talk) 18:33, 25 June 2013 (UTC)

@DSGalaktos:, There are incorrect assertions in both databases, I wrote a blog explaining the differences and similarities. If you find a correction to make, then just edit the Wikidata item, with a source if possible, and I will be able to monitor the changes for VIAF. Maximilianklein (talk) 19:49, 25 June 2013 (UTC)

RfC on Wikidata's primary sorting property[edit]

You recently participated in a deletion discussion for P107 - main type (GND). The discussion has been closed, as it is clear that a resolution won't come from PfD, and an RfC has been opened on the matter at Wikidata:Requests for comment/Primary sorting property. You are invited to participate there. Please note that this is a mass delivered message, and that I will not see any replies you leave on this page.

Yours, Sven Manguard Wha? 18:25, 30 June 2013 (UTC)

VIAFbot: alias values[edit]

in some of the items that have multiple values one of them is a redirect to the other. the redirect can be removed. for example: removed 64801193 in Billy the Kid (Q44200), since it's an alias for 84074643. can your bot do that? --Akkakk 08:51, 24 July 2013 (UTC)

Actually I have attempted to do this, and did remove many aliases. I think what might be happening is that VIAF is updated and consolidated monthly so more and more IDs will become aliases. I will think about how to do monthly updates. I think the first thing is to see if I can get diffs from VIAF, otherwise I'll have to check all the IDs every month - which isn't bad, but in the future might not finish within a month since it took 3 weeks to upload last time. Maximilianklein (talk) 05:34, 5 August 2013 (UTC)

My bot can now do that: Wikidata:Requests for permissions/Bot/SamoaBot 38 --Ricordisamoa 00:37, 11 September 2013 (UTC)

request at Wikidata:Property proposal/Authority control[edit]

Dear Maximilianklein, Please note the requests for

  1. WKP identifier
  2. NSZL identifier
  3. BNE identifier
  4. SELIBR identifier - obsolete see note
  5. BIBSYS identifier

לערי ריינהארט (talk) 12:29, 7 October 2013 (UTC)

WKP identifier[edit]

I wonder that I found persons without WD entry but with WKP in the viaf number.

  1. Could you please check if all viaf entries having WKP have a wikidata entry as well?
    1. My bot did make Wikidata entries for all VIAF pages having wikipedia links, but that last happened about a year ago in November so it's possible that its changed since then. Maximilianklein (talk) 19:13, 7 October 2013 (UTC)
  2. I started to contribute at wikidata a few days ago. I would be very happy if your bot could watch my contributions.
    1. I realized that with copy and paste I added a lot of ISNI values. Is it possible to reformat them and to delete the duplicates /equivalents? The insertion method should not matter and reformat should be done by the mediawiki sw.
      1. From where to where did you copy and paste ISNIs? It shouldn't matter too much if there are duplicates in Wikidata. I think another bot will delete obvious duplicates.
    2. many entries are missing "person", birth and death dates.Maximilianklein (talk) 19:13, 7 October 2013 (UTC)
      1. Yes, I never bothered to write "person", but this could be done. As far as birth and death dates, I am waiting for that dates to be supported in pywikipedia bot. Maximilianklein (talk) 19:13, 7 October 2013 (UTC)

2010 I started to work on AC issues and exprimented with template variants at eowiki, rowiki, yiwiki and others. You may find undocumented template parameters and multiple parameters as VIAF-1 etc. At that time I beleived that rediretcs of AC values should be preserved in databases as they might be regarded as permanent links. This does not seem to be the case.
I am using here mainly User:Magnus Manske/authority control.js . It reminds me that the relation between the AC identifiers are neither one to one nor is it simple to detect conflicts and manage a redirect database. Some of my edits here might have created such conflicts. Regards לערי ריינהארט (talk) 12:29, 7 October 2013 (UTC)

There are database constraint reports to uncover those, so I wouldn't worry about it. Maximilianklein (talk) 19:13, 7 October 2013 (UTC)
P.S. I added many BnF identifier values. Later while reading some discussions I realized that there are some issues. At a later point the values, links wp parameters need to be verified and corrected. לערי ריינהארט (talk) 12:46, 7 October 2013 (UTC)

Thanks for the answers!
status update: Q2203725 Philip Slier is an example of wikidata having AC values and the VIAF id viaf:68342673 containing WKP cluster but with a missing template at enwikipedi. לערי ריינהארט (talk) 23:50, 7 October 2013 (UTC)
I hope WKP will be available next week. Then it would be interesting to identify the authors with VIAF ids without articles at enwikipedia. Not shure how it would be possible to generate such reposts / lista. לערי ריינהארט (talk) 23:55, 7 October 2013 (UTC)

Maximilianklein (talk) 16:24, 8 October 2013 (UTC)Wikidata:Database reports/Constraint violations/P214 תראה פה @לערי ריינהארט:

NLI identifier[edit]

Hi! Wikidata talk:Authority control#NLI_identifier relates to the Israel Union List identifier. Would you support the addition of this parameter? Do you have an idea how it can be linkified?
Some time ago I found a note about the people involved in adding WKN identifiers to VIAF. I do not remember where. Do you know the names? Regards לערי ריינהארט (talk) 18:41, 9 October 2013 (UTC)

Well it is OCLC Research that manages all of VIAF. I am currently working temporarily for OCLC research so I can also answer specific questions. Maximilianklein (talk)

Thanks for the answer. First I am mainly interested in the following questions:

  1. How can we generate urls using the NLI identifier?
  2. How can we generate urld using the RSL identifier? see Property_talk:P947 RSL identifier
  3. you amy lok and comment also at Property talk:P268 BnF identifier. We need the algorithm to be able to calculate the checksum digit.

Regards and greetings from Munich Germany. לערי ריינהארט (talk) 22:13, 9 October 2013 (UTC)
P.S. I am still an admin at eo: yi: and some other wikis but made some sabatical years. I used to contribute (also about VIAF issues) at Librarything until my commemorative events have been moved to the ""Pacific pole of inaccessibility" (48°52.6′S 123°23.6′W), the point in the ocean farthest from any land mass". So I spend one year in exile (Q1779748 Pitcairn Island). BTW: Today is the birthday of Moshe Flinker

See also librarything: Publisher Series: WP: List of posthumous publications of Holocaust victims · Q2112755 · en:List of posthumous publications of Holocaust victims · T
Can you please add wikidata links to the last talk page? לערי ריינהארט (talk) 22:26, 9 October 2013 (UTC)
It is my understanding that talk pages are not supposed to have Wikidata links. Maximilianklein (talk) 19:46, 14 October 2013 (UTC)

VIAF issues to be handeled by bots[edit]

Hi! First a manually created table about WMF projects listed at Template:Authority control (Q3907614)
The list / table is ( 'als', 'ar', 'arz', 'as', 'bar', 'be' 'bn', 'bs', 'ca', cs', 'cv', 'da', 'de', 'en', 'eo', 'ee', 'fa', 'fr', 'hu', 'id', 'ilo', 'it', 'ja', 'km', 'ko', 'mk', 'nds', 'ne', 'nn', 'no', 'or', 'pl', 'pt', 'ro', 'ru', 'sl', 'st', 'th', 'uk', 'ur', 'vi', 'yi', 'zh', 'commons' ). Note: I deleted 'es' some days ago.

  1. Where is a discussion about "VIAF related bot features" required for bots?
  2. Are multiple line issues are addesssd? Multiple line template is helpful at sites useing RTL as yi.wikipedia

Regards לערי ריינהארט (talk) 09:55, 10 October 2013 (UTC)

NLI indicator and BAV indicator[edit]

Hi! I added a proposal for "NLI indicator" at Wikidata:Property proposal/Authority _control#NLI identifier . Can you please support this?
FYI Wikidata talk:Authority control#Bonnie and Clyde type pages I created some pages related to "Righteous Among the Nations" persons. Please search for that string in my contributions. Can the bot verify these newly created pages?

I noticed that the "Vatian Library" is using the prefix "ADV" as ADV10180492 in Q52570 Valdemar Langlet. I wonder if all BAV entries have this prefix or not. Thanks for any help on this question! Beest regards!
לערי ריינהארט (talk) 11:24, 11 October 2013 (UTC)


Hi Max, Wikidata has now beside ISBN-13 (P212) a property for the old ISBN-10 (P957). I hope it helps VIAFbot. --Kolja21 (talk) 17:38, 17 October 2013 (UTC)

This is not exactly what I wanted, or think is best. At them moment I actually am converting ISBN-10s to ISBN-13s. I wonder if I should change that now. Maximilianklein (talk) 20:29, 17 October 2013 (UTC)
You don't need to use ISBN-10. But checking the error report I found that a lot of users mixed ISBN-10 and 13 since there was no property for the ten digit version. --Kolja21 (talk) 21:25, 17 October 2013 (UTC)
An other ISBN problem: In World War Z (Q28172) VIAFbot has imported from the French WP the parameter "isbn" (= French edition). Since the book has been originally published in English the parameter "isbn_orig" (= original edition) should be taken into account. --Kolja21 (talk) 01:59, 18 October 2013 (UTC)
Of course, it wouldn't be a problem is users put ISBN-10s in the ISBN-13 property if we converted ISBN-13 to a property "ISBN" which accepted both. As for isbn_orig, I thought about importing both. For most books there is just one ISBN, on average. But for popular books there could be >400 or more ISBNs. Even though it would be too much to import them all, I thought that it would be good to show that there is not a injection (Q182003) relationship. I posted about this issue on in my RFBOT. I am happy to listen to an alternate proposal. Would you like me to include French isbn_orig? In this case it would have been picked up probably when I get to harvesting from English. Maximilianklein (talk) 17:48, 21 October 2013 (UTC)
Good question. Imho Wikidata:Books task force should discuss how many ISBNs we need. I've started a thread. --Kolja21 (talk) 19:27, 21 October 2013 (UTC)

Property:P50: Q1 - nonsense[edit]

for example. NBS (talk) 18:39, 21 October 2013 (UTC)

The Battle for Bond (Q7715841), P50: "Datenherkunft: englischsprachige Wikipedia". The author is "Robert Sellers" (enWP) = "Wikidata item not found". Has this caused the problem? --Kolja21 (talk) 19:32, 21 October 2013 (UTC)
Thanks for pointing out this bug. I know that pywikibot cant find a wikidata item it sets it to Q-1, but i thought I was catching this. Thanks for pointing it out. Good bug report, thanks @NBS:.
Can you undo this easily? Q1 is linked in several hundred items now, and all that I checked were introduced wrongly by VIAFbot. --YMS (talk) 11:34, 23 October 2013 (UTC)
I am stopping the bot, and I will undo all these mistakes. Thank you for being watchful. Maximilianklein (talk) 16:00, 23 October 2013 (UTC)
@NBS: @YMS: I just deleted 241 cases where author (P50) was Universe (Q1). Thanks again for the bug reports. Maximilianklein (talk) 21:51, 23 October 2013 (UTC)

bot adding Wikipedia as a source[edit]

VIAFbot is adding Property:P143 with the different language Wikipedia's as the data value. The Property_talk:P143 discussions Wikipedia as a source? and Deprecated suggest that this is inappropriate and that the property itself is deprecated. Perhaps the bot should be updated. 13:37, 23 October 2013 (UTC)

Taking a look now. Maximilianklein (talk) 16:00, 23 October 2013 (UTC)

Removing OCLC Properties[edit]

Hello Snipre,

I noticed you have been removing some of the OCLC properties that my bot has been adding (see diff). I was would like to know why you are doing this? I feel that is the right thing to do, as I got permission to do it at wd:RFBOT. Maximilianklein (talk) 18:17, 23 October 2013 (UTC)

Yes I removed the property: but you will understand why I did that if you look at the item. The item The King of Rome (Q594) is about a pigeon not a book. So if you want to add data about the book on the pigeon, please create the appropriate item. An item is not a category which mix different subjects with a common topic. Snipre (talk) 18:26, 23 October 2013 (UTC)
@Snipre: I understand completely. Well done. Good work. Can you explain, since I don't read French why there is an ISBN in on w:fr:Le_roi_de_Rome? Maximilianklein (talk) 18:33, 23 October 2013 (UTC)
The problem is that wikipedia article, the french one but the english one too, mix into the same article the famous pigeon, the book about that bird and a song which is at the origin about the book. So for the same article we have in reality 3 items in Wikidata. Snipre (talk) 18:41, 23 October 2013 (UTC)
Your bot added again the book data in the wrong item. If I create the item about the book how can I say to your bot that the good item is the one I created ? Snipre (talk) 09:02, 24 October 2013 (UTC)
Yes, sorry, I will build feature to ignore pages. Sorry to bother you @snipre:.
Can't you just stop the importation ? Once your bot did the importation there no interest to continue to look for new elements: wikidata has its starting data set and can now lives without wikipedia. The goal is to attract people in wikidata not to import data that people are adding in wikipedia. As Wikidata contributor I am working in wikidata and I don't change error in wikipedia so bots can do only once the importation task and not as an on-line task because there won't be a possibility to correct data if each correction is erased by wrong data importation. This is not a criticism just a simple remark. Snipre (talk) 17:24, 24 October 2013 (UTC)


VIAFbot is adding invalid numbers:

--Miredcessy (talk) 19:32, 24 October 2013 (UTC)

Thanks @Miredcessy: I'll stop the bot and inspect this now. Maximilianklein (talk) 19:43, 24 October 2013 (UTC)
There was a bug in my regular expression. Fixed now. Will remove those bad ones, thanks again @Miredcessy: Maximilianklein (talk) 20:45, 24 October 2013 (UTC)


Stop importation of data for ISBN number: your bot is doing wrong things: 1) the bot adds ISBN-10 number as a value of ISBN-13 property (see here or here), 2) your import doesn't respect the help:sources guidelines which split work and editions into different items, 3) your import doesn't respect the constraint especially the format constraint of the ISBN-13 (see Property_talk:P212) and now we have to treat manually more than 6000 statements (see Wikidata:Database_reports/Constraint_violations/P212#Format). Snipre (talk) 07:54, 25 October 2013 (UTC)

@snipre:. We don't have to fix them manually. I will fix them with my bot. @snipre:. I will take care of these database constraints, I talked to Kolja and thought we were moving to merging ISBN-10 and ISBN-13. No problems though, I will migrated these over to the new property.
Regarding edition/work difference, the only source I am making is "importation from xx:wikipedia". Help:sources refers to when a claim on an item1 uses another item2 as a source. then item2 has to be an edition linked to a work item3. So that is a different issue that I am not violating. Maximilianklein (talk) 17:38, 25 October 2013 (UTC)
Ok if you can treat the fromat problem with the bot but it would be simplier to do it when adding the statement.
About the source definition, this is not like you describe: normally you don't have to add any edition parameter in the work item. Or most of the items about book are work items because they are focus on the work and not on a special edition. We can do an exception if the edition parameters are for the original edition but this means you have to be sure that the edition parameters are relevant for the original edition when adding this data. You see the proble by adding ISBN numbers from different wikipedias: depending on the language we can add more than one ISBN number because of the translations. For me ISBN numbers can't by added by bots because you need to assess if they are relevant for the edition described in the items. Snipre (talk) 13:16, 28 October 2013 (UTC)


Hi Max, sorry to bother you again but VIAFbot seems to have problems with ISBN-10 (P957):

  1. removed hypens: 0-14-118126-5
  2. added: 0141181265
  3. added: 0141181265 (= duplicate)

See also Property talk:P957#changing english descrption. --Kolja21 (talk) 23:57, 25 October 2013 (UTC)

@Kolja21: I'm fixing it right now. I am going through the database constraint reports. Maximilianklein (talk) 00:00, 26 October 2013 (UTC)
Great. Thanks for the fast reply. --Kolja21 (talk) 00:02, 26 October 2013 (UTC)
@Kolja21: as im fixing this bug, i'm realizing that this is another argument for making ISBN-10 and ISBN-13 on property. In Wikipedia infoboxes they are stored as one property, now what VIAF bot would be doing is splitting it out into two properties. So that means that reusing this data back into infoxboxes will be difficult, because there will be two properties to choose from. Can you imagine an infobox that has two fields ISBN-13 and ISBN-10, it would look a little be silly in my opinion. Maximilianklein (talk) 00:46, 26 October 2013 (UTC)
And also the isbn-13 constraint regex is wrong, because this A Deadly Secret: The Strange Disappearance of Kathie Durst (Q4656350) valid isbn-13 shows up on the database constraint report. What do you want to do about that?
1) I think we should convert all ISBNs to ISBN-13. In Wikidata this is the main property. We started with ISBN-13 in March but since a lot of users entered the 10 digit number imho it's good to provide them with a ISBN-10 property. (I know from discussions on WP that some users insist on the "original" number, but we can add a comment like: "ISBN-13 preferred. Please use a converter.") Both numbers are correct and a bot can once a while convert ISBN-10 into ISBN-13 and afterwards delete ISBN-10. (In the worst case, if users disagree with deleting, we can ignore ISBN-10 as a duplicate that do no harm.)
2) An ISBN-13 has five elements: EAN (new) - Group (= language/territory) - Publisher - Title - Check digit
A Deadly Secret: The Strange Disappearance of Kathie Durst (Q4656350):
ISBN-13: 978-0425192078 should be converted into ISBN 978-0-425-19207-8
--Kolja21 (talk) 04:02, 26 October 2013 (UTC)
@Kolja21: @Kolja21 The problem is how can you detect the different parts in an ISBN number when you have something like 978-0425192078 ? You can have 978-0-4251-9207-8 or 978-04-251-9207-8. For me the only thing you can do when you have a number like 978-0425192078 is to format it into 978-042519207-8. Snipre (talk) 13:21, 28 October 2013 (UTC)
@Snipre: I don't know how ISBN converter manage to calculate the correct number but there must be an algorithm (or a database?). Imho it's better to enter ISBN 9780425192078 (without any hyphens) instead of using a third variant. --Kolja21 (talk) 14:52, 28 October 2013 (UTC)
@Kolja21: ISBN13's are never ambiguous - even without Hyphenation. When I use the pyisbn library to convert ISBN-10 to ISBN-13 it gives back '123-1234567890' form. Which is still perfectly human and machine readable. I was taking out the human hyphens, because in my mind hyphenation just confuses then and complicates the identifier leaving more room for error. But now I will import them just as they were in Wikipedias. I'm sorry this has become such a problem, but I'm really dedicated to fixing it. Maximilianklein (talk) 18:25, 28 October 2013 (UTC)
O.k. but please keep in mind that "perfectly readable" is missing the point. For human readers '123-1234567890' is just a collection of numbers, without the basic infos about the language and publisher of a book. --Kolja21 (talk) 00:22, 29 October 2013 (UTC)
The bot still has problems. Example: Globalization and Its Discontents (Q27466)
  1. 24. Okt. 2013‎ VIAFbot adds: ISBN-13 0-393-05124-2
  2. 24. Okt. 2013‎ VIAFbot adds: Source for ISBN-13
  3. 25. Okt. 2013‎ VIAFbot adds: ISBN-10 0393051242
The duplicate is not only missing the hyphens but also the source. "ISBN-13" has not been deleted. --Kolja21 (talk) 00:52, 29 October 2013 (UTC)
Thank you. I am fixing these now, it is going to take a while, as I am relying on the database constraints bot to understand the mistakes. And also bot throttling has increased. Please bear with me, it might take a while, but I am dedicated to fixing and importing ISBNs properly. Maximilianklein (talk) 00:56, 29 October 2013 (UTC)
@Kolja21: OK, I see the problem. The constraints report is not necessarily removing items off the list even when it has no ISBN-13 any more, which caused some double adding of ISBN-10. It's not wise to operate off those constrain reports anymore since they can be out of date. So here is my plan, I will finish the import from the Wikipedias, distinguishing for ISBN-10 and ISBN-13. Then I will go back and remove ISBN-10s that are in ISBN-13s property (that occurred before there was an ISBN-10 property). And then I will attempt to remove duplicates for both properties. Does that sound like the right plan? Maximilianklein (talk) 17:14, 30 October 2013 (UTC)
Sounds like a lot of work: Good luck! (I'm afraid the ISBNs will keep bothering us. As Snipre pointed out the help:sources guidelines splits work and editions into different items. I've no idea how a Wikipedia article - linked with the work - will be able to use the ISBN of an edition ...) --Kolja21 (talk) 20:35, 30 October 2013 (UTC)

VIAFbot: property / author[edit]

I didn't like this edit: [7] An Introduction to the Rock-Forming Minerals (editions) (Q4750099). Regards --Chris.urs-o (talk) 05:46, 5 November 2013 (UTC)
I can see that's not a fantastic edit. The problem was the the method I use looked throught the English Wikipedia Infobox under the author parameter and found w:en:Robert_Andrew_Howie except that that author now redirects to his book. Not much that can be done to autmatically detect this to thanks for pointing that out. Maximilianklein (talk) 18:29, 5 November 2013 (UTC)
Ok ;) --Chris.urs-o (talk) 15:48, 6 November 2013 (UTC)

VIAFbot: Also known as Member of[edit]

It looks like in July your bot added text like "Member of Royal Acadamy" and "Member of Political Economy Club" to the Also Known As field on a bunch of pages about people. Should these all be moved to the description field instead? I started to change them manually, but I keep finding more, so maybe there are too many for a human to do. Arctic.gnome (talk) 22:39, 5 November 2013 (UTC)

@Artctic.gnome: Ok, I hadn't seen that. So would you like me to automatically delete or move "aliases that begin with "Memeber of [X]"? Maximilianklein (talk) 17:08, 7 November 2013 (UTC)
Moving them all to the description field would be good, thanks. --Arctic.gnome (talk) 21:04, 9 November 2013 (UTC)

VIAFbot: pt - a lot of mistakes[edit]


NBS (talk) 20:55, 9 November 2013 (UTC)

@NBS: I see that mistake, I am planning a big fix of the mistakes that harvest infobox book has made, and I'll include this problem. Thanks for pointing it out. Maximilianklein (talk) 21:14, 9 November 2013 (UTC)


Hi, it appears that VIAFbot creates claims pointing to Universe if the infobox has a redlink there. There's author (P50) mentioned above, but it's for illustrator (P110) as well; example. --Magnus Manske (talk) 13:04, 11 March 2014 (UTC)

Forgot: all P110 instances pointing to Universe, as of last week (Wikidata is a little slow with the data dumps these days...) --Magnus Manske (talk) 13:07, 11 March 2014 (UTC)
@Magnus Manske: thanks for reporting that. I will address this bug. Maximilianklein (talk) 20:43, 19 March 2014 (UTC)

Also happened for P136 (example). -JanZerebecki (talk) 18:39, 22 March 2015 (UTC)

@Maximilianklein: that would explain why "Kung Fu High School" and "The Complete Manual of Suicide" were of genre (P136) the Univers ^^ The full list of entities with P136:Q1 -- Zorglub27 (talk) 17:44, 9 June 2015 (UTC)

Thanks,  at . Maximilianklein (talk) 19:25, 15 June 2015 (UTC)

Is anything happening on this issue? I'm seeing a lot of erroneous entries that still need to be fixed up, e.g. for genre. Cheers, Bovlb (talk) 16:37, 23 August 2016 (UTC)

Never mind. I fixed it. Bovlb (talk) 17:13, 23 August 2016 (UTC)
@Bovlb: Great, I'm pleased it got fixed. Maximilianklein (talk) 07:39, 1 September 2016 (UTC)

Reporting VIAF duplicates?[edit]

As you are somewhat in the know on these things, can you please answer …

  1. Is there a reporting process for when duplicates are found in VIAF, Do they care to have them reported, or do they simply believe that in time they will be found and removed in time? If to be reported, can you point to the process.
  2. When there are duplicates, which way will a merge happen? Is there a defined process, or is it a bit of luck?

I am trying to work out how to handle differences with regard to what is here and what is in enWS, and also how to handle them here. I have been going through and tidying and managing where I can, though it would be useful to have some authoritative statements to aid those activities. Thanks for the help where you can provide it. Would you please {{Ping}} me when you respond.  — billinghurst sDrewth 04:02, 4 April 2014 (UTC)

+1. Especially SUDOC is a good candidate for dups since SUDOC authority files are often imported to VIAF without book titles. This would be worth looking at. --Kolja21 (talk) 05:46, 4 April 2014 (UTC)

Strange alias[edit]

Hello, I noticed that VIAFbot added "Friend to the Church of England and a lover of truth and peace," to Q6230004 in this edit. I don't know how it happened, but I thought I'd let you know in case similar errors were made elsewhere that could be reversed. --Jfhutson (talk) 19:13, 3 October 2014 (UTC)

Another one: [8] --Jfhutson (talk) 20:50, 3 October 2014 (UTC)
@Jfhutson: Sounds strange but it's correct. VIAF:77377038: Jan Dounam, name variant LoC and ISNI = "Friend to the Church of England and a lover of truth and peace". It's a pseudonym of this author.[9] --Kolja21 (talk) 20:58, 3 October 2014 (UTC)
Interesting. I'm a little dubious that this should be included, but if it is common practice I'll leave it alone. --Jfhutson (talk) 21:09, 3 October 2014 (UTC)
All I'm saying is: It's not a bot error. If a name variant imported from VIAF should be kept depends on the individual case. --Kolja21 (talk) 21:13, 3 October 2014 (UTC)

Many ISBNs inserted in Wikidata in 2013 were invalid[edit]

I have a list of them. I think the bot should go back and remove them.

Wikidata items with invalid ISBNs. Property:P957 (list in edit history) -- Magioladitis (talk) 09:20, 2 April 2016 (UTC)

And those on Property:P212 (list in edit history)

Magioladitis (talk) 09:23, 2 April 2016 (UTC) -- 09:22, 2 April 2016 (UTC)

@Magioladitis:, thanks. I'll clear those up soon.

Quick note: Some may have been fixed by other editors meanwhile. -- Magioladitis (talk) 09:07, 5 April 2016 (UTC)

You may need:

-- Magioladitis (talk) 09:11, 5 April 2016 (UTC)

@Magioladitis: Done. I went through those lists, ran python (pyisbn) validator on them. If they were invalid I checked to see if worldcat had an entry for them. Surprisingly in about 5 cases "invalid" isbns resloved to entries in WorldCat, so I left them untouched. Maximilianklein (talk) 20:59, 10 April 2016 (UTC)

Thank you very much! -- Magioladitis (talk) 06:22, 12 April 2016 (UTC)

Katherine Mayo (Q1736246)[edit]

Hey. I saw your edit 4 years ago. How did you do that? Do you still run this bot? We could really use this in hebrew.--Mikey641 (talk) 16:20, 27 January 2017 (UTC)

@Mikey641:, the way I did this was that I compared all the flabels that were at, using the VIAF ID, with all the labels on wikidata, and added any missing ones. So the bot is only as clever as It's possible that it's been updated and has some extra names in the last 3 years, but I don't think it will be a huge gain. I have the code on github [10]. Maximilianklein (talk) 16:40, 29 January 2017 (UTC)
Thank you! I'm gonna use this code.--Mikey641 (talk) 18:51, 29 January 2017 (UTC)
@Mikey641: אחלה, תגיד לי אם זה לא עובד 20:41, 30 January 2017 (UTC)
@ חחח מי אתה?--Mikey641 (talk) 20:57, 30 January 2017 (UTC)
{@Mikey641: סליחה זה היה אני Maximilianklein (talk) 21:40, 30 January 2017 (UTC)
אההה.... וואי לפי השם לא הייתי מזהה שאתה יודע עברית Face-smile.svg--Mikey641 (talk) 21:42, 30 January 2017 (UTC)
Hey. What is the file "hasAKALC_shrink_lev65.json" and where can I download it from. Thanks--Mikey641 (talk) 17:51, 11 February 2017 (UTC)
@Mikey641: שלחתי לך מייל, אני צריך לשלוח לך את הדעתה בפרטי Maximilianklein (talk) 23:03, 12 February 2017 (UTC)