User talk:Bargioni

From Wikidata
Jump to navigation Jump to search
Logo of Wikidata

Welcome to Wikidata, Bargioni!

Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!

Need some help getting started? Here are some pages you can familiarize yourself with:

  • Introduction – An introduction to the project.
  • Wikidata tours – Interactive tutorials to show you how Wikidata works.
  • Community portal – The portal for community members.
  • User options – including the 'Babel' extension, to set your language preferences.
  • Contents – The main help page for editing and using the site.
  • Project chat – Discussions about the project.
  • Tools – A collection of user-developed tools to allow for easier completion of some tasks.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date.

If you have any questions, don't hesitate to ask on Project chat. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.

Best regards! --Epìdosis 13:41, 23 April 2019 (UTC)[reply]

Cluster problematici[edit]

@Epìdosis: Ovviamente ci stiamo portando dentro gli errori del Viaf stesso. Individuarli (temo solo a occhio) aiuterà il Viaf a correggerli. Grazie a te per l'appoggio. Prossimi import non prima di venerdì. --Bargioni (talk) 21:30, 12 November 2019 (UTC)[reply]

Ottimo lavoro con il VIAF!--Alexmar983 (talk) 02:35, 15 November 2019 (UTC)[reply]

@Alexmar983: Grazie. Ma siamo solo all'inizio, cioè a un decimo del lavoro. Oggi proseguo con gli import.
Per via del processo di clusterizzazione del VIAF, e soprattutto per il prelevamento che fa da Wikidata, questo allineamento andrà ripetuto. Dovrei scrivere da qualche parte come si fa. Qual è il posto più adatto? --Bargioni (talk) 09:18, 15 November 2019 (UTC)[reply]
Direi qualche progetto tematico sugli identificativi, non specifico di un paese. User:Epìdosis oggi è il primo giorno di WSC2019 in centinaia di paesi, lo indichi te quello migliore?--Alexmar983 (talk) 11:28, 15 November 2019 (UTC)[reply]
Ci sono varie alternative: Wikidata talk:Identifiers in realtà servirebbe per discussioni generali sugli identificativi, quindi non è adatta; Wikidata talk:WikiProject Properties è troppo generico; Wikidata talk:WikiProject Biographical Identifiers è troppo specifico (VIAF non è solo biografico, anzi!); Wikidata:Project chat è di gran lunga troppo generico; in conclusione, credo che la scelta migliore sia Property talk:P214. --Epìdosis 13:51, 15 November 2019 (UTC)[reply]
Sì ma vista la crucialità dell'indetificativo farei un rimando dal progetto sugli ID se non compare nessuno dopo qualche giorno.--Alexmar983 (talk) 17:45, 15 November 2019 (UTC)[reply]
Vero. La cosa migliore è scrivere nella talk dell'identificativo e poi segnalare altrove. --Epìdosis 17:49, 15 November 2019 (UTC)[reply]

VIAF confusion[edit]

Are you actually checking VIAF records before adding them? I already reverted your addition of VIAF 110615942 to Q1173887 and I said why in the edit summary. That VIAF record is linked to a mixture of records for the person described by Q1173887 and a similarly named person Q73178408. This situation is not uncommon. It is bad enough that people add these without checking them, but when it's been reverted by someone who takes the time to explain the problem, and the same editor just puts it back, this is even worse. All we are doing is pointlessly reproducing VIAF's errors. Iepuri (talk) 11:38, 16 November 2019 (UTC)[reply]

@Iepuri: Hi! Thank you for your observation. At the moment Bargioni is just importing all the cases in which VIAF cluster links to Wikidata, as I suggested him here; the next step will be to find problematic cases and listing them, in order to send a report to VIAF. The old en:Wikipedia:VIAF/errors is no longer used (but probably contains a lot of still useful reports), because VIAF prefers to receive reports at bibchange@oclc.org; I think, however, it would be useful to create a new page for reports here on Wikidata (the title may be Wikidata:WikiProject Authority control/VIAF errors) in order to have a collective list of problematic clusters - this page would also be useful in order to progressively check if the reports contained in the old en.wikipedia page have been addressed or not. What's your opinion? Bye, --Epìdosis 12:11, 16 November 2019 (UTC)[reply]
@Iepuri: Thank you for your correction(s). I know that while adding a lot of VIAFids, I'm importing (a very small number of) VIAF errors too. Their clusterization process may create inconveniences, that can be resolved only by humans, in my opinion. --Bargioni (talk) 15:02, 16 November 2019 (UTC)[reply]
And Convento della Santissima Annunziata (Q34351913), Monastero di San Vittore (Milan, Italy) (Q30063100), Palazzina del Belvedere (Q30134887). --Yiyi .... (talk!) 20:21, 18 November 2019 (UTC)[reply]
I'm trying to follow the data import in order to clean potential erros, but we should define a structured way to provide feedback for such errors to VIAF and their sources. Just for example, viaf/246353202 and viaf/313420111 should refer to the same church Q24034979. The error comes from two different GND items 3D7719397-0 and 3D1065019327 that should be merged. On the contrary, in VIAF they are kept disjoint and have different links to Wikidata items for the curch and the village where the church is located. Other example is a church named after San Nicola that in VIAF merges sources related to very different places. How to manage such process? Pietro (talk) 17:53, 20 November 2019 (UTC)[reply]

VIAF[edit]

Please ab bit more attention. Do you really belive, the hungarian handball player would be the same as the italian physician? -- Marcus Cyron (talk) 17:06, 21 November 2019 (UTC)[reply]

@Marcus Cyron: Hi, modification of Gabriella Landi (Q57981021) was applied by a batch job that is part of a large project: adding about 570,000 VIAF ID (P214) to Wikidata items, using data from Virtual International Authority File (Q54919). This means that unfortunately we also import some wrong associations introduced by VIAF itself. So, sorry. And, please, feel free to modify, delete or deprecate this new value. --Bargioni (talk) 17:52, 21 November 2019 (UTC)[reply]
Because I yesterday reverted around 90% of the additions in the field of badminton people (not clubs) I suggest not to proceed in this project, or revert all batches and the users can re-add the few correct entries. People looking at Wikidata as a very correct database, now we putting a lot of wrong things into it, making incorrect databases to seem more reliable. I did not check other professions than the badminton related ones, but if it looks there similar, some action must be done. Probably also too much entries are not on watchlists, so the wrong data will remain here long and be used for other databases as incorrect data input (@Marcus Cyron:) Another solution would be that you, Bargioni, by yourself would check every line you added for independent sources, and revert every unsourced entries by yourself. Florentyna (talk) 06:15, 22 November 2019 (UTC)[reply]
Amen. The solution to import false data willingly is strange and in my opinion false. -- Marcus Cyron (talk) 07:02, 22 November 2019 (UTC)[reply]
@Florentyna, Marcus Cyron: In my opinion we should first do the synchronisation and then check the data: I checked about one hundred added identifiers in the field of ancient Greek and Roman authors and they are nearly always correct, so the correctness probably depends on the field. Simply ignoring the VIAF is not the right way: we should import the data, then collect in one page wrong data and report them to VIAF, in order to have them corrected. --Epìdosis 08:55, 22 November 2019 (UTC)[reply]
@Florentyna: If I'm not wrong, badminton players with a VIAF ID (P214) can be selected using
select ?Q ?viaf where { ?Q wdt:P214 ?viaf ; wdt:P106 wd:Q13141064 . }
(result is 225). We can thus delete VIAF ids from items of badminton players in a simple way. Of course, if a badminton player is also an author, we would like to preserve his/her VIAF ID (P214).
As @Epìdosis: noticed, it is also interesting to gather VIAF errors imported by my batches: we could help the clusterization process of VIAF, that is as hard as any other reconciliation process. --Bargioni (talk) 10:38, 22 November 2019 (UTC)[reply]
Exact, I was going to suggest that query. I've just created Wikidata:WikiProject Authority control/VIAF errors, where all errors should be listed in order to help VIAF in the clusterization. The import will finish in a few hours, then we will start with some tidy-up: I've already two ideas for removing wrong clusters ;-) --Epìdosis 10:43, 22 November 2019 (UTC)[reply]
"We can thus delete VIAF ids from items of badminton players in a simple way" If you do that, you will also be deleting some good data. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:33, 6 December 2019 (UTC)[reply]
I'm still cleaning up bad matches from this batch. Example: [1]. What are you doing to reduce this burden on me and other volunteers? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:32, 6 December 2019 (UTC)[reply]
@Pigsonthewing: Problematic cases are being listed Wikidata:WikiProject Authority control/VIAF errors; when you find errors, please add them there in order to report them to VIAF. --Epìdosis 11:04, 6 December 2019 (UTC)[reply]
We tried that on Wikipedia: en:Wikipedia:VIAF/errors#Status of this page. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:03, 6 December 2019 (UTC)[reply]

VIAF confusion (2)[edit]

Hi! I have reversed your VIAF link after checking that it does not refer to the same cave, even though it has the same name. Although both caves are located in the Region of Murcia, the cave referred in Q8352480 is located in Cartagena, while the one referring to your link is in Pliego. Greetings, P4K1T0 (talk) 10:56, 22 November 2019 (UTC).[reply]


It seems to me that adding these VIAF statements in this manner (automatically or semi-automatically) is generally a bad idea. I've checked some identiferes added to places in Estonia and very often the problem is that VIAF entry is rather vague or mixes different entities, so that it's rather impossible to tell which Wikidata item it should actually match. E.g. this VIAF entry is currently associated to settlement, but names like "Tapa Region" suggest that rather it matches something else. This something else may be rural municipaly, but I don't really know as VIAF entry provides pretty much no clear context. Or, this VIAF entity is currently associted to settlement. Some alternative names however suggest that it may match parish instead. Or, for example there are three settlements entitled "Kurtna" in Estonia and it's unclear which of these is the VIAF entry about and why it's associated to this Kurtna. 2001:7D0:81F7:B580:F1CB:428E:AC85:CC4A 11:00, 22 November 2019 (UTC)[reply]

Batch too coarse[edit]

I think your huge VIAF batch was too coarse. Please see here. Asaf Bartov (talk) 13:07, 22 November 2019 (UTC)[reply]

I answered in the apposite page. --Epìdosis 13:42, 22 November 2019 (UTC)[reply]

Sincronizzazione conclusa - nuove sfide[edit]

Carissimo, finalmente i 23 batch sono conclusi! Ho segnalato l'avvenuta sincronizzazione in Property talk:P214#Recent synchronisation, invitando soprattutto ad usare la nuova pagina Wikidata:WikiProject Authority control/VIAF errors per segnalare i cluster confusi. Ti lascio qui giusto qualche idea che avrei per raffinare ulteriormente le nostre connessioni col VIAF:

  1. eliminare tutte le VIAF ID (P214) che linkano a cluster contenenti solo ID "undifferentiated" e/o "sparse" (non so se sia possibile individuarle facilmente dai dump) - probabilmente non sono molte, però sono senza dubbio da rimuovere;
  2. usare i fields per fare liste di possibili identificativi errati: es. listare gli elementi privi di instance of (P31)human (Q5) aventi nella VIAF ID (P214) un cluster del field "Personal Names", oppure gli elementi con sport (P641) aventi nella VIAF ID (P214) un cluster del tipo "Geographic Names" ... e cose simili; possiamo ragionare su quali liste cercare di ottenere (anche in questo caso, non so se sia possibile individuare facilmente dai dump il field al quale i cluster appartengono);
  3. cominciare ad esaminare gli elementi con instance of (P31)human (Q5) aventi più di una VIAF ID (P214), una delle quali è stata aggiunta in questo import; ho già preparato una lista approssimativa contenente circa 16mila elementi, ma penso che si possa filtrare ulteriormente, ci devo riflettere sopra. Quest'ultimo punto, comunque, riguarda un controllo manuale, mentre il primo è automatico e il secondo può essere pressoché automatico.

Insomma, ora comincia il bello: la caccia agli errori! Al tempo stesso è importante far sì che il VIAF noti al più presto la pagina Wikidata:WikiProject Authority control/VIAF errors, in modo che cominci a prendere in carico quelle segnalazioni (e le altre che verranno aggiunte nei prossimi giorni, settimane e mesi). Intanto, comunque, ancora complimenti per l'ottimo lavoro svolto! Mi dispiace che ti abbia attirato alcune critiche, secondo me immeritate (è ovvio che in ogni import del genere ci sia una certa percentuale di errori, ma la soluzione non può che essere un controllo manuale ed eventualmente semi-automatico a posteriori, come quello che ora metteremo in atto, e non bocciare l'import e far finta di niente), ma sono certo che la gran parte della comunità apprezzerà molto questo grande lavoro. Grazie ancora e a risentirci nei prossimi giorni! Buona notte :) --Epìdosis 00:05, 23 November 2019 (UTC)[reply]

@Epìdosis: Grazie a te per lo spunto iniziale, per i controlli durante i batch e per il supporto al contrasto delle critiche. In effetti adesso inizia una parte complessa, che oltre a quanto dici giustamente, include -direi- il seguimento dei merge e dei nuovi link tra Q e VIAF. Presumo che ci sia da sfruttare le differenze tra i file "http://viaf.org/viaf/data/viaf-AAAAMMGG-persist-rdf.xml.gz" e i file "http://viaf.org/viaf/data/viaf-AAAAMMGG-links.txt.gz" mese per mese. In questo modo si dovrebbe evitare di ripetere operazioni già fatte e stt di riportare nuovamente i VIAF errati. E per quanto dici sopra, mi pare che tu stia prospettando un'analisi che, in mancanza di un metodo di ricerca sui dati VIAF, richiede i dati di Wikidata e quelli dei cluster VIAF sulla stessa macchina. Se sbaglio, dimmelo. Se dico bene, è una bella impresa :-) , vista la quantità dei cluster e quindi la necessità di un sistema molto performante. Ci ragiono. --Bargioni (talk) 11:20, 23 November 2019 (UTC)[reply]
@Epìdosis: I casi di cluster costruiti solo su sorgenti sparse o solo undifferentiated possono essere recuperati dal dump del file http://viaf.org/viaf/data/viaf-20191104-clusters-rdf.xml.gz, con un filtro apposito, che controlli anche la presenza della sorgente WKP (Wikidata):

curl -s 'http://viaf.org/viaf/data/viaf-20191104-clusters-rdf.xml.gz' | gzip -cd | grep -E 'sparse|undifferentiated' | filtro
Il filtro è da fare, ma non è una promessa :-) Il comando sarà cmq molto lento, vista la dimensione del file .gz (6 GB?)
Analogamente si può costruire un elenco di  VIAF | tipo | Q  che permetta di lavorare i tipi 2. Ma sarebbe di molte righe, per cui non saprei come usarlo in una sparql. --Bargioni (talk) 10:58, 25 November 2019 (UTC)[reply]

Buon anno![edit]

Ciao! Volevo passare a ringraziarti per il tuo ultimo passaggio di aggiunta di Pontificia Università della Santa Croce ID (P5739) e ad augurarti buon 2020! Sono certo che riusciremo a continuare nel migliore dei modi il lavoro che abbiamo cominciato nelle scorse settimane. A presto, --Epìdosis 13:11, 31 December 2019 (UTC) P.S. Ho appena trovato un redirect VIAF, se e quando hai tempo potremmo valutare un nuovo passaggio di pulizia di VIAF ID (P214) reindirizzati o cancellati[reply]

@Epìdosis: Meno male, non mi avresti trovato, sono in ferie fino al 6. Ma mi avrebbe fatto molto piacere, quindi s'ha da fare.
Ho avuto materiale utile dal library manager della biblioteca relativa a Angelicum ID (P5731), così ho lanciato l'ultimo batch.
Per quanto riguarda un costante intervento su VIAF ID (P214) in base ai dump mensili del progetto VIAF, vale certamente la pena stabilizzare le procedure. Dovremmo anche ragionare sulla eccessiva differenza tra numero di cluster VIAF e numero di item Wikidata con VIAF ID (P214). Cioè, direi che in Wikidata mancano moltissimi autori. Buon anno. --Bargioni (talk) 13:42, 31 December 2019 (UTC)[reply]
Ciao, volevo dirti che ho notato ora che l'aggiornamento del VIAF (sistemazione dei cluster reindirizzati ed eliminazione dei cluster cancellati), nonché di ISNI e NLA, è compiuto periodicamente dal KrBot, quindi non credo sia necessario un nostro passaggio per questo aspetto; l'ultimo passaggio è stato il 10 gennaio. Ci risentiamo nei prossimi giorni. Buona domenica, --Epìdosis 12:06, 12 January 2020 (UTC)[reply]
@Epìdosis: Grazie, info utilissima. Possiamo lavorare su LCNAF e stt GND: ho trovato una via. Ma forse va discussa con altri, vedi Property_talk:P227#(careful)_import_from_VIAF?. --Bargioni (talk) 14:26, 12 January 2020 (UTC)[reply]

Incorrect VIAF on Q60527326[edit]

Good morning, I have reverted your VIAF changes on Office of Inspector General, Export-Import Bank of the United States (Q60527326) because that VIAF record refers to the parent organization. William Graham (talk) 14:49, 17 January 2020 (UTC)[reply]

@William Graham: Thanks a lot. Errors like this come from VIAF, unfortunately. We have to instruct it/them to foster their clusters. So, please, add the item Office of Inspector General, Export-Import Bank of the United States (Q60527326) to the page Wikidata:WikiProject Authority control/VIAF errors. --Bargioni (talk) 16:00, 17 January 2020 (UTC)[reply]
@Bargioni: I can do that. However, I want to note that even though you are using an external data source and semi-automated tooling, you are still responsible for the accuracy of the data you add to Wikidata and have a responsibility to manually vet your changes and/or discontinue using your tools if you are knowingly inserting incorrect data. Thank you and have a nice day. William Graham (talk) 16:14, 17 January 2020 (UTC)[reply]
@William Graham: The edit was correct, I restored it: https://viaf.org/viaf/154029415/ refers to Office of Inspector General, Export-Import Bank of the United States (Q60527326) and https://viaf.org/viaf/130175936/ refers to Export-Import Bank of the United States (Q1384697). Bye, --Epìdosis 17:07, 17 January 2020 (UTC)[reply]
@William Graham: Sorry for annoying you once again... my firs reply is wrong. Thx to Epìdosis, I can say that my batch job didn't generate any error. As you know, VIAF project uses clusters. In VIAF, Q60527326 was associated with cluster https://viaf.org/viaf/288007377, and now it is part of cluster https://viaf.org/viaf/154029415/. So, if you click on the current (let's say, yours) P214 in Q60527326, you will be redirected by VIAF to the cluster identified by the second value. This means, in my opinion, that what you dislike is independent by my batch job, since both values now link to the same VIAF cluster. --Bargioni (talk) 17:16, 17 January 2020 (UTC)[reply]
@Bargioni: Sorry for my confusion, need my coffee before editing in the morning. :) William Graham (talk) 17:17, 17 January 2020 (UTC)[reply]
@William Graham: Me too :-) Do not add Q60527326 to the page Wikidata:WikiProject Authority control/VIAF errors. Bye! --Bargioni (talk) 21:55, 17 January 2020 (UTC)[reply]

VIAF - errori geografici[edit]

VIAF comuni italiani[edit]

Ciao! Prima di assentarmi per l'esame della settimana prossima, volevo dirti una cosa che ho notato oggi: consultando Wikidata:Database reports/Constraint violations/P214, mi sono accorto che il nostro import di novembre ha significativamente aumentato (come prevedibile, ma forse più del prevedibile) le violazioni dei vincoli, in particolare alzando non solo i "single value" da 12mila a 32mila (ma lì si può benissimo spiegare, nella maggior parte dei casi con errori del VIAF), bensì anche gli "unique value" da mille a 17mila (e questi dovrebbero essere errori nostri, un po' tanti!). Bene, ho cominciato a vedere la lista degli unique value per cercare di capire un po' quali errori ci potessero essere e ... ho trovato la fonte di quasi 4mila errori: i comuni italiani.

Su Wikidata ogni comune italiano (salvo forse quelli di recentissima creazione) ha due elementi:

  • uno per il comune in sé (con tutti gli identificativi e tutte le pagine di Wikipedia: es. Albenga (Q241298));
  • uno per il suo capoluogo (inteso come la parte del comune omonima al comune stesso, ma distinta dalle altre frazioni comprese nel comune medesimo; esistono perché Wikipedia in cebuano ha creato via bot queste pagine basandosi su GeoNames, quindi tendenzialmente contengono soltanto cebwiki e GeoNames ID (P1566): es. Albenga (Q30022077)).

Ora, per oscure ragioni pare che il VIAF in 3824 casi abbia messo nei suoi cluster (chiarissimamente riferiti ai comuni nella loro interezza) gli inutili elementi Wikidata dei capoluoghi, e poi il nostro import ha aggiungo VIAF ID (P214) in questi elementi, creando quindi una violazione di "unique value" rispetto agli elementi dei comuni.

Ecco la lista dei casi:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q747074 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P31 wd:Q15303838 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}
Try it!

Di conseguenza, sarebbe bene procedere in questo modo:

  1. inserire la lista dei 3824 casi in Wikidata:WikiProject Authority control/VIAF errors, a mo' di tabella (come la sezione "VIAF with sparse or undifferentiated records"), con tre colonne: link al VIAF, elemento di Wikidata scorretto ed elemento di Wikidata corretto
  2. rimuovere con QuickStatements la VIAF ID (P214) da tutti i 3824 capoluoghi di comune

Se vuoi io posso occuparmene dopo il 29, altrimenti puoi procedere tu stesso, dovrebbe essere un lavoro piuttosto veloce; in tal modo cominciamo a decongestionare la lista delle violazioni di vincolo. Potrebbero stare un po' più attenti laggiù al VIAF, però ... Grazie mille come sempre, ci sentiamo a fine mese! --Epìdosis 17:34, 20 January 2020 (UTC)[reply]

@Epìdosis: Molto interessante... detto anche un po' ironicamente. Grazie della segnalazione. Sto studiando la cosa. --Bargioni (talk) 16:57, 21 January 2020 (UTC)[reply]
✓ Done finalmente problema risolto, il VIAF ha sostituito nei cluster i link ai capoluoghi coi link ai comuni. --Epìdosis 20:52, 9 May 2020 (UTC)[reply]

VIAF - comuni olandesi[edit]

Altri 55 casi scovati nei Paesi Bassi, il principio è esattamente lo stesso, direi (comune vs capoluogo):

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q2039348 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P131 ?comune .
  ?capoluogo wdt:P17 wd:Q55 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}
Try it!

Continuo la ricerca. --Epìdosis 20:49, 29 January 2020 (UTC)[reply]

VIAF - comuni belgi[edit]

Altri 124 in Belgio:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q493522 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P131 ?comune .
  ?capoluogo wdt:P17 wd:Q31 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}
Try it!

--Epìdosis 20:56, 29 January 2020 (UTC)[reply]

VIAF - comuni spagnoli[edit]

Altri 110 in Spagna:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q2074737 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P31 wd:Q15303838 .
  ?capoluogo wdt:P17 wd:Q29 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],es,ceb". }
}
Try it!

--Epìdosis 21:22, 29 January 2020 (UTC)[reply]

Removed invalid VIAFs might be re-imported from other wikis, as long as they still exist there[edit]

Hello Bargioni,

you recently removed a lot (abount 4.000 ?) of VIAF-IDs, which became invalid over time. (example). One problem is, that a lot of them have been imported from other wikis, mostly the German version (example). So the invalid IDs might still exist in the wikis they have been imported/harvested from (e.g. using HarvestTemplates). Sooner or later they might be reimported from these other wikis into Wikidata, as long they are not removed also in these other wikis.

Do you know a way to mass remove these invalid IDs also in the wikis they have been imported from? --M2k~dewiki (talk) 00:22, 23 January 2020 (UTC)[reply]

@M2k~dewiki: Hi, interesting question... Please confirm me that an example of the issue you refer to is this one: the now invalid VIAF 10512949 deleted by my batch from Wolfgang Leidig (Q1718929), is still present in https://de.wikipedia.org/wiki/Wolfgang_Leidig. And you are asking for a procedure to remove it also from the de.wiki page. Please, also ping @Epìdosis: in your reply. Thx a lot. --Bargioni (talk) 13:55, 23 January 2020 (UTC)[reply]
Yes, @M2k~dewiki: refers to cases such as Wolfgang Leidig, and he is right, the obsolete IDs still present in some Wikipedias (mainly de.wiki, because most Wikipedias read identifiers only from Wikidata and don't have them in their pages) can than be re-imported on Wikidata. The problem has already been noted here and I think there is (or there would be) consensus for the removal of obsolete IDs.
My suggestion is: first of all @M2k~dewiki: opens a thread in de:Hilfe Diskussion:Normdaten asking for consensus about the correction of obsolete VIAF IDs (which includes deletion of obsolete IDs and substitution of redirected IDs) and if someone would be capable of programming a bot doing this job periodically; if, when consensus about the correction is reached, a bot-programmer has been found, all OK; otherwise, I will ask for help to the Italian bot-programmers I know. Do you agree with this plan? --Epìdosis 15:41, 23 January 2020 (UTC)[reply]
Hello @Epìdosis, Bargioni:, as suggested I started a discussion at de:Hilfe_Diskussion:Normdaten#Ungültige_VIAF-Kennungen_in_Artikeln. Thanks a lot! --M2k~dewiki (talk)

Cluster VIAF con due o più elementi Wikidata[edit]

Ciao! Oggi, dopo aver notato (con un certo disappunto) questo cluster, ho pensato che una certa fetta, non so quanto ampia, delle nostre violazioni di unique constraint sia dovuta al fatto che alcuni cluster VIAF mettono assieme due o più elementi di Wikidata. Riusciresti a fare una tabella sulla base dell'ultimo dump e a incollarla in Wikidata:WikiProject Authority control/VIAF errors/Two or more Wikidata items, così poi la controlliamo gradualmente? In futuro possiamo rimuovere direttamente le righe dove loro hanno ragione (cioè quando ci accorgiamo che Wikidata ha effettivamente dei duplicati, e li uniamo) oppure constatare che non hanno ragione e aggiungere un commento (quindi puoi lasciare una cella vuota sulla destra per ogni riga). Grazie mille come sempre e buona serata, --Epìdosis 19:06, 10 February 2020 (UTC)[reply]

@Epìdosis: Situazione non rosea...: ci sono 4489 cluster nel VIAF che contengono elementi Wikidata diversi. E non sono solo coppie. Ecco i casi top:
occ VIAF cluster
6 http://viaf.org/viaf/308723015
6 http://viaf.org/viaf/242677520
6 http://viaf.org/viaf/122624553
select ?viaf ?q ?qLabel ?ins ?insLabel where {
  ?q wdt:P214 ?viaf ;
     wdt:P31 ?ins .
  values ?viaf {"308723015" "242677520" "22624553"}
  service wikibase:label { bd:serviceParam wikibase:language "it,en,de,fr,es,pt". }
}
order by ?viaf
Try it!
Domani spero di fare la tabella nella nuova pagina. --Bargioni (talk) 22:13, 10 February 2020 (UTC)[reply]
Accidenti! Ci sarà lavoro da fare ... let's do it, slowly! A domani :) --Epìdosis 22:16, 10 February 2020 (UTC)[reply]
@Epìdosis: E' nata Wikidata:WikiProject Authority control/VIAF errors/Two or more Wikidata items. Se la modifichi e fai Anteprima, potrebbe non rispondere mai. A me è successo. Temo abbia troppi link da costruire. --Bargioni (talk) 15:08, 11 February 2020 (UTC)[reply]
I arrived here because i saw this and this wrong edition. Are there a problem with the VIAF, right? --Vanbasten 23 (talk) 20:52, 31 March 2020 (UTC)[reply]
@Vanbasten 23: VIAF has errors. My huge batch import (Nov 2019) (un)fortunately included them. Please, append errors you may find to Wikidata:WikiProject_Authority_control/VIAF_errors. -- Bargioni 🗣 21:11, 31 March 2020 (UTC)[reply]
Bargioni, I recently created all the libraries in Spain and I would like to put them the VIAF code, how can I do it? Thanks. --Vanbasten 23 (talk) 15:15, 1 April 2020 (UTC)[reply]

Viaf Alvesta kommun[edit]

The entry was wrong that Viaf belong to the town Alvesta.Yger (talk) 04:58, 11 February 2020 (UTC)[reply]

Spanish Libraries[edit]

Hi, @Vanbasten 23:, I prefer to open a new thread. Please let me know -even in Spanish, or in Italian, if you like- what do you mean about created all the libraries in Spain? New items? If so, please pass me some examples. Thx, sorry. -- Bargioni 🗣 15:28, 1 April 2020 (UTC)[reply]

Perfect, and thanks. Yes, new items. This is the query. There, you can see only a few Viaf id... I introduced the address, telephone number, email, instance of, descriptions, label, coordenates... and i'm working on the image, commons... but i have not the identifiers... Thanks. --Vanbasten 23 (talk) 18:02, 1 April 2020 (UTC)[reply]
@Vanbasten 23: Hi! I've seen the message, so I try to leave just a little comment, as far as I know the topic. First of all, thank you for the great work you have done!!! Regarding the addition of VIAF ID (P214), I think it is probably not a priority: the great majority of libraries doesn't have a VIAF code, because no national library (including National Library of Spain ID (P950)) usually has an identifier for them, with the exception of the biggest ones, which are covered because they have often published something or they have been the subject of some publication. Probably only some tens of libraries can receive a VIAF. I would look at ISIL (P791) instead, which is the most important identifier for libraries all over the world:
SELECT DISTINCT ?biblio ?biblioLabel ?isil ?viaf
WHERE {
  ?biblio (wdt:P31/(wdt:P279*)) wd:Q7075;
    wdt:P17 wd:Q29.
  OPTIONAL { ?biblio wdt:P791 ?isil. }
  OPTIONAL { ?biblio wdt:P214 ?viaf. }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],es". }
}
Try it!
ISIL seems nearly always absent, but certainly hundreds (I guess) of libraries have one (I'm sure there are more libraries with ISIL than libraries with VIAF, at least). Unfortunately, not being a librarian, I don't know at the moment where to find some list of ISIL codes. Maybe you can ask a question in the talk page of Wikidata:WikiProject Libraries. @Bargioni:, of course correct me if I've said something wrong :) --Epìdosis 19:39, 1 April 2020 (UTC) P.S. Little advertisement to Italian public :)[reply]
Thank you so much. I asked Bartioni about the Viaf id because I saw that he had several batches about it, but yes, my idea is to introduce those identifiers that can contain libraries, like isil. I'll look, thank you very much. --Vanbasten 23 (talk) 20:17, 1 April 2020 (UTC)[reply]
@Vanbasten 23: I agree with @Epìdosis:: VIAF is for personal, corporate, meeting authors, places, ... but not libraries. The MARC org code is an ISIL compliant id that can be assigned to a library upon request. At the moment, 399 Spanish libraries have an ISIL code: https://www.loc.gov/marc/organizations/org-search.php?countryID=185&submit=Search. You could report it in P791. Does a national Spanish code for libraries exist? It could also be added to your items. Sincerely.  – The preceding unsigned comment was added by Bargioni (talk • contribs).
@Vanbasten 23: Very good, I hoped there was a list of ISIL but I didn't know where :) Great! --Epìdosis 08:31, 2 April 2020 (UTC)[reply]
Yes, i have an id, this is actually the id assigned by the it:Instituto Nacional de Estadística (Spagna) to identify the libraries. I was thinking of asking for a new property... --Vanbasten 23 (talk) 14:28, 2 April 2020 (UTC)[reply]
About de ISIL code, perfect and thanks. I will put it ;) --Vanbasten 23 (talk) 14:32, 2 April 2020 (UTC)[reply]

Author IDs should be added to author data items, not to work data items. --EncycloPetey (talk) 14:28, 21 April 2020 (UTC)[reply]

@EncycloPetey: ✓ Done, just created Pseudo-Anacreon (Q91332057). Thanks, --Epìdosis 14:50, 21 April 2020 (UTC)[reply]
@EncycloPetey, Epìdosis: Thx. Incoming Perseus author ID (P7041) values could reflect errors from the Perseus catalog... A total of 813 links to Wikidata are contained in its records. -- Bargioni 🗣 15:32, 21 April 2020 (UTC)[reply]

Creation of duplicates[edit]

Hi, your QuickStatements seem to create duplicates:

-- Discostu (talk) 12:00, 4 May 2020 (UTC)[reply]

@Discostu: Thx a lot. I'll check it. -- Bargioni 🗣 13:41, 4 May 2020 (UTC)[reply]
This was reported at Topic:Vkhd578n4cv9ndew. Luckily the duplicates are easy to find and merge.--GZWDer (talk) 13:46, 4 May 2020 (UTC)[reply]
@GZWDer: My QS commands do not contain more than one CREATE for items that were duplicated, like Krzysztof Lala (Q93244134)... What's wrong? -- Bargioni 🗣 13:59, 4 May 2020 (UTC)[reply]
Backend QuickStatements run tasks in several threads, but there is not a lock mechanism, so multiple threads did the same task. You can use frontend QuickStatements which have no such problem. (This problem seems happen on item creation only.)--GZWDer (talk) 14:04, 4 May 2020 (UTC)[reply]
Thx for your explanation. Happy to know that duplicates are independent on my QS commands :-) -- Bargioni 🗣 14:35, 4 May 2020 (UTC)[reply]
Also see https://phabricator.wikimedia.org/T234162 --M2k~dewiki (talk) 18:57, 24 May 2020 (UTC)[reply]

Check[edit]

Non sono del tutto certo sull'unione tra Ángel González Muñiz (Q715118) e Ángel González Muñiz (Q80325986). Puoi indagare? Grazie mille, --Epìdosis 15:27, 6 May 2020 (UTC)[reply]

@Epìdosis, Gentile64: Giro la richiesta. -- Bargioni 🗣 07:06, 7 May 2020 (UTC).[reply]
è la stessa persona. Se cliccate sull'isni vedrete che sono presenti due viaf. Comparandoli si capisce che l'autore è lo stesso. Purtroppo sono presenti molte volte due o più cluster viaf per indicare lo stesso autore. Un saluto  – The preceding unsigned comment was added by Gentile64 (talk • contribs).
@Gentile64: ✓ Done, uniti. --Epìdosis 08:15, 7 May 2020 (UTC) P.S. Per firmare i messaggi, usa il tasto apposito, come spiegato qui :)[reply]

Quickstatements: 1589290271422[edit]

Questions from your QS load batch: https://tools.wmflabs.org/editgroups/b/QSv2T/1589290271422/

I was wondering if the load above included all the properties as noted on the screen in one single action to create 4946 new entities. And if you could share a sample of a few entities from the set to compare how the file can be prepared. Thank you for your help.

jshieh (talk) 19:30, 14 May 2020 (UTC)[reply]

@ShiehJ, Epìdosis: With the strong collaboration of Epìdosis, we projected to create items from unmatched entries of the MnM FAST catalog https://tools.wmflabs.org/mix-n-match/#/catalog/150. Then we filtered entries with VIAFid and well formed dates, and enriched them accessing both FAST and VIAF records using http://fast.oclc.org/fast/$fast_id/marc21.xml and http://viaf.org/viaf/$viafid/viaf.json. We used a Perl script, whose output are the CREATE LAST LAST commands for QuickStatements (including references). We plan to add about 59,000 items. Here is an example of QS commands. Please let me know if you are interested in more info. -- Bargioni 🗣 07:31, 15 May 2020 (UTC)[reply]
CREATE
LAST    Len     "Alexander Murray"
LAST    P31     Q5
LAST    P569    +1727-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P570    +1793-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P214    "51558010"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P244    "nr92041797"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P7859   "lccn-nr92041797"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "51558010"
LAST    P2163   "1508922"
CREATE
LAST    Len     "Joseph Dawson Murray"
LAST    P31     Q5
LAST    P569    +1785-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P570    +1852-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P214    "29401466"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P244    "nr92041800"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P7859   "lccn-nr92041800"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "29401466"
LAST    P2163   "1508924"
CREATE
LAST    Len     "John Barton Derby"
LAST    P31     Q5
LAST    P569    +1792-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P570    +1867-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P214    "12174577"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P244    "nr92040978"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P213    "0000 0000 4805 0059"   S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "12174577"
LAST    P7859   "lccn-nr92040978"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "12174577"
LAST    P2163   "1508928"
@Bargioni, Epìdosis: This is a much more clear examples than what the Help:QuickStatements doc provides. It would be ideal that your examples be added to the Help doc for some novice like me! THANK YOU!

What you described is precisely what we will try to accomplish, uploading ca. 43K artists names which are not found in Wikidata as we grab content from the VIAF json file to upload. I am experimenting extracting necessary data from VIAF JSON URL then parse them into columns via OpenRefine. However the result is rather disappointing at the moment. It is likely due to my lack of proficiency in OpenRefine and JSON. The VIAF MARCxml format was not considered since the workflow was to be conducted by colleagues using browser. Otherwise, combining MARC::Perl module with marcxml to extract data to feed via Quickstatements should be simpler.

This artists set will be over 60K names. Afterwards, the following sets will be over 600K names (person and corporate and family). The method we use will first match Wikidata to ensure the entry to be added is indeed a new item in Wikidata. The method we devise at this time is browser based operations. Hoping this will have broader appeal to colleagues who are catalogers and reference librarians, to engage in the Wikidata activities. — jshieh (talk)

Query per duplicati con ru.wiki[edit]

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    VALUES ?item1 { wd:??? } .
    #?item1 wdt:P31 wd:Q5.
    #?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en.
    FILTER (LANG(?label_en) = "en")
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_ru
Try it!

Lascio come promemoria :) --Epìdosis 10:33, 15 May 2020 (UTC)[reply]

Questa individua gli elementi con "vich" nell'etichetta:

SELECT ?item ?en_label
WHERE {
  ?item p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] ;
        rdfs:label ?en_label .
  FILTER(LANG(?en_label) = "en") .
  FILTER(CONTAINS(?en_label,"vich"))
}
ORDER BY ?en_label
Try it!

--Epìdosis 10:44, 15 May 2020 (UTC)[reply]

Query finali[edit]

Ne consegue che

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT DISTINCT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    #VALUES ?item1 { wd:??? } .
    ?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en .
    FILTER(LANG(?label_en) = "en")
    FILTER(CONTAINS(?label_en,"vich"))
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_en ?label_ru
Try it!

è la soluzione perfetta. Funziona ottimamente, becca tutti gli elementi che hanno sia data di nascita sia data di morte; per quelli con sola data di nascita, segue

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT DISTINCT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    #VALUES ?item1 { wd:??? } .
    ?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en .
    FILTER(LANG(?label_en) = "en")
    FILTER(CONTAINS(?label_en,"vich"))
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_en ?label_ru
Try it!

, che ovviamente ha molti più risultati e non credo sia ulteriormente possibile filtrare. Ottimo! --Epìdosis 10:52, 15 May 2020 (UTC)[reply]

Same dates[edit]

came up on Wikidata:Database_reports/identical_birth_and_death_dates/1. You created the later recently. There is also Quentin Debray (Q3414170). There seems to be some mixup. --- Jura 09:23, 20 May 2020 (UTC)[reply]

GND saturation cleanup[edit]

Hi Bargioni,

Would you comment at Property_talk:P227#GND_saturation_of_Wikidata. While I trust Epìdosis, it would obviously be better if that came from you. --- Jura 10:27, 25 May 2020 (UTC)[reply]

Mappa di Roma[edit]

#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT ?luogo ?luogoLabel ?coordinate
WHERE {
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  { ?luogo wdt:P31/wdt:P279* wd:Q7075 . }
  UNION
  { ?luogo wdt:P31/wdt:P279* wd:Q33506 . }
  UNION
  { ?luogo wdt:P31 wd:Q16970 . }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}
Try it!

--Epìdosis 10:18, 3 June 2020 (UTC)[reply]

Con colori diversi, grazie a @Dipsacus fullonum:
#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT DISTINCT ?luogo ?luogoLabel ?coordinate ?layer
WHERE {
   BIND(wd:Q7075 AS ?biblioteca).
   BIND(wd:Q33506 AS ?museo).
   BIND(wd:Q16970 AS ?chiesa).
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  { ?luogo wdt:P31/wdt:P279* ?biblioteca . BIND(1 AS ?layer) }
  UNION
  { ?luogo wdt:P31/wdt:P279* ?museo . BIND(2 AS ?layer) }
  UNION
  { ?luogo wdt:P31/wdt:P279* ?chiesa . BIND(3 AS ?layer) }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}
Try it!
--Epìdosis 10:13, 4 June 2020 (UTC)[reply]
Versione semplificata, sempre di @Dipsacus fullonum:
#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT DISTINCT ?luogo ?luogoLabel ?coordinate ?layer
WHERE {
  VALUES ?layer { wd:Q7075 wd:Q33506 wd:Q16970 } # biblioteca, museo e chiesa
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  ?luogo wdt:P31/wdt:P279* ?layer.
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}
Try it!
--Epìdosis 10:25, 4 June 2020 (UTC)[reply]

Fix data teoricamente importata da VIAF[edit]

Nella query seguente e in altri casi simili

SELECT ?p ?db
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "exturlusage" .
    bd:serviceParam mwapi:geuprop "title" .
    bd:serviceParam mwapi:geunamespace "0" .
    bd:serviceParam mwapi:geuprotocol "https" .
    bd:serviceParam mwapi:geuquery "viaf.org/viaf/" .
    bd:serviceParam mwapi:geulimit "max" .
    ?p wikibase:apiOutputItem mwapi:title .
  }
  hint:Prior hint:runFirst "true".
  
  ?p p:P569 [ps:P569 ?db ; prov:wasDerivedFrom [pr:P854 ?site] ].
  FILTER("1950-00-00"^^xsd:dateTime = ?db)
  FILTER(CONTAINS(STR(?site),"viaf.org/viaf/"))
}
Try it!

ci sono cose da sistemare. Domani ti spiego meglio. Buona notte, --Epìdosis 21:25, 5 June 2020 (UTC)[reply]

That would be great. It's the same as Wikidata:Bot_requests#Cleanup_VIAF_dates. --- Jura 21:35, 5 June 2020 (UTC)[reply]
@Jura1: Wow, so it's an ancient problem; I've noticed it only today, I think. How about removing them all, in your opinion? --Epìdosis 21:46, 5 June 2020 (UTC)[reply]
Given the time it's there, I'm glad about any approach that solves it.
I suppose re-reading the source to check "ns1:dateType" (see explanation here) for all dates would be too much work.
An intermediate solution could be to change the property of these statements to floruit (P1317) (if no other reference is given). --- Jura 08:47, 6 June 2020 (UTC)[reply]

More complete list of cases (note: it comprehends not only references to VIAF URLs, so has certainly some false positives):

SELECT ?p ?db
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "exturlusage" .
    bd:serviceParam mwapi:geuprop "title" .
    bd:serviceParam mwapi:geunamespace "0" .
    bd:serviceParam mwapi:geuprotocol "https" .
    bd:serviceParam mwapi:geuquery "viaf.org/viaf/" .
    bd:serviceParam mwapi:geulimit "max" .
    ?p wikibase:apiOutputItem mwapi:title .
  }
  hint:Prior hint:runFirst "true".
  
  ?p p:P569 [psv:P569 ?dbv ; prov:wasDerivedFrom [pr:P854 ?site] ].
  FILTER CONTAINS(STR(?site),"viaf.org/viaf/")
  ?dbv wikibase:timeValue ?db; wikibase:timePrecision ?precision.
  BIND (YEAR(?db) AS ?year)
  FILTER(?precision = 9)
  FILTER IF(?year > 0,
            ?year - FLOOR(?year / 100) * 100 = 50, # year is AD
            ?year - FLOOR(?year / 100) * 100 = 51) # year is BC, 1 BC is encoded as "0", 2 BC as "-1" etc.
}
Try it!

--Epìdosis 16:17, 8 June 2020 (UTC)[reply]

NSZL VIAF senza NSZL vero[edit]

SELECT ?item
WHERE {
  ?item wdt:P951 ?nszlviaf
  MINUS { ?item wdt:P3133 ?nszl . }
}
Try it!

--Epìdosis 17:39, 8 June 2020 (UTC)[reply]

Wrong VIAF[edit]

The VIAF-ID added here described a completely different person. I added the correct one, but it is often hard to find mistakes like this. --Christian140 (talk) 18:51, 8 June 2020 (UTC)[reply]

@Christian140: Thx. This item was updated by a huge import I made 8 months ago. Some VIAF errors were imported too, of course. And this will allow the WD community to help VIAF to correct them. So please add this error to Wikidata:VIAF/cluster/conflating_entities. --  Bargioni 🗣 11:37, 9 June 2020 (UTC)[reply]

Cleanup[edit]

Hi Bargioni,

Epidosis gave an estimate of two weeks for the cleanup. Can you give us an update at W:AN? What's your view about the create if needed vs. check/complete approach? --- Jura 08:46, 29 June 2020 (UTC)[reply]

Wrong VIAF[edit]

Hi,

the VIAF added here is the wrong instance, it describes a administrative district and not the river Toss. --Hannes Röst (talk) 14:15, 29 June 2020 (UTC)[reply]

@Hannes Röst: Hi, this item was updated by a huge import I made in Nov 2018 2019. Some VIAF errors were imported too, of course. And this will allow the WD community to help VIAF to correct them. So please add this error (and more, if any) to Wikidata:VIAF/cluster/conflating_entities. -- Bargioni 🗣 16:13, 29 June 2020 (UTC)[reply]
I think November 2019, time passes but not so quickly ;-) --Epìdosis 19:10, 29 June 2020 (UTC)[reply]
Thx, @Epìdosis:, a typo! I corrected it. -- Bargioni 🗣 20:38, 29 June 2020 (UTC)[reply]

Procuratori non di Augusto[edit]

Guarda un po':

SELECT DISTINCT ?item
WHERE {
  ?item p:P106 ?statement .
  ?statement ps:P106 wd:Q499165 .
  ?statement prov:wasDerivedFrom ?reference .
  ?reference pr:P227 ?gnd .
}
Try it!

Sembra che in 76 elementi abbiamo importato da GND occupation (P106)Procurator (Q499165) anziché occupation (P106)prosecutor (Q600751) ... riesci a fare il fix con QS togliendo la dichiarazione errata e reinserendo quella corretta colla reference intatta? Grazie mille, --Epìdosis 16:00, 13 July 2020 (UTC)[reply]

Altro errore, lo puoi trovare inserendo come valore della occupation (P106) nella query precedente women in music (Q25095122): tutte queste occorrenze vanno sostituite, a mano o via QS, con composer (Q36834). Grazie mille e buona notte, --Epìdosis 23:28, 18 July 2020 (UTC)[reply]
@Epìdosis: Dubito della conservazione del riferimento a GND: women in music (Q25095122) è molto generico rispetto a composer (Q36834). -- Bargioni 🗣 09:03, 19 July 2020 (UTC)[reply]
Secondo me invece si può mantenere: http://d-nb.info/gnd/4032010-8 scrive chiaramente "Komponistin". --Epìdosis 09:07, 19 July 2020 (UTC)[reply]

Addition of asteroid to Isabele de Charrière[edit]

Hello Bargioni Would you add the naming of the asteroid 9604 Bellevanzuylen (Q268097) to Isabelle de Charrière (Q123386) at wikidata? That would be appreciated. Boss-well63 (talk) 20:08, 20 July 2020 (UTC)[reply]

@Boss-well63: 9604 Bellevanzuylen (Q268097) has named after (P138) that links to Isabelle de Charrière (Q123386). In my opinion, it is not possible to do the opposite. -- Bargioni 🗣 21:41, 20 July 2020 (UTC)[reply]
@Boss-well63: Activating "relateditems" in your Preferences > Gadgets you can visualize in Isabelle de Charrière (Q123386) the inverse statement. --Epìdosis 21:51, 20 July 2020 (UTC)[reply]

Mass deleting VIAF IDs[edit]

Hi, Has there been a discussion about deleting abandoned VIAF clusters? I think depreciating would be a far better solution than deletion. – Máté (talk) 08:15, 23 July 2020 (UTC)[reply]

@Máté: Hi! This discussion, although not much participated, reached consensus for removal (mainly through KrBot, but obviously also manually) of redirected and deprecated VIAFs; a more general discussion, Wikidata:Requests for comment/Handling of stored IDs after they've been deleted or redirected in the external database, is ongoing and actually stuck, so I think the result of the previous discussion is still valid. --Epìdosis 09:16, 23 July 2020 (UTC)[reply]
Hello @Epìdosis, Máté:

if the deleted VIAF-IDs still exist in other projects (German language Wikipedia, Commons, ...) they might be re-imported by HarvestTools for example. Also see

The German community has been informed at

I prefer to keep abandoned VIAF and set them to rank deprecated with Wikibase reason for deprecated rank (Q27949697) = withdrawn identifier value (Q21441764). This avoids re-importing from Wikipedia projects or 3rd party databases. Raymond (talk) 13:55, 23 July 2020 (UTC)[reply]

Also, in these cases the IDs do lead to the old clusters with all the information still present albeit they bear a warning that the cluster has been deleted. I do not really see the advantages of deleting over depreciating. – Máté (talk) 14:12, 23 July 2020 (UTC)[reply]

In some cases it is helpful to keep a depreciated VIAF-ID. But VIAF has no stable IDs. VIAF has clusters that keeps changing. VIAF-Clusters are merging multiple authority files of persons or organisations with the same name: Some times they are identical often not. So keeping all depreciated VIAF-IDs only creates chaos. --Kolja21 (talk) 15:28, 23 July 2020 (UTC)[reply]

Barnstar[edit]

The Wikidata Barnstar
Thank you for your work on the VIAF identifiers--Alexmar983 (talk) 02:57, 1 August 2020 (UTC)[reply]

Mi sembrava meritato.--Alexmar983 (talk) 02:57, 1 August 2020 (UTC)[reply]

@Alexmar983: Thx a lot, grazie mille...! -- Bargioni 🗣 09:17, 1 August 2020 (UTC)[reply]

VIAF error[edit]

Another quite a serious VIAF mistake. --Hannes Röst (talk) 20:52, 11 August 2020 (UTC)[reply]

How to improve this bot?[edit]

Hello. At least for Koreans, this bot makes lots of mistakes, adding wrong VIAF-IDs to the Wikidata-Objects. A reason could be that in Korea, many people carry the same names. However, if you take name, year of birth and description from the Korean National Library, you can quite well identify a person. And from the Library object, you get a link to VIAF, e.g. here: http://www.nl.go.kr/authorities/KAC201807861 --Christian140 (talk) 07:25, 18 August 2020 (UTC)[reply]

@Christian140: Hi, I presume you are talking about the huge batch I performed in Nov 2019. No bot is running, so fortunately we do not have a recurring error.
Anyway, thx for detecting this problem, and for a hint to solve it. I'll try to improve items related to Koreans. My goal is to detect conflations or other errors generated by VIAF, that unfortunately were propagated in Wikidata by my batch (and not only by my batch...).
I also @Epìdosis:, who usually helps me to study this kind of problems. -- Bargioni 🗣 09:31, 18 August 2020 (UTC)[reply]
@Christian140: Hi, in November 2019 a big synchronisation with VIAF was performed: in every case, when a VIAF cluster X linked to a Wikidata item Y, a link to the VIAF cluster X was added in the Wikidata item Y; unfortunately, there was a significant percentage of mistakes, all due to VIAF confusions. If you find mistakes, please remove the wrong VIAF cluster and, if the VIAF cluster confuses different subjects, please add a report in Wikidata:VIAF/cluster/conflating entities. If you have any doubt, obviously ask us. Bye, --Epìdosis 09:48, 18 August 2020 (UTC)[reply]

Didier Proton nel catalogo della Santa Croce[edit]

Salve,

con ogni probabilità (si veda questa discussione in francese), Q3026887 e Q2366553 sono la stessa persona. Comunque, ci sono due voci su Wikipédia, due elementi qua... e anche due identificativi nel catalogo della Santa Croce: 113707, e 64560 d'altra parte. Forse si può fare qualcosa?

Tanti saluti, Nomen ad hoc (talk) 09:31, 30 November 2020 (UTC).[reply]

@Nomen ad hoc: Grazie. Secondo me e @Gentile64: (che ha fatto molta ricerca) si tratta di due persone diverse, anche se non siamo sicuri al 100%. Uno sarebbe un sacerdote, nato nel 1941, di nome Didier Marie Proton, l'altro sarebbe uno scrittore con nome Didier Edmond Proton, nato nel 1942. Diverse biblioteche li distinguono anche con la data, ma IDREF e BNF li distinguono pur datandoli entrambi 1941. Non li fonderemmo. -- Bargioni 🗣 15:58, 30 November 2020 (UTC)[reply]
Grazie per la pronta risposta. Mi permetto tuttavia di far notare che finalmente, vennero fusi le due voci WP e i due elementi WD (da Nouill). Nomen ad hoc (talk) 09:52, 3 December 2020 (UTC).[reply]
@Gentile64, Nomen ad hoc: Ok, grazie della notizia. Ma sarebbe utile sapere la ratio. -- Bargioni 🗣 16:32, 3 December 2020 (UTC)[reply]
Senza dubbio in seguito alla discussione precitata; ma meglio lasciare Nouill confermarlo, forse. Nomen ad hoc (talk) 16:59, 3 December 2020 (UTC).[reply]

Salve. Torno dagli archivi della Chiesa francese dove, per conoscenza, ho potuto verificare che non c'è che un prete cattolico di nome Didier Proton, ordinato nel 1976 ed incardinato nella diocesi di Fréjus-Toulon. Auguri, Nomen ad hoc (talk) 17:16, 11 December 2020 (UTC).[reply]

moreIdentifiers[edit]

Hi! A short message to notice you that I have been recently using the moreIdentifiers script and I absolutely love it. The time saving is massive, and its also really helpfull to know diretcly from the Qid that some identifiers are missing. Cheers, --Jahl de Vautban (talk) 15:46, 6 January 2021 (UTC)[reply]

@Jahl de Vautban: Thx a lot! moreIdentifiers is a great idea of @Epìdosis:. He helped me to write and test it. -- Bargioni 🗣 16:19, 6 January 2021 (UTC)[reply]

"stated in" in references with external identifiers[edit]

Hi! Your UseAsRef.js tool looks very nice. But is it really necessary to add stated in (P248) alongside an external id, if the external id property has a applicable 'stated in' value (P9073) statement? I'd argue that it's actually a bit unhelpful, because if we want to revise what we think the "stated in" item should be for a particular external-id -- eg revising it from a particular organisation's website to a newly-created item specifically for the database in question, it's better if we can only make that change in one place, by updating the applicable 'stated in' value (P9073) statement, rather than having to adjust thousands of references throughout Wikidata. Would you agree? Jheald (talk) 22:57, 13 February 2021 (UTC)[reply]

Hi @Jheald:! In my opinion the use of applicable 'stated in' value (P9073) is not a problem: I think that only a minority of applicable 'stated in' value (P9073) values could be made more precise, while in the great majority of cases they can probably be considered as stable. Anyway, making mass-substitutions is not difficult through bots (see e.g. this case) and the use of stated in (P248) is clearly prescribed in the guideline Help:Sources. I would also notice that references added through this gadget (at the moment used by 22 users) will be a much smaller number than the references added by Reiheitsgebot every day. Maybe we could just organize ourselves in these days to massively check applicable 'stated in' value (P9073) values and reflect collectively on the possibility of improving single cases, before having an higher number of references gets added. But, according to the guideline Help:Sources, I don't think that the gadget should be changed. --Epìdosis 21:05, 14 February 2021 (UTC)[reply]
Thx to you too, @Jheald, Epìdosis:. Of course, the gadget could have settings or preferences, or even can be forked and modified. But my opinion is the same as Epìdosis, since writing it we tried to follow Help:Sources. -- Bargioni 🗣 21:44, 14 February 2021 (UTC)[reply]

Olá! Parabéns a você e a @Epìdosis: por criar essa ferramenta! Além de agradecer, eu queria sugerir que ela fosse estendida, para além dos identificadores externos, incluir também as propriedades described by source (P1343) e described at URL (P973), que são úteis muitas vezes como fonte para os dados inseridos. Grato. --Luan (talk) 16:10, 19 February 2021 (UTC)[reply]

@Luan: Good idea. We are working on it in the nexts two weeks. --Epìdosis 11:53, 21 February 2021 (UTC)[reply]
@Epìdosis: is there any news on that extension? This would make your already invaluable tool (thank you!) even more helpful for my daily work! --Emu (talk) 08:09, 24 April 2021 (UTC)[reply]
@Emu: I perfectly agree. Unfortunately it has proven to be a bit more difficult than we expected, but in May we will probably succeed in the improvement. Of course we will keep working on it! Bye, --Epìdosis 13:01, 24 April 2021 (UTC)[reply]

@Luan, Emu: Although the task of extending UseAsRef has proven to be much more difficult than expected (and other concurring tasks took us a lot of time), after some months of "planning" I can say we are somewhat nearer to the scope; we have established, from a theoretical point of view, the characteristics of this extension and we plan to start writing them down in the next weeks. Hopefully (although I know that my two previous forecasts went wrong) it could be ready before the 9th birthday of Wikidata. Be confident, we are working on it! See you soon, --Epìdosis 16:49, 10 September 2021 (UTC)[reply]

@Luan, Emu: We finally managed to create UseAsRef 2.0. Great work by Bargioni in the last two weeks! You can see documentation in the usual page User:Bargioni/UseAsRef. Report here any bugs. Enjoy! --Epìdosis 08:59, 22 October 2021 (UTC)[reply]

@Epìdosis: That’s great news, thank you so much! I wanted to try it on Hans Liebhardt-Kreutzinger (Q109061888) but it wouldn’t let me (“UseAsRef - Error modification-failed”) – do you think it’s a caching problem on my end? --Emu (talk) 09:17, 22 October 2021 (UTC)[reply]
@Emu: You found a problem with P1343. We are trying to solve it. --Epìdosis 09:39, 22 October 2021 (UTC)[reply]
@Emu: It should be solved now. --Epìdosis 09:52, 22 October 2021 (UTC)[reply]
@Epìdosis: Perfect, thank you! A very small suggestion for future developement: It would be great if page(s) (P304), volume (P478) and section, verse, paragraph, or clause (P958) could also be used, but I don’t know how hard that would actually be programming-wise. --Emu (talk) 14:13, 22 October 2021 (UTC)[reply]
@Emu: Experimented in User:Epìdosis/UseAsRef try.js, it seems to work. @Bargioni: you can check and eventually copy into yours. --Epìdosis 15:47, 22 October 2021 (UTC)[reply]
@Emu: User:Bargioni/UseAsRef.js has included my addition of the properties you requested; I updated documentation in User:Bargioni/UseAsRef. Good night, --Epìdosis 21:56, 22 October 2021 (UTC)[reply]
Bargioni e @Epìdosis: parabéns e muito obrigado!! Saudações, --Luan (talk) 02:34, 24 October 2021 (UTC)[reply]

[WMF Board of Trustees - Call for feedback: Community Board seats] Meetings with the Wikidata community[edit]

The Wikimedia Foundation Board of Trustees is organizing a call for feedback about community selection processes between February 1 and March 14. While the Wikimedia Foundation and the movement have grown about five times in the past ten years, the Board’s structure and processes have remained basically the same. As the Board is designed today, we have a problem of capacity, performance, and lack of representation of the movement’s diversity. Our current processes to select individual volunteer and affiliate seats have some limitations. Direct elections tend to favor candidates from the leading language communities, regardless of how relevant their skills and experience might be in serving as a Board member, or contributing to the ability of the Board to perform its specific responsibilities. It is also a fact that the current processes have favored volunteers from North America and Western Europe. In the upcoming months, we need to renew three community seats and appoint three more community members in the new seats. This call for feedback is to see what processes can we all collaboratively design to promote and choose candidates that represent our movement and are prepared with the experience, skills, and insight to perform as trustees?

In this regard, two rounds of feedback meetings are being hosted to collect feedback from the Wikidata community. Two rounds are being hosted with the same agenda, to accomodate people from various time zones across the globe. We will be discussing ideas proposed by the Board and the community to address the above mentioned problems. Please sign-up according to whatever is most comfortable to you. You are welcome to participate in both as well!

Also, please share this with other volunteers who might be interested in this. Let me know if you have any questions. KCVelaga (WMF), 14:32, 21 February 2021 (UTC)[reply]

UseAsRef[edit]

I might be missing something but i don't see any "copy this as ref" icon Trade (talk) 11:48, 22 February 2021 (UTC)[reply]

@Trade: Of course, you added UseAsRef.js to your common.js... If so, please, add more info: the element you are going to open, errors in the browser console if any. Thx. -- Bargioni 🗣 13:03, 22 February 2021 (UTC)[reply]
User:Trade/common.js, did i placed it correctly? --Trade (talk) 13:28, 22 February 2021 (UTC)[reply]
@Trade: In your common.js copy (selected lines of) my User:Bargioni/common.js, and use it only as a list of importScript. Do not copy entire scripts written by others in your common.js. HTH. -- Bargioni 🗣 16:24, 22 February 2021 (UTC)[reply]
Better? --Trade (talk) 00:27, 23 February 2021 (UTC)[reply]
@Trade: Maybe you can try moving this script to the first line of your common.js and closing the line "importScript("User:EpochFail/ArticleQuality.js")" with ";". I hope it will work! --Epìdosis 08:55, 23 February 2021 (UTC)[reply]
@Trade: In my opinion, your common.js could be simplified to 4 lines:
importScript( 'User:Melderick/moveClaims2.js' );
importScript( 'User:Tohaomg/rearrange values.js' );
importScript( 'User:EpochFail/ArticleQuality.js' );
importScript( 'User:Bargioni/UseAsRef.js' );

Append your comments to lines using "// test", if you like. Bye. -- Bargioni 🗣 16:42, 23 February 2021 (UTC) It works now, thanks! I was thinking, would it be possibly for you to create a script that can generate references from items? For example, using Notorious Gangsters and No Games: The Gory Story of the Gizmondo (Q61996727) to generate an reference liks this. Trade (talk) 00:36, 24 February 2021 (UTC)[reply]

moreIdentifiers scope fix[edit]

Hi. I want to just start and say -- very useful tool :-). One problem though. The tool is leaking functions and variables to global scope. So e.g. if two scripts define `viaf_sources` then your script might brake. Similar e.g. for functions like `escapeHtml` or `adapt` which might mean something different in different scripts.

So probably easiest solution is to add a scope over your code. From `var wgUserLanguage...` until `$( function($) {`. So something like this should work:

importScript( 'User:Bargioni/moreIdentifiers_system_regex.js' );

(function(){

var wgUserLanguage = mw.config.get( 'wgUserLanguage' );

// [...]

})()

$( function($) {

Let me know if you have any doubts. And thanks again for your tool 🙂 Nux (talk) 17:24, 19 March 2021 (UTC)[reply]

Sorry. Actually `})()` need to be at the end. You can see a working version here: https://www.wikidata.org/w/index.php?title=User%3ANux%2FmoreIdentifiers.js&type=revision&diff=1385673170&oldid=1385668108 Nux (talk) 17:35, 19 March 2021 (UTC)[reply]
@Nux: I appreciate a lot your help! I wrote MI copying parts of other scripts, and I'm not so experienced to ensure a correct environment to my vars and functions :-| I'm going to apply your fix. Thx a lot. -- Bargioni 🗣 10:05, 20 March 2021 (UTC)[reply]

Don't let NUKAT ids go missing[edit]

Hello! Thanks you for moreIdentiefiers, I greatly enjoy it! I'd like to suggest to fix the NUKAT ids, which come as

"NUKAT":["vtls008633095"],

from the justlinks.json. These are not persistent and therefore excluded, but this way the persistent ids of NUKAT go missing. Since you load the full viaf.json record anyway for the headings, I suggest a postprocessing of the NUKAT div in the function add_heading (div,full_cluster) and change it to the persistent NUKAT value n 2012036794. --Frlgin (talk) 15:12, 10 May 2021 (UTC)[reply]

@Frlgin: Thx for your appreciation and suggestion! I hope to apply it ASAP. -- Bargioni 🗣 20:39, 10 May 2021 (UTC)[reply]

I prototyped a fix at the end of function add_heading. It does not use jQuery nor add a hyperlink like id2 = '<a href="'+id2_href+'" target="_blank">' + id2 + '</a>'; yet. There is a little mess with NUKAT ids in wikidata: ids starting with n are cleared of the whitespace, but the others are not, see the examples. But the official NUKAT URL does use spaces, so wikidata must use hwikidata-externalid-url.toolforge.org intermediately to do a valid request to NUKAT for ids. This means that ids with n... simply scraped from VIAF do not match the current NUKAT regex. I added a fix therefore, you can try it embedding User:Frlgin/moreIdentifiers_nukat_fix.js in your common.js

if(div.firstChild.name==='NUKAT'){
    input=div.querySelector('input[data-raw]')
    nukat_vtls=input.getAttribute('data-raw')
    nukat_src=full_cluster.sources.source.reverse().find(s=>s['@nsid']===nukat_vtls)
    nukat_persistent_id =nukat_src['#text'].split('|')[1]
    // wikidata normalizes (only) ids starting with n - https://www.wikidata.org/wiki/Property:P1207#P3202
    if(nukat_persistent_id.startsWith('n')) nukat_persistent_id = nukat_persistent_id.replace(' ',);
    input.setAttribute('data-raw',nukat_persistent_id)
    input.setAttribute('data-id',nukat_persistent_id)
    input.removeAttribute('disabled')
    
    span=div.querySelector('span.line-through')
    span.innerText=nukat_persistent_id
    span.classList.remove('line-through')
    span.setAttribute('title',)
}

Frlgin (talk) 10:58, 15 May 2021 (UTC)[reply]

@Frlgin , Epìdosis: Thx a lot for your fix! I applied it. -- Bargioni 🗣 08:53, 16 May 2021 (UTC)[reply]
Hi @Bargioni, Frlgin: sometimes nukat has multiple spaces, so replaceAll should be used.
For example : Q463748
I tested this patch : User:Eru/moreIdentifiers.jseru [Talk] [french wiki] 20:48, 17 May 2021 (UTC)[reply]
@Eru, Epìdosis: Patched, thanks. -- Bargioni 🗣 20:58, 17 May 2021 (UTC)[reply]
Thanks, it was quick — eru [Talk] [french wiki] 21:03, 17 May 2021 (UTC)[reply]
Hi @Bargioni:
Following this I added some code to fix the issue with FRBNF, looking at User:Magnus Manske/authority control.js, can you update your script with User:Eru/moreIdentifiers.js ?  – The preceding unsigned comment was added by eru (talk • contribs).
@Eru: Tried, it works perfectly. We can surely include it! --Epìdosis 21:50, 27 May 2021 (UTC)[reply]
Actually I see there is no link, I found a way to optimize it, I will come back soon with better code. — eru [Talk] [french wiki] 06:45, 28 May 2021 (UTC)[reply]
@Bargioni, Epìdosis: ok it's better now: User:Eru/moreIdentifiers.js
It checks the BNF on start so the link is correct, and for NUKAT it sets the link after because we don't have the necessary value before. — eru [Talk] [french wiki] 15:27, 28 May 2021 (UTC)[reply]
@Eru, Epìdosis: Patch applied! Thx a lot, Eru! -- Bargioni 🗣 10:54, 29 May 2021 (UTC)[reply]
@Eru: Hi! I've noticed a little problem in the last version of your patch (the version which also has the link to BNF and NUKAT): for BNF, it shows a "?" instead of the label which BNF has in VIAF. Could you have a look? Obviously it is a minor problem, don't worry. Thank you very much! --Epìdosis 10:17, 6 June 2021 (UTC)[reply]
Thanks @Epìdosis: I didn't notice it. I shouldn't have touched the "id" variable.
@Bargioni: here is the hotfix : User:Eru/moreIdentifiers.js. — eru [Talk] [french wiki] 10:49, 6 June 2021 (UTC)[reply]
@Eru, Epìdosis: I applied the patch. Thx, Eru. -- Bargioni 🗣 13:23, 6 June 2021 (UTC)[reply]

Call for participation in the interview study with Wikidata editors[edit]

Dear Bargioni,

I hope you are doing good,

I am Kholoud, a researcher at King’s College London, and I work on a project as part of my PhD research that develops a personalized recommendation system to suggest Wikidata items for the editors based on their interests and preferences. I am collaborating on this project with Elena Simperl and Miaojing Shi.

I would love to talk with you to know about your current ways to choose the items you work on in Wikidata and understand the factors that might influence such a decision. Your cooperation will give us valuable insights into building a recommender system that can help improve your editing experience.

Participation is completely voluntary. You have the option to withdraw at any time. Your data will be processed under the terms of UK data protection law (including the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018). The information and data that you provide will remain confidential; it will only be stored on the password-protected computer of the researchers. We will use the results anonymized to provide insights into the practices of the editors in item selection processes for editing and publish the results of the study to a research venue. If you decide to take part, we will ask you to sign a consent form, and you will be given a copy of this consent form to keep.

If you’re interested in participating and have 15-20 minutes to chat (I promise to keep the time!), please either contact me at kholoudsaa@gmail.com or kholoud.alghamdi@kcl.ac.uk or use this form https://docs.google.com/forms/d/e/1FAIpQLSdmmFHaiB20nK14wrQJgfrA18PtmdagyeRib3xGtvzkdn3Lgw/viewform?usp=sf_link with your choice of the times that work for you.

I’ll follow up with you to figure out what method is the best way for us to connect.

Please contact me if you have any questions or require more information about this project.

Thank you for considering taking part in this research.

Regards

Kholoudsaa (talk)

Cleanup[edit]

Hi Bargioni

In June, there was a user who created a series of items despite not being authorized to edit WMF sites. It seems that besides this user, you (or a script of yours) edited some of these items. If the script works well, I suppose it didn't matter of you edit one or hundreds of such items. Would you mind if we deleted your contributions together with those of that user? --- Jura 13:41, 14 July 2021 (UTC)[reply]

@Jura1: Yes, please, delete them. -- Bargioni 🗣 14:01, 14 July 2021 (UTC)[reply]

difference between moreIdentifiers_defaultconf.js & moreIdentifiers.js[edit]

hello Bargioni i have activated of them, however only moreIdentifiers.js seems to be working. can you please elaborate about moreIdentifiers_defaultconf.js uses. -28july21 (talk) 00:51, 19 August 2021 (UTC)[reply]

Hi @28july21:, the gadget is made of two parts, moreIdentifiers.js and moreIdentifiers_defaultconf.js, so the second part only collaborates to the functioning of the fist one. --Epìdosis 08:42, 19 August 2021 (UTC)[reply]
hello @Epìdosis: ok, i thought both of them are different in functionality. is it ok to have both of them on my common.js? -28july21 (talk) 08:46, 19 August 2021 (UTC)[reply]
@28july21: Yes, it is ok :) --Epìdosis 08:47, 19 August 2021 (UTC)[reply]

moreIdentifiers and CANTIC[edit]

VIAF has moved the CANTIC id from field 1 to field 35. You can read the explanation by Epìdosis here. Is it possible to update moreIdentifiers in order to still be able to import CANTIC id? Thanks! --FogueraC (talk) 11:49, 31 August 2021 (UTC)[reply]

@FogueraC: Yep, I read it. We hope to update the tool. -- Bargioni 🗣 12:56, 31 August 2021 (UTC)[reply]

moreIdentifiers and dob/dod[edit]

Hi! I see that if you load all info of a VIAF ID the date of birth and date of death is also listed. Would it be an idea to also import that information? I think it might also be possible to import labels and aliases from the viaf? I don't really understand the language code but there are versions of "Jane Austin" in there of other scripts than the Latin alphabet. 1Veertje (talk) 21:36, 27 September 2021 (UTC)[reply]

@1Veertje: Hi, I'll study it about labels. About dates, I prefer not to add dates automatically from VIAF. It is not a primary source, so we would have to build a reference to the VIAF source of your choice that reports the date. I also detected some issues about dates in VIAF: please see page 181 of my paper https://www.jlis.it/article/view/12595. This is why I suggest you to look at two gadgets I wrote with @Epìdosis:

Thx for your idea. -- Bargioni 🗣 14:17, 28 September 2021 (UTC)[reply]

quickpreset[edit]

ciao Stefano, sto riguardando questa lezione e mi domando se l'utilissima funzione quickpreset può funzionare oltre che per gli umani anche per edifici facendo qualche aggiustamento. Mi servirebbe nel lavoro che faccio per Wiki loves monuments di catalogazione di beni culturali. Ciao Susanna Giaccai (talk) 18:04, 14 October 2021 (UTC)[reply]

@Giaccai: Ciao Susanna! Sicuramente QuickPresets è estendibile anche agli edifici; il suo principale limite è che permette di facilitare l'inserimento soltanto di proprietà che abbiano datatype "elemento" o "stringa", quindi ad esempio non si può usare per date, coordinate o indirizzi. Ho provveduto ora. --Epìdosis 20:19, 14 October 2021 (UTC)[reply]
@Giaccai, Epìdosis: Grazie. Ma cosa intendi per "provveduto"? -- Bargioni 🗣 20:37, 14 October 2021 (UTC)[reply]
Ho ampliato User:Epìdosis/quickpresets - base.js. --Epìdosis 20:40, 14 October 2021 (UTC)[reply]
Ok, vedo che lo usa anche Susanna. -- Bargioni 🗣 20:49, 14 October 2021 (UTC)[reply]
grazie @Epìdosis: è già un bel passo avanti !  ::Ciao Susanna Giaccai (talk) 13:36, 15 October 2021 (UTC)[reply]

moreIdentifiers proposal[edit]

Currently, moreIdentifiers submits one edit for each identifer, and submitting every identifier takes a lot of edits. However, I think it's possible to submit an edit that adds all those identifiers at once instead of multiple edits, as shown by OpenRefine and the beta version of User:Jitrixis/nameGuzzler.js. Is it possible you could enable an option of adding all those identifiers at once instead of over multiple edits, either by updating the gadget or create a beta version? My reason I request is so that my edit count isn't too high. Also, you have a year-old as-yet-unaddressed request at User_talk:Bargioni/moreIdentifiers#Warn_about_duplicated_identifiers. ミラP@Miraclepine 20:16, 9 January 2022 (UTC)[reply]

Request[edit]

In the Wikidata page of "National Olympic Committee of the Azerbaijani Republic" (Q1208145), can you replace the defunct weblink « http://www.noc-aze.org/ » with the actual official weblink « https://olympic.az »? « http://www.noc-aze.org/ » redirects to https://www.aztelekom.org/ ».

Yours sincerely, 31.200.21.19 08:45, 12 April 2022 (UTC)[reply]

✓ Done. --Epìdosis 08:52, 12 April 2022 (UTC)[reply]

QuickNames error (?)[edit]

Hi! Thank you for this great tool! It's really extremely useful! I think I may have found an error, please check it. For item Jan Kadłubiski (Q111601942) and Czesław Kajdas (Q111602202) the tool offered not just to create an item for surname (which was totally correct), but also for the first name (which was wrong, becuse first names were already added to those items). I don't know why that happened, so I'm reporting it here. Thanks again for great work! Powerek38 (talk) 16:14, 17 April 2022 (UTC)[reply]

Thx for reporting! The tool is very young, and requires special tests, since sur/names can be complicated.
Please, report the same problem with another example, without P734/5. Both examples already have sur/names.  Bargioni 🗣 16:25, 17 April 2022 (UTC)[reply]
Ok, here's an example which I haven't edited yet: Czesław Radzewicz (Q92872024). I think it reacts that way to some Polish first names, in this case Czesław (Q1149600), I don't know why.
Another thing that I've noticed it's that it is useless for Polish women with double surnames, for example Alicja Resich-Modlińska (Q9148217). In our language when a women (legally it's possible for men too, but it happens very rarely) wants to use both her maiden name and her husband's name, the two names are separated by "-" sign. In terms of Wikidata, we usually express this by adding two values of family name (P734), both with series ordinal (P1545) qualifiers. This is not so important, but just to let you know. Powerek38 (talk) 16:50, 17 April 2022 (UTC)[reply]
A name like Czesław (Q1149600) won't be selected as male given name if its English description is "given name". QuickNames requires an English description of "male/female/unisex give name" to work. I added "male": please try again with Czesław Radzewicz (Q92872024).
I'll study the surname containing "-". Interesting. Thx again.  Bargioni 🗣 17:08, 17 April 2022 (UTC)[reply]
Thank you for the explanation! One more issue which I found today: when using the tool in Antoni Iskra (Q107436791), the tool is suggesting to create the surname "Iskra". I thought it's quite strange, because it's quite popular Slavic name, so I manually found Iskra (Q12791467). Having said that, I must have created well over 100 items today using the tool, thanks again for developing it! Powerek38 (talk) 19:13, 17 April 2022 (UTC)[reply]
Please, take a look at version 1.1, and especially to the new tab Settings. I hope it solves Polish composed family names.  Bargioni 🗣 09:55, 19 April 2022 (UTC)[reply]
Please, also take a look at version 1.2 for the issue related to Antoni Iskra (Q107436791).  Bargioni 🗣 09:55, 19 April 2022 (UTC)[reply]
Thank you for the update, it's very useful! I have one last suggestion (last for now, anyway): have you considered integrating this script into your code? I use it every time after I create new item with your tool. Powerek38 (talk) 11:10, 19 April 2022 (UTC)[reply]

Hi, I often have problem with this gadget, that does not load on some items... for example, on Musica Elettronica Viva (Q1500785), it doesn't appear for me... and yet, BNF ID (P268) is an error in the cluster (publishing house in Versailles, not musical project in Italy)...

but since I cannot activate the gadget, I don't know how to declare this ID as wrong to VIAF...

Would it be possible to get this functionality in some other way, to be able to notify errors, please ? doing it manually is quite painful :(

Thanks for your help :) Hsarrazin (talk) 14:08, 31 May 2022 (UTC)[reply]

Hello, partly solved thanks to @Jahl de Vautban who advised me about creating my own User:Hsarrazin/moreIdentifiers defaultconf.js...
I had to put "any" to be able to get the gadget on all "organization" clusters, which is much more than needed, because I don't need Works...
Could you please help me on this ? Hsarrazin (talk) 14:55, 31 May 2022 (UTC)[reply]
@Hsarrazin: Hi! The default setting is that it appears only on items being instance of (P31)human (Q5), mainly because the quality of non-personal VIAF clusters is often poor and so we want to limit cases in which users in good-faith extract wrong IDs from conflated VIAF clusters; however, it is easily possible to make moreIdentifiers appear on all types of items changing its settings to "any", as you saw. If you are interested in persons and works, "accepted_P31": [ "5", "7725634" ] could probably work (assuming that whichever work has instance of (P31)literary work (Q7725634)). --Epìdosis 15:00, 31 May 2022 (UTC)[reply]
Thanks @Epìdosis,
I am presently (manually) aligning my professional "Collectivities (author)" list with VIAF, and happen to stumble on errors (BIG ones), so I wanted to use this tool to add those to the VIAF errors collected on wikidata... (like I usually do when I find errors on human authors)
but I cannot find an appropriate conf to get all "Collectivités" (is it "organisations" in English ? I read VIAF in French) without using the "any" conf, which is really too many items, since I am NOT interested in "works" (on VIAF, I mean) :) Hsarrazin (talk) 07:48, 1 June 2022 (UTC)[reply]
@Hsarrazin: OK, I understood you were interested in persons and works, not persons and organizations ... too fast reading! So, the problem is that the configuration of the types of items on which MI should appear is based on instance of (P31) value, not on the type of VIAF cluster, because reading the first is much easier than reading the latter. For this reason, since on Wikidata organisations (as places) can have vary different instance of (P31) values (while persons and works are very much predictable), I think that "any" is effectively the only solution :( --Epìdosis 07:53, 1 June 2022 (UTC)[reply]
okay... Once I finish my list, I will probably de-activate the "any" parameter, because it slowes down item displaying for no real use (for me)...
Thanks a lot for your help :) Hsarrazin (talk) 10:02, 1 June 2022 (UTC)[reply]

User:Bargioni/moreIdentifiers.js - information loads only sometimes after reloading the object (but not every time)[edit]

Hello Bargioni, thanks a lot for this tool!

I have the problem, that the addition identifiers only load occassionally, so the object has been reloaded several times, until the additional identifiers show up.

Example: d:Q113224926 would offer more identifiers, the show up only sometimes after reloading the object with the F5-key in the browser again and again.

Is there a way to expliclity force "moreIdentifiers.js" to be called on user request?

For example, tools like

add a additional entry in the navigational bar on the left side.

Thanks a lot! M2k~dewiki (talk) 12:25, 23 July 2022 (UTC)[reply]

Unfortunately I have also experienced the same problem, on some items. We haven't spotted the cause yet, but we continue the investigation ... --Epìdosis 13:16, 23 July 2022 (UTC)[reply]
From my point of view, a workaround might be to add an entry in the navigation bar on the left (similar to User:YMS/labelcollect.js or User:Frettie/consistency check add.js), like add moreIdentifiers (now) to expliclity force "moreIdentifiers.js" to be called on user request (?!?). --M2k~dewiki (talk) 13:21, 23 July 2022 (UTC)[reply]
Yep, but MI requires a config. So we have to load two scripts, and the second after the full load of the first one.
Do errors related to MI appear in your browser console? Thx.  Bargioni 🗣 13:27, 23 July 2022 (UTC)[reply]

For https://www.wikidata.org/wiki/Q113224926 I get with Google Chrome:

index.php?title=User:Bargioni/moreIdentifiers.js&action=raw&ctype=text/javascript:308 Uncaught (in promise) ReferenceError: moreIdentifiers_props is not defined
    at moreIdentifiers_addinterface (index.php?title=User:Bargioni/moreIdentifiers.js&action=raw&ctype=text/javascript:308:51)
    at index.php?title=User:Bargioni/moreIdentifiers.js&action=raw&ctype=text/javascript:636:3
moreIdentifiers_addinterface @ index.php?title=User:Bargioni/moreIdentifiers.js&action=raw&ctype=text/javascript:308

--------

Access to XMLHttpRequest at 'https://wikidata-primary-sources.toolforge.org/entities/Q113224926?dataset=' (redirected from 'https://tools.wmflabs.org/wikidata-primary-sources/entities/Q113224926?dataset=') from origin 'https://www.wikidata.org' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.

--------

wikidata-primary-sources.toolforge.org/entities/Q113224926?dataset=:1          Failed to load resource: net::ERR_FAILED

--------

--M2k~dewiki (talk) 13:33, 23 July 2022 (UTC)[reply]

Even if MI si not perfect while loading, you simply missed
importScript( 'User:Bargioni/moreIdentifiers_defaultconf.js' );
in your common.js. Please add this line before loading moreIdentifiers.js, and try again.  Bargioni 🗣 13:41, 23 July 2022 (UTC)[reply]

I now added the config in commons.js, the first error message dissappered, the second and third message still exist. --M2k~dewiki (talk) 13:44, 23 July 2022 (UTC)[reply]

They are both related to toolforge.org. toolforge.org services look down today... :-(
Bye.  Bargioni 🗣 16:07, 23 July 2022 (UTC)[reply]

Hi, could you add the possibility to add Scandinavian middle family name (P6978) to Quicknames 2? Thank you! --Emu (talk) 16:58, 25 August 2022 (UTC)[reply]

Added. Please, try it.  Bargioni 🗣 22:03, 25 August 2022 (UTC)[reply]
I just tried it, it works! Thank you! --Emu (talk) 23:06, 25 August 2022 (UTC)[reply]

User:Bargioni/MnM ext2.js[edit]

Hi, I just saw User:Bargioni/MnM ext2.js, which I might have inadvertently broken with my last update. I can see it adds a "remove" link for preliminary matches, which I should probably add myself. But I don't quite understand what the mutation observer does? It it just to trigger the UI update once my MnM stuff is in place? --Magnus Manske (talk) 13:27, 14 September 2022 (UTC)[reply]

Update: Added match removal to my MnM script. --Magnus Manske (talk) 14:47, 14 September 2022 (UTC)[reply]
Fine. I'll disable my extension.
Please, also add the possibility to declare the entry as N/A. This will completely allow to work in your MnM gadget. when editing an item. Bye.  Bargioni 🗣 16:13, 14 September 2022 (UTC)[reply]
N/A option will now appear once you remove the automatch. --Magnus Manske (talk) 08:26, 15 September 2022 (UTC)[reply]
Yes, it waits for your interface to load and adds missing functionalities. I'm not sure it is technically correct. Anyway, it seems to work.  Bargioni 🗣 16:11, 14 September 2022 (UTC)[reply]

@Magnus Manske: thanks very much for the addition of removal to MnM script! --Epìdosis 17:23, 14 September 2022 (UTC)[reply]

Transliteration source[edit]

Dear Bargioni,

I would like to ask you, where the transliterations of Julius Guttmann (Q75333) as an example come from? They are not according to any DIN / DMG standards and may apply to another language than German.-- Oberbefehlshaber Ost (talk) 17:38, 1 November 2022 (UTC)[reply]

Hi, transliterations comes from GND, since I'm using Magnus Manske's "GND reveal" gadget. If you are interested in testing/using it, add this line of code
importScript( 'User:Magnus_Manske/gnd_reveal.js' );
to your common.js. A link with the same name will appear in the left column of an item page, if the item contains GND ID (P227).
Anyway, even if data come from a valuable source like GND, the user is responsible of the quality. So, thanks to your feedback, in the future I'll remove aliases that are transliterations, since I'm not an expert in this field, and aliases do not have a reference (unfortunately).
Please, feel free to modify Julius Guttmann (Q75333) and even to discuss transliterations with the national German library, if you like.  Bargioni 🗣 11:48, 2 November 2022 (UTC)[reply]

FAST ID[edit]

Were items Ezra Attiya imported from FAST or from another catalogue? Budy Greene (talk) 13:48, 9 November 2022 (UTC)[reply]

Yes.  Bargioni 🗣 17:01, 9 November 2022 (UTC)[reply]

Wrong websites from the GND script[edit]

Hi Bargioni,

Francis Wolf (Q115707619) is said to have 2 personal websites:

This is bogus. May you please fix the GND script? İdealist Kadınlar (talk) 20:56, 16 December 2022 (UTC)[reply]

@İdealist Kadınlar: Clearly both wrong as official website (P856), removed. Unfortunately the script cannot be corrected by Bargioni or me directly, because it is User:Magnus Manske/gnd reveal.js and should be edited by @Magnus Manske:; I would suggest to simply remove the equivalence between "Quelle" and official website (P856). --Epìdosis 21:36, 16 December 2022 (UTC)[reply]

Teognide VcBA[edit]

Ciao Bargioni, auguri per l'anno nuovo! Hai aggiunto uno VcBA su Theognis of Megara (Q336115) che è statto ritirato. Non so da dove vienne ma se è usato da una biblioteca in particolare sarebbe utile averlo in riferimento. --Jahl de Vautban (talk) 08:58, 7 January 2023 (UTC)[reply]

Grazie, ricambio gli auguri. Secondo me, viene da un sync del MnM di VcBA. Se non è corretto, informo VcBA. Ti sono grato dell'attenzione. BTW, noto la ingente quantità di VIAF :-(  Bargioni 🗣 09:16, 7 January 2023 (UTC)[reply]
E inevitabile... vedi questa query [2] : VIAF si perde per molti autori antichi o del medioevo (Teognide non compare perchè il dump è del 4 gennaio e ho aggiunto tutto il 5). E al punto che divento suspicioso quando non ci sono al meno 3 o 4 VIAF per questi autori (cio che capita, ad esempio su Gaius Valerius Flaccus (Q316090)). Ho letto il tuo articolo "Beyond VIAF", sembre che ci sia encora molto da fare nel ambitto di questi autori. --Jahl de Vautban (talk) 10:35, 7 January 2023 (UTC)[reply]
Grazie per aver letto l'articolo, e anche per scrivere qui in italiano!
Non sappiamo perché VIAF non sappia fare tesoro del notevole lavoro su Wikidata almeno per la clusterizzazione degli autori antichi. Speriamo che si accorgano di questa possibilità. Ciao.  Bargioni 🗣 10:59, 7 January 2023 (UTC)[reply]

Request[edit]

Hello.

Can you add the photo File:British actress Mary Irene Miller 1997.jpg to the Wikidata page of the actress "Mary Millar" (Q2188163)?

Yours sincerely, 31.200.17.120 23:38, 12 January 2023 (UTC)[reply]

https://www.wikidata.org/w/index.php?title=Q2188163&action=history states it was added 03:01, 13 gen 2023‎ by user Lockal. Bye.  Bargioni 🗣 10:29, 13 January 2023 (UTC)[reply]

Strange behaviour of moreIdentifiers with two NUKAT IDs[edit]

Hello Bargioni, writing in English this time. I stumbled upon a strange behaviour of moreIdentifiers on Q3279558#P1207. There are two NUKAT IDs on two different clusters, however the gadget apparently considers that the first ID is part of both clusters, while the second one isn't linked to any. I'd be most grateful if you could have a look. Cheers, --Jahl de Vautban (talk) 09:56, 17 January 2023 (UTC)[reply]

You are right, thanks. The correct attribution of an identifier to its cluster is not so simple, indeed. I'll try to patch MI.  Bargioni 🗣 10:41, 17 January 2023 (UTC)[reply]

Stop[edit]

Please stop your quickstatements batch and check messages — Martin (MSGJ · talk) 22:43, 18 February 2023 (UTC)[reply]

Wikidata:Edit groups/QSv2/123260 — Martin (MSGJ · talk) 22:59, 18 February 2023 (UTC)[reply]
Sorry, I had to block your account as I have no other way to stop this batch and you are not responding to messages. Any admin please unblock with assurance that the batch is cancelled and I will discuss further in the morning — Martin (MSGJ · talk) 23:02, 18 February 2023 (UTC)[reply]
@MSGJ: Batch stopped and I'm undoing it (although it will be a slow process, seemingly); thanks for noticing the problem. Some years ago Wikidata admins had the possibility to stop QuickStatements batches, but unfortunately it has been removed ... --Epìdosis 08:29, 19 February 2023 (UTC)[reply]
Thx for your action. I'll try to submit a better batch. Also thx to Epìdosis for reactivating my account.  Bargioni 🗣 10:47, 19 February 2023 (UTC)[reply]
Apologies again, and thanks to Epidosis from me also — Martin (MSGJ · talk) 13:28, 19 February 2023 (UTC)[reply]
Don't worry. Absolutely: I learnt a lot about QS thanks to your attention.  Bargioni 🗣 14:15, 19 February 2023 (UTC)[reply]

Please Help for correction on Q109852921#p8034[edit]

Hello,

The item mainly references a living French litterature historian, who edited 16th century texts, but the Viaf cluster (68948096) also joins a 18th century probable homonym, which caused a very wrong import of dates on Q109852921, because of Vatican Library VcBA ID (P8034) "495/6220"...

Could you please help me fix this, because I think you are used to work with Vatican Library... Thanks :)

also @Epìdosis: who may help on this... Hsarrazin (talk) 14:19, 20 March 2023 (UTC)[reply]

@Hsarrazin: I think this can be solved only on VIAF level, not on BAV level; notoriously we unfortunately don't have an effective way to have VIAF conflations solved. I created a new item for Marie-Madeleine Fontaine (Q117223661): French nun (1723-1794), distinct from Marie-Madeleine Fontaine (Q109852921): French literary historian, hoping it could maybe help a bit. --Epìdosis 14:50, 20 March 2023 (UTC)[reply]
Thanks ! I know how to notify a Viaf conflation using Bargioni's tool, when the ID has not been added to an item... but I could not find a way, after it was added... and I fear automatic bot re-adding if I removed it from the item :) Hsarrazin (talk) 15:09, 20 March 2023 (UTC)[reply]

moreIdentifiers.js[edit]

Hello Bargioni,

I'm wondering if there is a possibility to get list of items that have VIAF id that have J9U id but the J9U id does not added to the item (may be using SPARQL?). If we have a list, librarians from the National Library of Israel can check the list manually. Geagea (talk) 13:32, 3 May 2023 (UTC)[reply]

A single sparql query is not sufficient. I think it's possible to do the job starting from the VIAF links dump. Please, wait.  Bargioni 🗣 20:55, 3 May 2023 (UTC)[reply]

Wikipedia in Cebuano[edit]

Hello Bargioni, the articles in Cebuano Wikipedia (Q837615) have been created mostly by bots. If an item has only a link to ceb.wikipedia adding VIAF is often wrong. Example: Oberfeld (Q33393272) ≠ VIAF 8145602335501361591.[3] Is there a way to check these batch edits? Kolja21 (talk) 14:42, 21 May 2023 (UTC)[reply]

@Kolja21: that first import in November 2019 was my first collaboration with Bargioni (it was based on a very simple reasoning: if VIAF X links to Wikidata Y, add a link to VIAF X in Wikidata Y if not already present) and surely lacked some prudence which I would now surely advice to myself and to others. So, unfortunately, since we didn't add any reference to the VIAF ID (P214) values, tracing the IDs added in that import is a bit complicated. Instead, it is technically possible to query all items having only 1 sitelink, precisely to ceb.wikipedia, and a VIAF ID (https://w.wiki/6jgG: 13249 results presently); if you think it to be a good idea (I tend to agree, I would just like to do a few random checks), these VIAFs (most of which was probably imported in that occasion) can just be massively removed. We can discuss it here. Bye from Athens with Bargioni (User:Epìdosis/NLG), --Epìdosis 15:44, 21 May 2023 (UTC)[reply]
I've checked the first five imports. All are wrong or duplicates:
--Kolja21 (talk) 16:04, 21 May 2023 (UTC)[reply]
@Kolja21: so I adopted the massive solution: I used the above query (https://w.wiki/6jgG) to start a massive removal, both of VIAFs (https://quickstatements.toolforge.org/#/batch/207049) and of related WorldCat Identities IDs. I hope it will clean a bit. --Epìdosis 09:39, 1 June 2023 (UTC)[reply]

Mix'n'match catalogs[edit]

Hello,

I see you are experienced with importing catalogs so hope you can help me with my projects: I would like to create two new catalogs to be associated with Rodovid ID (P1185) and Africultures structure ID (P11462) respectively. Is this feasible? I have tried myself but was not successful. Moumou82 (talk) 08:26, 6 August 2023 (UTC)[reply]

Thx to involve me in your project.
I'll study these catalogs in some days.  Bargioni 🗣 08:45, 6 August 2023 (UTC)[reply]
@Moumou82 Scrape is in progress.  Bargioni 🗣 14:53, 27 August 2023 (UTC)[reply]
Hello, can I confirm what is the status of the scraping? Moumou82 (talk) 14:08, 16 October 2023 (UTC)[reply]
The scrape was difficult for letters not in the group A-Z.
The file is very large, so I asked Magnus (more than one month ago) to upload it in MnM. No reply up to now. What do you prefer to do? Thx.  Bargioni 🗣 17:58, 17 October 2023 (UTC)[reply]
Do you refer to Rodovid or Africultures? I am wondering if someone else can help if @Magnus Manske is busy. Moumou82 (talk) 19:37, 18 October 2023 (UTC)[reply]
Yep, I refer to Rodovid ID (P1185) only. Africultures structure ID (P11462) website in my opinion cannot be crowled to scrape it, unfortunately.
I'm not aware of other people who could upload big catalogs in MnM, other than Magnus. Maybe @Epìdosis can suggest a name?  Bargioni 🗣 15:51, 19 October 2023 (UTC)[reply]
I'm pretty sure only Magnus can do it, unfortunately. --Epìdosis 20:50, 19 October 2023 (UTC)[reply]
I sent the file to Magnus: https://mix-n-match.toolforge.org/#/catalog/6068 is up and running, but an error stopped the import at about 15%. Use it anyway, but in the meanwhile I hope that Magnus will upload it once again.  Bargioni 🗣 16:21, 28 October 2023 (UTC)[reply]
@Moumou82 The error in the import was corrected and Magnus completed the upload. A "match person dates" operation is running in this moment.  Bargioni 🗣 16:50, 1 November 2023 (UTC)[reply]

Shanghai (band)[edit]

Hi, you've added the same viaf on both Q10666965 and Q12335101‎, I... guess you'll removed it on the wrong one, thanks in advance. TherasTaneel (talk) 06:54, 24 August 2023 (UTC)[reply]

@TherasTaneel: seems wrong on both, removed. --Epìdosis 07:33, 24 August 2023 (UTC)[reply]
Hi, thanks a lot for detecting this error. Please, always feel free to deprecate or delete a statement :-)  Bargioni 🗣 08:45, 24 August 2023 (UTC)[reply]

VIAF of Paul D. Miller[edit]

Hi Bargioni,

I think, that the VIAF of Paul Dennis Miller (Q90450531) is a conflation. Kind regards.-- U. M. Owen (talk) 11:22, 18 September 2023 (UTC)[reply]

J9U[edit]

Hi Bargioni,

I'm working to clean this page for a week to clean up stupidity by sockpuppet of Matlin - User:Ewa Lipton - https://quickstatements.toolforge.org/#/batch/212059. But now you restored the mistaken edits by the sockpuppet - https://quickstatements.toolforge.org/#/batch/214126 ... Geagea (talk) 08:47, 8 October 2023 (UTC)[reply]

Hi @Geagea:! Due to a well known issue (Wikidata talk:Mix'n'match#Synchronisation and feedback loops), about which I tried to raise a discussion but as of now with very scarse success (only 2 comments), unfortunately the removal of wrongly matched IDs from Wikidata items doesn't affect Mix'n'match entries, whose matches have to be fixed manually; if a wrong match is removed only from Wikidata, whichever user could readd it triggering the synchronisation of the catalogue; the only way to avoid this negative loop is in fact solving the wrong matches both on Wikidata and on Mix'n'match, which I periodically try to do for most VIAF members. If you need explanations about how to use Mix'n'match, of course you can contact me. I will empty Property talk:P8189/Duplicates/humans in the next days, as I did in September 2022 and in January 2023. Bye, --Epìdosis 09:15, 8 October 2023 (UTC)[reply]
Thanks Epìdosis, thanks. Maybe some kind of tool or automatic way to remove matches in Mix'n'match when using pages like this. lets say when using "Manually update list". Geagea (talk) 09:30, 8 October 2023 (UTC)[reply]
@Geagea: of course it would be; it should probably give you an option, when removing an identifier statement, to remove also its match on Mix'n'match without having to open it. You can comment on the thread I linked above if you want, although probable it won't have effects soon. --Epìdosis 09:33, 8 October 2023 (UTC)[reply]

MassMessages[edit]

Hi Bargioni, please take a look at de:Benutzer Diskussion:Bargioni and archive / remove old newsletters. The talk page has exceeded the maximum page size, therefore massmessage delivery is failing [4]. Johannnes89 (talk) 08:37, 12 October 2023 (UTC)[reply]

GND P227 and DtBio P7902 - single value violations[edit]

re GND ID (P227) and Deutsche Biographie (GND) ID (P7902)

https://web.archive.org/web/20231020160350/https://data.dnb.de/opendata/ states that there are 528920 redirects. Could you take that file and resolve redirects in P7902? If the target value is already in P7902 then delete the old value. For security do this only if there is no qualifier in any P7902 statement on that item.

@Kolja21 ok?

Currently there are 1300+ single value violations in P7902. I resolved ca. 200 https://www.wikidata.org/w/index.php?title=Wikidata%3ADatabase_reports%2FConstraint_violations%2FP7902&diff=1993934867&oldid=1991114724 Asterix2023 (talk) 16:31, 20 October 2023 (UTC)[reply]

Looks like the first line of this message is damaged. Please, write it again. Thx.  Bargioni 🗣 17:10, 20 October 2023 (UTC)[reply]
It doesn't look damaged to me and I see no reason to write the message again. Asterix2023 (talk) 17:35, 20 October 2023 (UTC)[reply]

In the past you removed P7902, at least some of them are at least now redirects. But you didn't add the target values. At least in some cases the target value is currently a valid value.

So, this request is a preparation to repair possibly not very good past removal. Asterix2023 (talk) 16:27, 21 October 2023 (UTC)[reply]

GND P227 and DtBio P7902 - missing VIAF[edit]

re GND ID (P227) and Deutsche Biographie (GND) ID (P7902)

Asterix2023 (talk) 17:46, 20 October 2023 (UTC)[reply]

This is the (plain) text I read in my browser:
re identificativo GND (P227) and identificativo Deutsche Biographie (P7902)  Bargioni 🗣 17:51, 20 October 2023 (UTC)[reply]
2 min before you wrote this, I wrote that this message is damaged and that I write again https://www.wikidata.org/w/index.php?title=User_talk:Bargioni&diff=prev&oldid=1994201696 Asterix2023 (talk) 18:02, 20 October 2023 (UTC)[reply]

This message is damaged, the special editing Tool removed everything I added after the pasted content.

Writing again...:

Could you add VIAF V on each item Q having GND G if the VIAF cluster contains only G or G and Q? Restrict to items having P31=Q5 or having P7902. The G for these items should be in the VIAF DB, if not, the cluster should be fixed by a new import, a task that the GND DB operator, the DNB, should have done long ago. Asterix2023 (talk) 17:58, 20 October 2023 (UTC)[reply]

Hello Bargioni, wondering if these are the same? Thank you zo much for your time. Lotje (talk) 12:45, 8 December 2023 (UTC)[reply]

@Lotje: they seem clearly the same, merged. @Kolja21: for final check. Thanks! --Epìdosis 13:36, 8 December 2023 (UTC)[reply]
Thank you ever so much Lotje (talk) 13:38, 8 December 2023 (UTC)[reply]
Clearly the same. I've corrected the typo in GND 116158972. Thanks for the hint. --Kolja21 (talk) 22:10, 8 December 2023 (UTC)[reply]

Widget ShowProperties[edit]

Ciao @Bargioni!

Ti segnalo che per un periodo ho usato il widget show_properties.

Per un effetto collaterale, andava a colorare di rosso anche i lessemi, i sensi e le forme. Probabilmente per una collisione sui selettori delle classi.

Ti lascio come esempio il lessema casa (L350) dove derived from lexeme (P5191), translation (P5972) e usage example (P5831) hanno i loro valori colorati di rosso con il widget.

Ovviamente è una segnalazione a bassissima priorità. Ti ringrazio per il tuo lavoro, un saluto. Luca.favorido (talk) 05:29, 11 April 2024 (UTC)[reply]

Grazie mille, visto. O lo adatto a L, o lo disabilito per non Q.  Bargioni 🗣 12:02, 11 April 2024 (UTC)[reply]