The WikiCite annual report 2019-20 has been published. Describing the "satellite event grants" program that was run (and details of the nine successful proposals); the changes that resulted from the COVID-19 shutdown of all in-person events; and a summary of WikiCite-related news from across the movement. For more details see Meta:WikiCite/Administration. For more information, contact LWyatt (WMF)
Past video: Querying Wikidata with a glimpse of SPARQL - Thorsten Butz - PSCONFEU 2020. YouTube
Upcoming video: Wikipedia Weekly Network Live Wikidata editing scheduled for 1st June at 7:00 PM UTC: YouTube, Facebook
Upcoming video (in Spanish): Wikidata Online Workshop: "Connecting Authority Resources with the Knowledge Graph" scheduled for 2nd June at 3:00 PM UTC: YouTube
Tool of the week
wdumps allows you to create a limited RDF dump from Wikidata, for those times when your SPARQL queries keep timing out. It is not particularly user-friendly, and it typically takes several hours to get a complete dump, but it is the best way to for example get a list of all English names of humans in Wikidata, or a list of every scientific article with its title and DOI.
As part of a thesis project, the search engine "Lister" has been developed to make it easier for the general public to access data on the Semantic Web in general, and WikiData specifically. It's in a usability test-phase right now, and if you'd like to try it out and help improve it, you can do so at this link. The test should at most 15 minutes.
Follow the discussion about restricting editing of properties to autoconfirmed users.
New domain toolforge.org to be adopted by our Toolforge community. New domain/scheme for Toolforge-hosted webservices will change from tools.wmflabs.org/toolname to toolname.toolforge.org with the aim to introduce permanent redirects for the legacy URLs on 2020-06-15.
Wikidata development team is currently running a survey until June 9th to understand better how people access and reuse Wikidata’s data from the code of their applications and tools (for example through APIs), and how we can improve the tools to make your workflows easier. If you would like to participate, please use this link (Google Forms, estimated fill-in time 5min).
Part 2 - The residency at the Bodleian Libraries - YouTube
Part 3 - The Astrolabe Explorer & other datasets - YouTube
Tool of the week
QuickStatements lets you edit thousands of Wikidata items at a time. Add labels and descriptions, or statements with sources and qualifiers, add sitelinks, create or merge items, or remove statements that were created in error. Batches can be discussed and, if necessary, reverted in full through the EditGroups tool.
MyCroft, the open digital personal assistant now has a Wikidata skill to answer questions about current and historic facts & information about a person.
Do you know a Wikidata/Wikibase community member who has accomplished something new or has been inspiring in the last year? If so, please take a few minutes and nominate someone in the 2020 Community Spotlight survey.
Wikidata Bridge: Made a number of technical improvements to wrap up the work in the first version.
Reference Hunt: Making final technical improvements and are now running the process to find references on the whole of Wikidata (as opposed to the initial small subset for testing). Once that's done we'll release them as a dump.
Easier access to data for programmers: Researching and interviewing ways to improve the APIs in preparation for working on them.
Query builder: Working on defining the first version and creating mockups based on the feedback we received.
Part 1 - Meet Ewan, WiR at the Uni' of Edinburgh - YouTube
Part 2 - the Witch Hunts projects & Wikidata - YouTube
Part 3 - Why Wikidata is important for education - YouTube
Tool of the week
Mix'n'match lets you match lists of external identifiers to Wikidata items, or create new items if necessary. Thousands of catalogs already exist, in dozens of different topic areas, and it's easy to import new ones.
Collecting a lot of feedback and doing research around how to improve our APIs to start making it easier to access Wikidata's data for programmers
Finalized the initial feature list for the query builder and working on designs and prototypes for testing
Doing user research on the Merge gadget to prepare for making it part of the proper code-base so that other Wikibase instances also benefit from easier merging
Federated Properties: continued working on the first version of Federation that will make it possible to use Wikidata's Properties in another Wikibase installation
Start of a new project (Wikibase Decoupling & Extension Registration) in order to clean up our codebase a bit and make it easier to extend with new features in the future
More work on the design system to have a set of unified components for Wikidata that will make it easier to develop new features in the future because we don't have to rewrite components that are used in a lot of places
Some MediaWiki skin changes meant broken edit links on Wikipedia and co. Fixed now. (phabricator:T252800)
Thank you for your work in maintaining Wikidata. I have a small suggestion to improve your future work. If you notice that two items are duplicates, please merge them instead of blanking one of them as you did with the page Q96483499. External sites use Wikidata identifiers, so it is important that we preserve the chain of references. We do this by making one item a redirect for the other. In particular, item ids are intended to be a permanent identifier, so we never reuse them for another concept. See Help:Merge for more information on how to merge items, and consider installing the Merge Gadget. Thanks! Bovlb (talk) 23:30, 21 June 2020 (UTC)
@Bovlb: Thanks for your remark. I understand your point, but in this specific situation I created the item myself, erroneously. I had not carefully enough checked to find the existing one, that was slightly differently spelled. Greetings, --Dick Bos (talk) 15:06, 22 June 2020 (UTC)
OK. In this case you can post a request at WD:RFD to have the item deleted, but merging is also a valid way to deal with it and has the advantage that you can do it yourself. Simply deleting all the claims is not a good approach, not least because it tends to trigger vandalism detectors. :) Happy editting! Bovlb (talk) 15:25, 22 June 2020 (UTC)
Upcoming domain name migration on the Wikimedia Toolforge implies that OpenRefine users need to update their Wikidata reconciliation service to the new endpoint which will be available by default in the upcoming release of OpenRefine (3.4). phab:T254172
Wikirecords - proposal for a Wikibase-based sister-project
Apologies for both of the problematic queries ("MWAPI searches in wikidata about people described as slave traders by citizenship" in Weekly Summary #419 and "Monuments named after/commemorating/depicting slave traders” in Weekly Summary #420). The "MWAPI searches in wikidata about people described as slave traders by citizenship” in #419 lists any people with the word "slave" in a label, description or alias, among them several people who are not slave traders. The "Monuments named after/commemorating/depicting slave traders” in #420 incorrectly linked this query for enslaved people instead of this query for slavetraders.
Upcoming: Next Linked Data for Libraries LD4 Wikidata Affinity Group call: Merrilee Proffitt, Chris Cyr, and Rob Fernandez on a project to surface library holdings to indicate possible notability for persons, June 30th. Agenda
Ptable displays the periodic table automatically extracted from information provided by Wikidata; it also provides a check that all the elements are there with some basic properties. Additional pages provide charts of the nuclides under different criteria such as half-life. Each element or nuclide is linked to its Wikidata item for more information or to edit if necessary.
Polishing the first step of Federation (using Wikidata's Properties in another Wikibase installation) (incl. preventing users from selecting a federated property with a non-supported data type (phab:T252012) and preventing users from accessing Special:NewProperty when federation is enabled (phab:T255576) and viewing a list of all properties when federation is enabled (phab:T246339))
Continuing research and interviews around the topic of making it easier to access Wikidata's data for programmers
Doing first testing of mockups and prototypes of the first version of the Query Builder - coding can start soon
Convert a few properties from string to external identifier: Linguasphere code (P1396), KOATUU identifier (P1077) and ISIN (P946)
Continued building out documentation for Federated Properties (phabricator:T255651) and making interface improvements to the first stage of the feature (incl. phabricator:T246886, changes to special pages that interact with both Items/Properties, and phabricator:T255581, changes to Special:ListDataTypes when federation is enabled)
More work on the consistent design system
More work on decoupling the different Wikibase extensions from each other to make development easier
Finalizing research and interviews to better understand what could be improved in the way developers access Wikidata's data (APIs, SPARQL)
Testing the first prototype of the Simple Query Builder with some editors to get final input before coding starts
Sorting of language links on Wikipedia and the other Wikimedia projects was broken (presumably by a change in MediaWiki core). A fix is being worked on. (phabricator:T257625)
Upcoming: next Wikidata office hour, July 21st at 16:00 UTC (18:00 CEST) in the Wikidata Telegram group. Query Service special with guests from WMF Search Team.
Upcoming: Wikidata Lab XXIV: Posicionamento digital relativo with Ederporto - July 23 17:00 UTC (14:00 BRT). In this technical training, we'll study the possibilities and functionalities of relative digital positioning in images and do practical activities on this topic using historical photographs of the city of São Paulo. The event will be held in Portuguese. Join us!
Upcoming video: July 21 - Wikipedia Weekly Network - Entity Schemas and Shape Expressions (ShEx) FacebookYouTube
Upcoming video: July 25 - Wikipedia Weekly Network - LIVE Wikidata editing #13 FacebookYouTube
Upcoming: Kidok-Workshop, online workshop about church building data. In German, non-native users welcome. Currently looking for a date in the upcoming week and people to help!
Upcoming: Next Linked Data for Libraries LD4 Wikidata Affinity Group call: Liam Wyatt on WikiCite and its future plans, ways to get involved, and discussions that are happening in the community, 28 July. Agenda
Library’s linked-data project gets new grant. "Known as Linked Data for Production, the project is part of a long-term collaboration among Cornell University Library, Stanford Libraries and the School of Library and Information Science at the University of Iowa. Through linked data, information about books and other items in library records will be enhanced by related information from external online sources". By Jose Beduya
Wikidata Training Workshop 1, by Canadian Arts Presenting Association
Part 1 - Introduction to Wikidata - YouTube (En, Fr)
Part 3 - Components of a Wikidata item - YouTube (En, Fr)
Video: Wikidata Lab XXIV on relative digital positioning (in Portuguese). YouTube
Video: Women Writers in Review: Integrating special collections into Wikidata. YouTube
Video: Wikipedia Weekly Network - Entity Schemas and Shape Expressions (ShEx) FacebookYouTube
Video: Wikipedia Weekly Network - LIVE Wikidata editing #13 FacebookYouTube
Tool of the week
We would love suggestions for tools to include in this section of the weekly summary. Please add your suggestions directly under Status updates/Next#Backlog after checking that the tool isn't already listed.
Changed the size of image previews to 1024 in the gallery view of the query service to avoid some images not loading sometimes (phabricator:T258241)
Added an actual space between the entity title and the name of the fallback language (if any), so that the fallback language isn't selected anymore when double-clicking the entity title for copying (phabricator:T256857)
Fixed the directionality of text pieces in placeholders that mix LTR and RTL (phabricator:T253812)
Continued work on first pieces of design system to make coding new features easier in the future
Continued untangling the code of Wikibase Client and Wikibase Repo to make it easier to develop on them
Finished first piece of research on how to make it easier to access Wikidata's data for programmers - more work to be done
Preparing to start coding on the Query Builder to make it easier to create queries without having to know SPARQL
Finished running the scraper that gets potential new references for unreferenced statements and preparing it for publishing
The last week was our quarterly prototyping week. We worked on the following projects. None of them are ready for prime-time yet but we'll continue with them.
Slices: We've had a lot of requests for accessing dumps of a smaller part of Wikidata's data since rarely anyone needs the complete data in Wikidata. The tricky part is figuring out which part is needed and if any of that can be generalized. We looked into for example how to make dump generation faster so we could potentially produce more smaller dumps that only cover a part of Wikidata's data, either thematically (e.g. humans) or by type of data (e.g. only statements and English labels and aliases but not sitelinks or descriptions).
REST API: As part of our effort to make it easier to access Wikidata's data for programmers we looked into a REST API. We tried to see if we could cover the existing action API modules in a REST API. We could. We'll take this as input for our ongoing API work now.
Improving quality ratings through ORES: ORES can judge the quality of an Item automatically. It is currently not very good at it however. We tried a few things to make it more accurate and found some easy wins we'll probably make happen in the next weeks.
Query manipulator: One of the ways we could potentially improve the load situation of the Wikidata Query Service is by automatically analyzing and then redirecting a bunch of queries to other systems that are more suitable for that particular type of query. The nice thing about that would be that the person/program sending the query wouldn't have to care about it but it'd be done automagically for them. We tried to build such a system and the results look very promising but more work/experimenting is needed, especially together with the WMF Search team.
Upcoming: Next Linked Data for Libraries LD4 Wikidata Affinity Group call: Daniel Mietchen and Lane Rasberry about Scholia, a project to present bibliographic information and scholarly profiles of authors and institutions, 11 August. [Agenda]
Video: Editing Wikidata with information from Son jarocho (in Spanish). YouTube
Tool of the week
Entity Explosion: a new multilingual Chrome browser extension. "Taking the power of Wikidata with me wherever I go across the web!". Uses API calls to the Wikidata Query Service to match the URL you are browsing on to a Wikidata item, and then displays data and links to other sites about the same entity. (Video)
e-Scholarships [per-diem calculated on your city; 1-5 people (single, or as a 'remote group') for 2-4 days, for COVID-era "stay at home" projects. Paid in advance living allowance, no expense report required.]
Upcoming: Next Linked Data for Libraries LD4 Wikidata Affinity Group call: Rob Fernandez (Wikimedia District of Columbia) on Listeria, a tool that uses SPARQL queries to define a list, and provides a bot that will update a wiki page containing that list if the results of that SPARQL query change, all based on Wikidata, 08 September. Agenda
"Our admiration for Wikidata and for the people who work with it knows no bounds. Having a single source of well modelled, massively interlinked, well managed data that anyone can query at the press of a button is a real thing of wonder." - says the UK Parliament data team
Fixing an error that sometimes causes Wikidata UI to report 2 error messages when saving a sitelink in an Item instead of just one. (phabricator:T260869)
Development of version one of Federated Properties has concluded! Expect an announcement with timing of the release of this feature soon. Wikibase users who want an early look at the feature are invited to reach out to participate in the pre-release testing round.
Development of the WikibaseManifest extension has continued into its second sprint; we focused on determining a product specification for the Manifest output.
Finishing a draft documentation for a REST API to get it ready for a feedback round before implementation.
Finished improvements to the automated scoring of the quality of Items with ORES. Still need to retrain ORES and deploy the changes before the scores are actually different though.
Upcoming: Next Linked Data for Libraries LD4 Wikidata Affinity Group call: A starting point for newer institutions to think through what is involved in coordinating a Wikidata project, including shared infrastructure, training, and documentation, 22 September. Agenda
Upcoming video: Wikipedia Weekly Network - LIVE Wikidata editing #21 WikiDojo: Facebook, YouTube, September 25
WDQS/WCQS Status update (September 2): "We are planning to spend more time doing some analytics on our data. (1) What are the most expensive queries, what are they trying to achieve and is that reasonable? (2) Do we have performant subgraphs that we could expose independently?"
Continuing to write a draft for a REST API specification
Finishing the remaining work needed to get the improved quality scoring for Items deployed to ORES
Continuing work on WikibaseManifest: Determined the essential metadata that will be included in the WikibaseManifest file, added some new features (mostly MediaWiki metadata) to the Manifest that were requested by the OpenRefine team (phab:T262805 and phab:T262804) and set up a test system that will soon be ready for tool builders to use for testing the integration of their tools with WikibaseManifest
The International Federation of Library Associations and Institutions [IFLA] have published a series of six videos "discussions with professionals in order to discuss projects, issues, progress of Wikidata, Wikibase and bibliographic data in the field of libraries." Currently available as a playlist on YouTube under CC-By (Wikimedia Commons upload soon), with subtitles in English, Spanish, Portuguese, and French (Arabic and Chinese coming soon). The production was made by the IFLA Wikidata Working Group and funded by a WikiCite grant.
Como is a new Android app, that uses Wikidata lexemes and senses to create a word-guessing game. It let's players create new senses and tests them on other players to finally save them in Wikidata. The app is developed as part of a BA thesis, to determine if this concept is useful to create more lexicographical data, and testers would be very welcome.
A long-standing bug has been fixed (T217144) where Items and Lexemes created via OAuth would not be added to the user’s watchlist even if the user had the Add pages I create and files I upload to my watchlist setting enabled. (The Add pages and files I edit to my watchlist setting was probably likewise ineffective, but this was not tested specifically.) Affected tools include QuickStatements, Mix'n'Match, and Wikidata Lexeme Forms; users of these and other tools may see more pages being added to their watchlists now. (This only applies to new edits and page creations; previously created or edited pages will not be automatically added to the watchlist retroactively.)
Working on the basic building blocks of the Query Builder towards making it possible to create the first very simple query with it
Talking to people about comparing Wikidata's data against other databases and flagging mismatches
Fixing an issue with Item creations via the API by blocked users leading to skipped entity IDs (phab:T232620)
Fixed an input issue with invisible characters (phab:T261071)
Finishing the draft of the REST API spec to get it ready for feedback
WikibaseManifest: created a separate key for local entities and decided what we do about non-local entity sources based on tool-builder feedback (phab:T263527) and specifying the API in OpenApi format (phab:T262919)