Wikidata:WikiProject LD4 Wikidata Affinity Group/Affinity Group Calls/Meeting Notes/2021-12-14

From Wikidata
Jump to navigation Jump to search

Call Details[edit]

  • Date: 2021-12-14
  • Topic: Wikidata Project Lightning Rounds
  • Speakers: Steve Baskauf, Anne Chen, Jim Hahn


Presentation Materials[edit]

Vanderbilt Fine Arts Gallery Update[edit]

WikiProject Vanderbilt Fine Arts Gallery update slides

Reference links:

Roundtripping Wikidata into Alma with AlmaRefine[edit]

Presentation Slides


Meeting Notes[edit]

  • Anne Chen, Yale University, Use of Wikidata in a site-specific archaeological case study (Dura-Europos, Syria)
    • Is a nascent project and will be showing a series of queries and webpages.
    • Anne is art historian/archeologist. This work involves a site called Dura-Europos that was excavated in the 1920s. Over ⅓ of artifacts ended up at Yale scattered around different departments.
    • Some artifacts ended up in Damascus.
    • They’ve added about 15,000 objects, not including archival photographs which is the next major mission.
    • Wanted to find a way to virtually join these up since they are scattered.
    • Have issue of colonialism that caused them to be in Western collections (and only searchable in English)
    • Interested in using Wikidata because of multilingualism features as well as linked data aspects and collaborative editorial opportunities.
    • At this stage, are working on getting metadata into Wikidata and create an urban gazetteer: an authority that defines spatial information. At the macro level, would disambiguate two places. Taking the same concept and zooming in. Could have “Temple of Bell” to disambiguate using defined coordinates and stable URIs.
    • Have linguistic factor as well as time since it’s been excavated (nearly a century ago). Defining URIs within Pleides framework and inserting them in Wikidata.
    • Are defining spatial entities down to room level, if there is something of interest, such as wall fresco that they want to place in individual setting: room → wall level. Specify it in a “location of discovery” Wikidata field.
    • Can run Wikidata query to visually display all the objects found in a certain place. For the first time these artifacts are searchable in same interface. This way they are able to contextualize artifacts in where they were spatially found.
    • Another query can show towers. Looking to parse city in entirety.
    • “Dura Europos” will be backend for users who aren’t familiar with Wikidata interface. Will be able to click into objects thanks to queries and show maps:
    • Starting to think about how to relate physical entities into archival representation of the entities. Everything is uploaded into ArtStor, plan to go there and standardize the metadata.
    • Are creating an item for the entity itself and a second entity for the facsimile and how to link those objects that preserve the meaning. Example shown: photo of a graffito.
    • Setting up the infrastructure and working with Syrian archeologists who have agreed to contribute their expertise.
    • In the future, building it in Wikidata allows them to keep it open and partner with scholars in other parts of the world, specifically Arab speakers and participate in the curation of the objects.
    • Interface is open source and can be adopted by others.
    • Wikidata Project Page
  • Jim Hahn, University of Pennsylvania, Round tripping Wikidata into Alma using Alma Refine
    • Penn Libraries participated in the PCC Pilot Project. Has established over 5,000 Wikidata items as serials from the Penn Libraries Deep Backfiles.
    • Now in the process of round-tripping Wikidata Entities back into our source metadata storage. Alma is using a version of OpenRefine (AlmaRefine).
    • Can create sets in Alma so can narrow searches. Interested in bringing QIDs.
    • Support for other Wikibase reconciliation also enabled: https://openrefine-wikibase.readthedocs.io/en/latest/
    • BibCard knowledge panel mockups. Can also include references for Wikidata in 758 fields to provide information, to provide information similar to Cornell’s DISCOGs implementation (https://newcatalog.library.cornell.edu/catalog/8058246).
    • Used Wikidata Reasonator () embedded in Franklin (catalog)--can pull in using jquery. Screenshots of mock ups in front-end since user testing needed to be done.
  • Q&A
  • Q for Steve: Interested in how to obtain page counts for images in Commons: can you talk about that?
    • There is an API that can read this information, this operates across Wikis. Would need to supply what platform and the page and date range. He has the code in https://github.com/HeardLibrary/dashboard/tree/master/gallery
    • Also, https://www.postman.com/ is way to use APIs in your browser (audience comment)
  • Q for Steve: You wrote your own code for this project. Why didn’t you use existing tools?
    • There’s a couple of reasons: he has various scripts for different stages in processing and wants to create a pipeline to feed the data in and it would automatically go through different steps without repeated clicks. While there are a number of great tools, they involve much clicking and exporting, etc.and he would like to just create a shell script to shepherd the data through the process.
    • Andrew Lih spoke of coding chasm in previous Affinity talk : how do you get in the middle ground between graphical tools but below a professional software developer where you can understand and write code to accomplish tasks. Software not yet ready for other users but he tries to insert lots of comments as he’s writing it.
  • Q for Jim: Have you ever thought about the impacts of catastrophic failure of Wikidata query service and how that would impact your project? Was just reading playbook for catastrophic failure: proposing deleting scholarly articles from knowledge graph:
    • Jim plans to read up more about this and is thinking about it: maybe there’s a need to have a duplicate copy of such data just in case.