Perl module to import entities from Wikidata for processing with the Catmandu ETL framework
Simple Python CLI to load subsets of Wikidata into ElasticSearch from a dump or SPARQL query. Automatically paginates SPARQL queries for bulk loading.
Import a subset or a full Wikidata dump into a CouchDB database
A PHP library that provides ways to read from, and iterate through, the Wikibase entities in a Wikibase Repository JSON dump such as the Wikidata JSON dumps.
A ranking signal for Wikidata. Periodically (re)computed from Wikimedia pageviews, available for bulk download. For an introduction, see the README file.
qwikidata is a Python package with tools that allow you to interact with Wikidata. The package defines a set of classes that allow you to represent Wikidata entities (items, properties, and lexemes). It also provides tools for getting data from the linked data service, the sparql query service, and the full JSON dumps.
A Python tool for fast inserts into a Wikibase instance. RaiseWikibase 1) uploads up to a million entities and wikitexts per hour, 2) fills data directly into the MariaDB database, and 3) can create a bot account for the wrappers of the Wikibase API.
Replicator is a CLI application for replicating a Wikibase entity base such as Wikidata. It can import entities from the Wikidata API and from Wikibase dumps in various formats. It features abort/resume, graceful error handling, progress reporting, dynamic fetching of dependencies, API batching and standalone installation (no own MediaWiki or Wikibase required). Furthermore it uses the same deserialization code as Wikibase itself, so is always 100% compatible.
ShExStatements allows users to generate shape expressions (ShEx/entity schemas) from simple CSV statements and files. It can be used from both web interface as well as the command line.
Offers a centralized, user-friendly way for The Community (TM) to provide translations for tools.
WikibaseIntegrator is a Python library forked from Wikidata Integrator in July 2020 by User:Myst. It allows reading and writing to Wikidata/Wikibase via the REST APIs. It also supports running SPARQL queries and returning WDQS results as pandas dataframes. Full contributors list
A comprehensive .NET Standard MediaWiki client library that allows you to fetch and edit Wikibase/Wikidata items conveniently. It also supports working with JSON dumps. See the "Getting started" section in the repository wiki for usage example.
Read and edit Wikidata from the command line. Edits are signed
#wikidatajs/cli by default.
A Wikidata client library for Python. Supports Python 3.4 or higher.
A lib to edit Wikidata from NodeJS. Edits are signed
#wikidatajs/edit by default.
WikidataIntegrator is a Python library for reading and writing to Wikidata/Wikibase. It also supports running SPARQL queries and returning WDQS results as pandas dataframes.
A JSON/JS index of 183 languages accessible either by their 2-letters language code or Wikidata Qid.
R package and wrapper for the Wikidata Query Service (WDQS) which provides a way for tools to query Wikidata via SPARQL.
R package to read from and write to wikidata via Wikidata and Quickstatements APIs.
Free Java library for working with Wikidata. Helps developers to download, process, and query data from Wikidata and other Wikibase sites. It is also used to create exports, such as the Wikidata RDF exports. Sources can be found at https://github.com/Wikidata/Wikidata-Toolkit
Tools to setup an ElasticSearch instance fed with subsets of Wikidata
wptools is a python library intended to make it as easy as possible to get data from MediaWiki instances, expose more Wikidata, and extend Wikimedia APIs just for kicks. We say (for Humans) because that is a goal.