From Wikidata
Jump to navigation Jump to search
This page is a translated version of the page Wikidata:Creating a bot and the translation is 43% complete.

Other languages:
العربية • ‎български • ‎català • ‎čeština • ‎dansk • ‎Deutsch • ‎Zazaki • ‎Ελληνικά • ‎English • ‎British English • ‎Esperanto • ‎español • ‎فارسی • ‎suomi • ‎français • ‎Frysk • ‎ગુજરાતી • ‎magyar • ‎Հայերեն • ‎italiano • ‎日本語 • ‎한국어 • ‎Ripoarisch • ‎lietuvių • ‎latviešu • ‎മലയാളം • ‎Nederlands • ‎occitan • ‎polski • ‎português do Brasil • ‎русский • ‎svenska • ‎தமிழ் • ‎తెలుగు • ‎中文




  • コーディングスキル (Python, Perl, PHP...)
  • フレームワーク (下記のうちひとつ) とタスクの実行と完了コード
  • ボットアカウント (使用申請によるフラグも必要)
  • ソースコードエディタ (Notepad++, Geany, vi, emacs)




pywikibotインストールの詳細についてはmw:Manual:Pywikibot/InstallationWikidata:Pywikibot - Python 3 Tutorial/Setting up Shopをご覧ください。





put_throttle = 1を追記することで遅延間隔を減らすことができます。





例1: データの取得

この例はDouglas Adamsへ参照しているページへのデータを取得します。次のソースコードをファイルに保存してpython example1.pyを実行してください。


    u'claims': {
        u'P646': [< instance at 0x7f1880188b48>],
        u'P800': [< instance at 0x7f1880188488>, < instance at 0x7f1880188368>]
    u'labels': {
        u'gu': u'\u0aa1\u0a97\u0acd\u0ab2\u0abe\u0ab8 \u0a8f\u0aa1\u0aae\u0acd\u0ab8',
        u'scn': u'Douglas Adams',
    u'sitelinks': {
        u'fiwiki': u'Douglas Adams',
        u'fawiki': u'\u062f\u0627\u06af\u0644\u0627\u0633 \u0622\u062f\u0627\u0645\u0632',
        u'elwikiquote': u'\u039d\u03c4\u03ac\u03b3\u03ba\u03bb\u03b1\u03c2 \u0386\u03bd\u03c4\u03b1\u03bc\u03c2',
    u'descriptions': {
        u'eo': u'angla a\u016dtoro de sciencfikcio-romanoj kaj humoristo',
        u'en': u'English writer and humorist',
    u'aliases': {
        u'ru': [u'\u0410\u0434\u0430\u043c\u0441, \u0414\u0443\u0433\u043b\u0430\u0441'],
        u'fr': [u'Douglas Noel Adams', u'Douglas No\xebl Adams'],
['claims', 'labels', 'sitelinks', 'descriptions', 'aliases']

It prints a dictionary with keys for

  • the set of claims in the page: Property:P646 is the Freebase identifier, Property:P800 is "notable work", etc.
  • the label of the item in many languages
  • the sitelinks for the item, not just Wikipedias in many languages, but also Wikiquote in many languages
  • the item description in many languages
  • the aliases for the item in many languages

Then a list with all the keys for the key-values pairs in the dictionary. Finally, you can see that the Wikidata item about Douglas Adams is Q42.



例2: ウィキ間リンクの取得

After item.get(), for example the sitelinks can be accessed. These are links to all Wikipedias that have the article.

The output is:

{u'fiwiki': u'Douglas Adams', u'eowiki': u'Douglas Adams', u'dewiki': u'Douglas Adams', ...

With item.iterlinks(), an iterator over all these sitelinks is returned, where each article is given not as plain text as above but already as a Page object for further treatment (e.g., edit the text in the corresponding Wikipedia articles).

例4: 説明の設置

This example sets an English and a German description for the item about Douglas Adams.

Setting labels and aliases works accordingly.

例6: サイトリンクの設置

To set a sitelink, we can either create a corresponding dict corresponding to Example 4 or use Page objects:

例7: statement の設置

Statements are set using the Claim class. In the following, we set for Douglas Adams place of birth (P19): Cambridge (Q350).

For other datatypes, this works similar. In the following, we add claims with string (IMDb ID (P345) and coordinate (coordinate location (P625)) datatypes (URL is the same as string):

例8: qualifierの追加

Qualifiers are also represented by the Claim class. In the following, we add the qualifier incertae sedis (P678): family (Q35409) to the Claim "claim". Make sure you add the item before adding the qualifier.

例9: 情報源の追加

Also, sources are represented by the Claim class. Unlike for qualifiers, a source may contain more than one Claim. In the following, we add stated in (P248): Integrated Taxonomic Information System (Q82575) with retrieved (P813) March 20, 2014 as source to the Claim "claim". The claim has to be either retrieved from Wikidata or added to an itempage beforehand.

Example 10: Page generators

Example 11: Get values of sub-properties

In the following, we get values of sub-properties from branch described by source (P1343) -> Great Soviet Encyclopedia (1969–1978) (Q17378135) -> properties reference URL (P854) and title (P1476).


Some users share their source codes. Learn more in the next links:

Wikidata Integrator

WikidataIntegrator is a library for reading and writing to Wikidata/Wikibase. We created it for populating WikiData with content from authoritative resources on Genes, Proteins, Diseases, Drugs and others. Details on the different tasks can be found on the bot's Wikidata page.

Pywikibot is an existing framework for interacting with the MediaWiki API. The reason why we came up with our own solution is that we need a high integration with the Wikidata SPARQL endpoint in order to ensure data consistency (duplicate checks, consistency checks, correct item selection, etc.). Compared to Pywikibot, WikidataIntegrator currently is not a full Python wrapper for the MediaWiki API but is solely focused on providing an easy means to generate Python based Wikidata bots.

For more information, documentation, download & installation instructions, see here:

Example Notebook

An example notebook demonstrating an example bot to add therapeutic areas to drug items, including using fastrun mode, checking references, and removing old statements:


Wikibase.NET is the api that replaces the now deprecated DotNetDataBot. They aren't compatible because Wikibase.NET does no longer need the DotNetWikiBot framework.

Download & Installation

The framework can be downloaded from GitHub here. Just follow the instructions on that page.

Known issues


DotNetDataBot (Deprecated)



After unpacking the package you can see a file called DotNetDataBot.dll and one called DotNetDataBot.xml. The xml document is only for documentation. To use it you have to create a new refer in your project. Then you can write using DotNetDataBot; to import the framework.


To login you have to create a new Site object with the url of the wiki, your bot's username and it's password.

例1: ウィキページで使用しているIDの取得

You can access the id of an item by searching for using the site and the title of the connected page.

例2: ウィキ間リンクの取得

You can get the interwiki links of an item by loading the content and accessing the links field of the object.

例3: 説明の設置

To set a description, you must call the setDescription function.

例4: ラベルの設置

It works the same way for setting a label. Just call setLabel.

Example 5: Get interwiki links for 100 pages

This feature is not supported. Just iterate over the list.

Wikibase api for PHP

This is an api client for Wikibase written in PHP. It can be downloaded from here.

Example 1: Basic example

Take a look at the source comments to understand how it works.

例2: 主張の作成

Take a look at the source comments to understand how it works.


Framework for Wikidata and Wikipedia. Read and write on Wikidata and other Wikimedia project and have a useful list generator to generate list of Wikipedia page and Wikidata entity. Can read also JSON dump of Wikidata.


Bot to read and edit Wikidata and Wikipedia.

  • License: CC0 1.0
  • Language C#
  • Can read and write entities with all datatype on Wikidata
  • Can read and write pages on all Wiki project
  • Can read parameter from template on wiki pages
  • Can read JSON dump
  • Can create lists using:
  • Tested with Visual Studio Express 2013 for Windows Desktop.
    • Is necessary to have Newtonsoft.Json. You can install it with NuGet inside Visual Studio
    • Is necessary to add manually a reference to System.Web for "HttpUtility.UrlEncode"


The framework can be downloaded from GitHub here.



Update en label for all items with instance of (P31): short film (Q24862) that have director (P57) and that have publication date (P577) in 1908. (Use of Wikidata query)

Wikidata の API を直接使用する

The other sections describe how to use bot frameworks to access and update Wikidata information. You can also directly interact with the Wikibase API that Wikidata provides. You need to do this if you're developing your own framework or if you need to do something that a framework doesn't support. The documentation for the Wikibase API can be found at You can also play around with it at Special:ApiSandbox, try action=wbgetentities.

Wikibase provides its API as a set of modules for MediaWiki's "action" API. You access this by making HTTP requests to /w/api.php. The default response format is JSON. So for your language of choice, you only need a library to perform HTTP requests and a JSON or XML library to parse the responses.

Example 1: Get Q number

This example gets the item Q number for the English Wikipedia article about Andromeda Galaxy. The Wikibase API's main "workhorse" module action=wbgetentities provides this information. The HTTP request (using jsonfm format for human-readable JSON output) is simply

Try following the link. This requests no additional information about the entity; remove &props= from the URL to see much more information about it. See the generated help for wbgetentities for more parameters you can specify.


The output is: