Wikidata:Project chat/Archive/2016/04

From Wikidata
Jump to navigation Jump to search

This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Contents

Wikidata weekly summary #201

Instance of

  • "Identifying problematic statements in Wikidata via multi-level modeling theory" is a very important paper for us and shows us why it's a bad thing that our use of (predominantly) instance of (P31) is a bad thing. --Izno (talk) 13:48, 21 March 2016 (UTC)
    • And let me clarify: Using the instance of (P31) relation for things which are not instances (except where we use it as a metaclass statement). --Izno (talk) 16:02, 21 March 2016 (UTC)
      • @Izno: I think I understand your latter point, but for the benefit of me and others please can you explain/ give examples? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:35, 21 March 2016 (UTC)
        • @Pigsonthewing: Immediately to mind is any video game, where we use instance of (P31) video game (Q7889); this is not a true statement because video games (and books) as we document them on Wiki* are not instances but instead classes, where there are no useful instances to document. We have similar issues in disease (as I came across yesterday) and in genes, and in chemicals. An instantiation of a gene, for example, is the individual molecule representing a gene in my or your body, but we document genes on Wikidata as instance of gene rather than the proper subclass of gene. An area where we got it right is in our treatment of biological taxonomy, which treats taxa as instance of a taxon rank and subclass of organism (by inheritance). (Side note: That particular domain did do something wrong in using domain-specific properties where P31 and P279 would have worked, but that's not an interesting part of this discussion.) --Izno (talk) 18:43, 21 March 2016 (UTC)
          • Side note: Only a minor sidekick. --Succu (talk) 21:02, 21 March 2016 (UTC)
          • Obviously we've had extensive discussions about this sort of thing before - This recent RFC for instance. I don't believe we either have a consensus on the correctness of this sort of instance/subclass distinction, or even that the exclusive multi-level modeling described in this article (while interesting) is really a valid description of the world that we ought to emulate. If you've ever read, for example, Douglas Hofstadter's books (I'm thinking specifically of I Am a Strange Loop (Q3616600)) you'll realize this kind of sharp level distinction is not the way the world actually works, and intermixing of levels is natural. That's not to say that some of the modeling in wikidata is not currently wrong and in need of fixing - the "earthquake" example in the paper is a good one. But I'm not so sure about genes or chemicals, or even books or video games. I Am a Strange Loop (Q3616600) subclass of (P279) book (Q571) feels wrong, while instance of (P31) feels right. The confusion is I think because "book" has two meanings in English: a physical object that you can throw at the wall, and an abstract entity described by its title, author, and content irrespective of physical manifestation. I believe book (Q571) describes the latter, and instance of (P31) is the correct relationship in that case. But such ambiguities are almost inevitable and it may be quite correct for a wikidata entity to represent more than one meaning in this sort of sense. ArthurPSmith (talk) 20:18, 21 March 2016 (UTC)
            • @ArthurPSmith: The video game token can be considered to be (exactly as a normal game) as the class of all concrete games one ever played, the concrete experience. Then a video game like "Quake 1" is indeed a class of game. Each video game is a type of game experience, specifically related to a single set of code/data, more specifically, according to philosophycal definition virtual is a kind of potentiality that becomes fulfilled in the actual., like a crop is a virtual plant (see en:Virtuality_(philosophy) where we can see mentioned Charles Sanders Pierce, the same one of the type/token distinction) So to use instance of (P31) for video games I'd put something like :
              < Quake > instance of (P31) View with SQID < virtual game experience >
              , which seems totally philosophically justified and justifies as well the use of a class as if it was an instance, here it's a potential instance. Same constructions can be constructed for genes, I think. author  TomT0m / talk page 14:58, 22 March 2016 (UTC)
      • @Izno:, you've writeen examples of what you consider wrong. Can you explain why you consider it wrong? --Infovarius (talk) 11:08, 1 April 2016 (UTC)
    • I was wondering where they got the TBL sample from and was reassured that its P31 wasn't what they suggest.
      --- Jura 16:41, 21 March 2016 (UTC)
      • That may be because I fixed most of their samples (an IP got to one of them before me). --Izno (talk) 18:43, 21 March 2016 (UTC)
        • There is no trace of it for Q80#P31, but the value at Property:P106#P1647 is (currently) probably wrong and could have led to the sample.
          --- Jura 19:00, 21 March 2016 (UTC)
          • Right, they were also using subproperties in their queries. No, I don't think it's wrong--indeed, TBL is a computer scientist (and he is an instance, not a subclass), so that statement is right. But I'm not sure about computer scientist subclass of profession. I think it would be correct to say computer scientist subclass of professional. --Izno (talk) 23:24, 21 March 2016 (UTC)
            • That should be two separate items, one for "computer scientist" (person) and another one for "computer scientist" (profession). However, since it is too much effort to maintain two items for each, I think it is an acceptable compromise to just use "computer scientist" (profession) and instanciate as human.--Micru (talk) 08:27, 22 March 2016 (UTC)


The main problem is the definition of "instance of". For me an instance is "the most detailed item in a hierarchy in term of properties use". So even if it is possible to describe a single molecule of water in a glass or a single grain of rice in a sac, we don't have the properties to describe these two elements among the other ones. The location property is not able to define the exact position at time t and to distinguish one molecule/grain among the others. We can't describe this level of precision so why do we have to bother about that level in the hierarchy ? Then second question do we want to create items about unique molecule/grain of rice ? If not why again do we need to think about that level ?
An ontology is always depending on the purpose we want to reach. So we should adapt the classification according to what we want to do and not according to what it is possible to do. So if we don't want to describe water molecules we can consider water (Q283) as an instance of chemical compound (Q11173).
Then about the famous problem B is an subclass of A and C is an instance of B. Something it is worth to says that C is an instance of A too because this is a good way to be sure to have all instances of A in a query. Here the query tool should be able to deduce that all instances of B are instances of A too. Currently I don't know if SPARQL can do that with a simple query or if we have to specify that in the query request. Snipre (talk) 13:25, 22 March 2016 (UTC)

"For you": This is not the consensus meaning of P31, so I'm not sure why you even mention it as a point.

The problem is when we introduce a hypothetical "frozen water molecule". This is clearly a subclass of "water molecule", but if we say "water molecule" instance of "molecule"... we have an issue. This is clearly wrong.

I broadly agree with "purpose of reach", but our purpose is to be used. And to be used usefully, we must indicate that an item of interest is classified correctly. This is basic Help:Membership properties.

Your comment about "famous problem" is not in keeping with consensus best practice here (except in a handful of cases where there are sources calling out an item as a subclass/instance of one thing and a subclass/instance of another thing, for which we have Help:Ranks and Help:Sources and Help:Qualifiers, which should all be used together). I don't see a reason why tool support should have an impact here--SPARQL is good at its job. --Izno (talk) 13:48, 22 March 2016 (UTC)

I summarized points that looks a lot like this article one in Help:Classification, that I proposed as a project guideline. It's the same except they don't explicitely use the en:Type–token_distinction explicitely as the first level, and that they use more levels that I use in this document, but indeed their example of taxon ranks feels right to me at forst reading. author  TomT0m / talk page 14:58, 22 March 2016 (UTC)
@Izno: Thanks for your kind reception of my opinion in an open debate and I can say the same about your opinion when speaking about my opinion: I'm not sure why you even mention it as a point.
But your next words are more interesting. Your arguments are contradictory: you agree about purpose to reach but just after your speak about hypothetical water molecule. Isn't it better to solve now the question if we want to start to create items about special states of molecule ? You can't fix purposes if you want to keep everything open in case of hypothetical things. And if you are afraid to change the classification of some hundreds of elements in the case of higher granularity, I suppose you never heard about bots and their capacity to modify items.
And by the way how can you use instance now for humans ? Don't you plan one day to create more than one item for each human in order to specify their developement or particular states like X as child or X as teenager ? With your reasonning of hypothetical items you shouldn't use in any case instance because you can always think about one hypothetical state.
At least my definition doesn't required a help page to be explained: the instance in a set of items of the same species is the more detailed item according to the number of properties used to characterize it. And with that definition you know that the last element in an hierarchy is always an instance. Snipre (talk) 18:36, 22 March 2016 (UTC)
That's a self contained definition : you relate the definition of an instance only to the way we model stuffs in Wikidata. So this is pretty close to "an instance is an instance" ... That also fails to capture that both an instance of human and an instance of molecules are both concrete object. Plus, what do you do with instances of type of administrative division like french commune ? You just fail to capture that there is instance of relationship that are not really on the last level, don't you ? Last but not least : why would we need two properties if we just have to check if we are on the most detailed level to see wether or not we are speaking of an instance, and this would be way simpler to maintain. Why bother having instance of (P31) at all with your (non) definition ? author  TomT0m / talk page 21:21, 22 March 2016 (UTC)
TomT0m It is just a definition. But everyone can now see that even with a help page on how to use instance of/subclass of, the current approach is completely useless because no one understands it or takes care of it (the cited paper is really a proof than the current system doesn't work). You can always refer to big theories, standards or what you want, if people don't use it for any reasons, you are stuck.
I am pretty sure your reference to type–token distinction is not understandable for a certain proportion of contributors. WD is a collaborative project with hundreds of regular contributors and thousands of occasional contributors so if you want that contributors use a specific system it should be very, very simple. I don't want to speak hours about that because for me only the results count: the good system is the one which used and allows to provide the results we want. Snipre (talk) 12:50, 23 March 2016 (UTC)
@Snipre: Which result do we want ? A mess ? author  TomT0m / talk page 12:54, 23 March 2016 (UTC)
And please note that we can see all that as variations on things we already does and use a little better defined. And I really think anyone can understand those pretty simple principles, otherwise I would really not push it. There is a lot of nice features that allows this to be understandable and beautifully regular, and this regularity makes all this simpler. It answers questions for us. author  TomT0m / talk page 12:58, 23 March 2016 (UTC)
@TomT0m: The mess already exists so please see the reality: you promote one system since a long time with which results ? A paper citing WD as a mess in term of classification. That's facts. You are not responsible of that, this is not a judgment. So what can be done to improve the situation ?
We have several possibilities: continue with the current system or change it. My comment was just a proposition of change with a new perspective, not completely defined I agree but something different. My proposition is not accepted and even denigrated ? Not a problem I don't want to spend time for that thing. But now this is time to show that your proposition can improve the situation in WD in term of classification. I don't take care about what you think I take care about what people are doing: do contributors use the system described in Help:Classification in their contributions in WD ? Do they understand how they can use it ? That are the only real questions. Snipre (talk) 13:23, 23 March 2016 (UTC)
Well, I propose something to make it cleaner, with strong and clear principle. I want to spend time explaining and try to convince it adds something. It takes time but I'm not afraid of it. It's abvious newcomers don't understand this fully at first, but I don't think your definition helps them. Plus you did not understand the concerns about "if we adopt this, why should we have two properties ?" without any answer to this, I think you better should trat them as a synonym and don't try to get in the way of people who try to make sense out of it, this would be a good compromise and smoother for everybody. It would not remove anything. author  TomT0m / talk page 13:31, 23 March 2016 (UTC)

@Snipre: You might find this paper useful: Instances of Instances Modeled via Higher-Order Classes, Douglas Foxvog. Cheers, Bovlb (talk) 14:42, 26 March 2016 (UTC)

I tested anti-pattern 1 using the SPARQL:
select distinct * where { ?Z wdt:P31/wdt:P279* ?A . ?Z wdt:P279+ ?A . }
and I got 358,840 pairs. Here are the top few:
Bovlb (talk) 17:56, 26 March 2016 (UTC)
I tried again grouping by the ancestor:
 select (MIN(?Z) AS ?MINZ) (COUNT(*) AS ?COUNT) ?A where { ?Z wdt:P31/wdt:P279* ?A . ?Z wdt:P279+ ?A . } GROUP BY ?A ORDER BY DESC(?COUNT)
and got 2476 different values.
Click [expand] to view the content
Example descendant Count Ancestor
winner (P1346) 1143217 entity (Q35120)
defendant (P1591) 781330 object (Q488383)
defendant (P1591) 418608 abstract object (Q7184903)
Economy of Djibouti (Q1000084) 406948 physical object (Q223557)
Amiga (Q100047) 371441 artificial entity (Q16686448)
Mandarin roll (Q1002439) 351902 work (Q386724)
Mandarin roll (Q1002439) 284272 goods (Q28877)
Mandarin roll (Q1002439) 284210 product (Q2424752)
water crisis in Iran (Q1000000) 283112 mental representation (Q2145290)
water crisis in Iran (Q1000000) 277544 concept (Q151885)
water crisis in Iran (Q1000000) 238642 point (Q44946)
water crisis in Iran (Q1000000) 238642 primitive notion (Q6453739)
water crisis in Iran (Q1000000) 196042 point in time (Q186408)
water crisis in Iran (Q1000000) 195706 occurrence (Q1190554)
Buddhism in Germany (Q1000850) 147904 process (Q3249551)
Buddhism in Germany (Q1000850) 129505 biological process (Q2996394)
Marshal of the Sejm (Q100172) 128843 behavior (Q9332)
Marshal of the Sejm (Q100172) 127007 activity (Q1914636)
China Railways HXN3 (Q1005425) 120327 mode of transport (Q334166)
China Railways HXN3 (Q1005425) 120298 finished good (Q3245975)
China Railways HXN3 (Q1005425) 120276 vehicle (Q42889)
Marshal of the Sejm (Q100172) 106677 human behaviour (Q3769299)
Marshal of the Sejm (Q100172) 102689 occupation (Q12737077)
carbon-11 (Q1014081) 73535 matter (Q35758)
Amiga (Q100047) 67926 artificial physical object (Q8205328)
Economy of Djibouti (Q1000084) 57328 manifestation (Q286583)
Amiga (Q100047) 47839 tool (Q39546)
ghat (Q1010155) 46454 location (Q17334923)
ghat (Q1010155) 46454 position (Q23008351)
ghat (Q1010155) 46255 geographic location (Q2221906)
ghat (Q1010155) 44255 geographical object (Q618123)
carbon-11 (Q1014081) 42923 isotope (Q25276)
carbon-11 (Q1014081) 42602 chemical substance (Q79529)
GFA League First Division (Q1007931) 41325 phenomenon (Q16722960)
GFA League First Division (Q1007931) 41293 phenomenon (Q483247)
GFA League First Division (Q1007931) 41172 social phenomenon (Q602884)
GFA League First Division (Q1007931) 40724 event (Q1656682)
Amiga (Q100047) 39677 device (Q1183543)
Saint-Pourçain AOC (Q10214) 38838 base material (Q214609)
Bürgl Hut (Q1021576) 38451 construction (Q811430)
Bürgl Hut (Q1021576) 37666 architectural structure (Q811979)
GFA League First Division (Q1007931) 36358 competition (Q476300)
carbon-11 (Q1014081) 34591 simple substance (Q2512777)
Buddhism in Germany (Q1000850) 33530 class (Q5127848)
Buddhism in Germany (Q1000850) 33238 class (Q16889133)
bucolic poetry (Q1003292) 33036 identifier (Q853614)
carbon-11 (Q1014081) 32984 chemical element (Q11344)
Economy of Djibouti (Q1000084) 32829 system (Q58778)
bucolic poetry (Q1003292) 32121 information (Q11028)
Saint-Pourçain AOC (Q10214) 29456 part (Q15989253)

Primary Sources Tool broken

I want to contribute with the Primary Sources Tool but it's not working properly at the time. F.e. it provides the same birthdate as it is set already in Jacques Spiesser (Q1678111), but with different references. I know that the tool is not maintained anymore but can anybody fix that? Queryzo (talk) 11:51, 28 March 2016 (UTC)

If you accept the references without accepting the claim, the new references will be added to the old claim and the PST claim will go away. --Izno (talk) 12:03, 28 March 2016 (UTC)
yes, but it didn't even work, when there are no references: Marceline Day (Q782604). Queryzo (talk) 15:20, 1 April 2016 (UTC)

Koppen climate classification

Polar and alpine climates

I'm working on implementing the infrastructure for Köppen climate classification (P2564) and I have something of a problem on my hands, and I'm not sure how best to deal with it (though I have a stopgap now--someone can change how I've dealt with it).

The problem is the E-group of classifications. From what I can observe based on my reading of the en.Wikipedia pages, the entire group is polar and alpine climate (Q23662318). I created it to faithfully reproduce the classification system. This item has no links as a result.

I also have on hand polar climate (Q193372) and alpine climate (Q859654), both of which have many links. A reading of the en.WP article for "polar climate" implies it is the same as the whole group. And similarly for "alpine climate".

Should we set the classification up such that it uses the joint class at polar and alpine climate (Q23662318) or should we pick one or the other of polar climate (Q193372) and alpine climate (Q859654) to set as our representation of the classification? My concern with the latter option is that ice cap climate (Q5985406) and Q23662321 need to take the correct subclass relationship, which I believe to be the "joint" item at polar and alpine climate (Q23662318). --Izno (talk) 16:46, 30 March 2016 (UTC)

Tundra climate

I have created a Q23662321, with no links. However, the en.WP article at tundra (Q43262) seems to represent some of the climate, though is more about the biome as a whole (and it is represented as such in Wikidata, as a "biome"). I have currently listed the new climate item as the correct item to use. Do you agree with this decision? --Izno (talk) 16:46, 30 March 2016 (UTC)

you need to specify a tundra climate is a class within the Koppen climate system, so a new class item is needed. Michiel1972 (talk) 10:19, 31 March 2016 (UTC)

Use of Koppen climate classification as instances

Another: I have used the system item, given its English name, to indicate that each of the classifications is an instance of (P31) the classification as a whole. Is this okay, or should the classification item be a separate "class/type" item? --Izno (talk) 16:54, 30 March 2016 (UTC)

General

@Izno: I urged to create the climate zone items before, but I wasn't heared: here.--Kopiersperre (talk) 10:35, 31 March 2016 (UTC)

@Kopiersperre: Right, and that's what I'm filling out on the property talk page now and how I bumped into this problem. Do you have a comment specific to my question? --Izno (talk) 11:07, 31 March 2016 (UTC)
  • I'm not really convinced about the approach to re-purpose existing items about articles that might not even mention the classification for a use within a scheme. I think that was a problem already mentioned in the creation discussion. You might find better input at the relevant WikiProject.
    --- Jura 07:43, 2 April 2016 (UTC)

Rhodopsin (Q423107) and RHO (Q14859555)

seem to be related. Is it that all sitelinks for Rhodopsin show up at RHO or are there more sublte twists? -- Gymel (talk) 12:08, 31 March 2016 (UTC)

From my understanding the first one is about the protein and the second is the gene which encodes the protein. I think some WPs treat both in one article and depending on the focus the article is linked to the protein item or to the gene item. I can't read all languages to define which is the more appropriate item. Snipre (talk) 14:21, 31 March 2016 (UTC)

Same with Insulin (Q39798) and INS (Q21163221). Seems as @ProteinBoxBot: is systematically(?) moving sitelinks from proteins to the responsible genes. As a layman I find this a bit surprising. -- Gymel (talk) 21:54, 1 April 2016 (UTC)

Coordinates and OSM

I've made a list of mountains in the Balearic Islands: ca:Usuari:Paucabot/Muntanyes2. The problem is that the coordinates are not good enough as they can be some meters away from the summit: Zoom in here to see it. Is there any way to (easily) import and substitute OpenStreetMap coordinates? Or, much better, is it planned something to share OSM data? Paucabot (talk) 18:04, 29 March 2016 (UTC)

This is tricky. OSM is licensed under the Open Database License (ODbL) (thematically similar to CC-BY-SA), while Wikidata is licensed under CC0. Importing substantial portions of data from OSM into Wikidata is not allowed unless Wikidata is able to license these imported data under ODbL. —seav (talk) 19:39, 31 March 2016 (UTC)
Thanks, Seav (talkcontribslogs). By the way, I found this: Wikidata:OpenStreetMap. Maybe we should explain this there? Paucabot (talk) 08:57, 1 April 2016 (UTC)
@Seav, Paucabot: This is just about the coordinates of 89 summits. Does that small amount qualify as a "substantial portion of data from OSM"?--Pere prlpz (talk) 23:04, 2 April 2016 (UTC)
Please see http://wiki.osmfoundation.org/wiki/License/Community_Guidelines/Substantial_-_Guideline for what the OSM community considers "insubstantial". —seav (talk) 23:19, 2 April 2016 (UTC)

Call for ideas : wikipédias and wikidata references

Hi people, wikidata datas are (chaotically) more and more used by wikipedias, and some - long standing - problems could become more important in the near future.

One of them I think about for a long time : how to use wikidata references in wikipedias articles. I think we have a set of partial solutions that does not meet some of the higher quality standards people can manage to use into some carefully sourced wikipedia articles : how to mix references that comes from Wikidata in a consistent way with sources coming just from the article. I want to know if other people encountered these kind of problem already, how, and get some ideas from community (and @Lea Lacroix (WMDE):?)

One of the main problem is reference deduplication that we have no real automated way to know if a reference in some part of the article is actually used in another part of the article, at several levels.

Problems sketch

For example if a URL/document is used in several statements on the same items. Say we want to show the references in an infobox. We could have stuffs like

  place of birth: New-York[example1 1]
  date of birth: date[example1 2]
  place of death: New-York[example1 3]
  date of death: date[example1 4]
  
  References
  1. http://example.com
  2. http://example.com
  3. http://example.com
  4. http://example.com
"recording reference" possible solution 
This first problem could be dealt with in a lua generated infobox if the infobox is generated in a single lua template call because there is a way to record every url used in the references and generate an ID for them and generate the following code into
place of birth: New-York[example1bis 1]
date of birth: date[example1bis 1]
place of death: New-York[example1bis 1]
date of death: date[example1bis 1]
References
  1. 1.0 1.1 1.2 1.3 http://example.com
  2. But this is not a perfect technical world, and this can not be extended in other parts of the article if the same url is used. We actually cannot record the references in the whole article because of some technical requirement for the Visual Editor (he must be able to pre-render templates without side effects so that its rendering cannot be invalidated by changes in other parts of the page).
    the "computed/conventional id" solution 
    This solution is used for things like so called "harvard references". In harvard references, every book has an id conventionally made from the authors first names and the name of the year. This id is not used in the "name" attribute of the ref tags, but to generate an anchor to the book in the bibliography.
    This allows to decouple the citation to the cited book, and to cite specific pages of the book without duplicating the informations about the book itself. It would be hazardous to try to use such an id in the "name" of references because it coud easily lead to ugly error message if the content of the ref tag is different :
      place of birth: New-York[example1ter 1]
      date of birth: date[example1ter 1]
      place of death: New-York[example1ter 1]
      date of death: date[example1ter 1]
      
      References
    
    1. 1.0 1.1 1.2 1.3 http://example.com Cite error: Invalid <ref> tag; name "url1" defined multiple times with different content Cite error: Invalid <ref> tag; name "url1" defined multiple times with different content Cite error: Invalid <ref> tag; name "url1" defined multiple times with different content
    But, although we have many identifiers for books or articles that goes from isbn, Qids, doi, etc. it seems it's hard to use them at community scale in a consistent way, that the harvard ids may generate collision which may make necessary to overload the automatically generated one ...

    So it seems to me we don't really have a perfect solution with current software and that we may benefit changes in the mediawiki references management system ... any thought ? What could we do to make easier to use Wikidata references in Wikipedias ?


    PS: That said I think I may have The solution : wrap every article into a lua call so that we can share the references used into the article and treat that with a custom parser and the "expandtemplate" lua function. This would look like this :

    {{referenced article|
    a '''referenced article''' is an article in which references and bibliographies are deduplicated<ref name="fisher"> {{U'|TomTom}}, 1st of April</ref>.
    
    {{infobox examples}}
    
    
    {{deduplicated bibliography}}
    }}

     – The preceding unsigned comment was added by TomT0m (talk • contribs).

    Have you seen w:hy:Սթիվեն Հոքինգ? (especially ref N4 and N6)--ԱշոտՏՆՂ (talk) 18:10, 1 April 2016 (UTC)
    I´m working on references too, but I have not thought about integrating references from the module and from the Wikipedia text. A possible solution is maybe to store some reference names in Wikidata. This way it is possible to combine the references at least by hand, because this way I knew the references names used in the module. The references names I´m creating are build from the item q-number plus a counter. --Molarus 20:24, 1 April 2016 (UTC)
    Why can't we just use whole reference for reference name? Like this: <ref name="http://example.com">http://example.com</ref>. It works in hywiki (see W:hy:Մոդուլ:Sources especially line 543)--ԱշոտՏՆՂ (talk) 22:57, 1 April 2016 (UTC)
    On svwiki I use "hash". See line 315 in sv:Module:Wikidata2. If different parts of the page uses different references with different hashes, the parser of the page combines them in one reference in the page. I do not know how our developers have managed to do so. They have probably spent many years in Hogwarts to accomplish thIS!
    table.insert(reference, mw.getCurrentFrame():extensionTag( 'ref', s, {name = ref.hash} ) )
    
    -- Innocent bystander (talk) 07:35, 2 April 2016 (UTC)
    @Innocent bystander: I assume you mean ref.hashreference.hash ? What would be interresting is the code that compute the "reference" object. author  TomT0m / talk page 09:21, 2 April 2016 (UTC)
    @ԱշոտՏՆՂ: what you show up is just using the uri, not the whole reference. This will break if the same URI is used in two references but some datas are added in addition to the reference, because the content of the ref tag could be different (say, same uri, but a different consultation date in the two cases) the same way in my "exampleter". author  TomT0m / talk page 09:21, 2 April 2016 (UTC)
    Yes Tom, 'ref' here is a loop over every 'reference' in the 'references'. I am a little surprised that it works, but it actually does. The only time it breaks down, is when the code that sets the reference together looks differently in different parts of the page. If there is two infoboxes which uses separate algoritms to set the references together and they both uses the same "hash", it breaks down. But that has only happened to me when I have made a copy of the algoritms to tests some improvements. -- Innocent bystander (talk) 09:39, 2 April 2016 (UTC)

    Don't forget that references can be to paper sources. See, for example, Galileo Galilei (Q307), the occupation property (astronomer), has a reference to the book The Hunt for Planet X (Q20888754). Also, because different articles have different citation styles, it will be virtually impossible to find one way to code an infobox that would allow it to be used in a wide variety of articles. I'm working on w:Template:Infobox zodiac/sandbox, but the only reason that one is manageable is because it will only be transcluded in 12 articles. Jc3s5h (talk) 12:15, 2 April 2016 (UTC)

    @Jc3s5h: Yes, I know. Actually when the work has an item, the Qid could be very handy has an anchor to identify it in the bibliography on articles that uses harvard style references - this is the default id used for fr:Modèle:Bibliographie. As this template use itself the Qid this totally make sense. This is usable in fr:Modèle:Référence_Harvard_sans_parenthèses or fr:Modèle:Référence Wikidata. But there is a lot of statement thathave references without any dedicated item. author  TomT0m / talk page 13:46, 3 April 2016 (UTC)

    Line and Ligne

    In English wiki there are two articles: https://en.wikipedia.org/wiki/Line_(unit) and https://en.wikipedia.org/wiki/Ligne they are described by items Q649848 and Q1630774. They are for British Line Unit and French Line Unit. In Russian wiki there is only one article about all kind of Line units: https://ru.wikipedia.org/wiki/Линия_(единица_длины) but now it is associated only with Q649848, but should be associated with both of the items. And both English wiki pages Line_(unit) and Ligne should have link to this Russian article.

    But I am new to wikidata, and I do not understand at all how to do this. (Not only technically, but I think I miss some part of general idea too) Can you please explain me this step by step, so I can will be able to do it myself next time. Thank you! --Nataraj (talk) 08:27, 3 April 2016 (UTC)

    This is a general problem with the wikidata model, which assumes a 1:1 relationship between concepts and Wikipedia articles. Each item in Wikidata can be associated with a maximum of one page on each Wikipedia, and each Wikipedia article can be associated with a maximum of one Wikidata item. I believe interwiki links can be worked around by adding them the old fashioned way (subject to local wiki policy), but I don't know how (or even if it is possible) to do everything fully in this situation. Thryduulf (talk: local | en.wp | en.wikt) 10:38, 3 April 2016 (UTC)

    How many entities should be in Wikidata dumps?

    According to main page, there are 17.209.354 items in Wikidata. I downloaded a Wikidata entities json dump a few days ago and I expected to find the same number of items there. However, I counted about 20,568,199 lines, 20,565,957 of which are items (e.g. entities with an id starting with "Q").

    Although my count could be wrong I think it isn't. Therefore, is there any reason to be more items in the dump than in statistics?

    If it matters, the dump I used was latest-all.json.bz2 from https://dumps.wikimedia.org/wikidatawiki/entities/ . --Pere prlpz (talk) 23:35, 2 April 2016 (UTC)

    Redirects? - Brya (talk) 06:01, 3 April 2016 (UTC)
    Yes in the dump there are also redirect --ValterVB (talk) 07:16, 3 April 2016 (UTC)
    That makes sense, because redirects and merged items also account for the difference between the official number of items (17M) and the much higher Q numbers of recent items (23M). It also seems related to de way items are ordered in dump, in two series of increasing order - although entities order is not supposed to be meaningful. I'll try to check.--Pere prlpz (talk) 09:49, 3 April 2016 (UTC)
    I haven't been able to find any duplicate nor any redirect marked somehow as a redirect, but it's hard to search in such a big file (69 Gb).
    This problem occurred a couple of years ago: https://phabricator.wikimedia.org/T74678 . I left a comment in phabricator just in case it is the same problem again.--Pere prlpz (talk) 21:57, 4 April 2016 (UTC)

    Removing Wikidata bots from watchlists on other projects?

    I like having Wikidata on my Wikipedia watchlists, but the Wikidata bots just seem to clutter everything up. Is there a way to just remove the Wikidata bots while leaving the Wikipedia bots and the rest of Wikidata on my Wikipedia watchlists? Cheers, Irn (talk) 17:18, 3 April 2016 (UTC)

    @Lydia Pintscher (WMDE): Probably something for WD:Contact the development team, but I'll ping you here just to see if you have an opinion. --Izno (talk) 20:41, 3 April 2016 (UTC)
    @Irn: Without checking, it's probably possible to use CSS to hide the lines where it's a bot from Wikidata. --Izno (talk) 20:42, 3 April 2016 (UTC)
    I have sv:Användare:Larske/Testsida6 in my Watchlist. There I now see 7 different edits by User:Sarah Layton in 5 different lines. The edits affects 3 different items, Alexandrine of Mecklenburg-Schwerin (Q57264), Christian X of Denmark (Q156617) and Louise of Sweden (Q232402). Why do I have 5 and not only 3, one for the latest edit in the linked items? -- Innocent bystander (talk) 10:29, 4 April 2016 (UTC)
    I think we need phabricator:T51315 to solve this. --Lydia Pintscher (WMDE) (talk) 10:10, 5 April 2016 (UTC)

    @Izno: and @Lydia Pintscher (WMDE): Thanks for your responses, but I'm not really sure what to make of them. How do I go about using CSS to hide the lines from Wikidata bots? And I checked out the phabricator link, but I'm not really sure what to do there. Cheers, Irn (talk) 13:06, 5 April 2016 (UTC)

    Links to redirects

    To my knowledge it was once discussed here that links to redirects shall not be changed, and Help:Redirects#Links to redirects also states that. However apparently at least one bot operator seems to have more knowledge, and his bot changes redirects, causing exactly what was the help page states - lot of other items modified without any real need after a wrong page merge, and thus lot of additional cleanup work - see here @Ivan A. Krestinin:. Was there ever a discussion which changed that policy how to handle redirects? And if yes, why nobody bothered to update the help page? Ahoerstemeier (talk) 10:00, 4 April 2016 (UTC)

    Since I didn't know of that page (which is not a policy, but a policy proposal), I have been fixing redirects when I came across them. This has the background that labels of redirects are not rendered, which makes redirect less than useful on item and property pages. --Srittau (talk) 11:06, 4 April 2016 (UTC)
    Wouldn't that be better to fix the software, instead of changing the data to workaround the problem? Don't know if there is already a bug report for this, as I am not really familiar with the bug reporting here. Ahoerstemeier (talk) 09:07, 5 April 2016 (UTC)

    Wikidata weekly summary #203

    Adding Wikidata for Brandemix

    Hello WIkiData Admins,

    I am new to wikidata and wanted to include the basic details of the business Brandemix Deepaksachdeva123 (talk) 10:52, 5 April 2016 (UTC)

    See Srittau's note above. Mahir256 (talk) 16:12, 5 April 2016 (UTC)

    Upcoming coordinate import

    So a few days ago, I imported some coordinates from German Wikipedia, but I was a bit ... indiscriminate about it. So now I am preparing a larger import, taken from all Wikipedia. Yup, that's right. Currently, there are >120K coordinates; could be ~150K by the time it's finished. But this time, each and every item I add to has (a) no coordinates, and (b) instance of (P31):subclass of (P279):geographic location (Q2221906). So, it should be significantly higher quality. I will start adding them today or tomorrow.

    I am posting this not only to alert you to watch out for this (edits will be done by User:Reinheitsgebot), but also to ask a question. Apparently, on some wikis, bots have been adding pages; in this case, I am mostly looking at hi.wikipedia. There are lots of village-level entries, which is good in itself. However, many of these have the same coordinates; the top shared coordinate is used for 1285 articles. Should I include these? They may be somewhat off, but still OK-ish, AFAICT. Or should I try to filter out the commonly used ones? Which "usage level"? 5x total? or 5x by the same Wikipedia? Or should I just skip Hindi Wikipedia (Q722040) and Gujarati Wikipedia (Q3180306) altogether? --Magnus Manske (talk) 14:33, 31 March 2016 (UTC)

    No, I think if we are sure the coordinates are not correct we should not add them.--Ymblanter (talk) 14:44, 31 March 2016 (UTC)
    Agreed. It's much easier to find items without coordinates than to find items with wrong coordinates. --Srittau (talk) 14:50, 31 March 2016 (UTC)
    BTW, could you not import coordinates from ruwiki. Apparently they have some issue with the DMS/decimal conversion.
    --- Jura 14:53, 31 March 2016 (UTC)
    I know, I should have answered at botreq page, but as you're posting here... (with little background, so this sounds logical) Isn't it possible to add correct precision? See this import and dewiki article. Precision as you can see is quite different. Why it's really important to have the correct precision? Well, because it is (or at least should be) correct :) Geo-people most probably can give you more serious reasons for having correct precision. And previously bots have been adding coordinates with correct (at least kind of correct) precision (that's why, I'm asking about this precision thing). Or problem is somewhere else? About Jura1's mentioned ruwiki problem see also this. --Edgars2007 (talk) 15:01, 31 March 2016 (UTC)
    So I am using the geo_tags database table on the Wikipedias, which does not store precision. I will be using QuickStatements to add them, which uses the default precision of Wikidata (that is, it does not specify a precision).
    Also, suppressing hi, gu, and ru Wikipedias as sources, statement count goes down by 50%. Pity. --Magnus Manske (talk) 15:10, 31 March 2016 (UTC)
    Maybe the ru thing isn't relevant (see prev. discussion) @Ivan A. Krestinin: could more be imported?
    --- Jura 15:37, 31 March 2016 (UTC)
    ruwiki error rate is not looked as significantly different from another wikies. So coords from ruwiki can be imported. Some other bots loosed precision during import. Please be more accurate. Also please do not import coord templates without "display=title" flag. — Ivan A. Krestinin (talk) 21:24, 31 March 2016 (UTC)
    Related possible import: Category:Articles with OS grid coordinates (Q6377660). Some of these articles don't have any coordinates or no statements at all (see Wikidata:Database reports/items without claims categories/cywiki).
    --- Jura 15:40, 31 March 2016 (UTC)
    I think it would be mostly correct to add coordinates for instances of event (Q10290214) as well, there are quite a few battles, train accidents, mass killings... that need coordinates. --Zolo (talk) 14:44, 1 April 2016 (UTC)

    Update: I have begun an import of ~30K coordinates. These have been filtered "conservatively". If there are significant issues, please block User:Reinheitsgebot. The coordinate gathering script has not yet completed, so there is a huge batch of coordinates from Swedish Wikipedia to be imported; I hope that one doesn't have the "Russian issues". Plus, all languages that come alphabetically after "sv" are still to be done. --Magnus Manske (talk) 15:47, 1 April 2016 (UTC)

    @Magnus Manske: I am not aware of any "Russian issues" on svwiki, but many coordinates comes from GeoNames, and they are often rounded to minutes, and are therefor sometimes wrong. Lsjbot is trying to compensate for that, see my note below. -- Innocent bystander (talk) 05:58, 4 April 2016 (UTC)

    @Magnus Manske: found one issue, I will call it "Chuvash issue". You can search here "0°0'0.000"N, 0°0'0.000"E". Probably those are not the only involved items. And/or your script could take into account cordinates with are at the center. There are not so many objects :) --Edgars2007 (talk) 06:51, 5 April 2016 (UTC)

    Update 2: Now importing half a million (!) coordinates from Swedish Wikipedia. Looks reasonable so far, ping me (Twitter, email, here) if I need to cancel it. --Magnus Manske (talk) 15:02, 5 April 2016 (UTC)

    So most coordinates have been imported now. There are still ~76K left, mostly from these wikis:

    • ruwiki: 30,674 pages with coordinates
    • guwiki: 18,153 pages with coordinates
    • ltwiki: 13,624 pages with coordinates
    • hiwiki: 8,536 pages with coordinates
    • glwiki: 1,991 pages with coordinates

    But above I hear the ru/guwiki ones might be substandard. What to do? --Magnus Manske (talk) 14:05, 7 April 2016 (UTC)

    GeoNames

    FYI: There are a lot of articles with Coordinates in Lsj's latest project. It has used the coordinates from GeoNames. One problem we have detected with GeoNames is that they have introduced many rounding errors. The bot has tried to adjust the coordinates when such mistakes has been detected by the help of an algoritm in the bot code. The pages with such adjustments can be found in sv:Category:Artiklar med robotjusterad position. -- Innocent bystander (talk) 08:06, 1 April 2016 (UTC)

    Guide for importing data into Wikidata

    Hi all

    Myself, User:Jens_Ohlig_(WMDE) and some others are working on a how to guide for people wanting to import data into Wikidata, the audience is existing Wikidata editors with a non technical background (i.e not programmers) and it would be a kind of partner page for the Wikidata:Data_donation page I rewrote. At the moment it is just a collection of existing tools and ideas, any help would be appreciated.

    Many thanks

    John Cummings (talk) 09:28, 7 April 2016 (UTC)

    Nontrivial request: Recovering unclear pdf document

    (also cross-posted to Wikipedia talk:Graphics Lab)

    Hello, I didn't even know Wikidata existed until 5 minutes ago, and I don't know if this is the correct forum etc. so apologies.

    I'm working on a complete rewrite of w:Bengal famine of 1943 in my personal sandbox. The central document for this topic (though it's biased) is the Woodhead Commission Famine report. It's a available in pdf format here. I can save that into .txt format (hurray!), and have written a little Python program that finds keywords from a large number of similar text files and stores quotes into separate files.. however, the scan quality or the Famine Commission report is so poor that extended stretches are simply gobbledygook.

    This is a nontrivial request: Is there a PhotoShop guru (or similar) who could sharpen the MANY pages into significantly better & more scannable pdfs? Not all pages could be fixed, because some show the curvature of the book pages etc., but I think many many could be improved.

    I have downloaded an evaluation copy of PhotoShop etc and tried to use Sharpen and Levels or Layers whatever to make each page more machine readable, but I don't know how to do it for an entire (large!) report, and I don't know how to scan them or save them to text instead of image (printing every page and scanning each manually is obviously much too much work). I also have a family life and work etc. and learning how to do all these things would just take too much time.

    Does anyone have suggestions?

    In theory, this service might be valuable for other old documents scanned to pdf, but i dunno how much demand there would be for such a service.

    ThanksLingzhi (talk) 04:51, 8 April 2016 (UTC)

    @Lingzhi: This is not relevant for Wikidata. You may get more help at commons:Commons:Village pump or at wikisource:Wikisource:Scriptorium/Help. Good luck! Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:41, 8 April 2016 (UTC)
    thanks!Lingzhi (talk) 13:35, 8 April 2016 (UTC)

    P2037 Github username

    hi, should GitHub username (P2037) be added as a property or as an identifier? if it's an identifier, my follow up question is how do I search for identifiers i the query tool? Example query: tinyurl.com slash h8vragg Ambrosiani (talk) 17:34, 8 April 2016 (UTC)

    @Ambrosiani: hmm, I assume you meant qualifier instead of identifier, right? If so, then it should be used as a property. Some piece about querying qualifiers in SPARQL: Wikidata:SPARQL query service/queries#Working with qualifiers. --Edgars2007 (talk) 17:55, 8 April 2016 (UTC)
    Sorry, I mean "Statement" or "Identifier" (the two different places where I can add properties). Ambrosiani (talk) 18:33, 8 April 2016 (UTC)
    Oh, that sounds more logical :) AFAIK, you don't have to worry about that. At least, putting everything in statement part works completely fine. System will put in right place. Quering at SPARQL is the same. For the record, Github username goes to "Identifier" section, as it has "External identifier" datatype (you can see it at property page). --Edgars2007 (talk) 18:47, 8 April 2016 (UTC)

    Splitting {{Property proposal}} from {{Property documentation}}?

    Currently, {{Property documentation}} is used on both property discussion pages and on property proposal pages. This made sense, but I think the time has come to split this template in two, since their use has diverged quite a bit since their creation. The overlap of fields to copy when creating a new property is very low now, usually only one or two fields, since more and more of the fields have been moved into property statements (where they belong). On the other hand there are several code paths in the template now that check, whether it's been used on a property talk page or elsewhere, and does different things based on that.

    Therefore, I suggest to create a separate template for property proposals that can be separately maintained. Advantages from my point of view:

    • The use of {{Property documentation}} and all its fields, which are partly targeted at property discussion pages, confused me when I proposed my first property. A separate template with a separate documentation can fix that.
    • Clearing up the confusion that exists, because fields are used differently, depending on the use of the template.
    • Removal of unnecessary (and therefore also confusing) fields from either template.
    • Better error detection, for example missing or spurious fields on each template.
    • Easier code paths/easier to maintain templates.
    • Easier way to evolve and improve the templates in the future.

    Disadvantages:

    • You have to replace the word "proposal" with "documentation" when creating a property. Considering the cleanup required anyway, I consider this a minor point.
    • The translations of both properties have to be maintained separately. I think the overlap of both templates will become very small over time, though.

    Please note that currently there already is a template {{Property proposal}}, which I consider ill named anyway and would need to be renamed. Since its use is not very widespread, I don't think this would be a problem though (about 60 uses).

    Suggested course of action:

    1. Rename {{Property proposal}} to {{Property proposal link}}. ✓ Done --Srittau (talk) 09:34, 6 April 2016 (UTC)
    2. Duplicate the existing {{Property documentation}} as {{Property proposal}}. ✓ Done --Srittau (talk) 20:27, 6 April 2016 (UTC)
    3. Replace {{Property documentation}} with {{Property proposal}} on the proposal pages and the archive.
      • Partially done. I replaced the uses on the current proposal pages and in the current archive as well as the templates for new proposals. Someone needs to allow the latter to be translated, though. Also missing are the other 46 archive pages. If a bot operator could help here, it would be much appreciated. (Replace "{{Property documentation" with "{{Property proposal".) --Srittau (talk) 21:01, 6 April 2016 (UTC)
      • Now ✓ Done, thanks to User:Pasleim. --Srittau (talk) 20:05, 7 April 2016 (UTC)
    4. Clean up the LUA code of both templates.

    When the templates have been refactored and un-entwined like this, we can start to discuss cleanups and improvements to those templates. --Srittau (talk) 13:35, 3 April 2016 (UTC)

    Do it. --Izno (talk) 20:40, 3 April 2016 (UTC)
    A near-identical proposal is already being discussed, at Wikidata talk:Property creators#Property Template change?. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits
    Thanks for mentioning that, I should have done so in the proposal above. I repeated it here again for a larger audience and with a concrete plan of action. --Srittau (talk) 18:08, 4 April 2016 (UTC)
    A pointer is sufficient; please don't split discussions. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:51, 5 April 2016 (UTC)
    Symbol support vote.svg Support I agree this is the right thing to do now. ArthurPSmith (talk) 17:32, 4 April 2016 (UTC)

    Good to see you working on documentation. You might want to use the (quite hidden) footer feature. Take for example OMIM ID (P492). Would be nice to have the docs at the bottom too so you don't have to click on the talk page. You can set it at Property talk:P492/footer. Probably it's just Module:Property documentation needing a small modification so it loads the information. Multichill (talk) 19:35, 7 April 2016 (UTC)

    ✓ Done, further discussion can take place on Template talk:Property proposal and Template talk:Property documentation. --Srittau (talk) 16:27, 14 April 2016 (UTC)

    This section was archived on a request by: Srittau (talk) 16:27, 14 April 2016 (UTC)

    Any guideline about taxon names?

    Lots of taxons, such as family with only one genus, are merged into one article on most of Wikipedia projects. Should we keep them as one item or separate ones? If the answer is latter, which one should we used for links?

    An example is Q3866060 and Q229084. On most of Wikipedias they are described in the same article (with a few exceptions, like ar.wiki). But apparently the links are now in different items despite they're describing the same thing. This kind of situation is super common. I assume there must be some guideline about this but I cannot find it. --Fireattack (talk) 08:54, 8 April 2016 (UTC)

    From a taxonomic point of view, they are not necessarily the same so they should be separate. Thanks, GerardM (talk) 10:32, 8 April 2016 (UTC)
    In taxonomy there are often multiple points of view: when one taxonomist considers a family to have only one genus, there often is another taxonomist who considers that family to have more than one genus. It is important to document any taxonomic position with good taxonomic references. Given that there are multiple projects that have pages on both Moschus and MOSCHIDAE, it is a good example of a non-issue. But, anyway, pretty much by definition, a page on a family is different from a page on a genus. - Brya (talk) 10:53, 8 April 2016 (UTC)
    In taxonomy there is a clear taxonomy of the publications. It includes dates and sources. The notion that we have multiple projects that are not aware of this is a non-issue. It is important that Wikidata is correct in this. Thanks, GerardM (talk) 13:31, 8 April 2016 (UTC)
    Maybe it would be of your interest: Wikidata:WikiProject Taxonomy. Paucabot (talk) 11:38, 8 April 2016 (UTC)
    Specifically Wikidata:WikiProject Taxonomy/Tutorial#Taxonomy changes. Paucabot (talk) 11:41, 8 April 2016 (UTC)
    @GerardM, Brya, Paucabot: Guys, thanks for your answers. I totally agree that they should be separate. However I think you didn't address my real concern: which one should be used to storage all the SITELINKS, for projects (mainly Wikipedia) that merged these two items. Status quo: Some of them are linked to one while some others are linked to another. Obviously we could do better. Maybe always link to the one with higher rank? (this is just an example.) I am here to seek for guidelines to deal with this kind of situation. I also read the pages Paucabot provided but I didn't find info about this very topic. --Fireattack (talk) 19:08, 8 April 2016 (UTC)
    Why the „higher rank”, Fireattack, and not the lowest rank? You'll need a species name to establish a genus name. --Succu (talk) 20:47, 8 April 2016 (UTC)
    No why, I said it's just an example...--Fireattack (talk) 06:14, 9 April 2016 (UTC)
    It is often difficult to determine what the real concern of a poster is. As to the placement of sitelinks, the practical course is to accept these at face value: if the title of the page is MOSCHIDAE, then put the page in the item MOSCHIDAE. Hopefully, the opening sentence and the taxobox will be in agreement with the title. If one was going to determine, on a case by case basis, what the gist of the page is, one would hit two obstacles: 1) others might not agree on what the gist is, and 2) pages are dynamic, and would have to be re-evaluated after each substantial change. - Brya (talk) 03:57, 9 April 2016 (UTC)
    Well, following the title of the page is a solution which I guess is okish from the point of view of Wikidata (which is still not ultimate solution: where should articles like w:en:Musk deer go?). But in Wikipedia, I don't think it's a good practice to not link (essentially) same inter-language pages together.--Fireattack (talk) 06:18, 9 April 2016 (UTC)
    Well, in w:en:Musk deer both the opening sentence (sort of) and the taxobox claim it is about the genus, which is supported by the list of species. So, the obvious placement is in the item Moschus. If the opening sentence is taken at its most literal, it belongs in neither, but in a "Musk deer, instance of common name", which is impractical.
            And connecting everything that would be desirable from the perspective of each individual Wikipedia will have to wait for an appropriate software solution which allows customizing. - Brya (talk) 06:55, 9 April 2016 (UTC)

    Who added a betawikiversity link?

    Which is affected by phab:T54971? --Liuxinyu970226 (talk) 03:34, 9 April 2016 (UTC)

    The item is Q23685749. The link isn't displayed, although it's there in the JSON. - Nikki (talk) 10:30, 9 April 2016 (UTC)

    WikiCite applications closing soon

    WikiCite wordmark.svg

    A reminder that applications to attend WikiCite 2016 – an event that should be of interest to Wikidatans active on source-related work – close this Monday April 11. We have a limited number of travel grants to support qualified participants. If you wish to join us in Berlin to participate either in the data modeling or engineering effort, please consider submitting an application --DarTar (talk) 16:09, 9 April 2016 (UTC)

    Deleting descriptions

    Can we have some tool/API call or whatever to remove all descriptions for an item? Basically, I'm talking about former disambigs, that we're transformed into articles at (all, of course) wikis, like Josh Cobb (Q6288763). It is painful to remove them manually (even in list of headers tool). I know, that I'm not the only one, who would like to have such tool. --Edgars2007 (talk) 17:43, 3 April 2016 (UTC)

    MediaWiki:Gadget-dataDrainer.js is a great tool for that. --Stryn (talk) 18:31, 3 April 2016 (UTC)
    Hah, I'm not in needed user groups :D Then maybe somebody could delete descriptions for linked item? --Edgars2007 (talk) 18:45, 3 April 2016 (UTC)
    Ah, I see. Earlier it worked for autopatrolled users as well, but as this user group doesn't exist anymore, meh... then you should request for adminship maybe ;) --Stryn (talk) 18:51, 3 April 2016 (UTC)
    My rationale would be lame :) But if seriously, then probably not now. --Edgars2007 (talk) 19:57, 3 April 2016 (UTC)
    You can also use the gadget if you're a rollbacker tho. Sjoerd de Bruin (talk) 11:37, 10 April 2016 (UTC)

    Anyway, if somebody wants to get some fun, here is the list of items (308 currently), that are not instance of (P31)=Wikimedia disambiguation page (Q4167410), but have English description "Wikimedia disambiguation page". Of course, double check is needed. --Edgars2007 (talk) 09:17, 5 April 2016 (UTC)

    'Changed data type from string to external-id'

    Hi all, I'm running a bot to move some control authority identifiers into wikidata (namely this one: P1607). However, as pywikibot does not support 'external-id' I'm not able to find any straightforward way to do id. May I ask to revert the data type change in P1607 for some hours so that I can update the items? They're no more than three hundred items. Any other solution or suggestion is also appreciated. Best regards --Discasto (talk) 23:32, 8 April 2016 (UTC)

    Forget it, I just tweaked the pywikibot core code. Thanks --Discasto (talk) 09:12, 9 April 2016 (UTC)
    Has this been merged into pywikibot core? I guess other people will also run into this issue if it has somehow been forgotten by the pywikibot team! ·addshore· talk to me! 09:45, 9 April 2016 (UTC)
    @Discasto: @Addshore: I wonder what you ran into. As far as I know Pywikibot has full support for this new datatype and I haven't run into any problems myself. Did you run the latest (git) version? What was the problem exactly? How did you fix it? Multichill (talk) 14:24, 9 April 2016 (UTC)
    Hi all, thank you for your questions. Initially, I thought it was just a question of an old pywikibot distribution. So yesterday I removed the pywikibot core code and downloaded the last version (this one), which is said to be 2.0rc3 (version.py says the version is 2.0b3, I assume both strings, 2.0b3 and 2.0 release candidate 3 are the same). I got the same issues with the following code:
    DIALNET_PROPERTY        = u"P1607"
    
    
                item = pb.ItemPage(repo, title=page_id)
                #print type(dialnet_id)
                try :
                    item.get()
                except :
                    continue
    
                if DIALNET_PROPERTY not in item.claims:
                    try :
                        dialnet_claim = pb.Claim(repo, DIALNET_PROPERTY, datatype='external-id')
                        dialnet_claim.setTarget(dialnet_id)
                        pb.output('Adding %s --> %s' % (dialnet_claim.getID(), dialnet_claim.getTarget()))
                        item.addClaim(dialnet_claim)
                        dialnet_claim.addSource(importedfrom)
                    except :
                        print "Couldn't add claim"
    

    I don't remember the exception text (after removing the try/except statements), but mentioned 'external-id' not being supported.

    So I edited page.py and included one code line and a half. I added line 3802 (within class Property):

                 'external-id': basestring,
    

    And modified line 4317 (former 4316). From

            elif self.type in ('string', 'url'):
    

    To:

            elif self.type in ('string', 'external-id', 'url'):
    

    Once done that, the bot was able to accurately insert any dialnet identifier I asked.

    Hope this clarifies the issue and helps the pywikibot developers (unless I was doing anything wrong). Best regards --Discasto (talk) 21:47, 9 April 2016 (UTC)

    I guess you're referring to this line Discasto. The pywikibot stable version doesn't include the latest features. I recommend you switch to the git version. You can do this with the command $ git clone --recursive https://gerrit.wikimedia.org/r/pywikibot/core.git pywikibot-core assuming you already have git on your system. Multichill (talk) 20:48, 10 April 2016 (UTC)
    Thanks for the info. I don't have git on my system, so that I will have to live with my "stable" version. Odd to notice that such a stable version has got such significant flaws :-( --Discasto (talk) 21:31, 10 April 2016 (UTC)

    Please merge

    Hi, please help me to merge this items: Q8341916 and Q772399 (duplicate). I can't understand how to merge items here.--176.15.165.250 09:35, 10 April 2016 (UTC)

    ✓ It's done, thanks. The second one ended in zero (Category:Cannibalised people (Q7723990)). Strakhov (talk) 10:03, 10 April 2016 (UTC)

    [1] and [2] should be the same person. Unfortunately, two different Wikidata items. --Jobu0101 (talk) 15:41, 10 April 2016 (UTC)

    Those firstly needs to be merged at Wikipedia. 1 WD item per 1 Wikipedia article. --Edgars2007 (talk) 18:58, 10 April 2016 (UTC)
    I've turned Q20564129 into a wikimedia duplicated page on wikidata. Someone will need to merge the wikipedia articles on azwiki, which looks fairly straighforward for an Azerbaijani speaker, as they are virtually identical. Silverfish (talk) 00:38, 11 April 2016 (UTC)

    WikiProject Museum

    Hello everybody,

    I want to create a project for museum, with propreties recommended and rules (like the difference between the buildings and the museum). But I do not have enough time and skills to lead this project. If you are interested… Face-smile.svg --Tubezlob (🙋) 18:09, 10 April 2016 (UTC)

    I've made that : Wikidata:WikiProject Museums --Tubezlob (🙋) 13:42, 11 April 2016 (UTC)

    seleccion chilena

    alguien modifico los seudonimos de la seleccion chilena y tambien chile esta en el rankinf fifa 3 no 5 alguien que lo arregle gracias  – The preceding unsigned comment was added by Sergio12341234 (talk • contribs) at 14:42, 11 April 2016‎ (UTC).

    Google translates this as "someone modify the pseudonyms of the Chilean national team and also Chile is in the fifa rankinf 3 5 someone to fix it thanks". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:12, 11 April 2016 (UTC)

    Panama papers

    What property to use to indicate that someone is mentioned in the Panama papers ? Thanks, GerardM (talk) 06:53, 5 April 2016 (UTC)

    I can't find anything really suitable. We have participant of (P1344), part of (P361), and present in work (P1441) which are all close but probably wrong. There was a proposal for a "mentioned in" property that was not done, but that would have had the domain of fictional entities only.
    In the short term you could probably do something with significant event (P793) but unless I'm missing something a new property to denote this sort of thing (mentioned in reports, leaks, etc) will be the best solution going forwards. Thryduulf (talk: local | en.wp | en.wikt) 16:02, 6 April 2016 (UTC)
    Ok, This is what I have done for now; it is easy to update as it now exists as data. Thanks, GerardM (talk) 10:31, 8 April 2016 (UTC)
    From the News I've heard that Anders Wall (Q6230054) also is mentioned in these papers. But I am not sure it qualifies as a "significant event". -- Innocent bystander (talk) 09:24, 10 April 2016 (UTC)
    If I had to choose between significant event (P793) and participant of (P1344)/part of (P361)/present in work (P1441), I personally would use present in work (P1441), which doesn't say, how big deal it is for particular person. For some big deals we could use significant event (P793). --Edgars2007 (talk) 09:29, 10 April 2016 (UTC)
    Note that at present present in work (P1441) is intended to be used for fictional entities only. I'll leave a message on the property talk page so those watching it are aware of the suggestion to broaden it. Thryduulf (talk: local | en.wp | en.wikt) 00:17, 12 April 2016 (UTC)

    Importing from Japanese Wikipedia the name in kana

    The Property:P1814 (name in kana) references the Japanese pronunciation of a name. This information is available in the Japanese Wikipedia pages, so it could be automatically extracted in order to be batch-imported into Wikidata. I wrote a small program to do the extraction (more details below), but I don't know if it would be technically possible to do the importation part (and if this kind of batch-import is generally acceptable). If it is possible, then I could open a discussion in the Japanese forum to check with native speaker that my extraction method gives accurate results and how it could be improved.

    About the extraction: I wrote a Java program (to be cleaned-up, improved a bit and published) that processes the Japanese Wikipedia pages from a database dump. At the beginning of most pages, the name in Kanji (Chinese characters) is present in bold, followed by its pronunciation in kana characters. Example for Tokyo: "東京都(とうきょうと)は".

    • For 33% of the pages, the process has a very good confidence that the kana text is found (the brackets contains only kana characters and spaces).
    • For 15% more of the pages gives kana characters in the brackets after stripping the templates, spaces and dashes at the end (ex: '''鶴田洋久'''(つるた ひろひさ、[[1964年]] - )は).
    • For 18%, I could not parse with confidence the content of the brackets (ex: for Japan, it can contain both nihon and nippon). Maybe it would be possible to build the interface so a human selects the good parts.
    • For 3%, the title contain non-Japanese characters (like digits), and the brackets contains these characters plus kana characters, so the confidence is quite good.
    • For the remaing 28%, no kana is available.

    The program produces a CSV text file (page title;kana text) for each category above.

    From these text files, it would require to find the Wikipedia item associated to each Japanese Wipedia page title, then creating the property if it does not exist yet, and setting the attribute "imported from" to "Japanese Wikipedia".

    So, what do you think about this idea? Edit: Do you have any idea how it could be technically put in place ? (ex: upload the output of my program on a Wikimedia server, write a program/script that would run on it and perform the process described above) — Fabimaru (talk) 21:24, 8 April 2016 (UTC)

    I've considered doing something similar, so I'm in favour of it. :) - Nikki (talk) 10:42, 9 April 2016 (UTC)
    Hi. As you have probably seen, recently i have been doing exactly the same thing. However there are few things I always or never do:
    1. I import P1814 only for people.
    2. I import P1814 only for people, connected with Japan. For this I'm using a number of rules which I combine in different ways, like ?item wdt:P27 wd:Q17, ?item wdt:P19/wdt:P131* wd:Q17, ?item wdt:P106/wdt:P279* wd:Q226008, ?item wdt:P69 ?uni . ?uni wdt:P17 wd:Q17, ?item wdt:P735 ?name, ?item wdt:P734 ?sname. For name and surname I use a heuristic, where I import only items with a single japanese sitelink.
    3. I import only kana with length <= 16 and consisting of cleary identifiable name and surname, separated by space or space-like character. Other information is useless for Wikidata. There is no segmentation algorithm for kana.
    4. I autotransliterate imported names into Russian using Polivanov system (which is accepted by ruwiki community and actually works much better than human transliteration). I guess one could use Hepburn romanization system for English. Maybe @Haplology: could explain, how he created English labels for japanese people few years ago.
    The code is gone but as I recall I used a similar method of conservative regex to isolate hiragana readings of names of people on pages without enwiki sitelinks (other pages may have romanized titles, but romanization styles vary slightly on other wikis) and with the further restriction of only modern-day people, that is, people born after the Meiji Restoration plus a few years to be safe. I skipped katakana names, either in the name itself or in its reading. There was also a lot of eyeballing and checking for errors. The transliteration method was the revised Hepburn romanization in use in WP and WT. --Haplology (talk) 01:21, 11 April 2016 (UTC)
    As for me, I don't see any use for P1814 of non-japanese items. Especially I don't see reasons for importing reverse-transliterated names in katakana: these could be always autogenerated with userscript just from english labels.
    --Lockal (talk) 11:31, 9 April 2016 (UTC)
    Hi. Thank you a lot for your answer, I noticed you added P1814 to the entries of a large number of people, but I did not know your method for importing them (it could have been manual).
    In addition to the people, I think could be also useful to have the places (cities, train stations…) and institutions (schools, companies…), but I guess that it would be a challenge to be sure that the Wikidata entries/Japanese Wikipedia pages belong to these categories (ex: for the item for a temple, we may not have any property set in Wikidata so we can identify it as a temple).
    Before your answer, I thought about importing indiscriminately all the Wikipedia titles containing kanjis (starting from the Japanese Wikipedia, not from the Wikidata entries). Do you see any reason not to do so? I first did not see any reason, but now I can think about these two cases that are arguable:
    • For the Wikipedia pages titles that are not a proper noun. For example, 認知症 / dementia. I don't think it would be good for Wikidata because the Wikidata item is about the concept, not about the noun itself (if the label is changed, the kana name will be of of sync).
    • For Chinese names (which obviously contain Chinese characters). For example, is it useful for Japanese people to know that "Xi Jinping" is pronunced "しゅう きんぺい" in Japanese? (I would tend to think it is)
    So maybe for the types of pages I mentioned, some heuristics should also be used to detect the category (ex for the train stations: if the name ends with "駅"=station).
    If other types of names were to be imported, do you think it would be better to extend your tool or that I complete mine (which currently only has the Wikipedia parsing part)? — Fabimaru (talk) 18:10, 9 April 2016 (UTC)
    I agree that limiting it to proper nouns makes sense for the reason you gave. Don't forget that there are also Korean names where the Japanese name uses Chinese characters (e.g. Kim Dae-jung (Q45785), South Gyeongsang Province (Q41151)). - Nikki (talk) 14:08, 10 April 2016 (UTC)
    If the Japanese label is already in katakana, there's no point duplicating it in a P1814 statement, so that should already exclude most non-Japanese things. The main exceptions would be some Chinese and Korean names. Also, if someone does decide to try to automatically generate katakana versions, please bear in mind that the katakana version depends on the pronunciation of the name, not the spelling (e.g. ja:マイケル・ジャクソン, ja:ミカエル・ラウドルップ, ja:ミヒャエル・バラック all have an English label starting with "Michael"). - Nikki (talk) 14:08, 10 April 2016 (UTC)
    I would suggest you not to import non-segmented kana names. I'm not sure what to do with generic nouns (maybe generate kana name with MeCab/neologd and if the generated value matches the value from Wikipedia, then import it), but for humans you could at least check if kana name consists of multiple parts. Non-segmented sequences of kana characters are useless in terms of further processing. --Lockal (talk) 07:28, 11 April 2016 (UTC)
    Thanks. After trying to understand which names/nouns should be segmented, I now realize that my approach was too naive and unsatisfying. In fact the kana inside the brackets does not always match the kanjis in Wikipedia. For example, for Mitsubishi Motors, the kanji ends with " Corporation ", but the kanas do no contain it. So I guess that if I want to import something, I should restrict myself to entries for which:
    • the kanji names matches for sure the kanas.
    • the kana names don't need to be segmented. My feeling is that the trains stations, cities and temples do not need it (I think that I can start with these 3 ones).
    So I need to narrow and refine my plan, and to see how to use Widar (or any alternative). Fabimaru (talk) 20:25, 11 April 2016 (UTC)

    IEG grant submission for Wikimedia-powered e-books - feedback welcome

    Hi everyone! I have just submitted an Individual Engagement Grant request to research how it would work if one would like to create e-books with mixed content from various Wikimedia projects together. The reason for requesting this grant is that I'm working on a hybrid print and digital (Wiki)book on Dutch digital arts, for which I'd like to do such a thing, and I want to work on a more 'universal' solution from the start. You can find the grant submission here. I explicitly want to include the possibility to work with data from Wikidata in such e-books (think timelines, infoboxes, and multilinguality where possible and useful). Feedback, questions, use cases, and of course endorsements are more than welcome there. Thanks! Spinster (talk) 09:50, 12 April 2016 (UTC)

    Wikidata weekly summary #204

    Input requested for Individual Engagement Grant: creating a database for the public domain

    Hello everyone! We're presenting a proposal for building a database of argentinean authors in the public domain. Our proposal is based on a project already been carried out in Uruguay, called AutoresUy, already accepted as an unique identifier in Wikidata. We think it's necessary and desirable to have more projects of this kind. In order to achieve this, we have set up a plan that includes workshops, edit-a-thons and lots of hard work to build, improve, and expand this database and help Wikimedia projects. Please leave your input in the talk page, or directly an endorsement if you think it's worthed.

    IEG proposal for AutoresAr. --Scanno (talk) 14:55, 12 April 2016 (UTC)

    A big questionmark

    author  TomT0m / talk page Mbch331 (talk) Jobu0101 (talk) Tubezlob (🙋) Flycatchr Koxinga (talk) Fuzheado (talk) Mfchris84 (talk)

    Pictogram voting comment.svg Notified participants of WikiProject Award I think we need to talk about this all together.

    hi, I went bald and implemented an idea talked of in a discussion about awards (see WikiProject Award because the discussion was stunned:

    • An award is the class of all the time someone recieved it. Which means that every award like "nobel price" is a subclass of "award", not an instance.
    • Almost all awards are given every year. I created a metaclass for this kind of awards to be an instance of. This metaclass regroups all awards whose ceremony take place repeatedly every year. So far so good. I just see this edit from Abc82 (talkcontribslogs) https://www.wikidata.org/w/index.php?title=Q987393&type=revision&diff=320519413&oldid=320265328 : WTF guy, are you again putting almost nonsensical statements on your own ? What is this ? it's not the first time ... author  TomT0m / talk page 13:18, 10 April 2016 (UTC)
    This does not sound right to me. An award in an individual recognization of one or more persons or organizations. Therefore, "the Oscars", "the Oscars 2016", or "the Oscars for cinematography" are classes, whose instances are those individual recognizations. --Srittau (talk) 15:26, 10 April 2016 (UTC)
    We probably need to distinguish between those things:
    • Award type ("Oscars")
    • Award years ("Oscars 2016")
    • Award categories ("Oscar for cinematography")
    • Individual recognizations. ("Oscar for cinematography in 2016")
    I would consider the former three to be classes, while the last one is not a class. --Srittau (talk) 15:30, 10 April 2016 (UTC)
    More or less. I'd agree that the last one is an instance of Oscardization, more precisely an instance of Academy Award for Best Cinematography (Q131520) View with Reasonator View with SQID. I'd also agree that "Oscars" is a type of award, considering it's the awards given by a single organisation (same conferred by (P1027) for all Oscars, the Academy) ) I actually wanted to implement a metaclass with
    < Award type > has quality (P1552) View with SQID < organization that offer the award >
    but I did not find actually the right item. I wonder if periodic price (Q23755142) View with Reasonator View with SQID qualifies for being this "award type" item. In most of the case I guess. author  TomT0m / talk page 19:23, 10 April 2016 (UTC)
    For the statement that states who won the "Academy Award for Best Cinematography in 2016", instead of creating an item for the 2016 award, why not say the person was awarded with an "Academy Award for Best in Cinematography" then qualified with the year awarded? —seav (talk) 00:22, 13 April 2016 (UTC)

    Cell lines

    Wanting to know how I can contribute to the Wikidata project by sharing the data on cell lines that is actively curated in the Cellosaurus:

    http://web.expasy.org/cellosaurus/

     – The preceding unsigned comment was added by Amb sib (talk • contribs) at 08:49, 12 April 2016‎ (UTC).

    Thank you! Check out Wikidata:Data donation. --Denny (talk) 17:10, 13 April 2016 (UTC)

    old names of a sport club ?

    Hi everyone. I would like to add old names of sport clubs. Should I use for this official name (P1448), which seems to be dedicated to places, or the more recent name (P2561), or another one I didn't find ? Thanks. --H4stings (talk) 08:58, 13 April 2016 (UTC)

    Use official name (P1448) with date qualifiers. Sjoerd de Bruin (talk) 11:16, 13 April 2016 (UTC)
    Ok thanks. --H4stings (talk) 14:22, 13 April 2016 (UTC)

    Linking a wikidata item to a subsection of a wikipedia page

    How does one do this? My example is a wikidata item: Gene Ontology Consortium (Q23809253) which I want to link to a tab on this wikipedia page: https://en.wikipedia.org/wiki/Gene_ontology#Consortium

    A) is this a bad idea and; B) if not how might it be implemented?

    Thanks

    --Mhaendel (talk) 15:33, 13 April 2016 (UTC)

    Mhaendel It is not possible to link a page with two items --ديفيد عادل وهبة خليل 2 (talk) 15:44, 13 April 2016 (UTC)
    this was discussed previously as a justification for linking to redirects. Currently the way you would have to do that (in this case as an example) is create a stub page in enwiki for "Gene ontology consortium", link to that, then change that stub to a redirect to where you want. This works, but some people object to it... ArthurPSmith (talk) 15:59, 13 April 2016 (UTC)
    A) Yes it's a bad idea and B) it may not be implemented. The workaround described above is a bug. There are several reasons why this is not possible. One is the fact that Wikidata has a 1-1 relationship with Wikipedia articles because of the problems with page moves on Wikipedia to improve article titles. On Wikidata only one field has alternate aliases, and that is the label field. The titles of Wikipedia articles do not change often, but their subsections change more often. This is the same reason that on Wikipedia, there is a link to "Cite this page" but there is no "Cite this paragraph". This is tracked in Phabricator . --Jane023 (talk) 16:31, 13 April 2016 (UTC)

    Go to en.wikipedia page for a Qid

    Hi - if i have a wikidata Q number for an item, is there a url I can use that will redirect me to the equivalent wikipedia page, if it exists (e.g. using the stored sitelinks information). Something like http://www.wikidata.org/redirect/enwiki/Q1234 HYanWong (talk) 14:04, 11 April 2016 (UTC)

    Hello! We have this: see Special:GoToLinkedPage. Sjoerd de Bruin (talk) 14:12, 11 April 2016 (UTC)
    Perfect. Thanks! I couldn't find it documented anywhere obvious, so perhaps might be worth flagging up somehow. But it might just be my poor search strategy. HYanWong (talk) 15:01, 11 April 2016 (UTC)
    p.s. it might be helpful to give an example on that page (e.g. http://www.wikidata.org/wiki/Special:GoToLinkedPage?site=enwiki&itemid=Q513). I can't edit the text there, otherwise I'd have added it. HYanWong (talk) 15:07, 11 April 2016 (UTC)
    You could use Reasonator. It is included and easy to use. GerardM (talk) 06:35, 14 April 2016 (UTC)

    Keeping track of things

    Keeping track of things on Wikidata is difficult. The broadness of the project's scope, granularity of its content, and diversity of its languages create a situation where normal monitoring tools like watchlists and recent changes are increasingly insufficient.

    Changes, both large and small, should be monitorable without difficulty. Users should be able to easily follow the areas they are interested in, see what changes others are making, and easily interact with the edits and editors involved. This is critical to having a functioning wiki.

    I've put together a list of the most pressing problems in this area, along with my proposals for solutions and implementations of those solutions. Please comment, and feel free to propose alternative solutions and/or implementations.

    Keeping track of mass-edits

    • Problem: Automated or semi-automated mass-edits are hard to keep track of or undo.
      • Solution: A central, trackable log of all mass-edits of over 100 edits. This includes both bot runs and large groups of edits done using tools like Autolist. Users should be able to follow the log via watchlist, understand the parameters of the edit groups, and easily undo individual groups of changes. Finding which run an edit belongs to should be simple.
        • Implementation: A log as a wiki page, with each group of edits having a listing, probably using a template. Ideally, tools like Autolist would add log entries automatically, filling in information such as precisely what source is used for determining which items to edit (SPARQL, WDQ, Category, etc), and what edits are made, a timestamp of when the run started and ended, and links to relevant discussions if applicable. If automatic updates are not an option, then the log could be filled manually using either preloaded templates or the form wizard. An undo button next to each run links to a preloaded filled-out form for User:RollBot, which simultaneously pings the changes' author. RollBot's undos should have relevant information in the edit summaries. The log is in chronological order and split by year (or maybe month?), and all edits within the group would ideally link to the log entry in the edit summary. --Yair rand (talk) 22:10, 11 April 2016 (UTC)

    Recent Changes flooding

    • Problem: The Recent Changes are frequently clogged by mass-edits.
      • Solution: Encourage more use of the Flood flag. Requesting the flag should be simpler. Users about to do mass-edits or in the process of doing them without a flood flag should be notified, and provided with a helpful link.
        • Implementation: Tools should warn users about to do mass edits if they don't have the flag set, and provide the necessary link. A talk page template for unauthorized bots and unflagged flooders should be created. Perhaps a bot could automatically warn all users doing too many edits too fast. WD:FLOOD should prominently link either to the correct noticeboard for requesting the flag, or even directly to the form. A flood flag request template should include a direct link for bureaucrats to enable the flag, and for the requester to remove it afterwards. Perhaps the template should also ping all current bureaucrats, in order to achieve faster response times.
          (It may be worth it to have a [default|rights=flood] gadget add a button to the sidebar or the template to simultaneously remove the flood flag and close the current mass-edit run log entry, if both proposals are implemented.) --Yair rand (talk) 22:10, 11 April 2016 (UTC)
          • I think if we really want to encourage people to use the flood flag, it would need to be more automated, e.g. QuickStatements is really useful and there's plenty of people using it. It would be easier for QuickStatements to set/remove the flag itself instead of trying to convince every single person who uses it to request the flag manually every single time they want to use it. - Nikki (talk) 06:46, 12 April 2016 (UTC)
            • @Nikki: Non-admins can't add the flag to their own account. --Yair rand (talk) 15:32, 12 April 2016 (UTC)
              • I know. We're talking about ideas of what we could do/change to improve things though, right? :) Another idea: Improve tag filtering. It seems to be possible to show things with a certain tag, but not to filter out things with one of a list of tags. The vast majority of flooding I see comes from QuickStatements which are already tagged (although not uniquely since they're lumped under Widar, but perhaps that could be changed). - Nikki (talk) 17:59, 12 April 2016 (UTC)
                @Nikki: Below I introduce the idea of a group of "Self-propmoters". -- Innocent bystander (talk) 14:57, 13 April 2016 (UTC)

    Never heard of the flood flag, and I am a heavy user of quick statements. It might help to inform the people what you are proposing exactly, because I still don't understand after reading this. --Jane023 (talk) 07:37, 12 April 2016 (UTC)

    The Flood flag is a kind of "Bot flag light". Your edits are hidden in RC, but you do not have the other rights bots have. It can be granted in a more simple way and you can remove the flag yourself. -- Innocent bystander (talk) 10:00, 12 April 2016 (UTC)
    Thanks for the explanation. I can see that it is something that has no benefit to the task at hand, so the motivation to use it would be low. I asked for a bot flag so I could use quick statements in a way that was faster for me, but so far I have the difficulties of logging in and out with a different user account and no significant time gains. Your idea of building this in to quick statements is probably the best way forward. I think we had that in the beginning but people were confused about it and didn't trust those edits. I am not sure this will help that problem. --Jane023 (talk) 18:06, 12 April 2016 (UTC)
    @Jane023: I applied as sysop to be able to set and reset this flag myself, without asking our crats every time. -- Innocent bystander (talk) 18:24, 12 April 2016 (UTC)

    Self-promoters?

    @All: Starting a new user group of "Self-promoters" who like sysops are trusted to set and unset their own flood flag would maybe be an alternative? It is maybe easy to get trust enough to become a sysop here, but not all of us feel comfortable with the extra tools associated with this group. I have myself, by the help of some css-magic, removed the rollback-links which are easy to hit by accident. Without that option I would probably never have applied as sysop (again). -- Innocent bystander (talk) 18:24, 12 April 2016 (UTC)
    I like this idea. --Yair rand (talk) 20:30, 12 April 2016 (UTC)
    +1. Yellowcard (talk) 15:15, 13 April 2016 (UTC)
    +1. This seems like a very good use of our permissions system. —seav (talk) 03:00, 14 April 2016 (UTC)
    I have no opinion on its desirability, but please call it something else: self-flaggers, auto-flaggers, floodflaggers, etc. A self-promoter is something to be combatted. - Brya (talk) 05:45, 14 April 2016 (UTC)
    @Brya: "Self-promoters" is a description of a potential new group, not a proposed name of it. I think it looks like the idea already has enough support to start a RFC about it. Decisions like this, who affects the whole project cannot be taken in a <h4/>-sub-thread in the English Project Chat. -- Innocent bystander (talk) 05:56, 14 April 2016 (UTC)

    Change list difficulties

    • Problem: Change lists, such as Recent Changes and watchlists, are too hard to read through and contain too many edits clearly outside of one's area of interest. As a result, relevant changes are missed. Furthermore, on client-wiki watchlists (Wikipedia watchlists, etc), the lists are so full of irrelevant changes, and the changes listings contain so little information, that users tend to either disable showing Wikidata changes or ignore them.
      • Solution: Change lists should allow for filtering, and users should be able to see all the details of an edit without clicking "diff" on every one of them.
        • Implementation: DiffLists.js offers a basic version of this, with limited by-language and by-property filtering, and shows a smaller version of the edit diff instead of the regular autogenerated edit summaries. However, as a result of being entirely on the front-end, the script has substantial limitations. A proper back-end version would have many advantages, such as more reasonable performance, potential for widespread use without smashing the servers, and the ability to display a stable number of results, allowing for more usable property and language filtering.
          (Other useful potential features: A better property-selecting mechanism and language selector, support for history and contributions pages, and defaulting language settings to babel box contents.)
          Client-wiki watchlists, by default, should hide all identifier statements and foreign-language labels/descriptions/aliases, and possibly even interwikis to languages the user does not speak.
          phab:T121361 is the task for showing diffs on the change lists. Afaik, there isn't one open yet for filtering options. --Yair rand (talk) 22:10, 11 April 2016 (UTC)
    +1, I love the DiffLists tool. I don't remember how I could be able to use the recent changes list before, and I also think it would benefit greatly from being implemented within Wikibase.
    Additionally, I'd like to see more people using the patrol feature (which is totally independent from DiffLists, just to clarify this). When looking for vandalism, it is plainly impossible to watch all changes on Wikidata, you can always only look at a very small excerpt. At the moment it's not even really possible to check all unpatrolled edits for a prolonged period of time, even with the DiffLists tool, which significantly reduces the effort you have to take. So if most people would mark most edits they have checked (and kept or reverted) as patrolled, we would win a lot already. --YMS (talk) 10:37, 13 April 2016 (UTC)

    Keeping track of topic areas

    • Problem: Topics spread across many items are hard to follow. Certain areas are divided into enough items that it's difficult to watch them all, or keep a watchlist updated with new "sub-items". The recent changes feed is too broad, watchlists not broad enough, and normal use of "Related pages" too limited.
      • Solution: Individual Wikiprojects could have separate change feeds, based on lists of relevant items and properties.
        • Implementation: Much of this would be best based off of lists automatically bot-updated from SPARQL queries. The feed could come from Related Changes of the list, and be prominently linked to from Wikiproject pages. Some more filtering of the feed could be done by a server-side version of DiffLists, as described above. (Show only relevant statements and such.) The details of the filtering could be included within the link ("?rc-filters=", etc), and filled in automatically. Some manual maintaining of the list of relevant items may also be necessary. For the query-generated parts, there could be an "expiry" system for removal, as opposed to immediate removal, so that statement removals/changes that would ordinarily remove the item from the list would still show up in the feed. --Yair rand (talk) 22:10, 11 April 2016 (UTC)

    Discussion dispersion

    • Problem: Discussion areas are far too scattered, causing people to be unaware of discussions they would be interested in. Every individual item has a talk page, and it's often hard to find a more general discussion page for a topic, or otherwise notify everyone who would be interested in participating.

    General discussion

    Illegal language error

    I am trying to add title (P1476): "Binny und der Geist" to Binny and the Ghost (Q17457631). For the language, I typed German and chose "German (de)" from the dropdown. When saving, it says "Illegal language: German". nyuszika7h (talk) 15:28, 11 April 2016 (UTC)

    It's as if it ignores what I choose from the dropdown. Manually typing "de" worked. nyuszika7h (talk) 15:31, 11 April 2016 (UTC)
    Firefox 45 on Windows 10, if that matters. nyuszika7h (talk) 15:31, 11 April 2016 (UTC)
    @Lydia Pintscher (WMDE): This seems like something you might want to look into :) It sounds quite similar to what I wrote about in my last comment on phab:T110043 - Nikki (talk) 08:47, 13 April 2016 (UTC)
    It seems like selecting a suggestion doesn't update the input, so "German" stays "German" instead of "de". Sjoerd de Bruin (talk) 08:53, 13 April 2016 (UTC)
    Thanks! Looking into it. --Lydia Pintscher (WMDE) (talk) 14:01, 14 April 2016 (UTC)

    How could we ...

    Following the previous thread by Yair Rand I'll try a mini-rfc.

    First round on questions for now (both choices can be "yesed" if you want that all three stuffs are aligned :

    See Wikidata:Requests_for_comment/Reforming_the_property_creation_process#Reorganization.3F for some earlier discussion and related proposals on this. How does an RFC get concluded anyway? TomT0m it might be good to list all the specific reorganizations that have been proposed so far here? ArthurPSmith (talk) 20:21, 12 April 2016 (UTC)

    Alignement of proposal categories on class hierarchies

    1. Should we try to align the property proposal categories to the class hierarchy ? This would concretely mean that each section of WD:PP would map to a top class of our class hierarchy.

    yes

    Symbol support vote.svg Support --Jane023 (talk) 12:45, 13 April 2016 (UTC)

    no

    neutral

    discuss

    Alignment of property proposal structure to wikiprojects

    2. Should we try to align the property proposal categories to the wikiproject structure ? I have no concrete idea on how we could implement that. A property proposal section in each wikiproject ? And wd:pp would link/transclude these pages ?

    yes

    no

    neutral

    discuss

    You lost me on this one. No idea what you mean at all. --Jane023 (talk) 12:46, 13 April 2016 (UTC)

    Documentation of classes

    1. Should we focus more on documenting classes instead of properties ? Every class has a set of natural properties that can be applied to her (and its parent property), can have a showcase instance, ... This would mean that we could be able to document several properties on the same page instead of on all the property pages, which leads to discussion and information dispertion.

    I think we should probably have documentation on both classes and properties. There would be some duplication, but I think it would be worth it. --Yair rand (talk) 20:28, 12 April 2016 (UTC)

    yes

    no

    neutral

    discuss

    Property proposals organized by class

    2. Should we try to associate a property proposal to a class ? This might be implemented as such : one property proposal page for each class, all this pages indexed in wd:pp or one of its subpages. There could be a path for example "wd:pp" -> "abstract object" -> "creative work class" -> "live performance class" -> "opera" -> existing properties for an opera (or its parent class). Then the user could find what he's looking for (or not) and propose something on the property proposal of a page. There is a "all proposals" page where all proposals have to be listed, whith just a minimal description and not all the details of the proposal.

    yes

    no

    neutral

    discuss

    The above section was added by TomT0m, 14:00, April 12, 2016‎

    I would really like it if each proposal were its own subpage. Those could be included on whichever pages we want and easily moved between pages without losing any history. The current pages are too large for me to keep track of and since I can't add individual proposals to my watchlist, I often end up missing discussions that I'd like to follow. - Nikki (talk) 20:56, 12 April 2016 (UTC)
    +1 to Nikki's suggestion. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:23, 13 April 2016 (UTC)
    I wanted to propose this at last RfC. One thing to think about is naming. At enwiki's AfD this is easy - simply page title. Not sure, what could be used here at PP. Random identifier - not good, property name - they can change very differently, and not everybody is adding titles in English (they should, but as this is not "English Wikidata", this can't be obligatory). But OK, naming is pretty minor thing to come up with. --Edgars2007 (talk) 07:14, 14 April 2016 (UTC)
    +1. Another advantage of this is that we could transclude a proposal to multiple overview pages. That means we could have pages sorting the proposal by classes, and other pages sorting the proposal by age or status. --Pasleim (talk) 13:57, 14 April 2016 (UTC)
    +1 from me too. This is how it's done in the Requests for Permissions pages here, for example Wikidata:Requests for permissions/Bot. It seems to work well there. I would just use the initial property name (in any convenient language) as the page name; if that is a duplicate of a previous entry then add a number. If the proposed property name changes during the course of discussion I think it would be fine to still have the discussion at the page pointed to by the original name. It would never need to move to another page; after closing the discussion it should be auto-archived the way those bot requests are. This probably should be a full RFC though, not just a chat topic. ArthurPSmith (talk) 18:20, 14 April 2016 (UTC)

    Help translate Listeria bot on local Wikipedia projects

    Hi all, I am working with User:Pigsonthewing on the Wikidata:TED data donation. I have built lists of speakers and talks and would like to build these lists on those local Wikipedia projects for which we have translated titles. See the overview here: outreach:TED conferences. I am missing Listeria bot however for the Wikipedia languages ja/ko/zh/tr/vi/fa. Is there anyone here that can help out? A local template needs to be created and then bot access for ListeriaBot needs to be applied for and granted. Andy, Magnus and I don't speak any of those languages, so it would be great if someone here could help out in those projects who 1) understand what Listeria bot is and 2) understand how to apply for a bot flag on their local wiki. Thanks! --Jane023 (talk) 12:22, 13 April 2016 (UTC)

    To clarify, the list templates Template:Wikidata list and Template:Wikidata list end must be created on the local project (preferably with redirects so non-local users like myself can still navigate with them). You can see the template in use for many things if you click "What links here" from any of the template, but for example I also just used it here: Talk:Q19507487. --Jane023 (talk) 08:37, 14 April 2016 (UTC)
    And now you can see an overview of what User:ListeriaBot does here Botstatus. --Jane023 (talk) 17:30, 14 April 2016 (UTC)

    Please merge

    Q9483060 and Q8475387 are now one and the same, following the outcome of Wikipedia:Categories_for_discussion/Log/2016_February_9#Category:Functionaries_of_the_Stalinist_regime_in_Poland. I don't know how to fix the issue of making sure the interwiki links work (i.e. pl:Kategoria:Funkcjonariusze Ministerstwa Bezpieczeństwa Publicznego should now be part of en:Category:Ministry of Public Security (Poland) officials level. --Piotrus (talk) 08:15, 14 April 2016 (UTC)

    I have moved the link for you. I'm pretty sure we have a help page for this...? --Izno (talk) 13:24, 14 April 2016 (UTC)

    Copy reference

    Hi. Is copy reference tool working or not? I can only copy a reference, not add it. Xaris333 (talk) 17:46, 25 March 2016 (UTC)

    Can confirm, the following JavaScript error is given: TypeError: null is not an object (evaluating 'this._qualifiers.value'). @Bene*: can you take a look? Sjoerd de Bruin (talk) 17:49, 25 March 2016 (UTC)

    Can anyone help? Xaris333 (talk) 12:25, 31 March 2016 (UTC)

    I tried to fix it; Does the gadget work again for you? Adrian Heine (WMDE) (talk) 06:39, 7 April 2016 (UTC)

    Adrian Heine (WMDE) Just try it. Still not working. Xaris333 (talk) 18:39, 7 April 2016 (UTC)

    I posted a hopefully correct fix on the gadget's talk page, but no sysop responded yet. Adrian Heine (WMDE) (talk) 11:49, 12 April 2016 (UTC)

    Lets hope that it will fix it. We really need that gadget. Xaris333 (talk) 12:01, 12 April 2016 (UTC)

    @Xaris333: It should be fixed now if you want to test it. - Nikki (talk) 20:00, 14 April 2016 (UTC)
    Nikki still not working. Xaris333 (talk) 22:25, 14 April 2016 (UTC)
    @Xaris333: It's working fine for me - could you provide some more information? e.g. which browser? what happens? if you know how to get to the error console, are there any errors there? have you tried a forced refresh? - Nikki (talk) 12:25, 15 April 2016 (UTC)
    Nikki Pls try to use it to one page and then to one other page. Yesterday it worked for me when I try it (after you said me to test it). But only for a page. Then stopped again. I have tried refresh. Chrome. I don't know about error console. Xaris333 (talk) 13:55, 15 April 2016 (UTC)
    I have used it on multiple pages and it still works for me :/ I'm not sure what to suggest... @Adrian Heine (WMDE): any ideas? - Nikki (talk) 14:05, 15 April 2016 (UTC)

    Its working now! Xaris333 (talk) 17:40, 15 April 2016 (UTC)

    Emoji 😀

    Emojis are now avaible on Wikidata 😋 --Tubezlob (🙋) 20:59, 12 April 2016 (UTC)

    Emoji palette

    ☺
    {{emoji|263A}}

    😀
    {{emoji|1F600}}

    😁
    {{emoji|1F601}}

    😂
    {{emoji|1F602}}

    😃
    {{emoji|1F603}}

    😄
    {{emoji|1F604}}

    😅
    {{emoji|1F605}}

    😆
    {{emoji|1F606}}

    😉
    {{emoji|1F609}}

    😊
    {{emoji|1F60A}}

    😋
    {{emoji|1F60B}}

    😌
    {{emoji|1F60C}}

    😍
    {{emoji|1F60D}}

    😎
    {{emoji|1F60E}}

    😏
    {{emoji|1F60F}}

    😐
    {{emoji|1F610}}

    😑
    {{emoji|1F611}}

    😒
    {{emoji|1F612}}

    😓
    {{emoji|1F613}}

    😔
    {{emoji|1F614}}

    😕
    {{emoji|1F615}}

    😖
    {{emoji|1F616}}

    😗
    {{emoji|1F617}}

    😘
    {{emoji|1F618}}

    😙
    {{emoji|1F619}}

    😚
    {{emoji|1F61A}}

    😛
    {{emoji|1F61B}}

    😜
    {{emoji|1F61C}}

    😝
    {{emoji|1F61D}}

    😞
    {{emoji|1F61E}}

    😟
    {{emoji|1F61F}}

    😠
    {{emoji|1F620}}

    😡
    {{emoji|1F621}}

    😢
    {{emoji|1F622}}

    😣
    {{emoji|1F623}}

    😤
    {{emoji|1F624}}

    😥
    {{emoji|1F625}}

    😦
    {{emoji|1F626}}

    😧
    {{emoji|1F627}}

    😨
    {{emoji|1F628}}

    😩
    {{emoji|1F629}}

    😪
    {{emoji|1F62A}}

    😫
    {{emoji|1F62B}}

    😬
    {{emoji|1F62C}}

    😭
    {{emoji|1F62D}}

    😮
    {{emoji|1F62E}}

    😯
    {{emoji|1F62F}}

    😰
    {{emoji|1F630}}

    😱
    {{emoji|1F631}}

    😲
    {{emoji|1F632}}

    😳
    {{emoji|1F633}}

    😴
    {{emoji|1F634}}

    😵
    {{emoji|1F635}}

    😶
    {{emoji|1F636}}

    😷
    {{emoji|1F637}}

    😇
    {{emoji|1F607}}

    😈
    {{emoji|1F608}}

    👿
    {{emoji|1F47F}}

    👦
    {{emoji|1F466}}

    👧
    {{emoji|1F467}}

    👨
    {{emoji|1F468}}

    👩
    {{emoji|1F469}}

    👴
    {{emoji|1F474}}

    👵
    {{emoji|1F475}}

    👶
    {{emoji|1F476}}

    👱
    {{emoji|1F471}}

    👮
    {{emoji|1F46E}}

    👲
    {{emoji|1F472}}

    👳
    {{emoji|1F473}}

    👷
    {{emoji|1F477}}

    👸
    {{emoji|1F478}}

    💂
    {{emoji|1F482}}

    🎅
    {{emoji|1F385}}

    👼
    {{emoji|1F47C}}

    👯
    {{emoji|1F46F}}

    💆
    {{emoji|1F486}}

    💇
    {{emoji|1F487}}

    👰
    {{emoji|1F470}}

    🙍
    {{emoji|1F64D}}

    🙎
    {{emoji|1F64E}}

    🙅
    {{emoji|1F645}}

    🙆
    {{emoji|1F646}}

    💁
    {{emoji|1F481}}

    🙋
    {{emoji|1F64B}}

    🙇
    {{emoji|1F647}}

    🙌
    {{emoji|1F64C}}

    🙏
    {{emoji|1F64F}}

    👤
    {{emoji|1F464}}

    👥
    {{emoji|1F465}}

    🚶
    {{emoji|1F6B6}}

    🏃
    {{emoji|1F3C3}}

    💃
    {{emoji|1F483}}

    💏
    {{emoji|1F48F}}

    💑
    {{emoji|1F491}}

    👪
    {{emoji|1F46A}}

    👫
    {{emoji|1F46B}}

    👬
    {{emoji|1F46C}}

    👭
    {{emoji|1F46D}}

    👈
    {{emoji|1F448}}

    👉
    {{emoji|1F449}}

    ☝
    {{emoji|261D}}

    👆
    {{emoji|1F446}}

    👇
    {{emoji|1F447}}

    ✊
    {{emoji|270A}}

    ✋
    {{emoji|270B}}

    ✌
    {{emoji|270C}}

    ✍
    {{emoji|270D}}

    👊
    {{emoji|1F44A}}

    👋
    {{emoji|1F44B}}

    👌
    {{emoji|1F44C}}

    👍
    {{emoji|1F44D}}

    👎
    {{emoji|1F44E}}

    👏
    {{emoji|1F44F}}

    👐
    {{emoji|1F450}}

    💅
    {{emoji|1F485}}

    💪
    {{emoji|1F4AA}}

    👣
    {{emoji|1F463}}

    👀
    {{emoji|1F440}}

    👂
    {{emoji|1F442}}

    👃
    {{emoji|1F443}}

    👅
    {{emoji|1F445}}

    💀
    {{emoji|1F480}}

    💋
    {{emoji|1F48B}}

    👄
    {{emoji|1F444}}

    💘
    {{emoji|1F498}}

    ❤
    {{emoji|2764}}

    💓
    {{emoji|1F493}}

    💔
    {{emoji|1F494}}

    💕
    {{emoji|1F495}}

    💖
    {{emoji|1F496}}

    💗
    {{emoji|1F497}}

    💙
    {{emoji|1F499}}

    💚
    {{emoji|1F49A}}

    💛
    {{emoji|1F49B}}

    💜
    {{emoji|1F49C}}

    💝
    {{emoji|1F49D}}

    💞
    {{emoji|1F49E}}

    💟
    {{emoji|1F49F}}

    💌
    {{emoji|1F48C}}

    💧
    {{emoji|1F4A7}}

    💤
    {{emoji|1F4A4}}

    💢
    {{emoji|1F4A2}}

    💣
    {{emoji|1F4A3}}

    💥
    {{emoji|1F4A5}}

    💦
    {{emoji|1F4A6}}

    💨
    {{emoji|1F4A8}}

    💫
    {{emoji|1F4AB}}

    💬
    {{emoji|1F4AC}}

    💭
    {{emoji|1F4AD}}

    👓
    {{emoji|1F453}}

    👔
    {{emoji|1F454}}

    👕
    {{emoji|1F455}}

    👖
    {{emoji|1F456}}

    👗
    {{emoji|1F457}}

    👘
    {{emoji|1F458}}

    👙
    {{emoji|1F459}}

    👚
    {{emoji|1F45A}}

    👛
    {{emoji|1F45B}}

    👜
    {{emoji|1F45C}}

    👝
    {{emoji|1F45D}}

    👞
    {{emoji|1F45E}}

    👟
    {{emoji|1F45F}}

    👠
    {{emoji|1F460}}

    👡
    {{emoji|1F461}}

    👢
    {{emoji|1F462}}

    👑
    {{emoji|1F451}}

    👒
    {{emoji|1F452}}

    🎩
    {{emoji|1F3A9}}

    🐵
    {{emoji|1F435}}

    🙈
    {{emoji|1F648}}

    🙉
    {{emoji|1F649}}

    🙊
    {{emoji|1F64A}}

    🐒
    {{emoji|1F412}}

    🐶
    {{emoji|1F436}}

    🐕
    {{emoji|1F415}}

    🐩
    {{emoji|1F429}}

    🐺
    {{emoji|1F43A}}

    🐱
    {{emoji|1F431}}

    😸
    {{emoji|1F638}}

    😹
    {{emoji|1F639}}

    😺
    {{emoji|1F63A}}

    😻
    {{emoji|1F63B}}

    😼
    {{emoji|1F63C}}

    😽
    {{emoji|1F63D}}

    😾
    {{emoji|1F63E}}

    😿
    {{emoji|1F63F}}

    🙀
    {{emoji|1F640}}

    🐈
    {{emoji|1F408}}

    🐯
    {{emoji|1F42F}}

    🐅
    {{emoji|1F405}}

    🐆
    {{emoji|1F406}}

    🐴
    {{emoji|1F434}}

    🐎
    {{emoji|1F40E}}

    🐮
    {{emoji|1F42E}}

    🐂
    {{emoji|1F402}}

    🐃
    {{emoji|1F403}}

    🐄
    {{emoji|1F404}}

    🐷
    {{emoji|1F437}}

    🐖
    {{emoji|1F416}}

    🐗
    {{emoji|1F417}}

    🐽
    {{emoji|1F43D}}

    🐏
    {{emoji|1F40F}}

    🐑
    {{emoji|1F411}}

    🐐
    {{emoji|1F410}}

    🐪
    {{emoji|1F42A}}

    🐫
    {{emoji|1F42B}}

    🐘
    {{emoji|1F418}}

    🐭
    {{emoji|1F42D}}

    🐁
    {{emoji|1F401}}

    🐀
    {{emoji|1F400}}

    🐹
    {{emoji|1F439}}

    🐰
    {{emoji|1F430}}

    🐇
    {{emoji|1F407}}

    🐻
    {{emoji|1F43B}}

    🐨
    {{emoji|1F428}}

    🐼
    {{emoji|1F43C}}

    🐾
    {{emoji|1F43E}}

    🐔
    {{emoji|1F414}}

    🐓
    {{emoji|1F413}}

    🐣
    {{emoji|1F423}}

    🐤
    {{emoji|1F424}}

    🐥
    {{emoji|1F425}}

    🐦
    {{emoji|1F426}}

    🐧
    {{emoji|1F427}}

    🐸
    {{emoji|1F438}}

    🐊
    {{emoji|1F40A}}

    🐍
    {{emoji|1F40D}}

    🐢
    {{emoji|1F422}}

    🐲
    {{emoji|1F432}}

    🐉
    {{emoji|1F409}}

    🐳
    {{emoji|1F433}}

    🐋
    {{emoji|1F40B}}

    🐬
    {{emoji|1F42C}}

    🐟
    {{emoji|1F41F}}

    🐠
    {{emoji|1F420}}

    🐡
    {{emoji|1F421}}

    🐙
    {{emoji|1F419}}

    🐚
    {{emoji|1F41A}}

    🐌
    {{emoji|1F40C}}

    🐛
    {{emoji|1F41B}}

    🐜
    {{emoji|1F41C}}

    🐝
    {{emoji|1F41D}}

    🐞
    {{emoji|1F41E}}

    💩
    {{emoji|1F4A9}}

    👹
    {{emoji|1F479}}

    👺
    {{emoji|1F47A}}

    👻
    {{emoji|1F47B}}

    👽
    {{emoji|1F47D}}

    👾
    {{emoji|1F47E}}

    💐
    {{emoji|1F490}}

    🌸
    {{emoji|1F338}}

    💮
    {{emoji|1F4AE}}

    🌹
    {{emoji|1F339}}

    🌺
    {{emoji|1F33A}}

    🌻
    {{emoji|1F33B}}

    🌼
    {{emoji|1F33C}}

    🌷
    {{emoji|1F337}}

    🌱
    {{emoji|1F331}}

    🌲
    {{emoji|1F332}}

    🌳
    {{emoji|1F333}}

    🌴
    {{emoji|1F334}}

    🌵
    {{emoji|1F335}}

    🌾
    {{emoji|1F33E}}

    🌿
    {{emoji|1F33F}}

    🍀
    {{emoji|1F340}}

    🍁
    {{emoji|1F341}}

    🍂
    {{emoji|1F342}}

    🍃
    {{emoji|1F343}}

    🍇
    {{emoji|1F347}}

    🍈
    {{emoji|1F348}}

    🍉
    {{emoji|1F349}}

    🍊
    {{emoji|1F34A}}

    🍋
    {{emoji|1F34B}}

    🍌
    {{emoji|1F34C}}

    🍍
    {{emoji|1F34D}}

    🍎
    {{emoji|1F34E}}

    🍏
    {{emoji|1F34F}}

    🍐
    {{emoji|1F350}}

    🍑
    {{emoji|1F351}}

    🍒
    {{emoji|1F352}}

    🍓
    {{emoji|1F353}}

    🍅
    {{emoji|1F345}}

    🍆
    {{emoji|1F346}}

    🌽
    {{emoji|1F33D}}

    🍄
    {{emoji|1F344}}

    🌰
    {{emoji|1F330}}

    🍞
    {{emoji|1F35E}}

    🍖
    {{emoji|1F356}}

    🍗
    {{emoji|1F357}}

    🍔
    {{emoji|1F354}}

    🍟
    {{emoji|1F35F}}

    🍕
    {{emoji|1F355}}

    🍲
    {{emoji|1F372}}

    🍱
    {{emoji|1F371}}

    🍘
    {{emoji|1F358}}

    🍙
    {{emoji|1F359}}

    🍚
    {{emoji|1F35A}}

    🍛
    {{emoji|1F35B}}

    🍜
    {{emoji|1F35C}}

    🍝
    {{emoji|1F35D}}

    🍠
    {{emoji|1F360}}

    🍢
    {{emoji|1F362}}

    🍣
    {{emoji|1F363}}

    🍤
    {{emoji|1F364}}

    🍥
    {{emoji|1F365}}

    🍡
    {{emoji|1F361}}

    🍦
    {{emoji|1F366}}

    🍧
    {{emoji|1F367}}

    🍨
    {{emoji|1F368}}

    🍩
    {{emoji|1F369}}

    🍪
    {{emoji|1F36A}}

    🎂
    {{emoji|1F382}}

    🍰
    {{emoji|1F370}}

    🍫
    {{emoji|1F36B}}

    🍬
    {{emoji|1F36C}}

    🍭
    {{emoji|1F36D}}

    🍮
    {{emoji|1F36E}}

    🍯
    {{emoji|1F36F}}

    ☕
    {{emoji|2615}}

    🍵
    {{emoji|1F375}}

    🍶
    {{emoji|1F376}}

    🍷
    {{emoji|1F377}}

    🍸
    {{emoji|1F378}}

    🍹
    {{emoji|1F379}}

    🍺
    {{emoji|1F37A}}

    🍻
    {{emoji|1F37B}}

    🍼
    {{emoji|1F37C}}

    🍴
    {{emoji|1F374}}

    🍳
    {{emoji|1F373}}

    🌍
    {{emoji|1F30D}}

    🌎
    {{emoji|1F30E}}

    🌏
    {{emoji|1F30F}}

    🌐
    {{emoji|1F310}}

    🌋
    {{emoji|1F30B}}

    🗻
    {{emoji|1F5FB}}

    🏠
    {{emoji|1F3E0}}

    🏡
    {{emoji|1F3E1}}

    ⛪
    {{emoji|26EA}}

    🏢
    {{emoji|1F3E2}}

    🏣
    {{emoji|1F3E3}}

    🏤
    {{emoji|1F3E4}}

    🏥
    {{emoji|1F3E5}}

    🏦
    {{emoji|1F3E6}}

    🏨
    {{emoji|1F3E8}}

    🏩
    {{emoji|1F3E9}}

    🏪
    {{emoji|1F3EA}}

    🏫
    {{emoji|1F3EB}}

    🏬
    {{emoji|1F3EC}}

    🏭
    {{emoji|1F3ED}}

    🏯
    {{emoji|1F3EF}}

    🏰
    {{emoji|1F3F0}}

    💒
    {{emoji|1F492}}

    🗼
    {{emoji|1F5FC}}

    🗽
    {{emoji|1F5FD}}

    🗾
    {{emoji|1F5FE}}

    ⛲
    {{emoji|26F2}}

    ⛺
    {{emoji|26FA}}

    🌁
    {{emoji|1F301}}

    🌃
    {{emoji|1F303}}

    🌄
    {{emoji|1F304}}

    🌅
    {{emoji|1F305}}

    🌆
    {{emoji|1F306}}

    🌇
    {{emoji|1F307}}

    🌉
    {{emoji|1F309}}

    🌊
    {{emoji|1F30A}}

    ♨
    {{emoji|2668}}

    🗿
    {{emoji|1F5FF}}

    🌌
    {{emoji|1F30C}}

    🎠
    {{emoji|1F3A0}}

    🎡
    {{emoji|1F3A1}}

    🎢
    {{emoji|1F3A2}}

    💈
    {{emoji|1F488}}

    🎪
    {{emoji|1F3AA}}

    🎫
    {{emoji|1F3AB}}

    🎭
    {{emoji|1F3AD}}

    🎰
    {{emoji|1F3B0}}

    🚂
    {{emoji|1F682}}

    🚃
    {{emoji|1F683}}

    🚄
    {{emoji|1F684}}

    🚅
    {{emoji|1F685}}

    🚆
    {{emoji|1F686}}

    🚇
    {{emoji|1F687}}

    🚈
    {{emoji|1F688}}

    🚉
    {{emoji|1F689}}

    🚊
    {{emoji|1F68A}}

    🚝
    {{emoji|1F69D}}

    🚞
    {{emoji|1F69E}}

    🚋
    {{emoji|1F68B}}

    🚌
    {{emoji|1F68C}}

    🚍
    {{emoji|1F68D}}

    🚎
    {{emoji|1F68E}}

    🚏
    {{emoji|1F68F}}

    🚐
    {{emoji|1F690}}

    🚑
    {{emoji|1F691}}

    🚒
    {{emoji|1F692}}

    🚓
    {{emoji|1F693}}

    🚔
    {{emoji|1F694}}

    🚕
    {{emoji|1F695}}

    🚖
    {{emoji|1F696}}

    🚗
    {{emoji|1F697}}

    🚘
    {{emoji|1F698}}

    🚙
    {{emoji|1F699}}

    🚚
    {{emoji|1F69A}}

    🚛
    {{emoji|1F69B}}

    🚜
    {{emoji|1F69C}}

    🚲
    {{emoji|1F6B2}}

    🚳
    {{emoji|1F6B3}}

    🚴
    {{emoji|1F6B4}}

    🚵
    {{emoji|1F6B5}}

    ⛽
    {{emoji|26FD}}

    🚨
    {{emoji|1F6A8}}

    ⚓
    {{emoji|2693}}

    ⛵
    {{emoji|26F5}}

    🚣
    {{emoji|1F6A3}}

    🚤
    {{emoji|1F6A4}}

    🚢
    {{emoji|1F6A2}}

    ✈
    {{emoji|2708}}

    💺
    {{emoji|1F4BA}}

    🚁
    {{emoji|1F681}}

    🚟
    {{emoji|1F69F}}

    🚠
    {{emoji|1F6A0}}

    🚡
    {{emoji|1F6A1}}

    🚀
    {{emoji|1F680}}

    🏧
    {{emoji|1F3E7}}

    🚮
    {{emoji|1F6AE}}

    🚥
    {{emoji|1F6A5}}

    🚦
    {{emoji|1F6A6}}

    🚧
    {{emoji|1F6A7}}

    🚫
    {{emoji|1F6AB}}

    🚭
    {{emoji|1F6AD}}

    🚯
    {{emoji|1F6AF}}

    🚰
    {{emoji|1F6B0}}

    🚱
    {{emoji|1F6B1}}

    🚷
    {{emoji|1F6B7}}

    🚸
    {{emoji|1F6B8}}

    ♿
    {{emoji|267F}}

    🚹
    {{emoji|1F6B9}}

    🚺
    {{emoji|1F6BA}}

    🚻
    {{emoji|1F6BB}}

    🚼
    {{emoji|1F6BC}}

    🚾
    {{emoji|1F6BE}}

    🛂
    {{emoji|1F6C2}}

    🛃
    {{emoji|1F6C3}}

    🛄
    {{emoji|1F6C4}}

    🛅
    {{emoji|1F6C5}}

    ⚠
    {{emoji|26A0}}

    ⛔
    {{emoji|26D4}}

    🚪
    {{emoji|1F6AA}}

    🚽
    {{emoji|1F6BD}}

    🚿
    {{emoji|1F6BF}}

    🛀
    {{emoji|1F6C0}}

    🛁
    {{emoji|1F6C1}}

    ⌛
    {{emoji|231B}}

    ⏳
    {{emoji|23F3}}

    ⌚
    {{emoji|231A}}

    ⏰
    {{emoji|23F0}}

    🕛
    {{emoji|1F55B}}

    🕧
    {{emoji|1F567}}

    🕐
    {{emoji|1F550}}

    🕜
    {{emoji|1F55C}}

    🕑
    {{emoji|1F551}}

    🕝
    {{emoji|1F55D}}

    🕒
    {{emoji|1F552}}

    🕞
    {{emoji|1F55E}}

    🕓
    {{emoji|1F553}}

    🕟
    {{emoji|1F55F}}

    🕔
    {{emoji|1F554}}

    🕠
    {{emoji|1F560}}

    🕕
    {{emoji|1F555}}

    🕡
    {{emoji|1F561}}

    🕖
    {{emoji|1F556}}

    🕢
    {{emoji|1F562}}

    🕗
    {{emoji|1F557}}

    🕣
    {{emoji|1F563}}

    🕘
    {{emoji|1F558}}

    🕤
    {{emoji|1F564}}

    🕙
    {{emoji|1F559}}

    🕥
    {{emoji|1F565}}

    🕚
    {{emoji|1F55A}}

    🕦
    {{emoji|1F566}}

    ♈
    {{emoji|2648}}

    ♉
    {{emoji|2649}}

    ♊
    {{emoji|264A}}

    ♋
    {{emoji|264B}}

    ♌
    {{emoji|264C}}

    ♍
    {{emoji|264D}}

    ♎
    {{emoji|264E}}

    ♏
    {{emoji|264F}}

    ♐
    {{emoji|2650}}

    ♑
    {{emoji|2651}}

    ♒
    {{emoji|2652}}

    ♓
    {{emoji|2653}}

    ⛎
    {{emoji|26CE}}

    🌑
    {{emoji|1F311}}

    🌒
    {{emoji|1F312}}

    🌓
    {{emoji|1F313}}

    🌔
    {{emoji|1F314}}

    🌕
    {{emoji|1F315}}

    🌖
    {{emoji|1F316}}

    🌗
    {{emoji|1F317}}

    🌘
    {{emoji|1F318}}

    🌙
    {{emoji|1F319}}

    🌚
    {{emoji|1F31A}}

    🌛
    {{emoji|1F31B}}

    🌜
    {{emoji|1F31C}}

    ☀
    {{emoji|2600}}

    🌝
    {{emoji|1F31D}}

    🌞
    {{emoji|1F31E}}

    ☁
    {{emoji|2601}}

    ⛅
    {{emoji|26C5}}

    🌀
    {{emoji|1F300}}

    🌈
    {{emoji|1F308}}

    🌂
    {{emoji|1F302}}

    ☔
    {{emoji|2614}}

    ❄
    {{emoji|2744}}

    ⛄
    {{emoji|26C4}}

    🌟
    {{emoji|1F31F}}

    🌠
    {{emoji|1F320}}

    🎲
    {{emoji|1F3B2}}

    ♠
    {{emoji|2660}}

    ♥
    {{emoji|2665}}

    ♦
    {{emoji|2666}}

    ♣
    {{emoji|2663}}

    🃏
    {{emoji|1F0CF}}

    🀄
    {{emoji|1F004}}

    🎮
    {{emoji|1F3AE}}

    ⚽
    {{emoji|26BD}}

    ⚾
    {{emoji|26BE}}

    🏀
    {{emoji|1F3C0}}

    🏈
    {{emoji|1F3C8}}

    🏉
    {{emoji|1F3C9}}

    🎾
    {{emoji|1F3BE}}

    🎱
    {{emoji|1F3B1}}

    🎳
    {{emoji|1F3B3}}

    ⛳
    {{emoji|26F3}}

    🎣
    {{emoji|1F3A3}}

    🎽
    {{emoji|1F3BD}}

    🎿
    {{emoji|1F3BF}}

    🏂
    {{emoji|1F3C2}}

    🏄
    {{emoji|1F3C4}}

    🏇
    {{emoji|1F3C7}}

    🏊
    {{emoji|1F3CA}}

    🏆
    {{emoji|1F3C6}}

    🔇
    {{emoji|1F507}}

    🔈
    {{emoji|1F508}}

    🔉
    {{emoji|1F509}}

    🔊
    {{emoji|1F50A}}

    📢
    {{emoji|1F4E2}}

    📣
    {{emoji|1F4E3}}

    📯
    {{emoji|1F4EF}}

    🔔
    {{emoji|1F514}}

    🔕
    {{emoji|1F515}}

    🔀
    {{emoji|1F500}}

    🔁
    {{emoji|1F501}}

    🔂
    {{emoji|1F502}}

    ▶
    {{emoji|25B6}}

    ⏩
    {{emoji|23E9}}

    ◀
    {{emoji|25C0}}

    ⏪
    {{emoji|23EA}}

    🔼
    {{emoji|1F53C}}

    ⏫
    {{emoji|23EB}}

    🔽
    {{emoji|1F53D}}

    ⏬
    {{emoji|23EC}}

    🎼
    {{emoji|1F3BC}}

    🎵
    {{emoji|1F3B5}}

    🎶
    {{emoji|1F3B6}}

    🎤
    {{emoji|1F3A4}}

    🎧
    {{emoji|1F3A7}}

    🎷
    {{emoji|1F3B7}}

    🎸
    {{emoji|1F3B8}}

    🎹
    {{emoji|1F3B9}}

    🎺
    {{emoji|1F3BA}}

    🎻
    {{emoji|1F3BB}}

    📻
    {{emoji|1F4FB}}

    📱
    {{emoji|1F4F1}}

    📳
    {{emoji|1F4F3}}

    📴
    {{emoji|1F4F4}}

    📲
    {{emoji|1F4F2}}

    📵
    {{emoji|1F4F5}}

    ☎
    {{emoji|260E}}

    📞
    {{emoji|1F4DE}}

    🔟
    {{emoji|1F51F}}

    📶
    {{emoji|1F4F6}}

    📟
    {{emoji|1F4DF}}

    📠
    {{emoji|1F4E0}}

    🎥
    {{emoji|1F3A5}}

    🎦
    {{emoji|1F3A6}}

    🎬
    {{emoji|1F3AC}}

    📺
    {{emoji|1F4FA}}

    📷
    {{emoji|1F4F7}}

    📹
    {{emoji|1F4F9}}

    📼
    {{emoji|1F4FC}}

    🔅
    {{emoji|1F505}}

    🔆
    {{emoji|1F506}}

    🔍
    {{emoji|1F50D}}

    🔎
    {{emoji|1F50E}}

    🔬
    {{emoji|1F52C}}

    🔭
    {{emoji|1F52D}}

    🔥
    {{emoji|1F525}}

    💡
    {{emoji|1F4A1}}

    🔦
    {{emoji|1F526}}

    🏮
    {{emoji|1F3EE}}

    🎃
    {{emoji|1F383}}

    🎄
    {{emoji|1F384}}

    🎆
    {{emoji|1F386}}

    🎇
    {{emoji|1F387}}

    ✨
    {{emoji|2728}}

    🎈
    {{emoji|1F388}}

    🎉
    {{emoji|1F389}}

    🎊
    {{emoji|1F38A}}

    🎋
    {{emoji|1F38B}}

    🎌
    {{emoji|1F38C}}

    🎍
    {{emoji|1F38D}}

    🎎
    {{emoji|1F38E}}

    🎏
    {{emoji|1F38F}}

    🎐
    {{emoji|1F390}}

    🎑
    {{emoji|1F391}}

    🎓
    {{emoji|1F393}}

    🎨
    {{emoji|1F3A8}}

    🎯
    {{emoji|1F3AF}}

    🎴
    {{emoji|1F3B4}}

    🎀
    {{emoji|1F380}}

    🎁
    {{emoji|1F381}}

    📔
    {{emoji|1F4D4}}

    📕
    {{emoji|1F4D5}}

    📖
    {{emoji|1F4D6}}

    📗
    {{emoji|1F4D7}}

    📘
    {{emoji|1F4D8}}

    📙
    {{emoji|1F4D9}}

    📚
    {{emoji|1F4DA}}

    📓
    {{emoji|1F4D3}}

    📒
    {{emoji|1F4D2}}

    📃
    {{emoji|1F4C3}}

    📜
    {{emoji|1F4DC}}

    📄
    {{emoji|1F4C4}}

    📰
    {{emoji|1F4F0}}

    📑
    {{emoji|1F4D1}}

    🔖
    {{emoji|1F516}}

    💰
    {{emoji|1F4B0}}

    💴
    {{emoji|1F4B4}}

    💵
    {{emoji|1F4B5}}

    💶
    {{emoji|1F4B6}}

    💷
    {{emoji|1F4B7}}

    💸
    {{emoji|1F4B8}}

    💱
    {{emoji|1F4B1}}

    💲
    {{emoji|1F4B2}}

    💳
    {{emoji|1F4B3}}

    💹
    {{emoji|1F4B9}}

    ✉
    {{emoji|2709}}

    📧
    {{emoji|1F4E7}}

    📨
    {{emoji|1F4E8}}

    📩
    {{emoji|1F4E9}}

    📤
    {{emoji|1F4E4}}

    📥
    {{emoji|1F4E5}}

    📦
    {{emoji|1F4E6}}

    📫
    {{emoji|1F4EB}}

    📪
    {{emoji|1F4EA}}

    📬
    {{emoji|1F4EC}}

    📭
    {{emoji|1F4ED}}

    📮
    {{emoji|1F4EE}}

    ✏
    {{emoji|270F}}

    ✒
    {{emoji|2712}}

    📝
    {{emoji|1F4DD}}

    💻
    {{emoji|1F4BB}}

    💽
    {{emoji|1F4BD}}

    💾
    {{emoji|1F4BE}}

    💿
    {{emoji|1F4BF}}

    📀
    {{emoji|1F4C0}}

    💼
    {{emoji|1F4BC}}

    📁
    {{emoji|1F4C1}}

    📂
    {{emoji|1F4C2}}

    📅
    {{emoji|1F4C5}}

    📆
    {{emoji|1F4C6}}

    📇
    {{emoji|1F4C7}}

    📈
    {{emoji|1F4C8}}

    📉
    {{emoji|1F4C9}}

    📊
    {{emoji|1F4CA}}

    📋
    {{emoji|1F4CB}}

    📌
    {{emoji|1F4CC}}

    📍
    {{emoji|1F4CD}}

    📎
    {{emoji|1F4CE}}

    📏
    {{emoji|1F4CF}}

    📐
    {{emoji|1F4D0}}

    📛
    {{emoji|1F4DB}}

    ✂
    {{emoji|2702}}

    🔒
    {{emoji|1F512}}

    🔓
    {{emoji|1F513}}

    🔏
    {{emoji|1F50F}}

    🔐
    {{emoji|1F510}}

    🔑
    {{emoji|1F511}}

    🔨
    {{emoji|1F528}}

    🔧
    {{emoji|1F527}}

    🔩
    {{emoji|1F529}}

    💉
    {{emoji|1F489}}

    🔪
    {{emoji|1F52A}}

    🔫
    {{emoji|1F52B}}

    🎒
    {{emoji|1F392}}

    🔋
    {{emoji|1F50B}}

    🔌
    {{emoji|1F50C}}

    🔗
    {{emoji|1F517}}

    🚬
    {{emoji|1F6AC}}

    💄
    {{emoji|1F484}}

    💍
    {{emoji|1F48D}}

    💎
    {{emoji|1F48E}}

    🔮
    {{emoji|1F52E}}

    🔯
    {{emoji|1F52F}}

    🔱
    {{emoji|1F531}}

    💊
    {{emoji|1F48A}}

    🔰
    {{emoji|1F530}}

    💯
    {{emoji|1F4AF}}

    📡
    {{emoji|1F4E1}}

    🏁
    {{emoji|1F3C1}}

    🚩
    {{emoji|1F6A9}}

    ⬆
    {{emoji|2B06}}

    ↗
    {{emoji|2197}}

    ➡
    {{emoji|27A1}}

    ↘
    {{emoji|2198}}

    ⬇
    {{emoji|2B07}}

    ↙
    {{emoji|2199}}

    ⬅
    {{emoji|2B05}}

    ↖
    {{emoji|2196}}

    ↕
    {{emoji|2195}}

    ↔
    {{emoji|2194}}

    ↩
    {{emoji|21A9}}

    ↪
    {{emoji|21AA}}

    ⤴
    {{emoji|2934}}

    ⤵
    {{emoji|2935}}

    🔃
    {{emoji|1F503}}

    🔄
    {{emoji|1F504}}

    🔙
    {{emoji|1F519}}

    🔚
    {{emoji|1F51A}}

    🔛
    {{emoji|1F51B}}

    🔜
    {{emoji|1F51C}}

    🔝
    {{emoji|1F51D}}

    ♻
    {{emoji|267B}}

    ⚡
    {{emoji|26A1}}

    ⭐
    {{emoji|2B50}}

    ⭕
    {{emoji|2B55}}

    ✅
    {{emoji|2705}}

    ☑
    {{emoji|2611}}

    ✔
    {{emoji|2714}}

    ✖
    {{emoji|2716}}

    ❌
    {{emoji|274C}}

    ❎
    {{emoji|274E}}

    ➕
    {{emoji|2795}}

    ➖
    {{emoji|2796}}

    ➗
    {{emoji|2797}}

    ➰
    {{emoji|27B0}}

    ➿
    {{emoji|27BF}}

    〽
    {{emoji|303D}}

    ✳
    {{emoji|2733}}

    ✴
    {{emoji|2734}}

    ❇
    {{emoji|2747}}

    ▪
    {{emoji|25AA}}

    ▫
    {{emoji|25AB}}

    ◻
    {{emoji|25FB}}

    ◼
    {{emoji|25FC}}

    ◽
    {{emoji|25FD}}

    ◾
    {{emoji|25FE}}

    ⬛
    {{emoji|2B1B}}

    ⬜
    {{emoji|2B1C}}

    🔶
    {{emoji|1F536}}

    🔷
    {{emoji|1F537}}

    🔸
    {{emoji|1F538}}

    🔹
    {{emoji|1F539}}

    🔺
    {{emoji|1F53A}}

    🔻
    {{emoji|1F53B}}

    💠
    {{emoji|1F4A0}}

    🔘
    {{emoji|1F518}}

    🔲
    {{emoji|1F532}}

    🔳
    {{emoji|1F533}}

    ⚪
    {{emoji|26AA}}

    ⚫
    {{emoji|26AB}}

    🔴
    {{emoji|1F534}}

    🔵
    {{emoji|1F535}}

    ‼
    {{emoji|203C}}

    ⁉
    {{emoji|2049}}

    ❓
    {{emoji|2753}}

    ❔
    {{emoji|2754}}

    ❕
    {{emoji|2755}}

    ❗
    {{emoji|2757}}

    〰
    {{emoji|3030}}

    🔞
    {{emoji|1F51E}}

    ©
    {{emoji|00A9}}

    ®
    {{emoji|00AE}}

    🔠
    {{emoji|1F520}}

    🔡
    {{emoji|1F521}}

    🔢
    {{emoji|1F522}}

    🔣
    {{emoji|1F523}}

    🔤
    {{emoji|1F524}}

    🅰
    {{emoji|1F170}}

    🆎
    {{emoji|1F18E}}

    🅱
    {{emoji|1F171}}

    🆑
    {{emoji|1F191}}

    🆒
    {{emoji|1F192}}

    🆓
    {{emoji|1F193}}

    ℹ
    {{emoji|2139}}

    🆔
    {{emoji|1F194}}

    Ⓜ
    {{emoji|24C2}}

    🆕
    {{emoji|1F195}}

    🆖
    {{emoji|1F196}}

    🅾
    {{emoji|1F17E}}

    🆗
    {{emoji|1F197}}

    🅿
    {{emoji|1F17F}}

    🆘
    {{emoji|1F198}}

    ™
    {{emoji|2122}}

    🆙
    {{emoji|1F199}}

    🆚
    {{emoji|1F19A}}

    🈁
    {{emoji|1F201}}

    🈂
    {{emoji|1F202}}

    🈹
    {{emoji|1F239}}

    🉑
    {{emoji|1F251}}

    🈴
    {{emoji|1F234}}

    🈺
    {{emoji|1F23A}}

    🉐
    {{emoji|1F250}}

    🈯
    {{emoji|1F22F}}

    🈷
    {{emoji|1F237}}

    🈶
    {{emoji|1F236}}

    🈵
    {{emoji|1F235}}

    🈚
    {{emoji|1F21A}}

    🈸
    {{emoji|1F238}}

    ㊗
    {{emoji|3297}}

    🈲
    {{emoji|1F232}}

    ㊙
    {{emoji|3299}}

    🈳
    {{emoji|1F233}}

    Themes : |theme=one |theme=noto |theme=twitter |theme=fx |theme=none Sizes : |size=16 |size=24 |size=36
    See also : the {{Emoji}} template & the emoji list on Commons
    ^.^bBe..anyone 💩 08:54, 15 April 2016 (UTC)

    Tinyurl blacklisted

    hi, perhaps tinyurl.com shouldn't be a blacklisted domain, as it's used by the Query tool to generate short links? Linking to an example query is more complicated than it needs to be. Ambrosiani (talk) 15:53, 14 April 2016 (UTC)

    ːA better place to discuss that is probably at meta, since all blacklisting is done from there.̃ -- Innocent bystander (talk) 18:32, 14 April 2016 (UTC)

    Ok, thanks! Ambrosiani (talk) 19:18, 14 April 2016 (UTC)
    Actually, that page asks me to propose changing a project-specific whitelist, so perhaps this is the correct place after all.
    "Proposed removals
    Please check our list of requests which repeatedly get declined. Typically, we do not remove domains from the spam blacklist in response to site-owners' requests. Instead, we de-blacklist sites when trusted, high-volume editors request the use of blacklisted links because of their value in support of our projects. Please consider whether requesting whitelisting on a specific wiki for a specific use is more appropriate - that is very often the case."Ambrosiani (talk) 19:23, 14 April 2016 (UTC)
    Local whitlisting here on Wikidata is not a good idea at all. It may result in that blacklisted data is transferred to Wikipedia et al. -- Innocent bystander (talk) 06:59, 15 April 2016 (UTC)
    Whitelisting generic URL shorteners is a bad idea, as by this, all spam filters can be overridden (and links to malware pages etc. can easily be posted without the user having a real chance to notice it before clicking them). Instead, why doesn't the tool use own short URLs? --YMS (talk) 19:46, 14 April 2016 (UTC)
    Maybe because that would be more work. A generic WMF shortener for WMF sites could be helpful. 91.9.100.81 05:37, 15 April 2016 (UTC)
    To link from Wikidata to an example query no URL shorteners are needed. Just copy the full URL from the address bar and paste it here. This even requires less clicks than if you first create a short URL and then copy pasting that one. --Pasleim (talk) 07:50, 15 April 2016 (UTC)
    phabricator:T112715 is tracking switching to another shortener - ideally Wikimedia's own shortener. --Lydia Pintscher (WMDE) (talk) 09:33, 15 April 2016 (UTC)
    A wikimedia-specific shortener would be the perfect choice here I think. Thx everyone for enlightening me on the issue (short-term, I copied the full links). Ambrosiani (talk) 11:19, 15 April 2016 (UTC)

    Forking an item

    There are, for example, two versions of the painting The Last of England (Q958245); the one represented by that item, and another, in a different collection, for which there is no item yet. Are we near to having a tool which will make a copy of the existing item, so that an editor can then change the two or three differing parameter values? Is there an external tool that can do this? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:48, 14 April 2016 (UTC)

    In the mean time, see Help:Split an item. -- LaddΩ chat ;) 21:30, 14 April 2016 (UTC)
    Create subclasses! 91.9.100.81 05:39, 15 April 2016 (UTC)

    Magnus has kindly provided a script: User:Magnus Manske/duplicate item.js. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:25, 15 April 2016 (UTC)

    UI - identifiers at random places

    Why are some identifiers since weeks in section "identifiers", while many others are not? 91.9.118.86 20:57, 14 April 2016 (UTC)

    The discussion if a property qualifies as an identifier is happening place at User:Addshore/Identifiers and its subpages. --Pasleim (talk) 07:45, 15 April 2016 (UTC)
    Take a look at the responses I got, when I raised the issue, recently. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:27, 15 April 2016 (UTC)
    Since then, the number of properties with string data type dropped from 604 to 273! I might be now a good time to implement your proposal. Where would be a good place to do it? --Pasleim (talk) 12:16, 15 April 2016 (UTC)
    Thanks for the pointers. I would not like to edit in Userspace, and Andy's proposal looks fine - but : Special:ListProperties/string if expanded to show up to 500 shows 261 items. Looking at the very first, ISNI - does really anyone want to create a page with sections for cases like that? The sister ISIN is also listed there. There are 39 hits for " code", 57 for " id" and 24 for " number". Maybe these can be processed first / get a grouped approval? I created Wikidata:Conversion of datatype to external-id. Andy - would you agree that some bulk cleaning is done and then the page is used in the way you proposed? 91.9.100.81 15:44, 15 April 2016 (UTC)

    double interwiki link to EN-Wikibooks

    Hello, today I tried to make a double interwiki link from a Wikibooks page in Dutch to the English version of Wikibooks. In this case, there are two pages on en wikibooks which can both be linked. See https://nl.wikibooks.org/w/index.php?title=Engels/Grammatica/Voornaamwoorden&curid=29947&diff=303366&oldid=303365.

    However, only one interwiki link is visible on the page in Dutch. Could this be changed in some way? Thanks. Regards De Wikischim (talk) 08:31, 15 April 2016 (UTC)

    Well, it is technically possible, but you have to do some js/css-hacks into your wiki, to allow more than one interwiki per project. And you have to use other methods to get interwiki than (only) by the help of Wikidata-sitelinks, since a Wikidata-item only can provide one link to each project. -- Innocent bystander (talk) 12:02, 15 April 2016 (UTC)

    Foundational Model of Anatomy and Wikidata

    On homepage of the project they write: "The Foundational Model of Anatomy ontology (FMA) is OPEN SOURCE and available for general use." The Foundational Model of Anatomy ontology contains valuable data about how different anatomical concepts are related to each other (holonyms, meronym etc). Given that the FMA-IDs for anatomical concepts a widely available for Wikipedia there would be a high value of integrating the dataset into Wikidata.

    Unfortunately they don't specify a license on their website, so getting the data legally into Wikidata would mean asking the project owner for permission. Given that I'm a new user of Wikidata I don't think I'm in a good position to sent such a request.  – The preceding unsigned comment was added by ChristianKl (talk • contribs) at 11:38, 15 April 2016‎ (UTC).

    Apparently refers to [3]. The page at [4] includes a CC-by 3.0 licence statement. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:19, 15 April 2016 (UTC)

    How to donate an RDF file with etymological relationships to Wikidata

    Hello I am working on an IGE Wikimedia grant proposal (here) to create a database of etymological relationships between words (using data extracted from Wiktionary). I am using Dbnary as a tool to extract data from Wiktionary.

    Dbnary generates and RDF file (see below the first lines from the output of Dbnary when applied to a sample of Wiktionary data) and I am wondering: How can I "donate" this data to Wikidata? This information will be helpful to structure my grant proposal.

    Thanks!

    @prefix olia:  <http://purl.org/olia/olia.owl#> .
    @prefix lemon: <http://lemon-model.net/lemon#> .
    @prefix dbnary: <http://kaiko.getalp.org/dbnary#> .
    @prefix non-eng: <http://kaiko.getalp.org/dbnary/non/eng/> .
    @prefix rdf:   <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
    @prefix dcterms: <http://purl.org/dc/terms/> .
    @prefix lexvo: <http://lexvo.org/id/iso639-3/> .
    @prefix rdfs:  <http://www.w3.org/2000/01/rdf-schema#> .
    @prefix grc-eng: <http://kaiko.getalp.org/dbnary/grc/eng/> .
    @prefix lexinfo: <http://www.lexinfo.net/ontology/2.0/lexinfo#> .
    @prefix lat-eng: <http://kaiko.getalp.org/dbnary/lat/eng/> .
    @prefix eng:   <http://kaiko.getalp.org/dbnary/eng/> .
    
    eng:link  a              dbnary:Vocable ;
            dbnary:refersTo  eng:link__Verb__2 , eng:link__Noun__2 , eng:link__Verb__1 , eng:link__Noun__1 .
    
    eng:__ws_1_link__Noun__1
            a                   lemon:LexicalSense ;
            dbnary:senseNumber  "1"^^<http://www.w3.org/2001/XMLSchema#string> ;
            lemon:definition    [ lemon:value  "A connection between places, people, events, things, or ideas."@en ] .
    
    eng:link__Verb__1  a          lemon:LexicalEntry ;
            dbnary:partOfSpeech   "Verb" ;
            lemon:canonicalForm   [ lemon:writtenRep       "link"@en ;
                                    lexinfo:pronunciation  "/lɪŋk/"@en-fonipa
                                  ] ;
            lemon:language        "en" ;
            lemon:sense           eng:__ws_12_link__Verb__1 , eng:__ws_16_link__Verb__1 , eng:__ws_14_link__Verb__1 , eng:__ws_13_link__Verb__1 , eng:__ws_15_link__Verb__1 ;
            dcterms:language      lexvo:eng ;
            lexinfo:partOfSpeech  lexinfo:verb .
    Wikidata isn't prepared to accept dictionary relationships yet; please see WD:Wiktionary. --Izno (talk) 16:33, 15 April 2016 (UTC)

    Europeana Art History Challenge

    As some of you might already know, there's a major WP translation and WD improvement competition that launched today - with the support of Wikiproject "Sum of all Paintings" and also CEE Spring 2016. [thank you!].

    The Europeana Art History Challenge!

    This associated with the launch of Europeana's new art history channel, and consists of 10 high quality artworks submitted by the Ministries of Culture of 30 countries across Europe. 300 artworks x 40 languages = a lot of Wikipedia articles about notable artworks! Personally, I find the diversity of the selections really interesting - theres everything from contemporary Irish video-art to pre-historic Spanish cave-art with everything in between. Importantly, this project "lives" on Wikidata and improving the quality and language-coverage of the associated Wikidata items is equally part of the challenge. Points are awarded, prizes to be won, datasets to query and (hopefully) visualisations to produce!
    I believe this is actually the largest ever GLAM-related or Wikidata-based Wikimedia competition ever run, targeting over 10,000 language-artworks pairs. Head over to the homepage to investigate the works, the datasets, the participating countries, the FAQ page, and signup for yourself!
    Yours in art-history, Wittylama (talk) 16:12, 15 April 2016 (UTC)

    Adding qualifiers for quantity statement with quickstatements

    I tried to add a quantity statement with date and source qualifier using quickstatements, but the qualifiers were not added (QuickStatements added just the rating and did not state any error). The goal was to add Elo rating (P1087) of 3627 to player Q18653975 as of today and sourced from Go Ratings (Q23058744) The syntax I used is

    Q18653975 P1087 3627 P585 +2014-04-15T00:00:00Z/11 S248 Q23058744

    Did I make a mistake in the syntax or is it a similar problem to this? Thank you for your help. --Wesalius (talk) 13:30, 15 April 2016 (UTC)

    It is related to the problem you linked and it's partly fixed - if you include a + before the number, qualifiers should work (see Topic:Sz6qqjgee86vqv1w). - Nikki (talk) 13:57, 15 April 2016 (UTC)
    Thanks
    Q18653975 P1087 +3627 P585 +2014-04-15T00:00:00Z/11 S248 Q23058744
    does the job. Is it okay to use stated in (P248) for the source or should I use imported from Wikimedia project (P143)? Thank you. --Wesalius (talk) 14:04, 15 April 2016 (UTC)
    @Magnus Manske: Could you add a note to the documentation about the use of "+", as described by Nikki, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:17, 15 April 2016 (UTC)
    It states in the documentation that Quantity in the form of [+-]d*, but it would be good if the documentation would literally state that "+" or "-" sign is obligatory. --Wesalius (talk) 16:40, 15 April 2016 (UTC)
    Not everyone is fluent in regex... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:36, 15 April 2016 (UTC)
    On a similar note, how would one add units to those quantities in QuickStatements? Mahir256 (talk) 21:02, 15 April 2016 (UTC)
    As far as I know, quickstatements dont support adding units yet. --Wesalius (talk) 21:04, 15 April 2016 (UTC)

    Talk pages of constraint reports

    Just to let you know: Wikidata:Bot requests#Talk pages of constraint reports. --Succu (talk) 21:06, 15 April 2016 (UTC)

    Dealing with Spam

    Is there any sort of project to deal with spam items? By that I mean items that have no sitelinks, and are meant to promote a person, business or charity (etc). I've seen quite a few when cleaning up constraint violations, but it seems like it would be better to identify spam items directly. It seems the items to look for are ones that are created with no sitelinks, no uses as property values or used by other spam items (An item may be created for a company and its founder, say). I also suspect a lot are created by new editors as their first contribution. It seems that bots could be useful to identify likely spam items to check. Silverfish (talk) 23:06, 15 April 2016 (UTC)

    Some of them (although not all, I'm not sure what the exact criteria are) get added to User:Pasleim/notability. I'm not aware of anything else. - Nikki (talk) 06:40, 16 April 2016 (UTC)

    RateLimit

    Per this change, it is planned to introduce an edit rate limit for non-bot users on all Wikimedia projects. The background for this is that the bot flag once was introduced to allow some special accounts (namely the bots) to exceed this limit, though it apparently never actually existed. While this follow-up change already increases the number of allowed edits per time on Wikidata (10 edits per minute instead of 50 edits per 10 minutes), I seriously doubt this is enough given the nature of editing Wikidata. I just checked the edits of six randomly selected human users plus myself, and of those, everybody (!) exceeded that limit at least once in their latest 100 edits alone. If the change goes live like this, a large amount of users would need either sysop or higher status or the noratelimit right, or they would have to slow down their work. Personally I don't actually see the problem that is to be solved with these changes, but if they have to come, I believe the rate limit has to be much higher than proposed for Wikidata to still allow editing it in a reasonable way. Again pinging User:Lydia Pintscher (WMDE) in this issue, and this time also User:Jdforrester (WMF). --YMS (talk) 19:51, 13 April 2016 (UTC)

    While that limit is too low, I think it's quite reasonable for a rate limit to exist for non-bot/flood users. Perhaps something like 150 edits per 10 minutes would work. Anyone going quicker than that, at sufficient volume, should really be using the flood/bot flag in any case. --Yair rand (talk) 20:07, 13 April 2016 (UTC)
    (Checks ListGroupRights.) Apparently, flooders are not exempt from rate limits. This should be fixed, imo. Or at least the rate limits should be much higher for flooders. See also relevant discussion #Recent Changes flooding. Maybe the error message for going over the limits could include a link to the flood flag request form. --Yair rand (talk) 20:22, 13 April 2016 (UTC)
    That is less than what it currently is. What is it that you achieve in this way? Drive people away?? Thanks, GerardM (talk) 06:33, 14 April 2016 (UTC)
    150 edits every ten minutes is not less than 10 edits every minute. (Abandons thread in response to multiple accusations of bad faith within this thread.) --Yair rand (talk) 17:17, 14 April 2016 (UTC)
    Paris Tuileries Garden Facepalm statue.jpg
    This artificial limit is a clear sabotage of Wikidata upgrowth. This should not be done until either
    1. All users would be able to assign flooder rights to themselves at any moment
      or
    2. There is a special page for bulk edits (by bulk edits I mean something like native QuickStatements, which allows to submit multiple edits as a single one).
    Are you seriously going to throttle thousands of active Wikidata users just because you hope to watch RecentChanges manually? What kind of censorship is this?
    I would like to confirm whether registering as much new accounts as one need for comfortable and productive Wikidata editing does not break Wikidata:Alternate accounts rule. Would you like to see Lockal1, Lockal2, Lockal3, ..., Lockal100? I respect maxlag limitations, but other limitations are pure developer fiasco. There are hundreds of ways to improve Special:RecentChanges: grouping by user, grouping by edit types, excluding specific users, excluding editing of specific properties by user, adding custom AbuseFilter-like filters, excluding edits made by specific oauth tool, including/excluding edits with custom #hashtags, adding specific users to watchlist, etc, etc. But what do we have? Artificial rate limits? Seriously? --Lockal (talk) 09:01, 14 April 2016 (UTC)
    Please don't be hyperbolic. --Izno (talk) 13:21, 14 April 2016 (UTC)
    Wikidata seems like something of a special case, especially given our large use of semi-automated tools, as well as the fact that we cannot make a "batch edit". Is there a Phabricator task where we can raise our concern as an issue? --Izno (talk) 13:21, 14 April 2016 (UTC)
    Izno - YMS pointed to the phabricator tasks in the links posted at the start of this section. There is already a special case for wikidata, the question is whether the parameters being put forward (10 edits per minute) are the right ones or not. ArthurPSmith (talk) 13:32, 14 April 2016 (UTC)
    @ArthurPSmith: no, he pointed to Gerrit. Where is/are the Phab tasks documenting the change, if they even exist? --Izno (talk) 13:34, 14 April 2016 (UTC)
    Yes, I realized that right after I posted and was about to correct myself. Looks like it hasn't been discussed on phabricator at all as far as I can see. ArthurPSmith (talk) 13:37, 14 April 2016 (UTC)
    If I'm reading the code right, the limit on Wikidata would be 10 edits per second(!) and on all other wikis 50 edits per 10 minutes. With that I don't see a problem on Wikidata but I'm concerned about other wikis because it's possible for experienced user on Wikipedia to reach this limit. --Pasleim (talk) 13:43, 14 April 2016 (UTC)
    Yeah, someone trying to make a small change to a number of pages could easily reach 50 edits in 10 minutes on any project, especially if they're using any tools designed to help fix things. Looking at the recent changes on Commons, there were several users there just in the last few minutes alone who would be over the limit. I don't edit much on Commons but even I would have gone over the limit recently when I was fixing a spelling mistake in a category (142 edits in 10 minutes) and I came close just categorising images (39 edits in 4 minutes). - Nikki (talk) 17:50, 14 April 2016 (UTC)
    I wonder what that means for Commons, where I often do mass changes of categories, for example. (I wonder how this interacts with cat-a-lot.) This change seems to be actively harmful, if the limits are that low. --Srittau (talk) 21:36, 14 April 2016 (UTC)
    I discussed this with James at the hackathon in Israel and my understanding from that discussion is that it should not affect any regular editor here. As Pasleim says 10/s. If we actually run into issues let's look into them at that point. If there are valid cases where we want to raise the limit we can do that. --Lydia Pintscher (WMDE) (talk) 13:59, 14 April 2016 (UTC)
    Okay, seems I misread the change. 10 edits per second is not much of a limit. It's probably hard to reach even with tools, and then they are somewhat likely to make inefficient usage of the API. I still don't see the need for this in itself, but if it would prevent Wikidata from being affected from any rate limits set for other projects, then it's good. --YMS (talk) 14:19, 14 April 2016 (UTC)
    I'm surprised we are still discussing this nonsense, as that is what it is to me. I did close to one million edits last year without a bot, by using the tools like AutoList and Creator. I have suffered quite a lot from the limits that were set to AutoList, what made me start autolist in 30 browser tabs to get some jobs done within a reasonable time. Slowing me down to one edit per minute is not getting WikiData filled, so keeping WikiData like a blank sheet is not going to help Wikimedia. What problem are we trying to solve, other then "the recent changes are getting filled", because they get filled anyways, there is no (I repeat NO) sensible information that can be seen in the recent changes view. I also did about 1 million edits per month recently with my botbit, but not everyone has the knowledge and tools to use Python or PHP, and people that abuse tools like AutoList to make a mess will do so anyways, also with a limit on their edits.
    On the other hand, I see Lydia talking about 10 edits per second, if we can achieve that tools like PetScan/Autolist/etc are allowed to make 10 edits per second, it might be that we actually do a great favour to WikiData. It's something that should have been done 2 years ago already to my idea. Edoderoo (talk) 18:23, 16 April 2016 (UTC)

    Simplified process for identifiers

    Instead of completing a proposal form, one could directly create an item for the identifier. Once it's ready to be created, its content could just be copied to property namespace.
    --- Jura 15:07, 16 April 2016 (UTC)

    How is this simpler? It would fork discussion yet again, and pollute wikidata with unreal and temporary items. I don't think this is a good suggestion at all. ArthurPSmith (talk) 01:02, 17 April 2016 (UTC)

    Please review item about Hungarian short film

    Can someone who speaks Hungarian please review Blood and High Heels (Q19766009)? I’m fairly certain the inscription (P1684) qualifier on Blood and High Heels (Q19766009)original language of work (P364)  English (Q1860) is incorrect, and the original language of work (P364) and title (P1476) statements seem suspicious as well. —Galaktos (talk) 12:51, 17 April 2016 (UTC)

    The huwiki article says it's English-speaking with Hungarian subtitles. – Máté (talk) 13:10, 17 April 2016 (UTC)
    The value given for inscription (P1684), 'magyar' is Hungarian for 'Hungarian'. Inscription and subtitle (on a film) are both 'felirat' in Hungarian. That must have caused the confusion. – Máté (talk) 13:14, 17 April 2016 (UTC)
    Ah, that explains it. Thanks a lot! —Galaktos (talk) 13:38, 17 April 2016 (UTC)

    Server switch 2016

    The Wikimedia Foundation will be testing its newest data center in Dallas. This will make sure Wikipedia and the other Wikimedia wikis can stay online even after a disaster. To make sure everything is working, the Wikimedia Technology department needs to conduct a planned test. This test will show whether they can reliably switch from one data center to the other. It requires many teams to prepare for the test and to be available to fix any unexpected problems.

    They will switch all traffic to the new data center on Tuesday, 19 April.
    On Thursday, 21 April, they will switch back to the primary data center.

    Unfortunately, because of some limitations in MediaWiki, all editing must stop during those two switches. We apologize for this disruption, and we are working to minimize it in the future.

    You will be able to read, but not edit, all wikis for a short period of time.

    • You will not be able to edit for approximately 15 to 30 minutes on Tuesday, 19 April and Thursday, 21 April, starting at 14:00 UTC (15:00 BST, 16:00 CEST, 10:00 EDT, 07:00 PDT).

    If you try to edit or save during these times, you will see an error message. We hope that no edits will be lost during these minutes, but we can't guarantee it. If you see the error message, then please wait until everything is back to normal. Then you should be able to save your edit. But, we recommend that you make a copy of your changes first, just in case.

    Other effects:

    • Background jobs will be slower and some may be dropped.

    Red links might not be updated as quickly as normal. If you create an article that is already linked somewhere else, the link will stay red longer than usual. Some long-running scripts will have to be stopped.

    • There will be a code freeze for the week of 18 April.

    No non-essential code deployments will take place.

    This test was originally planned to take place on March 22. April 19th and 21st are the new dates. You can read the schedule at wikitech.wikimedia.org. They will post any changes on that schedule. There will be more notifications about this. Please share this information with your community. /User:Whatamidoing (WMF) (talk) 21:07, 17 April 2016 (UTC)

    Again: Qualifier to determine "last update"

    I'm referring to this discussion I brought up in March: Qualifier for "last update".

    In a nutshell, I want to use a qualifier to express when a statement was true and was checked last time. More concretely, I'm talking about the number of games and goals a soccer player has made/scored for a certain team; see Q6451050#P54 where point in time (P585) is used for that. In the linked discussion, it seemed to me that this usage of point in time (P585) is considered appropriate.

    However, now I'm deliberating whether retrieved (P813) would be the better property to use for this purpose? Or is retrieved (P813) for sources only? Yellowcard (talk) 09:29, 18 April 2016 (UTC)

    You may access data at a time (P813) when it is no longer true, however, it was at another (P585) (and is still worth keeping). (And its constraints do indeed indicate that P813 is a source-only property.) – Máté (talk) 10:12, 18 April 2016 (UTC)
    @Máté: Did you have a look at Q6451050#P54 (member of sports team (P54)Montreal Impact (Q167615))? I'm talking about data of a player with respect to a team where he still plays for. The number of games and goals increases week by week and the old number is for sure not worth keeping :) Having a look at the constraints of the property is a good idea. Yellowcard (talk) 10:22, 18 April 2016 (UTC)
    @Yellowcard: No, not in this case, and not in many others, but e.g. for population and many others it is. And I think it's better to use the same qualifier in both instances. – Máté (talk) 10:33, 18 April 2016 (UTC)
    Regarding population I totally agree with you, data like that must not be deleted. So you support using point in time (P585) for the soccer data case as well? Yellowcard (talk) 12:40, 18 April 2016 (UTC)
    Yes, I'd support that for football too. – Máté (talk) 13:12, 18 April 2016 (UTC)
    This use is perfectly fine. point in time (P585) hasn't been added to the list of the accepted qualifiers for member of sports team (P54) mainly because some users keep mistaking it for start time (P580) or end time (P582).--Casper Tinan (talk) 14:03, 18 April 2016 (UTC)
    Mmmm that's an awful constraint. This would contradict general wikidata rules and a local project should not be able to contradict the semantics of a qualifier that is available in any case. Like Denny (talkcontribslogs) said, the constraint system should not be an occasion to mimic very strong restrictions on what we can do. author  TomT0m / talk page 16:34, 18 April 2016 (UTC)

    Wikidata weekly summary #205

    wiki whatsapp group

    I stumble on a Wikipedia project that facilitate an interactions on whatsapp group Thats serves as search engine, it gives you response as you typed in your questions with wiki keyword But to my surprise is like the server for that facility is down now what happened  – The preceding unsigned comment was added by 197.211.53.12 (talk • contribs). 17:38, 10 April 2016‎ (UTC)

    Wikispecies relation between human and category for proposed taxa

    Hello everyone,

    do we have a property/system for linking Charles Darwin (Q1035) (species:Charles Robert Darwin) and species:Category:Charles Darwin taxa? Something similar to category for people born here (P1464) or category for recipients of this award (P2517). And we would also need something to link the category item to the human. -- T.seppelt (talk) 04:55, 19 April 2016 (UTC)

    No. It should be possible to put in a query for "taxon author = Charles Darwin" and generate a list. - Brya (talk) 06:00, 19 April 2016 (UTC)
    For linking categories back to normal items, we have category combines topics (P971). Looking at the examples there, I'd expect something like category combines topics (P971): taxon author, Charles Darwin (Q1035). Not sure if there's an item for taxon author yet, I can't find one. On a related note, should Category:Taxa by author (Q4155659) and Q9295491 be merged or are they not quite the same? - Nikki (talk) 06:41, 19 April 2016 (UTC)
    Yes. merged. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:48, 19 April 2016 (UTC)
    We have Commons category (P373); we should have an equivalent "Wikispecies category" property. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:48, 19 April 2016 (UTC)
    We should very strongly resist any further categorizing or strong linking of categories. --Izno (talk) 11:31, 19 April 2016 (UTC)

    I mistakenly added properties to "Identifiers" instead of "Statements"

    I have added "image" properties to a hundred items and now I realize that I have put them in the wrong section: I guess "image" properties belong to "Statements" but I just added them at the bottom of the list, which often belongs to the "Identifiers" section instead. I guess many other contributors have made this kind of mistake. Is there a bot in charge of fixing this kind of mistakes? Or should I go fix them manually? Thanks! Syced (talk) 08:42, 19 April 2016 (UTC)

    Just reload one of the pages and you will notice that it appears in the correct part. It doesn't matter which link you use.
    BTW WikiData Free Image Search Tool (Q21283216) is a convenient tool to find images and add them to Wikidata. (sample link: Try it!
    --- Jura 08:46, 19 April 2016 (UTC)
    However, this i a regression in the user interface. I as a user now have to decide which link I have to use, not knowing that it actually doesn't matter. And if the statement then ends up to be shown in the wrong section after adding it, I have to think about what I have done wrong and how to correct it, like Syced just did. There should be no unnecessary "Uh, I don't know where to click, maybe I better keep my hands off this" or "Oh my god, what did I break now?" moments when editing Wikidata. --YMS (talk) 08:52, 19 April 2016 (UTC)
    True, but I'm not quite sure what could be done to change that. Re-ordering the page after every addition would be highly annoying. As most of the time one just needs a link, there is a suggestion to place an additional link at the top of the page as well.
    --- Jura 09:03, 19 April 2016 (UTC)
    Jura: Wonderful tool! Would you happen to know if I can limit it to item whose latitude/longitude are in a particular area? Without using the "location" property which is missing from many items. The "Discuss" link there is broken. Thanks a lot! Syced (talk) 09:49, 19 April 2016 (UTC)
    The "Wikidata query" "create a query" link has a sample for items 15km around Cambridge: AROUND[625,52.205,0.119,15] . Discussion is mostly done on User talk:Magnus Manske.
    --- Jura 10:12, 19 April 2016 (UTC)

    What property to use for external databases?

    When we want a link to external databases we usually create a new property. But sometimes it seems not worth it. For instance,

    1. Neurosynth database http://neurosynth.org uses PubMed ID (P698) as an identifier, see, e.g., http://neurosynth.org/studies/16516496/ To link to the Neurosynth entry I have used external data available at (P1325) with the URL. Would there be other more appropriate ways to make a connection to Neurosynth?
    2. My article Mining the posterior cingulate: Segregation between memory and pain components (Q21012713) is also available in the university database [5], apparently with the identifier c5052ad6-7618-48b6-9132-7b2704a7b0b7. Is seems a bit overkill to have a property for each university repository.

    Finn Årup Nielsen (fnielsen) (talk) 13:53, 19 April 2016 (UTC)

    For #2 you can use full work available at (P953). --Succu (talk) 18:04, 19 April 2016 (UTC)

    "Official" vs. "national" languages of Switzerland

    As I'm talking about Switzerland, I remember another longstanding issue... the "official" and "national" languages. Officially, Switzerland has three official languages (German, French, Italian) and four national languages (German, French, Italian, Romansh). The difference being that Romansh (a small minority language), though officially designated a national language of Switzerland (and is used on the banknotes, for example), isn't a regular language in official publications and for correspondence with federal agencies, except for native Romansh speakers of the canton of Graubünden. Well, at Q39, there has been something like a very slow edit war over the years, and Romansh was added and removed as an "official language" (Property:P37) several times. Particularly Ludo29 was adamant in removing Romansh as an official language, and the topic has come up several times on the discussion page of Q39. I understand the reasoning of not listing Romansh as an "official language". However, it is a (so to speak: officially designated) national language, and Q39 contains also the name of the country in Romansh. I think that in some way, a relation between Switzerland and Q13199 (Romansh) should be maintained. But in what way, if not as an "official language"? Gestumblindi (talk) 22:09, 8 April 2016 (UTC)

    The Philippines also makes the same distinction between official and national languages. The official languages are English and Filipino, while Filipino is the sole national language. Although we don't have the edit war that CH's Wikidata item is having, I'm still interested in this discussion. If a different property is too much, maybe a qualifier would be better instead? —seav (talk) 02:30, 9 April 2016 (UTC)
    When Wikidata was young I added a number of languages to Q34#P37. The use of P31 as qualifier is probably not right, but it can give you an idea of how it might work. I think Swedish sign language also have some kind of official status, but I never added that. -- Innocent bystander (talk) 06:14, 9 April 2016 (UTC)
    I can see two approaches to this dilemma: Either introduce a new property and tighten the description of official language (P37), or use official language (P37) with P794 (P794) as qualifier. Personally, I always prefer separate properties for separate concepts, but the latter solution could be used until such a property exists. --Srittau (talk) 10:28, 9 April 2016 (UTC)
    « I always prefer separate properties for separate concepts » I'm agree with that. If the country said « Ours official languages are : DE, IT, FR » it's not the work of Wikidata to say différent.
    Romansh has a statute, it's not an official language. Creating a specific property ? Probably the best solution Ludo29 (talk) 17:32, 9 April 2016 (UTC)
    A problem may be the definition of Property:P37. It's English description is "language designated as official by this item", but by "official", various things may be meant. The German label says "Amtssprache", which more literally than as "official language" could be translated as "language of the office" - an "Amt" being a (government) office, agency, or bureau. Romansh is not an official language in that sense (except for the native Romansh speakers), as e.g. a person from German-speaking Switzerland can't insist on having an official correspondence in Romansh. But it is a national language, that means: officially recognized as a national language of Switzerland, together with the other three. That is why it also appears on banknotes and there is a Romansh version of the Swiss Government's website. Now, I also think there are two possible approaches, both involving a clearer definition of Property:P37. Either say: P37 is strictly for "Amtssprache", that is, languages that are approved for official correspondence, translations of law etc. Create a new property "national language". Or say: P37 is actually broader and applies to any kind of officially recognized language - then use it for Romansh, after all. Gestumblindi (talk) 18:17, 9 April 2016 (UTC)
    I do not think it works to add a new property for every system of "official language". We would then end up with 190 properties for national/official language, one for each country.
    In Sweden we have "official minority-languages". Some municipalities have to support Finnish, some have to support Meänkieli and some have to support Sami. No municipality have to support Yiddish or Romani, but they are still official minority-languages.
    In Norway they have different official written languages. -- Innocent bystander (talk) 20:02, 9 April 2016 (UTC)
    So, If we choose the second proposal of Gestumblindi, we have to choose a new name for P37. It can't stay official. Ludo29 (talk) 08:52, 10 April 2016 (UTC)
    Could we call it "officially recognized language", which would encompass "official" and "national" languages? Gestumblindi (talk) 23:28, 10 April 2016 (UTC)
    Imho, official has very strong meaning.
    We build wikidata not just for us. We build it also for external reuses. I think that if soemone see official for a propretry of a State, he'll think ok, it's a offcial language..
    So, if we have the wish to list languages - and not only official languages - of State, we have to choose an other word that official.
    So, not "officially recognized language", but "recognized language". Ludo29 (talk) 08:07, 11 April 2016 (UTC)
    Well, but recognized by whom? After all, Romansh is recognized by the Swiss constitution, it doesn't get much more official - but it happens to be not an "official language" in the sense of an Amtssprache; it seems that we're a bit stuck in that dilemma. For you, every wording containing "official" sounds too much like "language of the (federal) offices", but it seems to me that is not really the only meaning of official. Another idea if we stick with the notion of a new property: "Language officially recognized as a national language"? Gestumblindi (talk) 18:19, 13 April 2016 (UTC)
    by whom ?
    Romansh is recognized by Swiss constitution as a national language. Romansh is not recognized by Swiss constitution as an official language. So, we can't add a qualificater official about Romansh in the context of Swiss constitution. Ludo29 (talk) 10:41, 16 April 2016 (UTC)
    Well, as said before: Only with a narrow definition that equals "official language" for Wikidata purposes with what the Swiss constitution means by an official language, that is, an Amtssprache. But, maybe, as suggested by Innocent bystander ("I do not think it works to add a new property for every system of "official language". We would then end up with 190 properties for national/official language, one for each country"), the meaning of "official" in Property:P37 should be seen in a broader international context, including any kind of "officially" recognized language, including the national languages of Switzerland, and maybe, you could accept that too, if we clarify this with an appropriate description? Gestumblindi (talk) 22:32, 17 April 2016 (UTC)
    As I said before, Wikidata have external re-use, including Wikipedia. Official have a very strong meaning. We can't use official in Wikidata if that is to said an other thing. Ludo29 (talk) 09:37, 18 April 2016 (UTC)
    @Ludo29: In any case I think we can agree that it is an unacceptable situation that there currently is no connection between the items for Switzerland and for Romansh, a national language of Switzerland recognized by the Swiss constitution. A connection needs to be created - using a broadly defined Property:P37 or a new property. As you're so strictly opposed to using P37 resp. to any property containing the wording "official", we could create a property called "national language", but somehow I foresee confusion in an international context. Or we end up with a property that is only used for "national language" according to the definition of the Swiss constitution, not usable for any other country, a Switzerland-only property - also not quite an appealing prospect, wouldn't you say? Gestumblindi (talk) 22:18, 19 April 2016 (UTC)
    I think that the current situation is less bad that the other when Wikidata has presented Romansh with the same statut than German, French or italian.
    You said A connection needs to be created . Probably. But some others Wikidata contributors don't seem able to create a new property. Ludo29 (talk) 08:11, 20 April 2016 (UTC)

    Anthem

    anthem (P85) has similar problems. Sweden (Q34) has no anthem (or national hymn as the label tells in most languages). There is one song that is de facto used as anthem, but it has no official recognition in any law. In contrast EU has an anthem, even if EU isn't a nation. -- Innocent bystander (talk) 09:17, 11 April 2016 (UTC)

    Mass imports of data

    Is there a way that sets of property values can be set for many articles at once (after applying for approval, of course, and with pre-vetting by the community), perhaps by uploading a file containing those values in some format like CSV or JSONL, instead of poking them into the database one at a time with a bot? The sort of volume I'm thinking of is tens of thousands to hundreds of thousands of articles. -- The Anome (talk) 14:48, 19 April 2016 (UTC)

    I used QuickStatements https://tools.wmflabs.org/wikidata-todo/quick_statements.php to load up some hundreds of pages (Danish libraries). — Finn Årup Nielsen (fnielsen) (talk) 16:15, 19 April 2016 (UTC)
    I don't think is possible, all the tools and API can edit only one item at time, but if the data are very interesting and useful maybe developers (@Lydia Pintscher (WMDE):) can invent something. --ValterVB (talk) 16:37, 19 April 2016 (UTC)
    @The Anome: I second the suggestion to use QuickStatements - and with a bot flag, which you should have no trouble obtaining, it works more quickly. If it's good enough for Magnus... Shout if you need help. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:39, 19 April 2016 (UTC)
    Thanks! Yes, QuickStatements + bot automation sounds like the way to go. -- The Anome (talk) 07:56, 20 April 2016 (UTC)
    Wikidata:Data donation has links and all for this. --Lydia Pintscher (WMDE) (talk) 09:03, 20 April 2016 (UTC)

    Navigation menu broken?

    In the last day or so (?) the navigation menu on the left hand side of every wikidata page seems to have broken - the "Main page" link goes to Main_Page instead of WIkidata:Main_Page, the query service link seems to have vanished, and I'm sure there was a link to Project Chat also. What happened? ArthurPSmith (talk) 18:26, 19 April 2016 (UTC)

    Ha - and right after I posted this comment it seems to have fixed itself ??? ArthurPSmith (talk) 18:28, 19 April 2016 (UTC)
    Reported at WD:DEV#Sidebar as well but I haven't seen any problems. Matěj Suchánek (talk) 19:01, 19 April 2016 (UTC)

    SQID as a tool for editors

    A main goal of the new SQID Wikidata Browser was to help editors to find out about current property and class item usage, so I would like to announce it here as well. Besides satisfying your curiosity (e.g., to see which type string properties are most common in qualifiers), you can use it to scan for errors or misunderstandings. Here are some examples:

    I am sure there are many other applications. Another helpful use are the property pages, which show the qualifiers used for a property and several other statistics. For example, here is the property page for P969 (located at street address).

    Property usage numbers and class instance counts are updated once per hour (you may need to clear your browser cache sometimes). Individual entity pages show live data. First opening a page takes a few seconds to download the data, but browsing and filtering will be quite fast after that.

    --Markus Krötzsch (talk) 15:56, 20 April 2016 (UTC)

    Very nice, thanks! ArthurPSmith (talk) 20:09, 20 April 2016 (UTC)
    FYI, this tool makes connections to Google and JQuery. Dispenser (talk) 14:49, 21 April 2016 (UTC)

    Several updates and requests

    Hey all,

    1. With this commit in Kian repo, We can run Kian models on new items and I run it weekly on adding P31:Q5 for English, German, Dutch, French Wikipedia. I will add it for more Wikis/statements later. Suggest if you want kian running on a particular wiki/statement.
    2. I have been working on wikilabels. It's the component that powers Wikidata:Edit labels. First, I solved some underlying performance issues. And now, I added option of abandoning tasks from your workset, it needs some polishing but it'll be there soon. I have two requests: 1- What do you want from Wikilabels? what is the most annoying bug or missing feature? 2- Right now, ORES model for Wikidata is not the best. We need some human-based data and we are pretty close to reaching it (stats show 3483 out of 4283 edits have been labeled). Please check out Wikidata:Edit labels and label some edits.
    3. ORES is moving to the production cluster. Which means it'll be more stable, and also better performance, etc. but the most importantly, It will allow us to enable ORES extension in Wikis (including here). If you want to test the extension. Go to beta cluster, make an account (or login), enable it in beta features. Then you can check out recent changes, and your watchlist. You can change ORES sensitivity as well in your preferences. We are still working on moving to prod but the datacenter switchover disrupted our work a little bit. Also I would be more than happy to hear your comments, bug reports, feature requests.

    Best Amir (talk) 00:22, 21 April 2016 (UTC)

    PHP OOP project using wikidata syntax on Wikipédia API

    Hello everybody and cheers from France, I am willing to use wikidata on my internet site for personnal interest and I would like to know how I can get something to work. For the moment I can query the API no problem but I don't understand how to use the wikidata syntax in my project, Thanks a lot, djos  – The preceding unsigned comment was added by 128.78.173.38 (talk • contribs) at 23:12, 21 April 2016‎ (UTC) (UTC).

    Hi djos, the JSON structure we use for our entities is documented at mw:Wikibase/DataModel/JSON. Cheers, Hoo man (talk) 06:27, 22 April 2016 (UTC)

    Error message when merging

    When trying to merge Q4315333 into Q7022760, Special:MergeItems generates an error warning: "Failed to merge items, please resolve any conflicts first. Error: Conflicting descriptions for language pl." --46.147.155.38 06:01, 22 April 2016 (UTC)

    You need to remove the Polish description from one of the items, then you can continue with the merge. You can access the Polish description by clicking on "More languages" below the box with the labels, description and aliases. Cheers, Hoo man (talk) 06:28, 22 April 2016 (UTC)
    Thank you, Hoo man --46.147.155.38 06:38, 22 April 2016 (UTC)

    Historical place names

    Welcome to participate in WikiProject Historical Place! The project will deal with using Wikidata as a historical gazetteer, describing places and place names over time.

    The first task in to find out current practices. My very first question is: What properties are currently used to define historical place names, and which qualifiers? name in native language (P1559) is used only for people? --Susannaanas (talk) 13:39, 21 April 2016 (UTC)

    @Susannaanas: Use official name (P1448). Snipre (talk) 18:14, 21 April 2016 (UTC)
    @Snipre:Thank you for the recommendation! It helps half-way, but declaring unofficial names would also be needed. There is taxon common name (P1843) but that is limited to taxons. --Susannaanas (talk) 13:20, 22 April 2016 (UTC)
    official name (P1448) describes the official name in any language, and can be constrained by start time (P580) and end time (P582). native label (P1705) is used to describe the local name. This could be used to display the variety of naming conventions, but the purpose of the property does seem limited to one original local form. See the discussion about the property. If native label (P1705) cannot be used for different purposes, there is still a property missing. Susannaanas (talk) 08:21, 23 April 2016 (UTC)

    Cities on the move?

    Found this diff, but I'm not sure if this is the right way to do this. I believe the correct way to do this is to relate to the containing country, and then set up statements for that. It is the country that has been part of an union at state-level, not the city (or county). It is also a question on the users page about this. Jeblad (talk) 16:22, 21 April 2016 (UTC)

    I am also curious on the item Christiania (Q21711493), which seems to be the same thing but with another name. Actually, Oslo (Q585) has changed name several times. Viken, Áslo, Oslo, Christiania, Kristiania, and then Oslo again. I think all of this is a single ting with a long continuation, but with several names. I think that the item Christiania (Q21711493) should be deleted and the alternate names added to Oslo (Q585) with start time (P580) and end time (P582). Jeblad (talk) 16:52, 21 April 2016 (UTC)

    I have some doubts about using items like Union between Sweden and Norway (Q62589) for P17 at all. It should maybe rather be treated as a history of topic (P2184):from 1814-1905 in the item about Norway. This item is maybe not as obvious as Swedish Empire (Q215443) who in English is labelled as "Swedish empire" in English but as "The great power era" in Swedish. (Should that item therefor be split?) Using Sweden–Finland (Q3279296) for P17 would be even worse, since such a nation never has existed. It is a term used now for an era when the mid point of Sweden was located much more to the East than it is today. Finland was not even the name of the eastern part of the nation, it was the name of the area around Turku (Q38511).
    I have noticed the (K|Ch)ristiana-item. An interesting note is made in the nowiki-article. The city was on fire in 1624 and was rebuild on another location and then renamed.
    In a similar way, Karl Johan city (Q10543698) was founded 1812, but never became inhabited. The city was like Oslo moved and renamed to Haparanda (Q588976). Using two items for the same city can therefor be an alternative.
    And Åslo and Oslo maybe should be treated as different ways to spell one name? Many Swedish names were changed in the early 20th and late 19th century, but that was more of a change in the orthography of Swedish than real changes of the names. -- Innocent bystander (talk) 19:21, 21 April 2016 (UTC)
    Note that Medieval town of Oslo (Q11989332) and Old Town (Q4583418) is part of Gamle Oslo (Q1493178) which is part of Oslo (Q585). Jeblad (talk) 19:47, 21 April 2016 (UTC)
    Christiania (Q21711493) and Oslo (Q585) was the example I had in mind as well! I think that merging them into one would be the recommended way, for what I have been able to anticipate the discussion around the topic. I will pick it as an example for the discussion. Cheers, Susannaanas (talk) 14:03, 22 April 2016 (UTC)

    Citizenship

    I'm wondering about the use of citizenship as a data field instead of either nationality or residency, because citizenship works differently in different parts of the world. It may be sufficient to reside in a country legally, while elsewhere there may be more requirements. Cheers, The Jolly Bard (talk) 21:27, 21 April 2016 (UTC)

    @The Jolly Bard: You don't need to understand anything, you just need sources. Snipre (talk) 15:22, 22 April 2016 (UTC)
    Use country of citizenship (P27) for that. Bear in mind that, for example, I am a citizen of the UK, I consider my nationality to be English, and I could reside anywhere, without that changing - so all three values can be different. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:39, 22 April 2016 (UTC)
    For residency we have residence (P551). --Pasleim (talk) 08:43, 23 April 2016 (UTC)

    gsw - Alemannic/Swiss German in labels and descriptions?

    Possibly Harry Canyon has already brought up this issue somewhere here, but as I couldn't find a discussion... Well, there is a recent discussion in German-language Wikipedia dealing with the meaningfulness of offering Wikidata labels/descriptions in Swiss German (gsw). Some background:

    • One of the four national languages of Switzerland is German.
    • There is a Swiss variant of Standard German, called Schweizer Hochdeutsch, Schriftdeutsch, or Swiss Standard German in English. The IETF language tag for that is de-CH. Its differences to the Standard German of Germany (de-DE) and that of Austria (de-AT) are comparable to the differences between British English and American English (slightly different vocabulary and spelling, probably most notably the absence of the letter ß which is represented by ss).
    • There is also a variety of Alemannic German dialects in Switzerland that are grouped together as Schweizerdeutsch, Schwyzerdütsch, or Swiss German in English. The ISO 639-2 code for Swiss German is gsw, which however encompasses actually all Alemannic dialects including the Alsatian dialect. As Swiss German is no single dialect, but a group of Alemannic dialects, there are notable differences e.g. between Bernese Swiss German and Basel Swiss German. There isn't a "Standard Swiss German", so the same words are written in totally different ways by different people depending on dialect. For example, a simple word such as "window" ("Fenster" in Standard German) might be written as Faischter, Fentschier, Finschter, Föischter and in many more variants, and all of them are Alemannic, or Swiss German (gsw).
    • There is an Alemannic Wikipedia. Its community has decided that authors can write in their own dialect. So, Alemannic Wikipedia contains articles e.g. in the Alemannic dialect of Southwest Germany, but also in Alsatian, and in many variants of Swiss German.

    Wikidata offers Swiss German (gsw) for labels and descriptions. Given the above, I (as a person from Switzerland) think that doesn't make much sense. There is currently no Swiss German label and description for Q35473 (window) - what should we add? Personally, based upon my Swiss German dialect, I would write Fänschter, and translate the English description transparent or translucent opening maybe as en Öffnig, wo duursichtig isch oder öppis duureschimmere lood. My neighbour most probably would write it completely differently. My opinion therefore is:

    • Don't offer labelling/describing in generic gsw. Writing in many dialects may work in a Wikipedia, but I think it's a nightmare for a database such as Wikidata, where a label in gsw would be in a random, unpredictable dialect, and if a would prefer a different dialect of gsw, who says which one should be chosen?
    • It might, however, be possible and somewhat meaningful to offer Swiss Standard German (de-CH). For example, Fussball instead of Fußball, or Abwart instead of Hausmeister. But I'm not sure that it's really needed.

    Anyway, my proposal is to remove gsw from Wikidata for the time being. Is there some other, more official way than this page? Whou should be notified? And, of course, I would love to read reasons for keeping it here, and ideas on how to proceed in that case given the variety of dialects encompassed by gsw. To start, I will notify some users involved in the previous discussion on German-language Wikipedia. Gestumblindi (talk) 20:15, 8 April 2016 (UTC)

    Gestumblindi - Does the same issue apply to the 'als' language label (there is an 'als' wiki but no 'gsw' wiki, so are they the same thing or not quite?) One solution here without removing 'gsw' might be to use the "Also known as" labels for spellings in different dialects; that way searches will find the entries however the main label is spelled. If there could be a default (most common?) dialect selected for the main labels that would be nice but it sounds like that might not be an option here? ArthurPSmith (talk) 21:21, 8 April 2016 (UTC)
    ArthurPSmith Yes, they are the same thing, for all practical purposes. If I remember correctly, the Alemannic Wikipedia is using als instead of gsw as a label for historical reasons (because they started as an Alsatian Wikipedia) and probably because gsw, although officially encompassing all Alemannic dialects (including those from the neighbouring regions in Germany, France, and Austria), looks too much like "Switzerland-only" (German-Swiss). But Holder who is the expert for Alemannic Wikipedia can probably say more regarding that or may correct me (I already notified him of this thread). To complicate matters, als is actually the ISO 639-3 code for Tosk Albanian, but we don't have a Tosk Albanian Wikipedia as yet... - No, I don't see a "most common" dialect as an option here. There is no such Alemannic dialect. And apart from this, even if you stay with one dialect, people spell words differently. A person from Basel might feel inclined to spell "window" as Fänschter, but Fänster would also be possible, for example (the pronunciation in both cases being the same). So, it wouldn't help much to add "Also known as" for Basel German, Zürich German, Walliserdeutsch, Alsatian etc., because there is no fixed spelling for any of them (there are attempts at standardisation, but even e.g. the Alsatian Orthal isn't an attempt to create an unified Standard Alsatian, but a standardised way of describing regional variants). As I see it, the "Also known as" approach would result in a list of arbitrary spelling variants for a gsw entry (potentially dozens each) which still would be likely to not contain the actual spelling variant which a random user would choose to search for. Gestumblindi (talk) 21:46, 8 April 2016 (UTC)

    Dear all.

    Today the code gsw is the most important language code of Alemannic language (there are three other codes for some special variants of Alemannic). Alemannic Wikipedia has been created on als.wikipedia.org instead of gsw.wikipedia.org for historical reasons because there have been several revisions of the language code gsw. In the first revision gsw was the code for Swiss German. In 2003 als.wikipedia.org was created as Alsatian Wikipedia (Alsatian is part of Alemannic but no part of Swiss German) and in that year gsw didn’t include Alsatian. But after some revisions of Ethnologue now gsw includes also Alsatian and some other Alemannic variants (unfortunately not all of them).

    Since about ten years there are discussions on moving als.wikipedia.org to gsw.wikipedia.org. But as also Norwegian (Bokmal) Wikipedia has to be moved from no.wikipedia.org to nb.wikipedia.org there are huge technical difficulties that haven’t been solved. On translatewiki gsw was used for Alemannic from the beginning and also on Commons and meta gsw ist used for Alemannic translations. And also on Wikidata!

    So since about ten years we have this contradiction: als code in in the URL of Alemannic Wikipedia and gsw in the software for all Alemannic translations. So please don’t remove gsw from Wikidata!

    --Holder (talk) 04:06, 9 April 2016 (UTC)

    That a language does not have a well defined or consistent orthography (Q43091) does not look as a good reason to exclude them from Wikidata. Such a rule could potentially exclude many languages from Wikidata. Even some languages with a well defined orthography can have a set of words with more than one accepted spelling. Take for example a look in the Swedish (sv) spelling of Symbol support vote.svg Support in our templates here at Wikidata compared with those on Commons. Both are right, and we do not have to be consistent between two different projects. The Swedish language has two language academies, one in Sweden and one in Finland. I have proposed that Wikidata should support also the Finnish version (sv-fi). I would not oppose support for Estonian and Ukrainian Swedish, but I have not come across one single user who talks or writes those languages. -- Innocent bystander (talk) 06:07, 9 April 2016 (UTC)
    Innocent bystander: Swedish orthography is well defined, I think - there are variants, but all of them well defined, and one can choose one of them. Just like de-DE and de-CH - both the Standard German of Germany and Swiss Standard German are well defined; a person from Switzerland writes "Fussball" whereas German and Austrian people use the spelling "Fußball", no problem. The case of Alemannic dialects is different. Actually, to support gsw, Wikidata would need something like a three- to four-tiered structure for it, for example again "window" ("Fenster" in Standard German):
    • Alemannic (gsw)
      • Swiss German
        • Basel German
          • Basel German spelling variant 1: Fenschter
          • Basel German spelling variant 2: Fänschter
          • Basel German spelling variant 3: Fänster
          • Basel German spelling variant ...
        • Luzern German
          • Luzern German spelling variant 1: Fönschter
          • Luzern German spelling variant 2: Fünschter
          • Luzern German spelling variant 3: Faischter
          • Luzern German spelling variant 4: Feischter
          • Luzern German spelling variant ...
      • Alsatian
        • Mulhouse German
          • Mulhouse German spelling variant 1: Fënster [not sure about that, just as an example]
      • Vorarlberg German
    ... and so on. And you don't have an academy that defines spelling variants. Every Swiss person, when writing Swiss German, just writes as they think the word sounds in their dialect. Being from Switzerland myself, I don't think this is realistically compatible with Wikidata. And I would never actually look for anything here in Swiss German, as Swiss German is a widely spoken language that, however, is not usually written in a formal context (such as a knowledge database). The Alemannic Wikipedia is a project that didn't originate in Switzerland and, for a long time, it was hard for me to even understand its purpose. I see now its value as a cultural project especially for the regions of Alsace and southern Germany where the existence of Alemannic dialects is threatened (in the German-speaking part of Switzerland, all speak it everyday, but use it in writing only in informal context such as in mobile phone messages or personal e-mails). But trying to use a language/dialect with all its sub-dialects and a myriad of possible spellings for each word for a database such as Wikidata... can that work? Gestumblindi (talk) 13:42, 9 April 2016 (UTC)
    You know Gestumblindi, that is the same problem as writing encyclopedic articles on Alemannic Wikipedia. We have different dialects with several possible spellings for every dialect. And have a look at alswp: that works! It's a wiki. Everybody can write his/her own dialect with his/her own spelling. Why shouldn't it also work on Wikidata? Of course there will be different dialects with different spellings (not myriads but some). But what's the problem with that? --Holder (talk) 15:17, 9 April 2016 (UTC)
    Holder Well, yes, I think it's different... Alemannic Wikipedia's headwords (lemmata) are in Standard German, after all. For example, the article about television is als:Fernsehen. The title is displayed in an Alemannic dialect as "Färnsee" using the template {{Titel|Färnsee}}, but the actual article headwords are all maintained in Standard German. So, even Alemannic Wikipedia is using Standard German for the "technical backend", so to speak. And what else is Wikidata? A more sophisticated technical backend, a collection of structured data used for generating and linking content, that needs to contain predictable data following clear criteria. Just as you decided to use Standard German for the headwords in Alemannic Wikipedia - I suppose for practical purposes of feasability - we should look at what is really feasible in Wikidata. As I tried to outline above, we have basically two choices here: Either let gsw stand as it currently is (but, I think you would agree, rename it from Swiss German to Alemannic?), and let people just enter labels and descriptions in their respective dialect, arbitrarily. So, the "window" entry for gsw might contain either "Fenschter" or "Feischter" or whatever, and I assume we would simply stick with the first variant entered. I have to say that this approach doesn't really convince me. The other approach, with finely tuned attributions such as "Zürich German", "Alsatian" etc., would be very complex, and - at least it seems to me - of little practical use. Gestumblindi (talk) 15:56, 9 April 2016 (UTC)
    Gestumblindi, well, that's also different :-)). The headwords are written in Standard German because thus the articles are easier to find per search function. But what actually should be the practical use of such descriptions?
    Another point is: would you exclude all languages from Wikidata with different variants and without a well definied spelling because of this problem? Then most languages in the world would be excluded. --Holder (talk) 17:43, 9 April 2016 (UTC)
    Holder: I think (Wikidata experts, please correct me if necessary) that the purpose of Wikidata labels and descriptions is twofold: One use is helping users to find and identify items using their own language; the other, maybe, to generate "information units" such as infoboxes and the like in various projects, also outside of Wikimedia. For all these purposes, I think Alemannic is ill suited: For finding the item for "window" in Alemannic, we would need to add every likely variant from Fänschter to Föister. For identifying, we don't need any Alemannic at all, as there are no "Alemannic-only" speakers. Alemannic speakers read and write Standard German (or French) as well, so we can identify any item with a Standard German (or French) description. Finally, generating infoboxes etc. out of an arbitrary mix of Alemannic dialects, resulting in a "mixed dialect" that doesn't exist anywhere in reality, would be a bad idea too, IMHO. - Regarding your other point: I think we need to look at each language individually. For now, this discussion is about the specific circumstances of Alemannic and not about "all languages ... with different variants and without a well defined spelling". There might be other surrounding conditions that make it meaningful to use a language in Wikidata, but here I still don't really see the point, or simply: the feasibility. But, well - which approach would you support? How do you think that Alemannic should be used in Wikidata, if you think it's possible in a meaningful way? Rather a "stick with one variant" or "add many more variants" approach? Gestumblindi (talk) 18:03, 9 April 2016 (UTC)
    NB! The titles of the pages on alswiki can be found in the sitelinks, and you are free to use the de-labels on Wikidata in your templates on alswiki if you like. Any label in any language is possible to reach by the help of a Lua-module. The Lua-modules we have on svwiki has a fallback-support that allows any language to be displayed, not only English and the other Scandinavian languages. If the only label available is in Urdu, the Urdu label is displayed. -- Innocent bystander (talk) 19:50, 9 April 2016 (UTC)
    Gestumblindi As I understood, Wikdata was created to support small language projects. So if somebody wants to create infoboxes on Alemannic Wikipedia he can have a look after the used descriptions and correct them. But if Alemannic is removed from the languages here there is no possibility to create such infoboxes on als:wp. I think we could discuss this again when there are real plans for such automatically created templates on als:wp.
    To my impression there are very few people creating descriptions in Alemannic (gsw) here on Wikidata. So I think that in fact there is no problem with different spellings. Perhaps there could be such problems as you described in the future, but should we implement such unsatisfying solutions because there may be problems one day? --Holder (talk) 12:41, 10 April 2016 (UTC)
    Holder: I do not think that "people aren't really using it, and therefore we don't have any problems" is a really strong argument for keeping ;-) - it seems that, conversely, you'd concede that it would become problematic in its current form as soon as it would be more widely used? But the problems are already starting, every time one actually thinks of adding a "Swiss German" (Alemannic) label and description. After all, the starting point of this discussion was the actual question by Harry Canyon in the German-language WP discussion, where he asked for a translation for Wikidata: Wie schreibt man auf schweizerdeutsch „weiblicher Vorname“?, "How do you write female given name in Swiss German?" - and we can't answer that question. de:User:= (yes, = is his username ;-) ) replied, somewhat tongue-in-cheek: Schweizerdeutsch existiert nicht - "there is no Swiss German". Meaning that there is no unified Swiss German, of course. Harry could write wybliche Vorname, wiibliche Vorname, wibliche Vorname, wyblige Vorname, wyblige Voorname etc. Well, currently there still is no Swiss German label and description in Q11879590. Will you enter some? And then we just have to accept this as the given variant? Or have we to enter more? That's the question (one of the questions)... Gestumblindi (talk) 23:44, 10 April 2016 (UTC)
    Gestumblindi: Do you really think that discussions are a problem? I've added now a Alemannic description in Q11879590.
    The funny thing about this question is thar if you ask Swiss people how to write something in Swiss German they will always say "one can't write Swiss German", "Swiss German does not exist" or something like that. But nearly every German-speaking Swiss does write Swiss German in informal writings nearly every day ... --Holder (talk) 05:24, 11 April 2016 (UTC)
    Holder: Isn't that exactly the problem? "In informal writings", yes, but: Wikidata is formal. The most formal of all Wikimedia projects - structured data that needs to be predictable, I'd say. And no, of course I don't think that "discussions are a problem". What I tried to say was that this specific discussion shows us an example of the problem (there is no defined variant of spelling of wiibliche Vorname or the like, and the variant you entered is just one of lots of possible ones), and this is a problem that will continue with the present state of things. Gestumblindi (talk) 11:10, 11 April 2016 (UTC)
    Gestumblindi: You're confusing 'formal language = sociolinguistic register' and 'w:en:Formal language. If you think that only languages with a well defined standard should be used on Wikidata you exclude about 95 % of all languages in the world. Why shouldn't we try it also with Non-standard languages? What's the problem? In Wikipedia we discuss conflicting opinions and try to find a solution. Why shouldn't that work on Wikidata?
    Perhaps the real problem is that Wikidata tries to match natural language with formal language. But natural languages don't work that way. Even well defined standard languages are not fully "predictable". This applies only to constructed languages. --Holder (talk) 16:26, 11 April 2016 (UTC)
    Holder: Well, let's return to simple practicalities: Maybe Wikidata is indeed too inflexible by assuming that it's possible to use one well-defined variant for labels and descriptions in each language. But how to solve the problem? To repeat some questions, I would be interested in your answers: You have added the label "wyblige Vorname" and the description "e Vorname, wu Fraue gee wird" to Q11879590. Is your approach for future additions of gsw labels and descriptions "first come, first serve": The first variant entered stays, be it e.g. Walliserdütsch, Züridütsch, Badisch, or Alsatian? Or would you suggest some other approach? And if we keep gsw, should it continue to be called "Swiss German" here in Wikidata, or should it be changed to "Alemannic"? Gestumblindi (talk) 21:56, 11 April 2016 (UTC)
    Gestumblindi: "First com, first serve" is indeed the approach I would prefer. And of course, 'gsw' should be renamed to "Alemannic". Thanks and best regards. --Holder (talk) 03:47, 12 April 2016 (UTC)
    As I said above, it is technically possible to create a code for each dialect and use them all in one wiki together with the main "de" labels and descriptions. They can all be used in the infoboxes if you like. -- Innocent bystander (talk) 06:42, 12 April 2016 (UTC)
    Holder, Innocent bystander: Well, to have some first productive outcome of this discussion, let's change Swiss German to Alemannic, I'd say. Where is this information stored? I notice that already, when I visit e.g. Q11879590 with my language set to Alemannic, gsw is displayed as Alemannisch. When I set my language to English, it's displayed as Swiss German. And then, of course Innocent bystander's approach would be interesting. Are there any (semi-)standardized codes that could be used for e.g. Baseldytsch, Züridütsch etc.? Gestumblindi (talk) 21:12, 12 April 2016 (UTC)
    @Innocent bystander:: Sorry for pinging you again, but before this gets archived.... I think at least the participants of this discussion can agree upon displaying gsw as "Alemannic" (which includes Swiss German and other Alemannic dialects) instead of Swiss German. I think you are the Wikidata expert in this discussion... so... can you help us with that? What has to be changed where? Gestumblindi (talk) 22:36, 17 April 2016 (UTC)
    @Gestumblindi: From what I know, gsw is a ISO 639-2-code. See en:List of ISO 639-2 codes. And I do not think that it is a good idea to change those definitions. And if our software isn't consistent, we probably have to report it to our developers. We have at least one other project that has a similar problem. The language of nowiki is Bokmål, but the ISO-code of Bokmål is nb. To make it even worse, the nowikisource-project is not in Bokmål, it is in Norwegian which has the code no.
    I do not know if there are any standardized codes for Baseldytsch, Züridütsch etc. I do not even know if there are any standardized codes for Sweden-Swedish either, but standardized software uses sv-se for that and sv-fi for Finland-Swedish. To take that further, I could probably propose sv-ee and sv-ua without to many objections for Estonia- and Ukraine-Swedish. Such codes are not within our software yet, you have to propose them to our developers. @Adrian Heine (WMDE):. -- Innocent bystander (talk) 07:11, 18 April 2016 (UTC)
    @Innocent bystander: Well, "Alemannic" is, apparently, one of the official (alternate) names for the ISO 639-2 code gsw, see this entry at SIL and this one at Ethnologue. As Holder pointed out, it's a systematically rather murky situation (gsw originally being a code only for "Swiss German" and then officially expanded to nearly all, but not quite all of Alemannic dialects, still including Swiss German). Anyway, at least we need some consistency here: Currently, if the Wikidata interface is set to English, gsw is displayed as Swiss German, but when set to German, it's displayed as "Alemannisch". So, either (which I would prefer) change the English name to "Alemannic", too - or change the German name from "Alemannisch" to "Schweizerdeutsch". Gestumblindi (talk) 10:39, 18 April 2016 (UTC)

    ──────────────────────────────────────────────────────────────────────────────────────────────────── I was expecting that the translations of language names are coming from mw:Extension:CLDR which means from http://cldr.unicode.org/. However, on CLDR "Swiss German" gets correctly translated to "Schweizerdeutsch". Maybe Lydia Pintscher knows better where to change translation mistakes? --Pasleim (talk) 11:58, 18 April 2016 (UTC)

    I lost overview in this discussion, sorry. Can one of you summarize what is wrong? Then I can go talk to Adrian to figure out what needs to change. --Lydia Pintscher (WMDE) (talk) 09:04, 20 April 2016 (UTC)
    On Wikidata the language code gsw is used inconsistently. In English it's written out as "Swiss German" and in German as "Alemannisch". There is no consensus if "Swiss German" or "Alemannic" is the right name for gsw but because Alemannic (Q131339) is not exactly the same as Swiss German (Q387066) a decision should be taken and then used consistently independent of the UI language. --Pasleim (talk) 09:44, 20 April 2016 (UTC)
    Ok thanks. I just talked to Adrian about it. We take what MediaWiki gives us here. The settings are here: https://doc.wikimedia.org/mediawiki-core/master/php/Names_8php_source.html The reason for this seems to be that this is also used to display the language in the language list in the sidebar on Wikipedia and there people wanted to have Allemanisch. --Lydia Pintscher (WMDE) (talk) 14:50, 20 April 2016 (UTC)
    Then it should be Alemannic in English as well. Gestumblindi (talk) 19:40, 20 April 2016 (UTC)
    Also see a recent commons VP thread about NRM != NRF, what's going on, I was on the IANA language tag list years ago, contributed to the langtag BCPs, and the tags MUST NOT be moving targets. –Be..anyone 💩 13:51, 24 April 2016 (UTC)

    One page per property proposal?

    At the moment we have several pages that contain property proposals. When we archive proposals, the whole discussion gets copy&pasted over to the archive page, which is a bit cumbersome, error-prone, and loses the history. I would suggest that we move to a system, similar to what e.g. Commons uses for deletion requests: Each proposal (which can contain multiple related properties) is on its own sub-page of Wikidata:Property proposal. These proposals then get transcluded on the proposal category pages.

    Advantages:

    • It is possible to selectively add proposals to the watchlist and ignore proposals you are not interested in.
    • That also makes it easier to notice when a proposal that you are interested in is (not) done, even if you are not pinged.
    • The history of a proposal is much easier to read, without looking at the history of several unrelated proposals.
    • The history stays intact when a proposal gets archived. This is especially important for changes to the {{Property proposal}} template.
    • Archiving is easier. Instead of moving a block of text (and hopefully not cutting too much or not enough), you just need to move a single line.
    • Archiving can be done immediately when a property was created or the status was set to "not done", since watchlist and ping notifications still work. This reduces bloat on the proposal category pages.
    • No need to change the "proposed by" field when archiving a proposal (often forgotten). (added later --Srittau (talk) 19:36, 16 April 2016 (UTC))
    • A proposal can be added to multiple categories, e.g. "Persons" and "Authority control".

    Disadvantages: To be honest, I can't see any.

    Any opinions? --Srittau (talk) 14:07, 16 April 2016 (UTC)

    In principle I am in favour. - Brya (talk) 14:34, 16 April 2016 (UTC)
    • The main disadvantage is that one would need to bookmark every proposal to get an overview. Subpages lead to lost pages none is aware of and need to be maintained. It might be easier if the process for identifier is simplified and limited to one page. They seem to make up the bulk of the proposal and could easily done more quickly.
      --- Jura 14:42, 16 April 2016 (UTC)
      • I see disadvantage of needing to bookmark every proposal, although this only affects a few "power users", in my opinion. I think the majority of users (myself included) is not interested in changes to every proposal. I doubt that subpages will lead to lost pages and a maintenance problem. This hasn't been a problem on other projects. --Srittau (talk) 15:16, 16 April 2016 (UTC)
        • It's a problem with language requests at Meta.
          --- Jura 15:20, 16 April 2016 (UTC)
          • There is a recent mediawiki feature to follow all pages in a category isn't it ? Or to follow all pages linked in a page. Then we just have to maintain the page of open proposals and every changes will be followable in a single watchlist. author  TomT0m / talk page 16:54, 16 April 2016 (UTC)
            • Mmm no, it's phab:T3710 and it's still opened. I was probably confused by the fact we can now watch a category for addition/removal of a page. But Special:RecentChangesLinked is still usable. A bot or property proposers/closers could easily maintain a page with all opened properties. author  TomT0m / talk page 17:02, 16 April 2016 (UTC)

    ──────────────────────────────────────────────────────────────────────────────────────────────────── Actually there are two problems:

    • lost subpages (and subpages we don't know the status of)
      It's easy not to lost subpage by using a template for page creation which automatically add the relevant cat like "opened proposal" to new subpage. Example of such template is used by {{Participants}}. author  TomT0m / talk page 17:20, 17 April 2016 (UTC)
    • and comments on property proposal ..

    currently both are easy to monitor.
    --- Jura 23:05, 16 April 2016 (UTC)

    How would the naming system for the subpages look like? Matěj Suchánek (talk) 15:00, 16 April 2016 (UTC)
    I think something like Wikidata:Property proposal/2016/04/Great new property would work best. Maybe Wikidata:Property proposal/Great new property would suffice. This is what the German village pump uses. Great new property can be in any language, of course. In the end, the exact naming scheme is not super important, in my opinion. --Srittau (talk) 15:16, 16 April 2016 (UTC)
    Symbol support vote.svg Support Good idea, Srittau. Lymantria (talk) 17:08, 16 April 2016 (UTC)
    Symbol support vote.svg SupportT.seppelt (talk) 20:02, 16 April 2016 (UTC)
    • I commmented on the proposal above.
      --- Jura 23:05, 16 April 2016 (UTC)
      • Not very helpful. This makes the proposal much harder to read. --Srittau (talk) 23:36, 16 April 2016 (UTC)
        • I restored the original proposal. Your answers are now listed below. --Srittau (talk) 14:31, 17 April 2016 (UTC)
    Symbol support vote.svg Support --Srittau I agree completely this would make for a much better process. However this shouldn't only be presented on English project chat as some people will not see it and it may be lost from view too quickly. I recommend you formulate this with detailed proposals to be implemented as a Request for Comment in WD:RFC. ArthurPSmith (talk) 00:53, 17 April 2016 (UTC)
    @ArthurPSmith: That is a valid concern, but I had a look at a few recent RFCs. Of those only one was translated. Seeing as there is currently lots of support and little opposition to this proposal, do you think it would be enough if we would point people here from the most active village pumps, with a short translated version of the proposal? And if it turns out that opposition is higher than it is at the moment, start the RFC process? --Srittau (talk) 14:45, 17 April 2016 (UTC)
    @Srittau: whether or not an RFC is translated, it appears for users of every language who may visit the RFC page. But users with other default languages will not see the English Project chat page unless they specifically follow the link to it. We do get an awful lot of discussion on here compared to other languages so maybe it's sufficient, but I think an RfC would be a better way to handle a change like this. Note that WD:RFC also uses this process of creating a separate page for each proposed RfC, which is then linked from the table on that page. ArthurPSmith (talk) 13:50, 18 April 2016 (UTC)
    Just to be a little clearer also on details I think are still missing from what you've proposed:
    • Do we keep the current list of (16) property proposal categories (eg. People, Authority Control, Natural Science, etc), or change that also?
    • How would specific proposals be linked to these categories? You mention transclusion, but how does a category know a proposal belongs there?
    • For done or not-done proposals could we use an monthly archive page similar to Wikidata:Requests_for_permissions/RfBot/February_2016?
    • are some new templates needed for the overall proposal (and perhaps to indicate discussion has concluded?) as it may include multiple individual properties?
    ArthurPSmith (talk) 14:03, 18 April 2016 (UTC)
    • I would keep the current list of 16 property proposal categories. We can change that at another time if desired.
    • The property proposer has to add it to one or more categories. If they forget to do so a bot could add it to /unsorted or /new.
    A bot can do the transclusion based on a category link, e.g. Category:Property proposals/Event, which is automatically added when a user creates a new proposal from one of the category pages, e.g. Wikidata:Property proposal/Event. --Pasleim (talk) 17:34, 24 April 2016 (UTC)
    • could be useful to reorganize the archives for a better overview and archiving by bot. Also the link from the property talk page to the right archive could be added by the LUA module.
    • no new templates are needed. Property creation process will stay the same as now.--Pasleim (talk) 17:14, 24 April 2016 (UTC)
    Symbol support vote.svg Support don't think a RfC is needed for this change. --Pasleim (talk) 13:22, 17 April 2016 (UTC)
    Symbol support vote.svg Support Good idea. Overload of proposal site with closed proposals (probably caused by uneasy archiving) is real problem. Also ability to add single proposal to watchlist is real advantage. --Jklamo (talk) 13:50, 17 April 2016 (UTC)

    Jura1 replied to some of the points above inline. Since this makes the proposal much harder to read, I have extracted that part below and restored the original proposal above. --Srittau (talk) 14:31, 17 April 2016 (UTC)

    • The history of a proposal is much easier to read, without looking at the history of several unrelated proposals.
      It tends to loose context.
      --- Jura 23:05, 16 April 2016 (UTC)
      I don't know what you mean by that. --Srittau (talk) 23:36, 16 April 2016 (UTC)
    • The history stays intact when a proposal gets archived. This is especially important for changes to the {{Property proposal}} template.
      This shouldn't be an issue with the users currently archiving.
      --- Jura 23:05, 16 April 2016 (UTC)
      I don't understand your point. --Srittau (talk) 23:36, 16 April 2016 (UTC)
    • Archiving is easier. Instead of moving a block of text (and hopefully not cutting too much or not enough), you just need to move a single line.
      You could just set up the archiving bot instead. Beside all current users know what they were supposed to archive ..
      --- Jura 23:05, 16 April 2016 (UTC)
      That does not change the fact that archiving is cumbersome and error-prone. --Srittau (talk) 23:36, 16 April 2016 (UTC)
      What error could occur? I don't recall a problem with archiving at Wikidata:Bot requests. The manual version of flow you proposed is much more complicated.
      --- Jura 07:24, 17 April 2016 (UTC)
    • Archiving can be done immediately when a property was created or the status was set to "not done", since watchlist and ping notifications still work. This reduces bloat on the proposal category pages.
      This is really strange. First you want to re-write the property creation process to not archive immediately (1) and then you start removing people's comments when they attempt to discuss your new arguments brought up in your (premature) closing statements (2) .. now you want to set up a bot to archive immediately (3) ..
      --- Jura 23:05, 16 April 2016 (UTC)
      Your statements are wrong. It seems to me your opposition is not with the proposal, but with the proposer. --Srittau (talk) 23:36, 16 April 2016 (UTC)
      That is just a way of not discussing the argument about your back-and-forth. If you need diffs for (1) and (2), I'm happy to provide.
      --- Jura 07:24, 17 April 2016 (UTC)
    • No need to change the "proposed by" field when archiving a proposal (often forgotten). (added later --Srittau (talk) 19:36, 16 April 2016 (UTC))
      It was found that we could atttempt to calculate this.
      --- Jura 23:05, 16 April 2016 (UTC)
      I don't understand what you mean. --Srittau (talk) 23:36, 16 April 2016 (UTC)
      It's mentioned on the template documentation page.
      --- Jura 07:24, 17 April 2016 (UTC)
    • I'm working on a script that would add "watch" buttons to each subpage transcluded in this kind of system. If there are any other bits of JS that would be useful here, I'd be willing to write the code. --Yair rand (talk) 17:04, 17 April 2016 (UTC)

    IMDb links non functional

    The formatter URLs on IMDb ID (P345) don't work, and the links they generate give 404 errors. How should this be resolved? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:28, 20 April 2016 (UTC)

    Changed the default URL formatter to use the IMDB search feature. Works for most identifiers, except for events that use a year suffix (e.g. http://www.imdb.com/find?s=all&q=ev0000003/2015 fails, while the more specific http://www.imdb.com/event/ev0000003/2015 works). -- LaddΩ chat ;) 22:37, 20 April 2016 (UTC)
    Thank you. That's a clever work-around, but I wonder if there's a better solution. Maybe splitting the property? Better, complex, rules for FURLS? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:42, 22 April 2016 (UTC)
    Now the link don't work, the URL is http://www.imdb.com/find?s=all&q=name/nm0798083/, thats wrong. The correct address is http://www.imdb.com/name/nm0798083/. The amendment I have undone, because it works at least. In this version it is working properly, what's the problem? --Harry Canyon (talk) 15:22, 24 April 2016 (UTC)
    Because name/nm0798083/ isn't a valid string for IMDb ID (P345). The correct string is nm0798083 which didn't work with the old link formatter. Dispenser (talk) 16:46, 24 April 2016 (UTC)
    It actually depends if you have the AuthorityControl gadget installed or not. This gadget is changing nm0798083 to name/nm0798083/ and then appending it to the URL formatter. To get a working link for everybody we either need the development team to allow us some regex code in url formatters (which also would fix the link for a handful of other properties) or we could change all claims from nmXXX to name/nmXXX etc. --Pasleim (talk) 16:55, 24 April 2016 (UTC)
    If we're going to do something drastic like change every single claim, we might as well just split the property like people keep suggesting. The only argument I've seen against splitting it is that it would affect people using the property, but changing the format would do that too. - Nikki (talk) 17:04, 24 April 2016 (UTC)

    Sheriffs and police chiefs

    How do we list the police chiefs of police departments and sheriffs (in American usage) of counties in items? --AmaryllisGardener talk 23:29, 22 April 2016 (UTC)

    Does director/manager (P1037) in Police departments/district-items make sense? I have never seen an item about a Police district, so I would be glad if you could give me a link to one of them. -- Innocent bystander (talk) 18:42, 23 April 2016 (UTC)
    @Innocent bystander: Like Los Angeles Police Department (Q214126), and then director/manager (P1037) = Charlie Beck (Q5084512)? Sounds logical. --AmaryllisGardener talk 18:56, 23 April 2016 (UTC)

    Adding Drug-Target Information to Wikidata

    Hi,

    I work with the Dumontier Lab in the Biomedical Informatics Research Program at Stanford University and we are interested in incorporating drug-target information from DrugBank to Wikidata. Specifically, we would like to add to Andrew Su's work on importing drug/proten data. Also, because DrugBank contains information about drugs and their targets we would be able to add relational data to both items. Best-Oliver

    @Crowegian: Contact the Wikidata:WikiProject Molecular biology and their active bot user:ProteinBoxBot. Snipre (talk) 21:32, 24 April 2016 (UTC)
    But before any mass impotation from DrugBank, you should check your data: I found a dozen of wrong identifiers in the DrugBank database. Snipre (talk) 21:35, 24 April 2016 (UTC)
    @Snipre: Thank you for your response. I've posted this question on Wikidata:WikiProject Molecular biology. Should I also post on the user:ProteinBoxBot page as well? Also, in order to resolve incorrect identifiers we should be able to go through DrugBank and report any links to UniProt that don't work to them. Hopefully they'd be open to fixing them, if not then at least they know about them. Because Drugbank ids are already in some wikipedia information boxes we can make sure that the Drugbank uniprot ids of proteins that interact with a given drug match those in wikidata for the same protein.

    WikiProject Olympics

    I would like to announce that I've founded Wikidata:WikiProject Olympics, because of the Olympics being later this year. Let's improve Olympics-related items! --AmaryllisGardener talk 19:12, 24 April 2016 (UTC)

    About using P1435

    I need a help with filling heritage designation (P1435) for Latvian monuments.

    We have two characteristics:

    • monument value - national monument and local monument
    • monument type - archeology, architecture, history, industrial, art, urban, event

    How should we fill heritage designation (P1435), which approach is better:

    • create a tree of classes with leaves national/archeology, local/archeology, national/architecture etc. - 14 new items for classes and 9 new items for upper classes
    • or create 2 items for values and 7 items for types and use heritage designation (P1435) twice in the monument item

    What do you think? --Voll (talk) 17:55, 19 April 2016 (UTC)

    If there some monuments listed in more than one type category, the second approach would be hard to understand. -- Gymel (talk) 18:18, 19 April 2016 (UTC)
    I would use 'monument value' (national monument / local monument) to fill P1435. Probably the monument type (archeology, architecture, history, industrial, art, urban, event) can be derived from P31. Michiel1972 (talk) 18:29, 23 April 2016 (UTC)
    I see, but type for Latvian monument has official status in the heritage register (example), so we should store it in the Wikidata item without fail.
    Well, it seems that first approach is better, because we can build tree for any combination of parameters. --Voll (talk) 13:37, 25 April 2016 (UTC)

    Creating a multilingual list of missing Wikipedia articles for African World Heritage Day

    Hi all

    May 5th is African World Heritage Day and I would like to use Dynamic Wikidata Lists to create a multilingual list of articles that are missing for the World Heritage Sites in Africa. Unfortunately this requires me to be not to be terrible at Wikidata queries. What I'm looking for is a query that will give me all list of all the Wikidata items for World Heritage Sites in Africa, if anyone can help me out I'd really appreciate it.

    Thanks

    John Cummings (talk) 09:40, 25 April 2016 (UTC)

    Hm... this might work, as long as there aren't any in Ceuta or Melilla. - Nikki (talk) 09:57, 25 April 2016 (UTC)
    Including the coordinates (add ?item wdt:P625 ?coords .) will enable the map. I spotted Church of Nossa Senhora da Conceição da Muxima (Q5116934) which looks like it needs fixing, it's a mixture of somewhere in Costa Rica and somewhere in Angola. - Nikki (talk) 10:03, 25 April 2016 (UTC)


    Hi @Nikki:, thanks, it works in the query tool but just gives me an empty list in Dynamic Wikidata Lists. I did it for all the World Heritage sites using claim[1435:9259] but I don't know how to say only in Africa. John Cummings (talk) 10:15, 25 April 2016 (UTC)

    Ah, I didn't know that's what you were referring to. That tool seems to be broken right now, nothing I try works (@Magnus Manske:). The equivalent query using the WDQ syntax would be claim[1435:9259] and claim[17:( claim[30:15] and noclaim[30:46] )] I think (it works in AutoList at least :)).
    Template:Wikidata list can use either as input.
    --- Jura 10:41, 25 April 2016 (UTC)
    Thanks @Nikki: and @Jura1: its alive :) I'll keep adding languages but I think its a really nice example of generating a campaign using Wikidata queries. --John Cummings (talk) 10:43, 25 April 2016 (UTC)
    You have only 137 items in your list, so you can use such nice table for presentation: meta:Wikimedia_CEE_Spring_2016/Structure/Austria --Voll (talk) 11:55, 25 April 2016 (UTC)

    Adding exact integer quantities

    Adding some basic properties to Falcon 9 Full Thrust (Q22808999), I wanted to specify the number of engines: 9 Merlin 1D (Q19923939) and 1 Merlin 1D Vacuum (Q18646644), by editing powerplant (P516). However when I add the qualifier "quantity" and type the number 9, the system displays "9±1". How can I tell Wikidata that this is an exact value, not a measurement subject to an error margin? Existing items such as Falcon 9 v1.1 (Q15215794) have the exact quantity displayed correctly (although in the less precise property has part (P527), which I would gladly change if I knew how to save exact values). — JFG talk 18:10, 23 April 2016 (UTC)

    Add "9±0" or "9+-0". This is a known problem which is going to be fixed soon. Matěj Suchánek (talk) 18:29, 23 April 2016 (UTC)
    Good to know, thanks for the tip! — JFG talk 06:48, 26 April 2016 (UTC)

    Wikidata weekly summary #206

    Other sites

    How can I add something with the 'Other sites' box? The save button does nothing. Cheers, The Jolly Bard (talk) 20:00, 21 April 2016 (UTC)

    What is the "other sites" box? Where is it? --Stryn (talk) 20:21, 21 April 2016 (UTC)
    At the bottom, under the Wikipedia sites. The Jolly Bard (talk) 20:56, 21 April 2016 (UTC)
    What are you trying to add there? - Nikki (talk) 06:53, 22 April 2016 (UTC)
    Non-Wikimedia sites. When I enter a link the save button remains greyed out. The Jolly Bard (talk) 12:17, 22 April 2016 (UTC)
    You can't add non-wikimedia sites there. For non-wikimedia sites you need a property and then add it to the statement section. --Pasleim (talk) 13:10, 22 April 2016 (UTC)
    What then is the purpose of this field? How does one get a property for a non-wikimedia site? The Jolly Bard (talk) 13:18, 22 April 2016 (UTC)
    The purpose of the field is to add links to commons, mediawiki, meta wiki, etc. To which site would you like to link? Maybe there exists already a property for it, see list of properties. New properties can be proposed on WD:PP. --Pasleim (talk) 13:30, 22 April 2016 (UTC)
    In this case, Wikisage. The Jolly Bard (talk) 14:16, 22 April 2016 (UTC)
    You might be able to use described at URL (P973). Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:36, 22 April 2016 (UTC)

    Perhaps the form label should be changed to, say, "other Wikimedia sites" or "other sister projects"? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:34, 22 April 2016 (UTC)

    Second. I made this mistake as well when I started using Wikidata and I think the wording would be helpful. --Hardwigg (talk) 15:52, 24 April 2016 (UTC)
    @Lydia Pintscher (WMDE): Can this be done, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:56, 26 April 2016 (UTC)
    MediaWiki:Wikibase-sitelinks-specialDispenser (talk) 14:26, 26 April 2016 (UTC)
    Jep the change is best made there. I'd be hesitant to make this change in the software itself because that is also used on other installations that don't have anything to do with Wikimedia. So this is best done as a local change here. --Lydia Pintscher (WMDE) (talk) 15:35, 26 April 2016 (UTC)

    Can't create new items using PetScan

    I can no longer create new items based on Wikipedia categories using PetScan. It just hangs when I click on the green "process commands" button.

    I notice that not many new items have been created in the last few days, so I guess I am not alone. Danrok (talk) 11:45, 25 April 2016 (UTC)

    I had to update a series of bookmarks. There were some tweaks to the url. Generally, it works better when logging into widar before running the queries.
    --- Jura 12:12, 25 April 2016 (UTC)
    It seems that some databases are not in sync. I tend to get deleted or redirected items.
    --- Jura 12:14, 25 April 2016 (UTC)
    Maybe I'm simply lucky, but I was able to create items via petscan yesterday (evening, if it matters). Jura - maybe this is related. --Edgars2007 (talk) 12:16, 25 April 2016 (UTC)
    It worked for me too, but occasionally it hangs on a deleted article at Wikipedia. Obviously, it will end up hanging on the same article in the next run as well ;)
    --- Jura 12:22, 25 April 2016 (UTC)
    One additional note. Sometimes it hangs up also for me, but pressing "stop" (or whatever the red button was called) and then "process commands" usually helps. --Edgars2007 (talk) 12:56, 25 April 2016 (UTC)
    Well, I have got it to work now. The items it had selected were user pages (in the year/births categories), not articles. So, I set the namespace option to articles under the page properties tab. Danrok (talk) 14:14, 25 April 2016 (UTC)

    If this is still an issue, please post a non-working example to the bugtracker. --Magnus Manske (talk) 14:20, 26 April 2016 (UTC)

    Label, description, aliases as properties

    Why not having labels, descriptions, aliases of an entity as properties/claims? It would be much easier to perform all queries.

    We didn't have support for all of the types of properties which would have enabled this. This functionality would also enable referencing of names and aliases. --Izno (talk) 20:22, 25 April 2016 (UTC)
    I think there should be a "name" property that can be sourced, deprecated, etc., as every other statement. But a single label and description is needed beyond that name property in order to effectively choose a way to identify an item in the UI, i.e. to allow to display it, etc. That is why labels and descriptions are outside of statements - it is mostly for the UI of Wikidata. Similar for aliases, which have the sole goal of improving search. As said, it would make a lot of sense to have properties and statements for names in order to be able to source them, but giving up on the label/description pair would undermine how the system works. --Denny (talk) 17:53, 26 April 2016 (UTC)
    I personally fantasise about labels/descriptions/aliases being derived from statements as much as possible, so that we're not constantly updating the same information in multiple places (or, as is often the case, not updating it in all places and having it get out of sync). E.g. Have a way to mark a statement as the preferred label for a language (and automatically keep the label in sync with that property), generate descriptions from statements (and update them when statements change) instead of relying on an army of bots (which don't often keep them in sync with the statements), have a way to list which properties should count as aliases (strings would be aliases for all languages, monolingual text would probably only be an alias for the language of the statement). It wouldn't mean giving up on the label/description/alias system, it would just emphasise storing data as statements (which benefit every language, unlike labels/descriptions/aliases). It would all probably be quite a lot of work, but I can dream. :P - Nikki (talk) 07:57, 27 April 2016 (UTC)
    But remember that there are weird languages, where automatically getting description (that makes sense) for item wouldn't be an easy task. But I would support the idea itself, at least for English and other big languages, so that items have at least one description in central language - English (I know, it's very POVish statement, but I hope you agree, that English is central language). --Edgars2007 (talk) 09:17, 27 April 2016 (UTC)
    Yeah, that's part of what I mean by it being a lot of work. I think there is still a lot of potential though, because we have a lot of things which can all be described in the same way (e.g. over 38,000 American male politicians, over 60,000 villages in India). If we could make it work, even just partly, it could be amazing for smaller languages (many of which don't even have that many descriptions in total). I wouldn't expect automatic descriptions to completely replace manual ones either - there are certainly cases (even in English) where a custom description is more useful. - Nikki (talk) 10:59, 27 April 2016 (UTC)

    Tool for API call "replacement"

    Those who are active at WD:PP/AUTH maybe have noticed, that I'm flooding it with boring sports-related properties for Template:Sports links (Q22674492) (Lua code). I have one idea, that probably could be pretty useful for other use-cases and users. So the idea is to have a link in template (output, of course), that user can press and a statement (in my case - external identifier with value) gets added to the WD item. It's possible with API call, but it requires token, which as I understand can't be put in Lua, right? So I was thinking about some tool (in Tools Labs), which could be used as replacement for API call or do the token part. Why it would be useful? For users, who are not very familiar with Wikidata. And it is more easier than statement adding directly (because statement validation at least in my use-case would be done in Lua - is the value OK itself, is there that property already at WD etc.) Ideas? --Edgars2007 (talk) 09:10, 27 April 2016 (UTC)

    Something similar already exists with Wikipedia:WE-Framework (Q22946134). It's a gadget which allows you to change Wikidata claims directly from Wikipedia. --Pasleim (talk) 09:56, 27 April 2016 (UTC)
    Yes, I'm using it for some while. Of course, it is cool, but not exactly, what I could use in this case. --Edgars2007 (talk) 10:03, 27 April 2016 (UTC)
    Such tools (do an edit with a single click without tokens) are bad in terms of security and can be easily abused by zero-sized iframes or direct links. However there is nothing bad in edit with 2 clicks tools. Autolist2 is one of them: for example go to https://tools.wmflabs.org/autolist/index.php?manual_list=Q4115189&statementlist=P31%3AQ5&run=Run and press "Process commands" to add P31:Q5 to Wikidata Sandbox item. --Lockal (talk) 11:49, 27 April 2016 (UTC)

    gadget based on union of and disjoint union of

    After creating a module to generate queries to spot items that could be classified in a more specific class based on union of (P2737) View with SQID and disjoint union of (P2738) View with SQID, examples :