Shortcut: WD:PC

Wikidata:Project chat

From Wikidata
(Redirected from Wikidata:PC)
Jump to: navigation, search
Wikidata project chat
Place used to discuss any and all aspects of Wikidata: the project itself, policy and proposals, individual data items, technical issues, etc.
Please take a look at the frequently asked questions to see if your question has already been answered.
Also see status updates to keep up-to-date on important things around Wikidata.
Requests for deletions can be made here.
Merging instructions can be found here.

IRC channel: #wikidata connect
On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2015/11.





for permissions

for deletions


for deletion

for comment


Category hierrachy[edit]

Helllo I need a category hierarchy of wikipedia more specifically I need a category tree .Can I get that through wikidata ? If yes could you please tell how to get it .

Has making links just got harder?[edit]

I recently created a new page on en:Wikipedia and when I tried to link to to the German page in the way I was used to, I suddenly was confronted with all sorts of interrogation pages. After spending a while puzzling over why I was looking at all this stuff, I eventually worked out how to make the link. But I must admit that I think its still takes longer than it used to. Did I wonder into unfamiliar ground, or has the process been changed? Leutha (talk) 17:31, 16 November 2015 (UTC)

Both. It all depends on your personal definition of "recently". --Jane023 (talk) 09:38, 20 November 2015 (UTC)

Overlapping administrative territorial entities[edit]

Hi, I've noticed a worrying pattern in Australia, but I think it's probably one that people have run into all over the world. Hopefully someone can shed some light on the problem for me.

I've noticed that Melbourne is used as the value for “located in the administrative territorial entity” for ~100 items in the region. By contract, the City of Melbourne is only used for a handful.

An “administrative territorial entity” is defined as a “territorial entity for administration purposes, with or without its own regional government”. But “Melbourne” as an entity isn't used by anyone as an administrative region: Australia is divided into states, and then into local government areas. But at the same time, Melbourne is an instance of city, which is a subclass of administrative territorial entity.

So we have overlapping entities for the same territory, even though in Australia there really isn't any sort of overlap when it comes to having responsibility for administering an area.

Has anyone dealt with this in other places? Is there a way to mark things inside the boundaries of Melbourne as located there geographically, but not administratively? --Atroceh (talk) 01:48, 17 November 2015 (UTC)

I guess “Melbourne” has man made borders, and that is enough to describe it as an "administrative territoral entity" in the Wikidata perpective. To what extent those borders fullfill an administrative purpose is of less importance.
Take administrative entities in contrast to islands which have borders created by God/Nature or however you prefer to describe it. We have other properties for those kinds of entities. -- Innocent bystander (talk) 07:48, 17 November 2015 (UTC)
I don't think Melbourne has man made borders. It's a large settlement covering multiple administrative divisions while not being an administrative division as a whole. - Nikki (talk) 09:01, 17 November 2015 (UTC)
The borders are maybe not well defined, but my opinion is that they still are man made. They are not a creation of nature. Ethnic regions like Sapmi also fails to have well defined borders, but the area is (by man) more or less defined as the area where the semi-nomadic sami-people had their home.
That an entity cover "multiple administrative divisions" is also often true for a typical "administrative entity". In Q34 the municipalities are today divided in districts. The common rule is that they are inside one municipality only, but it is not difficult to find exceptions from that rule. Boteå district in the neighbour municipality also cover one building and its surrounding area in the municipality where I live. -- Innocent bystander (talk) 10:59, 17 November 2015 (UTC)
There's location (P276) for locations in general. It seems wrong to me that city is a subclass of administrative territorial entity. The meaning of "city" varies from country to country (and even within countries) - that's why we have a number of more specific "city of (country name here)" items for the more formally defined types of cities. - Nikki (talk) 09:01, 17 November 2015 (UTC)
The word city in Swedish (stad) in itself has many different meanings. One is "populated place", and such have very vaguely defined borders, like Melbourne above. A "municipality of stad-type" (which there are several only in Sweden) definitely have administrative borders. "Urban areas" have man made borders, but they only have statistical purpose and are unknown for those who do not have access to the maps which define them. -- Innocent bystander (talk) 10:59, 17 November 2015 (UTC)

Some more data points:

As discussed at en:Glasgow and en:Greater Glasgow, the conurbation around Glasgow in some directions extends well beyond the boundaries of the council area. However for the most part, the council area does seem a useful delimitation for maybe what people think of as the boundary of the city -- compare the two iterations of c:File:Glasgowareas.jpg, with the one restricted to the council area currently being used on the en-wiki article. (Or at least its current boundary: the administrative area has shrunk compared to the earlier 20th century). So it doesn't seem unreasonable to me to equate Glasgow with its council area, since essentially this is how people have been using P131 anyway.

Belfast is currently the opposite, with two different items, even though it seems people are mostly using the 'city' item Belfast (Q10686) as the target of P131s -- and the en-wiki article en:Subdivisions of Belfast although presented as about the city seems to identify this mostly with the council area. The situation may be complicated because the council area has recently (for the elections this year) been enlarged; I haven't checked whether that en-wiki article (and articles like en:Electoral wards of Belfast) reflect the old boundary or the new. We also have Belfast Metropolitan Area (Q769573) / en:Belfast_Metropolitan_Area for the wider metro area.

We seem quite happy (both here and at en-wiki) to identify Manchester with the borough council area -- even though the built-up area extends continuously around it (c:File:Map_of_Manchester.png) -- perhaps because Salford and Stockport either side of it are well-defined entities, and also because Greater Manchester (Q23099) exists, as a well-defined administrative area in its own right, for the wider area. (c:File:Greater_Manchester_County_(2).png)

As regards London, the striking thing to me is that 343 returns for P131 seems quite few. Yes there are only 66 returns for Query: CLAIM[131:23306] (Greater London); but compare it with 3538 returns for one of the London boroughs CLAIM[131:(CLAIM[31:211690])]. That probably reflects that London is divided up into well-defined and well-known administrative subunits; and also that there is a distinction between London and Greater London, in that Greater London does include areas that are perceived as distinct from London itself.

Turning to smaller cities in the UK, currently we make no distinction between Cambridge (Q350) and the corresponding city council area -- the two are considered equivalent. On the other hand we do make a distinction between Carlisle (Q192896) and City of Carlisle (Q1094110) -- perhaps paradoxically (but very English), the first refers to the city proper; whereas the second is the name for a wider administrative area, including quite a lot of rural land well outside the built-up area.

So what to conclude? The above mostly reflects the distinctions that the totality of wikis have or have not made, which may not be that bad a first starting point for what entities do or do not deserve distinct items. For myself I think, as far as possible, that I would like to see the same item used for a city and for the territorial area corresponding to the city administration -- unless there is a clear mismatch in perception between the two extents (eg Carlisle). So, myself, I would prefer to see the territorial information moved from Belfast city council district (Q1140130) to Belfast (Q10686); with the former perhaps being re-purposed as an item relating purely to the City Council itself, as an administrative/legislative organisation.

But I do agree with whoever suggested that city (Q515) should not by default be a subclass of administrative territorial entity (Q56061). In countries where a city by law a particular type of administrative entity, with particular powers and responsibilities, then it makes sense to have an item "City of Country X". Otherwise, as per Manchester above, to me it makes sense to have the item be a separate instance of both city (Q515) and its precise administrative class, eg metropolitan borough (Q1002812) -- and only being a subclass of administrative territorial entity (Q56061) through the latter.

Jheald (talk) 12:05, 17 November 2015 (UTC)

en:Local government areas of Victoria tells me that there are many cities in "Greater Melbourne" and "City of Melbourne" is just one of them. "Greater Melbourne" is an alias for the article "Melbourne". Therefore, I would add into "Melbourne" P150 with all the cities (City of Melbourne, City of Port Phillip, City of Stonnington, ...) and into the cities P131 "Melbourne". I guess that below a city there are suburbs of which Melbourne has ~400. If the item "City of ..." get the property P150 too, it will be possible to use this property for queries the same way as we are using P31. --Molarus 15:06, 17 November 2015 (UTC)
In my opinion very few things should be described as being <located in the administrative territorial entity (P131):Melbourne (Q3141)> because it is too big. The P131 statement should give the smaller local government area where the thing is located - the City of Melbourne for instance. In the meantime saying stuff is <located in the administrative territorial entity (P131):Melbourne (Q3141)> is not actually wrong - Melbourne (Q3141) Region appears to be an actual administrative territorial entity with legally defined borders. There are cities however which sprawl over multiple and have no defined borders or unitary authority and so those cities are not 'administrative territorial entities' - in my opinion. Joe Filceolaire (talk) 19:34, 19 November 2015 (UTC)
It's not clear to me whether statistical concepts like urban area (Q702492) or metropolitan area (Q1907114), that may well correspond to no local governmental authority, should be considered 'administrative territorial entities', but we should maybe have properties to link to them, given that there are infoboxes that want to link to them and give their respective populations (for example en:Template:Infobox UK place on en:Derry). Jheald (talk) 20:14, 19 November 2015 (UTC)

@Jheald: I'm fairly new to WikiMedia projects. How do we go about changing city (Q515) so that it's not an administrative territorial entity (Q56061)? It seems like it'd be a large change with far-reaching effects. Atroceh (talk) 02:06, 25 November 2015 (UTC)

We maybe should not describe "local governmental authorities" as territories at all. It is not a part of Earth (Q2) that has a "govermental authority", it is a group of people. That you live within an area does not necessarily make you a "part of" that entity. That a road is located in an area, does not always mean that the local goverment has any influence over that road. There live a group of immigrants not far from here, I am not sure that they are subjects to the local authorities, even if they live here and are counted in the census. The governor and the county board in Q34 is not a local authority, it is a local branch of the central goverment in Stockholm. Their authority was from the beginning limited within each county. But in last decades, each county board has specialised in some subjects. The county board of Jönköping for example is responsible for the Agricultural business in the whole nation. The county counsils here is a "local authority" but is very hard to describe as "territorial" since their authority is limited to only a few sections of the daily life. If you go to a private doctor, you do not have to deal with them at all, except for the tax you pay.
It is maybe not the "statistical" entities we should be carefull about, it is maybe rather the "administrative". Historically in Q34 also "administrative entities" lack the "transitivity" you demand from them. A Swedish municipalsamhälle, could be located in several municipalities and even several counties. The municipalsamhälle was responsible the urban planning (Q69883), something the rural municipalities didn't have to bother with. You payed tax to the parish to pay for the gravyard, the municipalsamhälle to pay for the urban planning, the municipality to pay for the social aid (schools, elder care, poverty) and to the county counsil to pay for the health care. There is then four levels of local administrative entities but they have not always been transitive. -- Innocent bystander (talk) 10:02, 25 November 2015 (UTC)
@Innocent bystander: For most administrative areas in England there are separate Wikipedia articles for the administrative territory and the administrative body -- compare en:Category:Non-metropolitan counties and en:Category:County councils of England. The Wikidata items for the administrative bodies mostly still need to be marked up with properties -- in most cases they don't even have a P31 yet. That's some work I still need to do. But something like North Yorkshire County Council (Q17017103) shows a start, linking to the corresponding administrative territory via applies to jurisdiction (P1001), which in turn links back to it using legislative body (P194); with the administrative body being an instance of county council (Q4321471), which in turn is a subclass of local government (Q6501447)
Away from England, eg in Scotland, there is a tendency (inherited from Wikipedia) to conflate the administrative territory and the administrative body onto the same item -- the item for the territory. But this is no different to so many other areas on Wikidata, where a single item (like a single WP artice) quite often covers different aspects of the same thing, or closely-related things. That may even be inevitable -- there's only ever going to be a finite granularity with which we carve up all the concepts of the world into different items, there are only ever going to be a finite number of items. There is always going to be some point at which one says "these concepts are closely enough related that we are going to consider them together -- at least for now". So, for the most part, I have taken as I have found, and not created new items (eg for different (even purely territorial) aspects that an entity may have, sometimes even with different borders). There's enough work to do with the items we've already got. Besides, once we start hiving aspects of entities off into different items, that makes it quite hard, both for infoboxes to pull everything together again, and for sitelinks. So, for the most part, I have found enough other priorities needing work, without thinking about creating any systematic new sets of items. The "co-classes" links, and explicit list queries, on the various sub-pages of Wikidata:WikiProject UK and Ireland/adm show the range of multiple roles various items may be sharing -- or at least, those so far marked up on the items.
Regarding transitivity of P131: for England it almost holds, with just a couple of glitches. For Scotland things are more complicated, particularly between different types of (partly but not wholly) historical entities and the current primary divisions. Banffshire (Q806432) gives an example of what I have tried to do. The historical/ceremonial/land-registration county is split between two different top-level council areas, so I have given each of these as a P131 at normal rank with an applies to part (P518) qualifier. But I have also put in located in the administrative territorial entity (P131) Scotland (Q22) at preferred rank, because that is as much as can be reliably said (transitively) for people/scripts/bots eg using the wdt:P131 form of the P131 property, working without qualifiers. Jheald (talk) 12:30, 25 November 2015 (UTC)
@Jheald: Well, I have this far added both P131:Municipality and P131:Province since there is no or very poor transitivity between Municipalities and Provinces. Hästhagen (Q3352829) is a good example. Nacka Municipality and Södermanland (province) is what it can be described as being located in today. Nacka city was dissolved as an entity 1970 and it therefor have a "normal rank" and an end date. Setting P131:Södermanland to a "normal rank" would imply that it like Nacka City is a dissolved entity, but it isn't. Provinces have (since 17th century) lost their administrative function. What remains of them are the ethnic identities, languages and the lines on the map. There are of course smaller entities than municipalities, like parishes, civil parishes and districts. There is no guarantees about transitivity between such entities and municipalities either. But larger urban areas are often larger than such entities. (A handfull of urban areas are larger than municipalities, but they are the exceptions.) If I would like to know the language and ethnicity of somebody, then I would prefer to know the Province. But if I would like to know what political landscape (s)he lives in and what tax (s)he pays, I would like to know which Municipality. If I want to know the religious background, I would like to know the parish. And if I would like to know how the cultural heritage is treated in hir surroundings, I would like to know which County (s)he lives in. It is complex, and I think I would like that we forget the whole transitivity-thing with P131, because it so very seldom looks that simple in the items I edit. -- Innocent bystander (talk) 15:03, 25 November 2015 (UTC)
@Innocent bystander: What I've started trying to do in cases like that is to add the qualifier as (P794) to the P131 statement -- eg on Renfrew (Q7313155), where I've given the present-day administrative area unqualified as the preferred value, but also the historic administative area at the time this district existed as a normal-rank value, qualified with as (P794) = Local government areas of Scotland 1975 to 1996 (Q383493), so that in time (with more work) somebody should be able to check for that qualifier and pull out the whole administrative hierarchy as it was at that time. I'm not sure whether this is considered an approved use of P794, but it seemed the best choice available.
PS: Is "hir" the possessive of hen (Q10521894) ? I only met the latter for the first time this week, in the first episode of the new series of The Bridge (Q1211796), when Saga Norén was queried on her use of it by her Danish colleague. Now impatient for Saturday and episodes 3 & 4. :-) Jheald (talk) 15:31, 25 November 2015 (UTC)
But I do think transitivity is important to try to preserve as far as we can with P131, because the P131 tree is really the only way for users to be able to localise places and events within a part of the country as whole -- whether a province, a region, a municipality or whatever. Since each of those may include multi-level subdivisions, it's important to at least try to make such searches possible, without too many items leaking in from adjacent provinces/regions/municipalities etc if subdivisions in one system cross the borders of subdivisions in a different system.
In the Scottish context, I also thought it was important to try to record how alternative and historic divisions relate to the present primary set of council area (Q15060255) divisions. Jheald (talk) 15:47, 25 November 2015 (UTC)
One question: Is a Province in a County or is a County in a Province? Wikipedia tells me both! Sometimes the first is smaller, sometimes the latter is smaller. In two cases (today), there is an exact match. -- Innocent bystander (talk) 16:17, 25 November 2015 (UTC)
@Jheald: Using P794 for P131 probably works when the item has multipurpose (both an urban area and a municipality or both an island and a province). But when there is no doubt that "Rohan" is a "kingdom of Middle Earth", I cannot see such need. That Rohan is a kingdom in Middle Earth is already stated in the item about Rohan. Your approach would probably work with the two Swedish provinces that the articles also describe as (islands|group of islands). -- Innocent bystander (talk) 09:59, 26 November 2015 (UTC)
@Atroceh: The actual edit is very simple -- just go to city (Q515), hit "edit" on the statement in question, then hit "remove".
But I suspect you're asking about a bit more than just the simple mechanics...
There are currently just over 16,000 items that are cities but not subclasses of administrative territorial entity (Q56061) in any other way:
Here is a breakdown by country:
The first issue is to determine in which of these countries "city" status does in itself imply a municipal / administrative organisation.
A second issue is to think how that should be indicated -- eg does it make sense to mark items as instance of (P31) both a city (Q515) and a municipality (Q15284) in territories where the former implies the latter? Alternatively, should one introduce a second "city" item, a subclass of Q515, for cities that also have municipal administrative authority? (But then could we trust users to consistently choose the right "city" item from a drop-down menu?)
Is the distinction between "city with administrative powers" and "city without administrative powers" in fact worth making, all for the sake of perhaps only a handful of cities (for example Inverness (Q160493)) that are not administrative units ?
The key question is how and where to get the attention of the people most interested in administrative areas, and the located in the administrative territorial entity (P131) hierarchy, to get a sense of their perspectives on the above two issues, and so how as a community as a whole we think we should go forward.
There is Wikidata talk:WikiProject Country subdivision, which may be a good place to start a discussion thread. However, I am not sure how active that WikiProject is. But perhaps it is a place to start a thread, and then perhaps advertise that thread with a new section on this page, and the sister pages of this page in other languages -- eg the Bistro page in French, the Forum page in German, etc -- to get the attention of more of the community, to discuss how to go forward. Jheald (talk) 09:26, 25 November 2015 (UTC)

Merging a painting by Gauguin[edit]

Please, double check if Q19869305 and Q20381008 should be merged. Note that they have the same inventory number (P217) althoug with different wording and located at different collection (P195). I'm not sure how inventories are organized in Denmark. --Vriullop (talk) 09:57, 17 November 2015 (UTC)

It is one of 30 paintings that belong Statens Museum for Kunst (Q671384) and long-term loan to Ny Carlsberg Glyptotek (Q1140507). --Steenth (talk) 12:24, 19 November 2015 (UTC)
Thanks, I understand. Merged. --Vriullop (talk) 09:36, 20 November 2015 (UTC)

Harassment consultation[edit]

Please help translate to your language

The Community Advocacy team the Wikimedia Foundation has opened a consultation on the topic of harassment on Meta. The consultation period is intended to run for one month from today, November 16, and end on December 17. Please share your thoughts there on harassment-related issues facing our communities and potential solutions. (Note: this consultation is not intended to evaluate specific cases of harassment, but rather to discuss the problem of harassment itself.)

Regards, Community Advocacy, Wikimedia Foundation



i wanted just to add interwiki from Commons to cs.wp. This time traditional dialog box didnt appeared, insted I was moved here to fill in the property. So I have filed the property: Lšelín (Q21503551) and than I wanted to add interwiki. There was an error, because there is allready a property existing. Could you fix it please and link w:cs:Lšelín with c:category:Lšelín. I dont want to study it, nor debug it, nor spend my time on Wikidata. Sorry!--Juandev (talk) 07:13, 18 November 2015 (UTC)

And the same for w:cs:Ostromeč and c:category:Ostromeč. Why this is happining - I want just to add interwiki. I dont want to create new items on d!--Juandev (talk) 08:17, 18 November 2015 (UTC)

It seems to me, that there is no category for Lšelín on to be linked with the category on commons, and there is no gallery page on commons to be linked with the article. Technically, you probably still can add interwiki by editing the source code, but it wouldn't be systematic and I'm afraid some other editor would remove them soon.--Shlomo (talk) 11:22, 19 November 2015 (UTC)

parishes turning deprecated as of 2016[edit]

You who know how to question things to the databases here. Do you know if there are any statements like:

located in the administrative territorial entity (P131):QX where QX has statement instance of (P31):Parishes of the Church of Sweden (Q615980)?

We should put an end date to such statements (if we have any). From 2016, such statements are only legitimate within the Church of Sweden (Q749243). Technically such statements became deprecated already in 2000 when the Swedish lutheran church finally lost its status as national church. But the parishes have not been replaced with districts until now. -- Innocent bystander (talk) 10:55, 18 November 2015 (UTC)

Lots of things are located in administrative territorial entities which are not themselves administrative territorial entities - every informal village for instance. I believe that pretty much every road, museum, monument, church and parish should have a 'located in the administrative territorial entity (P131) statement. As well as this Parishes should also have a 'location (P276)' statement linking them to their diocese ( Q665487) - i.e. the next larger religious administrative entity.
In the case of Parishes of the Church of Sweden (Q615980) there is a problem in that these were both instance of (P31):civil parish ( Q4976993) and instance of (P31):parish ( Q102496) (religious) until some date when their legal status (but not their location) was changed. Maybe we should just add a statement <instance of:civil parish of Sweden>, with an end date, to each of these parishes? Joe Filceolaire (talk) 19:16, 19 November 2015 (UTC)
Detail (There are levels between Parish and diocese (Q665487) in the Swedish church.)
To make it more complicated. These parishes are in fact the "religious parishes". That they kept some parts of the civil administration until today, was because it was in the 19th century regarded as a religous responsibility to educate children and keep record of birth/deaths and draft soldiers. The civil parishes (socken) have today lost almost every part of their civil administration but are still regarded as an territorial entity used in registration of the cultural heritage.
A very large part of these parishes has since 1999 been merged. We who are not employed by the Swedish church (I am not even member) have very little interest in how the Swedish church today is organised and the notability of new parishes in the Swedish church is an unsolved issue on svwp. -- Innocent bystander (talk) 08:32, 20 November 2015 (UTC)

Does a screenwriter of director have to be an individual or can it also be a group of persons[edit]

Currently a lot of director (P57) and screenwriter (P58) claims are added for a group of people (mostly because Wikipedias use the groups instead of individuals). I recently asked for a bot to change that to the individual members of a group (for property P57. The result was that the participants in that discussion said that a group of people could also be a director (Q3455803). The same would then apply to screenwriter (Q28389). P57 requires P106 to be one of film director (Q2526255), director (Q3455803), television director (Q2059704),theatre director (Q3387717) to not violate that constraint, P58 requires P106 to be screenwriter (Q28389). Problem however is that if we apply P106 to non-individuals they violate the constraints of P106. So what is the best solution in this case? Allowing groups for P57 or P58? Clean up P57 and P58 to individuals (probably using a bot)? Allowing groups to have P106 too? Something completely different? Mbch331 (talk) 12:14, 18 November 2015 (UTC)

I think our constraint system isn't flexible enough. The perfect solution would be:
Checking Claim = P57/P58:Item
If Item has claim P31:Q5 then
 Item has claim: P106 is in {{Q|2526255}}, {{Q|3455803}}, {{Q|2059704}},{{Q|3387717}}, {{Q|28389}}
 Item as claim: P21 is in male, female, etc.
ELSEIF Item has claim P31 subclass of group of people THEN
 Item has claim {{P|527}}:Items
  Items of P527 have claim P31:Q5, P106 is in same list as above, P21 is in same list as above
ELSE Claim violates constraint
But I don't think the current constraint system (or the new one) can handle this. Mbch331 (talk) 15:05, 18 November 2015 (UTC)
That would be a great improvement to how the constraint system is, but probably is hard to do it in an easy to use, but still machine readable way. Also I don't know how it can be represented using properties, which is the way in which it seams that constraints will work in short/medium time. At least is what I suppose we are going to do with the property constraint (P2302) property created a couple of days ago. -- Agabi10 (talk) 20:16, 18 November 2015 (UTC)
I don't know when we're actually going to switch to property based constraints, but the plan is to switch to that system. The properties like P2302 have been created so it can be tested how it works. When we're going to switch I don't know. Also there are 2 type of constraint reports that need to be adapted before we can fully switch from template based to statement based. (The daily reports by KrBot and the reports behind Special:ConstraintReport). Mbch331 (talk) 21:08, 18 November 2015 (UTC)
I think it wouldn't be too difficult to extend the current syntax to allow some conditional constraints while still being easy to use and machine readable, but I don't know how it could be done with properties/statements. For this example, I think adding support for a "when" parameter to constraints might be enough (e.g. {{Constraint:Target required claim|property=P21|when=P31:Q5}}, which would mean that the constraint applies only when the item has P31:Q5). Something like that would be very useful for postal code (P281) too (postcode formats vary from country to country so being able to apply different format constraints based on the value of country (P17) would let us make much better constraints). - Nikki (talk) 07:55, 19 November 2015 (UTC)
I wonder - how about SPARQL-based constraint system? Such thing would be easier to express in SPARQL. --Laboramus (talk) 00:25, 21 November 2015 (UTC)
SPARQL of LUA based are good options (I would prefer PHP-style, but that's only because I know that language better). But this whole discussion about flexible constraints isn't of much use if only humans can be a director (P57) or a screenwriter (P58). And I haven't heared any opinions on that part of the discussion. Mbch331 (talk) 10:20, 22 November 2015 (UTC)

Reliable Bot imports from wikipedias?[edit]

In a Wikipedia discussion I came by chance across a link to the following discussion:

Not seeing that this was already an archived discussion, I accidentally posted some commentary there, so now I delete it there and repost it here:

To provide an outside perspective as Wikipedian (and a potential use of WD in the future). I wholeheartedly agree with Snipre, in fact "bots one running wild" and the uncontrolled import of data/information from Wikipedias is one of the main reasons for some Wikipedias developing an increasingly hostile attitude towards WD and its usage in Wikipedias. If WD is ever to function as a central data storage for various Wikimedia projects and in particular Wikipedia as well (in analogy to Commons), then quality has to take the driver's seat over quantity. A central storage needs a much better data integrity than the projects using it, because one mistake in its data will multiply throughout the projects relying on WD, which may cause all sorts of problems. For crude comparison think of a virus placed on a central server than on a single client.The consequences are much more severe and nobody in their right mind would run the server with even less protection/restrictions than the client.

Another things is, that if you envision users of other Wikimedia projects such as Wikipedia or even 3rd party external projects to eventually help with data maintenance when they start using WD, then you might find them rather unwilling to do so, if not enough attention is paid to quality, instead they probably just dump WD from their projects.

In general all the advantages of the central data storage depend on the quality (reliability) of data. If that is not given to reasonable high degree, there is no point to have central data storage at all. All the great application become useless if they operate on false data.--Kmhkmh (talk) 12:00, 19 November 2015 (UTC)

I agree with you that the quality is important, but I think that we'll always will need bots to import the data. The quantity of "relevant" information generated each day is too big for the people adding data to do it manually and what we need is not preventing the inclusion of that data. What we need is to continue improving the constraint reports to detect more errors and inconsistencies and also improving the tools we have to solve the detected errors. Also we can automatize the correction of some errors that are easily correctable.
The problem we have it is not that there are bots importing information to Wikidata from the different Wikipedias, the problem is that not all the people adding the information checks the constraint violations reports and that there are bots that don't make all the corrections they should.
I think that telling that we should stop adding information with bots from the different Wikipedias is the same as telling that we should not be allowing the IPs to edit entities because most of the vandalism is generated by IPs. It goes against what I think Wikidata is. As it is said in the Main Page "Wikidata is a free linked database that can be read and edited by both humans and machines" and I don't like the idea of putting most machines out of the equation. -- Agabi10 (talk) 12:57, 19 November 2015 (UTC)
  • Well, I don't agree with your analogy as it ignores my main point above. Wikipedia is not central data storage hence IP vandalism does not cause any central damage (aside from that WP is using a lot of resources (human and software (various bots, flagged revisions) to combat IP vandalism). For central storage as WD however you need a different even stricter paradigm.
Also I made no arguments against bots as such (you can't run reasonanly WD without any), but that the use of writing bots should be handled more restrictively and more importantly that Wikipedia should not be treated as a reliable source. It would be much better to restrict writing bots to import data from scientific and federal databases that are considered to be reliable. Or to but it this way import from Wikipedias sources rather than from Wikipedia and use stricter rules than Wikipedia for data integrity.
Of course it also depends as what you ultimately envision WD. It is supposed to be only a "Wikidata is a free linked database that can be read and edited by both humans and machines" or is it supposed to be a reliable central data storage for other Wikimedia project (but also external projects) as well. If WD is just the former and not the latter, you might be right, but if it is supposed to be the latter as well, then imho Snipre's criticism is spot on and the current practice isn't really working.--Kmhkmh (talk) 13:23, 19 November 2015 (UTC)
  • @Agabi10: Please read the comments. Nobody requests to stop all data imports but data import from Wikipedias. If you want to use the big sentences just take account on that definition of WD in the Wikidata:Introduction "A secondary database. Wikidata can record not just statements, but also their sources, thus reflecting the diversity of knowledge available and supporting the notion of verifiability".
If adding data is kind of right, adding sourced and verified data is a duty. We agreed to import quite a lot of data from WPs at the beginning because we needed data but since people are working and curating the raw imported data bots can't continue to add unverified data (data from WPs are not sourced because WPs are not a source).
We can't continue to curate the data using constraint reports (I took several weeks to do that job for chemicals) and encourage people from the different WPs to correct and add data when a bot can destroy all the work in some hours (I experienced that situation and now I prefer to stop to correct items until I get an insurance we can work without risk). The main problem with imports from Wikipedias is the articles connected to the wrong items. As people can't correct most of the time the sitelinks due to language barriers they just correct the Wikidata data and assume that Wikipedians will correct their sitelinks when they won't find the data they want to see in their articles.
We finished the first phase which focused on populating the items, now we have to focus on data curating and on quality improving. So only data import from external databases recognized as references should be approved. Snipre (talk) 14:12, 19 November 2015 (UTC)
It would be so wrong to do this. Wrong on several levels. First of all, you conflate two issues. The existence of statements and the existence of sitelinks. When sitelinks are wrong, they need to be aggressively remedied. This means not only removing the wrong links but also to ensure that items exist that relate to the old link. In this way we prevent imports that are wrong. There is no helping it. It needs to happen and the incorrect import is EXACTLY how we know that these sitelinks are wrong.
When statements are wrong, when Wikipedia is wrong, it needs attention. By importing data, such errors get exposed. It is the only way whereby we improve quality not only but also in Wikidata. There is no helping it, and certainly not importing is ignoring the facts without any other mechanism that brings solace.
If anything, we NEED to concentrate on comparison of our data to other sources. Not only to Wikipedia but to any source. When we have lists that indicate issues, we can concentrate on where the problems are. We can signal to the Wikipedias, to other sources that issues exist. This is a much better way of dealing with issues. Wikidata is a tool, it is a mechanism that allows us to make a difference. Only storing data is so little of what we can do... Thanks, GerardM (talk) 16:08, 19 November 2015 (UTC)
  • We have no way of communication back to the Wikipedias. This list is two years old. Nothing happend. --Succu (talk) 16:32, 19 November 2015 (UTC)
  • GerardM Managing sitelinks is not the work of Wikidata: we have no way to force some WPs to link articles to some items. Your aggressively remediation will be kind of attack to the freedom of WPs. Then even when you inform the WPS and the corresponding projects about errors, no answer is providing because some WPs and some projects have no enough contributors to do it or with the skills to judge how they have to manage their articles and sitelinks. Then you have the famous problem of some articles mixing different concepts into one article even when several items exist for them. Data should be splitted into different items but no bot is doing that. All data are just imported in the linked item. And you can't force WPs to split their articles. So Wikidata should focus on data and good data. Snipre (talk) 10:42, 20 November 2015 (UTC)
I concur, Communication beetween Wikipedias and Wikidata is a keypoint. In frwiki a lot of users considers Wikidata as a totally outside project, and it's very hard to communicate on Wikidata on Wikipedia. For example, a simple thing such as translating Wikidata Weekly on a project talk page fr:Discussion Projet:Wikidata is hard, and some people keeps ... asking this to stop because they don't understand how it can help to use Wikidatas datas on Wikipedias. This does not work :/ We certainly MUST tight more the projects to Wikidata, by helping information to go back and forth, and make Wikidata a home for Wikipedians. author  TomT0m / talk page 16:41, 19 November 2015 (UTC)
  • @Snipre: If the main problem are the incorrectly linked articles the problem will still continue existing independently of importing or not the data. What we have to do in that situation is creating ways to solve that problem. Importing data from Wikipedia helps detecting this kind of problems even if we have to clean elements after, the problem is if we only clean the wrong statements. If we only clean the wrong statements our work is worthless, the incorrect sitelinks should be also corrected. Yeah, the person who corrects the data can have language barriers to understand the content of the Wikipedia page, but the good part is that they can easily get in contact with people that can understand them. We have a Project Chat in lots of languages here in Wikidata, and also the Village Pump of the different Wikipedias. Babel also does a good job with the categorization of users depending on the languages they know. When the incorrect statements have imported from (P143) added it's easy to know to which language we have to go to ask for help.
@Succu: Having that link in Wikidata is completely useless, most of the Wikipedians don't care about what happens in Wikidata and they probably don't know even about the existence of that list if you don't tell so. Have you tried to add that link into the local Wikipedia and to ask for help in the local Village Pump? For most of the people Wikidata is no more than a tool that the Foundation want to force them to use and we are the ones who have to make the difference, not Wikipedias. Wikipedians are used to the way they work until now and they don't want to change that. So we just can't expect that the ones who edit in Wikipedia will be editing in Wikidata, no at this point at least. -- Agabi10 (talk) 16:52, 19 November 2015 (UTC)
  • Sure we did, Agabi10. Most of the problematic cases in taxonomy are caused by bot generated content. --Succu (talk) 17:11, 19 November 2015 (UTC)
  • The problem of sitelinks is not a problem of Wikidata but of WPs: something that is a real choice of some WPs. Example: the case of drug articles which mix commercial data, medical data and chemical data. In Wikidata we have two items for that case, one for the commercial product and one for the active molecule. depending on the choice of the WPs, you can import chemical data in the commercial item or the inverse. Snipre (talk) 10:42, 20 November 2015 (UTC)
For Wikipedians Wikidata is a tool (with central storage aspect across all Wikipedias), they will use it when it is convenient and helpful and otherwise they won't. However the lower data integrity/correctness are (and the less restrictive the procedure to create such data are), the less inclined will be Wikipedians to use use it and to maintain/service WD entries on the side. Or to put it this way, the current practice makes it less likely for Wikipedians to help out.--Kmhkmh (talk) 17:31, 19 November 2015 (UTC)
And if there is not Wikipedian who use Wikidata datas, except in a few cases where there is good opendataset, there won't be any good datas on Wikidata because there won't be anybody to work on it. Deadlock. author  TomT0m / talk page 17:59, 19 November 2015 (UTC)
Can anyone write a bot that runs through all Wikidata edits ever, and undoes all bot reverts of humans? (And maybe notifies the bot author if it happens a lot?) Is that even possible?
I agree that we need to stop, or at least slow down, the uncontrolled bot imports from Wikipedias. There's a lot of problematic data being added. Wikipedia imports, where done, must be supervised by a human. If there are sources on Wikipedia, importing those is important. In many situations, it's important to get another user to look over the job before running the import. Also, it would help if, after doing a full (careful!) import from a Wikipedia in an area, the Wikipedia had its local data cleared out, to prevent the kind of problem where Wikidata gets something fixed and the fix gets overridden by the old data being pulled back from the Wikipedia again. Or, if another import doesn't happen, and the data is left duplicated on a Wikipedia, then editors will continue editing the local versions, and Wikidata will become out of date.
There's a lot going on, and the speed of imperfect data imports is more than we can handle right now. We need more human manual editors curating data, and their work is becoming near impossible with the bots running wild. Curating existing data is more important than adding new data which won't be usable until sufficiently curated anyway. --Yair rand (talk) 20:09, 19 November 2015 (UTC)
  • Running a bot that runs through all Wikidata edits ever is not impossible, but it should not be done to the live data, because it would be generating a lot of unnecessary load to the servers, instead it should be done with a database dump.
Related to the removal of the imported data into the local Wikipedias is something that is hard to do, because even if the objective is being able to use templates without local parameters this is something that at least for now it's impossible for us (the people from eswiki). There are neither all the needed properties created nor all of the current values of the infoboxes in most of the "Wikidata friendly" templates. In that situation what I think we need is even more information here, because even if we try to teach to the users that empty templates can also have information this is not always true, and it is never true for all the parameters (I don't think that we have any infobox with all the parameters connected to Wikidata yet).
I don't agree that we have to stop the import of data from Wikipedia, but I agree that the person importing the data should check at least that the imports they made don't add a new errors in the constraint reports, and if violations are added they should be responsible of checking and solving the problems. The thing is that we can't force them to do that. -- Agabi10 (talk) 21:04, 19 November 2015 (UTC)
  • Removing local data after import sounds nice, but isn't workable. There are wikis that currently don't use data from Wikidata for their infoboxes or other templates (nlwiki is such a wiki). I know that the Dutch community sees using data from Wikidata as a restriction on the possibilities to edit Wikipedia. Mbch331 (talk) 21:30, 19 November 2015 (UTC)
    I can understand the Dutch WP community well. Even if the Wikipedian cares to learn about WD and edits a statement there, his work is often soon destroyed by some bot or WiDaR mass action. As for now, there is no way to prevent it or to protect your work, the complaints to admins are beeing thoroughly discussed and then archieved without any action. In this situation, the Wikipedians (Wikisourcians, Wikiquotians a.s.o.) are completely right, that using data form Wikidata means losing control over them.--Shlomo (talk) 08:59, 20 November 2015 (UTC)

Pictogram voting comment.svg Off topic: Stop changing the title, I am tired to enter in the topic to see that the only change is the title. -- Agabi10 (talk) 23:22, 19 November 2015 (UTC)

If we would add a property into an item as soon as the item is used in a wiki (<item used in wiki> <wikipedia x> ), this could tell bots and wikidata editors to be very careful with changing data. This would give wikipedia editors same control over the data they will see in the infoboxes. --Molarus 01:39, 20 November 2015 (UTC)

Remarkable to me, the intro suspects that bots are the root cause of all problems, where my experience is that bots are operated by users that have very valuable knowledge of Wikidata, but that most issues are created by (often one-time) users that have no idea what they have to do, and that what they did is causing issues. On my home wiki we have several articles every day that loose all there interwiki links, because users are renaming/moving around articles, and I've never seen this being caused by a bot. For me it's a bit obnoxious to blame bots without proving that bots are the cause of the exposed headache. Edoderoo (talk) 06:41, 20 November 2015 (UTC)
Not the bots, but irresponsible bot users. I don't think there are many of them here, the problem is, that one ore two of them can destroy or devalue the work of hundreds.--Shlomo (talk) 08:59, 20 November 2015 (UTC)
That would make it simple. Every bot has an owner, and when the owner is irresponsible, action needs to be taken. But no specific users has been named, but all the bots have been blamed. To me this sounds like bot-o-phobia that is not solving the problem. On the other hand, interwiki-links has been named as an issue, and before there was WikiData, interwiki-links was also an issue, as for some items it's not always possible to get a true 1-on-1 link with *all* the languages, especially not when one is not speaking the languages to see if it's about exactly the same item or not. Edoderoo (talk) 09:45, 20 November 2015 (UTC)
Often the owner is not responsible of bad data imports: someone comes and just ask him to perform that action and he does it in good faith. We don't say that bots are the problem but that bot work is source of problems. This is not a problem with bots this is a problem with people who extract data from WPs with bots. Snipre (talk) 11:04, 20 November 2015 (UTC)
@Edoderoo: As soon as your "action needs to be taken" changes to "action is taken", we can have a new situation here and we can start working on improving the bots' performance and afterwards trying to convince wikipedians, that cooperation makes sense. Trying it the other way round is waste of time. Specific users have been named in specific discussions to specific issues, no need to crush down this general discussion into single issues.--Shlomo (talk) 11:32, 20 November 2015 (UTC)

At one point, it will be a self-reliant system. ;)

Wikidata - Wikipedia relationship.

Mustard of the sunshine (talk) 09:08, 20 November 2015 (UTC)

We can't ask WPs to delete data after import because not all WPs agree to use Wikidata. We have to stop to thing about a close relationship between WP and WD because it doesn't exist. WPs want to keep their freedom and be able to overwrite the data coming from WD with their chosen data. We can't expect to find solution for WD by doing any modification in WPs. WPs won't accept that. Snipre (talk) 11:04, 20 November 2015 (UTC)
  • Are there fields where people think WD is particularly unreliable? This given that most of Wikidata are P31-statements, identifiers, basic infobox data about persons, taxonomy, location and country statements? --- Jura 11:17, 20 November 2015 (UTC)
One often contentious piece of data that comes to mind are dates of births I assume. They are often unsourced or based on unreliable sources (as IMDB, external Wikis) and as a result they often get removed, modified or at some point later point properly sourced. Other infoboxes data often has outdated (sourced or unsourced) or sometimes plain false (unsourced) data, for for geographic objects data on size, length or inhabitants. With taxoboxes there are as far as i know issues with different methodologies in different wikipedias rather unreliable data as such.
And then again there is the general storage problem, meaning as long as it is as easy or even easier to intentionally or accidentally vandalize data in WD, Wikipedias will be unlikely to outsource their "live" data to WD. Even more so since local data in (local) Wikipedias gets increasingly sourced (by reliable external sources), switching to (possibly or even likely unsourced) data (as in reliable external sources not Wikipedia itself) seems like a step back for them.--Kmhkmh (talk) 13:23, 20 November 2015 (UTC)
According to my experience with persons: citizenship, ethnicity, religion, academic degree, occupation, employer.--Shlomo (talk) 14:20, 20 November 2015 (UTC)
As the Project chat is not the best to keep the discussion for a long time so I started a RfC about bot policy in order to see which is the trend in the Wikidata community. See Wikidata:Requests for comment/Improve bot policy for data import and data modification. This is not a decision only a survey of the opinion of the community. Snipre (talk) 13:46, 20 November 2015 (UTC)
Maybe we should try to define how the various elements mentioned by Kmhkmh should be addressed. The approach for supplanting a local infoboxes isn't necessarily the same as to provide infoboxes (or fields) to articles that don't have them yet. I don't think we have that much of "size, length or inhabitants" data yet. Location data consists mainly of P131 and coordinates. --- Jura 14:38, 20 November 2015 (UTC)
Snipre: What about users relying upon tools based on OAuth (Q743238)? I know some users who never reacted at theit talk page. --Succu (talk) 22:26, 20 November 2015 (UTC)

Quality and quantity[edit]

This whole point has been talked about often enough. Ask yourself, does Wikidata have sufficient statements for all the items it holds. The answer is obvious; no. With 50.13% of all items having two or fewer statements we do not have a viable set of information. Yes, it is improving rapidly but that is only because of bots are being operated. There are many reasons why statements added may be wrong, this has been extensively covered above but it is important to understand the issue. The issue is that Wikidata is incomplete and immature. That is not a problem, the problem is to wish this away.

As data is added to Wikidata, a large percentage is wrong. The quality of the imports by bot is however improving. Kian for instance calculates the likelihood if something is correct and it has humans make the decision if it is not certain. That is a clear win. Comparison of data between Wikidata and other sources including Wikipedias is how we can find data that has issues, it has been promised to us for a long time. When it finally becomes available and when it comes with a workable user interface and workflow it will drive quality in both Wikipedia, the sources and Wikidata.

What people do not appreciate is that bots typically are much better at importing data than people are. Once it has been determined that something is correct, it makes no difference if 1, 10 or 1000 statements are imported. The consequence is that as a percentage a bot makes fewer mistakes than a human. There are mistakes but the reason why is likely to be found in the underlying data not in the algorithm.

It does not help that Wikipedias insist in doing things their way and expecting Wikidata to follow. It cannot, it should not. Particularly when from a data point of view the Wikipedia way is inconsistent. When this means that Wikidata is not good enough at this point, so be it. We should have time to grow up.

At the same time, increasingly there are collections of data that are good enough. Things like links to external sources, person data like date of death. There are plenty of people who are happy to maintain those but to make it shine, we should have ways to report issues with other sources. Again this has been promised for quite some time. It is however the best way to make our work relevant and it will drive quality at both ends. This in turn will have an effect on the whole of Wikidata.

In conclusion, Wikidata bot operators add so much data that they cannot remedy every possible error. Their work is very much required because there is too little data to work with. When you compare data, no data in Wikidata means no comparison OR you add the data. When we focus on quality by determining what is likely correct and likely incorrect, we will actually achieve something. By looking at Wikidata from the perspective of set theory our work will have the biggest impact and we will achieve more in less time. Thanks, GerardM (talk) 08:19, 22 November 2015 (UTC)

While I agree with much of what is said above in particular with regards to the need of bots, nevertheless it misses 2 points of the discussion.
There is no dispute about the usefulness if and need for bots, the dispute is about the management/authorization/supervision of those bots.
From Wikipedias' perspective, they don't expect WD to do it their way, in fact in some regards they don't care at all how WD does it, but they do care about the quality of the result if it is to be used in Wikipedias.
With regard to your "so be it" in my perception there are many WDian who seem to push use case and usage scenarios (which as such is not a bad thing), but exactly for many/some of those the data quality needs to be better. Or to put it this way, "en:You can't have your cake and eat it".
--Kmhkmh (talk) 15:19, 22 November 2015 (UTC)
When people work towards particular use cases, they do it within an environment ie Wikidata. Everything is connected and when their assumptions do not jive with what others assume it will fail. I have my pet projects and typically my assumptions are fine because it fits in with what we do. When this is not the case, these people are out of luck. As long as Wikidata is this immature, we need at least 500% more statements and it has to start with "instance of" and "subclass of". The biggest hurdle is that many items have no single subject. This will have to improve dramatically. Thanks, GerardM (talk) 19:59, 22 November 2015 (UTC)

504 Gateway Time-out[edit]

API is down. --Jobu0101 (talk) 07:14, 20 November 2015 (UTC)

It is down since thursday night (CET). Actually I have no idea where to report, or where to trace the actual status. Edoderoo (talk) 08:57, 20 November 2015 (UTC)
Phabricator or the development team page would be a good start. Mbch331 (talk) 09:17, 20 November 2015 (UTC)
I thought Phabricator is for bugs, is it also for incidents like this? And "development team page" could be a good place indeed, but I had issues finding it. Sooner or later someone else will fix it I hope. Edoderoo (talk) 09:37, 20 November 2015 (UTC)
If something doesn't work, that should work, that's a bug. Mbch331 (talk) 09:44, 20 November 2015 (UTC)
Hey :) This is Magnus' tool so please do reach out to him. The contact page for the development team is at Wikidata:Contact the development team‎. --Lydia Pintscher (WMDE) (talk) 10:18, 20 November 2015 (UTC)
It's working fine now ... I guess a 20 hour down time, time to catch up ;-) Edoderoo (talk) 20:13, 20 November 2015 (UTC)

What's in Wikidata? New pie chart[edit]

Statements by property gives another pie chart. This time by number of statements for a group of properties. --- Jura 11:55, 20 November 2015 (UTC)

Great, tanks for sharing--Ymblanter (talk) 20:14, 20 November 2015 (UTC)
Jura, what is subsumed as bibliographic fields? --Succu (talk) 23:00, 20 November 2015 (UTC)
Good question. It's mainly "publication date (P577), collection (P195), record label (P264), published in (P1433)", but also "title (P1476), publisher (P123), title (P357), volume (P478), page (P304), issue (P433), short author name (P2093), subtitle (P1680)". Not entirely optimal. --- Jura 08:27, 21 November 2015 (UTC)

Copy references[edit]

Hello. I enable DuplicateReferences: Adds a link to copy references and add them to other statements on the same item. gadgets but is not working. Am I doing something wrong? Xaris333 (talk) 14:44, 20 November 2015 (UTC)

I did the same yesterday, but although I had the "paste reference" links, I did not manage to "copy" a reference ... Is a link to do that supposed to be shown next to already created references ? author  TomT0m / talk page 14:47, 20 November 2015 (UTC)
Few days a ago I was using User:Bene*/DuplicateReferences.js (See User:Xaris333/common.js). Refreshing tha item page, a copy option was showed near every references. By select copy, the insert reference option was enable. Xaris333 (talk) 14:54, 20 November 2015 (UTC)
See Wikidata:Contact the development team#Adding reference. -- Innocent bystander (talk) 15:39, 20 November 2015 (UTC)

Preferred and normal rank[edit]

What is the proper use of "preferred rank" ?

Is it appropriate to use it if a property has many values for an item, to choose which ones should be included in an infobox (and which should not be shown) ?

Or, should somebody writing a query be able to assume that eg a SPARQL search using the wdt:... version of a property (and corresponding 'simple' RDF dumps) will usually include all values for that property that are 'true without qualification' ?


The issue has been brought into focus by attempts to make the standard place-infobox template on es-wiki draw directly the value of instance of (P31) to state the nature of the place.

However, for an item like Frankfurt (Q1794), which currently has seven different active values for P31, this leads to a pile-up of descriptions, as seen in the the infobox at es:Fráncfort_del_Meno in the section between the name and the picture.

After some discussion on es-wiki's equivalent of Village Pump, the plan was therefore developed to mark the value city (Q515) as 'preferred' on all items for which it occurs (and presumably, similar action would follow for other types of settlements etc). [pinging: @Greenny, Agabi10, Shadowxfox, Metrónomo: ]

However, it is not clear whether this is always correct. For Rennes (Q647), as User:VIGNERON has raised, there may be other values -- eg commune of France (Q484170), and/or regional capital -- that may be more significant. (Plus, also, different wikis may disagree about what is the 'most important' value to show).

The other issue about marking some values as preferred is that it makes other values disappear not just from infoboxes, but also from 'simple' RDF dumps, and from SPARQL searches using wdt:... versions of properties.

So for example, making city (Q515) the preferred value of instance of (P31) for Cardiff (Q10690) means that the value principal area of Wales (Q15979307) is no longer visible for simple searches, as in eg the search queries in column 2 and column 4 of the following template, which now sees only 21 principal areas of Wales instead of 22:


It is possible to work round this in SPARQL by always using the two-part expression p:P31/ps:P31 for the property instead of wdt:P31; but this will introduce an extra join every time the query is run, which may make it significantly slower (and may confuse the query optimiser, potentially making the query a lot slower, or requiring it to be hand-optimised).

Having to use this syntax also has the effect of including all deprecated statements, and statements which may be true only subject to qualifiers.

From a query-writing point of view, it would be nice if "preferred rank" could be used principally to distinguish statements that are true without having to consult qualifiers from statements that are true, but only if qualifiers are taken into account. Of course, it is possible (in principle) to then filter these out, but that is then more time and more complexity -- potentially for every property used in every query.

As an example of this, consider the located in the administrative territorial entity (P131) statements for the Scottish ceremonial county of Banffshire (Q806432), which (as marked up on the item) is partially located in Aberdeenshire (Q189912) and partially in Moray (Q211106). For people using the version of the RDF dump without qualifiers, it is useful to give the P131 as just Scotland (Q22), because this is true without qualification. Using the 'preferred rank' setting for Scotland does this -- it hides the other two values from simple searches and simple dumps, so that a simple recursive search for places in Moray does not see Banffshire as being in Moray, and then include all the places in Banffshire that are not in Moray but are actually in Aberdeenshire.

So from a search point of view, this can be a really useful effect of "preferred rank".

On the other hand, for something like Cardiff (Q10690), one would like to be able to retrieve using just a simple search that it is a principal area of Wales (Q15979307), to include it in the list of principal areas, and not have this hidden at the simple level.


SO: Given these two different potential objectives, what are the right circumstances to use or not to use "preferred rank" ? Help:Ranking isn't particularly clear on this point, so wider input would be useful. Jheald (talk) 23:08, 20 November 2015 (UTC)

My English is very bad, so I apologize in advance.
We have several cases to take into account for this:
In the case of Frankfurt (Q1794), the features "big city, city with over a million inhabitants, imperial city and college town" can be summarized as simply "City", that is certainly the best item that includes all these features described above.
For other cases, not all items should be removed since they reflect a different or higher territorial order in several instances. For example, a city can also be a state or a district, like Bremen (Q24879).--Shadowxfox (talk) 01:00, 21 November 2015 (UTC)
First of all, and before starting to analyze another alternatives: would it be possible to ask to the developers an alternative to wdt:... which includes all the statements that are not of "deprecated rank"? This would solve the problem of the deprecated statements showing up. Here we still have the problem of having to get the values that are true only subject to qualifiers as you described, but in this case having both options easily usable and it should be the job of the person creating the query to choose the one that fits more in their scenario. -- Agabi10 (talk) 01:22, 21 November 2015 (UTC)
Currently "preferred rank" is set by bot for some properties to define values without end-date (incumbents) or most recent data (population, if I recall correctly).
What should get preferred rank probably varies depending on the type of property. P31 is different from most other properties. --- Jura 00:44, 22 November 2015 (UTC)
No, we can't use rank to select preferred data from query in the case of P31. P31 is used in the structure of WD classification and not as other data. It is like the WP categorization meaning this is a definition chosen by WD. Snipre (talk) 08:00, 23 November 2015 (UTC)
Actually we usually refers to real world definitions, such as commune of France (Q484170) (View with Reasonator) who is indeed used by WD but defined by France. author  TomT0m / talk page 11:23, 23 November 2015 (UTC)
But this implies a cultural or linguistic definition. Suppose X defined as commune de France by WP:fr, town by WP:en, ciudad by WP:es ? Who is right ? Snipre (talk) 21:14, 23 November 2015 (UTC)
It's not cultural or linguistic. French commune are defined by french administration and state. It's true in all languages. Of course "french commune" cannot apply to a german city. Maybe WP:es do not have a special treatment for any municipalities in the world, then as it's a subclass of "municipalities in the world", it knows maybe how to find the right label. But of course Wikidata is here to reflect cultural differences, it's a NPOV project ! and NPOV is not achieved by losing informations, but by retying several aspects of the reality. author  TomT0m / talk page 21:53, 23 November 2015 (UTC)
@TomT0m: The question is not so much which statements are true (many of them may be), but how eg es-wiki can select which one(s) out of those statements to present in the sub-title section of its infobox. To step us aside from the ville/city/commune discussion, consider Glasgow (Q4093). Which of the different P31 classes that Q4093 is an instance of should be displayed at the top of a Spanish infobox? Is the answer the same for the top of a Scottish infobox? And what is the best mechanism to achieve this? Jheald (talk) 22:30, 23 November 2015 (UTC)
@Jheald, Snipre: I guess as every spanish speaking person knows about the terms used in spain and/or other spanish speaking countries, there is no real problem using the specific classes. When referfing to localities outside of their regions, it might be required to "fallback" to more generic concepts such as "commune du monde". The infobox can be the right place to present the information right. I don't think there is a generic solution we can apply here that would not imply information that would not need treatment on how to present the information, obviously Wikidata is not the place to do it. author  TomT0m / talk page 11:47, 24 November 2015 (UTC)
TomT0m Concerning your first comment you assume that only the French definition can be applied to communes of France, this is typically a cultural bias. WP:es can considered the communes of France in the same way that their municipalities in Spain. And the French administration or the WP:fr can't do anything to prevent that.
It's not a cultural bias to say that communes as defined by french administration, that is "settlements in the France territory that have eligible to have a mayor" and so on are ... french commune. This is just an objective transcription of the fact, there is just no bias at all. It occurs that other state have similar regulations and that we can regroup settlements in a broader class. It's irrelevant to invoke a principle of cultural bias in this case. We need to reflect all, not simplify everything and do not catch a lot of informations. You don't seem to acknowledge we can do both, why ? What's wrong in my proposition ? And you escalate to broad and vague principles and you don't answer to my proposition ... author  TomT0m / talk page 12:55, 24 November 2015 (UTC)
Jheald No WP:es can't select the "instance of" they want to see in their infobox by using the rank. Simply because the WP:fr will the same, WP:de too and I don't speak about WP:en. So at the end we will get several "instances of" with rank preferred and nothing will be solved.
The problem you present is the result of the accumulation of data coming from different WPs without any global view. We have to define a structure to avoid accumulation of instances of by 1) creating a classes structure and 2) by transferring some information in specific properties. A class of items is the set of items sharing some special properties so we can always create new properties to avoid class creation. For Frankfurt (Q1794), we have instance of financial centre (Q1066984). We can transform this instance of into a property like "activity":"finance". Then big city (Q1549591) can be deleted and the data can be retrieved by doing a query with two constraints instance of (P31)=urban district of Germany (Q22865) and population (P1082)>100'000. We have to create a new system of classification for cities which can be applied to all cities around the world and which is independent of the cultural systems and definitions.
So instead of commune de France we could have something like "instance of": "smallest administrative unit" and "located in":"France" and by doing the correct query we can retrieve the same items as using a class system. Snipre (talk) 12:35, 24 November 2015 (UTC)
"smallest administrative unit" => Good luck to define this. We have administrations who defines all this for them, not all their definitions are consistent from one state to another, it's far more simple just to reflect them in classes defined the same way administrations define them and to regroup similar definitions ourself. Plus there is regions defined for statistical purpose like "urban unit" that might not reflect the administative definitions. Your scheme is unapplicable and biased because it just loses a lot of informations, overall a mess because you just push the classification complexity in the specific properties that will not be usable with generic tools because they will all have their own instructions. What we need to focus on is the general scheme and make it able to be able to express local peculiarities while beeing able to express the information in less specific classes everybodu can retrieve that going up the class tree, and document those more generic concepts. author  TomT0m / talk page 12:55, 24 November 2015 (UTC)
Also, please don't forget the possibility to use metaclasses to class classes. If different culture/countries have different concepts and different definitions for the smallest kind of municipalities, they all can be tagged instance of (P31) which is also a way to reconcile Snipre's view and more specific classes modelling. author  TomT0m / talk page 13:15, 24 November 2015 (UTC)

Wiktionary support[edit]

Is there an update to the plans for Wiktionary? I just noticed that most supporting properties suggested in preparation at Wikidata:Property_proposal/Sister_projects#Wiktionary got closed as "not done". --- Jura 10:13, 21 November 2015 (UTC)

The most recent proposal is Wikidata:Wiktionary/Development/Proposals/2015-05. It's not being worked on yet. I think it's far too early to propose Wiktionary properties (it's still only a proposal) so I think it would make sense to collect a list of things Wiktionary can store (and whether there's an existing property that might be suitable) somewhere else first (maybe at Wikidata:Wiktionary). - Nikki (talk) 13:19, 21 November 2015 (UTC)
I thought contributors were encouraged to formulate proposals even for in-existent datatypes and not-yet integrated sister projects. This way, they are ready when needed. --- Jura 00:03, 22 November 2015 (UTC)
I didn't say that proposals have to wait until Wiktionary support is finished. I just think it's too early to start deciding how we're going to store the data in Wikidata when we don't even know for sure how the Wiktionary support will work. - Nikki (talk) 10:08, 22 November 2015 (UTC)
Part of the proposal process is that this is being sorted out. --- Jura 11:59, 22 November 2015 (UTC)

delete aspect of history (Q17524420) (View with Reasonator) ?[edit]

These articles are about sequences of events, as we discussed earlier, so I think we should delete this item. What do you think ? author  TomT0m / talk page 13:03, 21 November 2015 (UTC)

There is also history of a country or state (Q17544377). --- Jura 00:21, 22 November 2015 (UTC)
@TomT0m: aspect of history (Q17524420) was relabeled in English to "aspect of history" after the discussion back in September. Several other languages still have the old "Wikimedia history article" label, which is problematic. I don't know exactly how history items should be handled, but I don't think deleting this item outright would make sense now. --Yair rand (talk) 10:24, 22 November 2015 (UTC)
@Yair rand: : Why not just ... story ? If a story is a sequence/set of related events, then of course a story is an aspect of story ... It's the story itself. author  TomT0m / talk page 12:46, 22 November 2015 (UTC)
The purpose is history, not stories. --La femme de menage (talk) 13:08, 22 November 2015 (UTC)
Both are sequences of real events anyway, if not fictional anyway. News articles are called "stories" in english for example (of course I don't speak of fictions here). author  TomT0m / talk page 13:11, 22 November 2015 (UTC)

Link class of event / body part[edit]

I'm thinking of brain damage (Q720026) (View with Reasonator). Do we have an appropriate property to express that a brain injury occurs or damages some brain ? author  TomT0m / talk page 13:58, 21 November 2015 (UTC)

I suppose you could use the property of (P642) = brain (Q1073) as a qualifier to the statement subclass of (P279) = lesion (Q827023); but it might be useful to have something more made for the purpose. Jheald (talk) 08:40, 22 November 2015 (UTC)
Indeed, subclass of is not naturally "parameterized". We don't have any way to express that a damage, in general, of a body, applies to parts of the body usually. Hence the "of" qualifier is note really linked to anything. author  TomT0m / talk page 12:49, 22 November 2015 (UTC)
I basically agree with you.
But it is not unheard of for P279 to be "parametrised" using P642 -- indeed, this query for properties that P642 is most commonly used to qualify currently reports 649 cases it being used to qualify P279 -- and 8248 for it being used to qualify P31. Possibly some of these uses may also deserve investigation, and (perhaps) some more specific created properties. Jheald (talk) 13:28, 22 November 2015 (UTC)
Amended query:, showing the values as well for these properties that are qualified by P642. Jheald (talk) 13:34, 22 November 2015 (UTC)
I have added afflicts (P689) brain (Q1073) . Not sure if that is what you meant, though. --Andreasm háblame / just talk to me 18:01, 22 November 2015 (UTC)
looking at the breakdown of values that other uses of P683 have (, it looks perfect.
(This query now available as a link from the top-right of the property box on every proprty talk page. Big thanks to whoever added it to the template.) Jheald (talk) 18:38, 22 November 2015 (UTC)
Thanks Andreasmperu, that's what I was looking for. author  TomT0m / talk page 11:42, 24 November 2015 (UTC)

Please delete a links page[edit]

Wikidata weekly summary #185[edit]

Wikidata weekly summary #114

New Wikibase datatype - GeoShapeValue[edit]

GeoShapeValue is a long expected and as-of-yet unimplemented data type in Wikibase. Wikipedia needs geographic data attributes for it's articles, but it's being held up by this. Since it seems there's little momentum, I've proposed WKT be used. Talk:Wikibase/DataModel#Use_WKT_for_GeoShapeValue. esbranson (talk) 21:24, 22 November 2015 (UTC)

An alternative would be to link to a value stored somewhere else (maybe even in a specialist repo on Wikimedia's own servers), via a URL-valued property -- quite a semantic web approach. Jheald (talk) 23:06, 22 November 2015 (UTC)
The challenge of a GUI for calendar and quantity still needs addressing ..
Obviously, if you are merely interested in storing strings, you could try string-datatype. --- Jura 18:12, 23 November 2015 (UTC)

WDQ not working(?)[edit]

I'm trying to use WDQ in Autolist to find items that feature both of two claims, e.g. redundant ones. Now if I put a query of the form "claim1 AND claim2" into Autolist, it finds me no or only some of the matching items, although I know there definitely exist some it doesn't turn up.

Example: I want to find software that is redundantly stated to be under GPL and under GPL version 3. So I search with the following query:

claim[275:7603] AND claim[275:10513445]

It returns only item Q1252773, which is an expected match. But for example Q2737776 is missing, although it should be there.

What's up? Is WDQ broken? Or Autolist? Should I file a bug report or am I missing something?--Frysch (talk) 20:55, 23 November 2015 (UTC)

Somethings wrong. I have noticed that Wikidata:Database reports/Recent deaths is not showing any deaths for the last couple of days.--Racklever (talk) 23:18, 23 November 2015 (UTC)

Could it be that problem? That one doesn't look like it's going to resolve itself like they seem to hope...--Frysch (talk) 00:46, 24 November 2015 (UTC)

As I said in about 50 other places, WDQ is currently stuck around the 17th of November. Not sure what's wrong. Debugging will take a while. --Magnus Manske (talk) 09:50, 24 November 2015 (UTC)
Fixed it quicker than anticipated. ~1 day behind now, catching up fast. --Magnus Manske (talk) 17:53, 24 November 2015 (UTC)

Confirming it seems to be fixed.--Frysch (talk) 23:13, 24 November 2015 (UTC)

Property creation backlog[edit]

If one or the other administrator, property creator or other experienced editor could help us out. Several properties currently proposed at Wikidata:Property proposal/Unsorted can be created.

Experienced editors (who already proposed and reguarly used various properties) can request the necessary access at Wikidata:Requests_for_permissions/Other_rights#Property_creator after reading Wikidata:Property creators. --- Jura 11:48, 24 November 2015 (UTC)

Need help splitting item[edit]

I need help sorting out the sitelinks to Category:Literature of Greece (Q7316804) (literature from Greece) and Category:Greek literature (Q10004556) (literature writen in greek). /ℇsquilo 13:37, 24 November 2015 (UTC)

Deaths at hospitals being deleted at Wikipedia[edit]

If you have an opinion one way or the other, please express it here: w:Talk:Mount Sinai Beth Israel. Here at Wikidata we list the place of death as the exact location, not just the city, but we are about to lose that information from Wikipedia before it gets migrated here. Each article on a major hospital lists the notable deaths. The majority opinion is that it is trivial at Wikipedia. --Richard Arthur Norton (1958- ) (talk) 14:54, 24 November 2015 (UTC)

Maybe we should think about a way to make place of death (P20) either state the location of the hospital (New York City) or the hospital itself. Jonathan Groß (talk) 09:02, 25 November 2015 (UTC)
To make it even more complicated: Official Swedish sources normally tells where somebody lived when somebody died, not the location of the event. -- Innocent bystander (talk) 10:25, 25 November 2015 (UTC)

Q18900373 empty[edit]

I've noticed that the English no label (Q18900373) is completely empty, besides a Wikisource link to another language. Can someone who knows the language translate it to English?  – The preceding unsigned comment was added by SadisticSouls (talk • contribs).

The language is Chinese (zh). Jonathan Groß (talk) 09:06, 25 November 2015 (UTC)
I could suggest the description "page to read once I have learned zh" --- Jura 13:32, 25 November 2015 (UTC)

Adding sources for every statement when using identifier to external database[edit]

Hi everyone! I'm in the middle of editing Joseph Bradley (Q21546271), using information from an entry in Australian Dictionary of Biography (Q672680).

I'm wondering: do I have to attach references to each of the statements I make? And do I list the source as the ADB in general, or as the specific volume of the ADB in which this person is described?

Atroceh (talk) 02:19, 25 November 2015 (UTC)

  • To start with: Are you aware of the DuplicateReferences-Gadget, that allows you to copy-paste references at least within one single item? -- Innocent bystander (talk) 07:24, 25 November 2015 (UTC)
  • Please read help:sources. Then the good practice is yes for several reasons. Some databases are closed system and no open url is available to check in the database if the identifier is correct. In that case we need the references. Typical example, the CAS registry number which is a wildly used identifier from a non-free organization. Then if you can't create an url link to the database because the url doesn't contain the identifier or the url is not stable, we need the reference. So instead to spending time to judge if the reference is necessary put it and you avoid plenty of problem for the other contributors. Snipre (talk) 08:45, 25 November 2015 (UTC)

Offline Dictionary for different languages[edit]

Why there is no Offline dictionary are available of different languages? After searching there is no Offline positive result.

Erasmus prize awarded today, thanks in part to you, the Wikidata community![edit]

Hi all, as part of the activities surrounding the Erasmus prize I attended several presentations from the Dutch academic world in Amsterdam yesterday and Wikidata featured in many of them. I realize now that Wikidata has been a force of change in my life as a Wikipedian and now I realize it is a game-changer for others who study Wikipedia projects as well. From here let me extend my thanks to this community for your part in making my Wikidata experience so positive and enjoyable. I hope see some of you at the prize ceremony this afternoon. --Jane023 (talk) 11:04, 25 November 2015 (UTC)

Wikimania 2016[edit]

Only this week left for comments: Wikidata:Wikimania 2016 (Thank you for translating this message). --Tobias1984 (talk) 11:07, 25 November 2015 (UTC)

Mix'n'match daily report[edit]

For those of you interested in Mix'n'match and external catalog reconciliation, I now produce a daily report of things that might require manual attention. Some of that is Mix'n'match specific, but others may be of general interest (e.g. multiple items using the save external ID). --Magnus Manske (talk) 13:50, 25 November 2015 (UTC)

@Magnus Manske: Very useful. Thank you! Would it be possible to generate URLs for the identfiers in the "unrecognised external ID" part, to make them link to the database ? Jheald (talk) 14:57, 25 November 2015 (UTC)
Could disambiguations be included as well? By the way, I don't get any results after clicking on "Disambiguation links", it just says it's loading and should take ~30 sec. It used to work, maybe some change disabled it. Matěj Suchánek (talk) 15:40, 25 November 2015 (UTC)

Items with administrative unit set to a disambiguation page[edit]

Over 1200 items have located in the administrative territorial entity (P131) set to an item with instance of (P31) set to Wikimedia disambiguation page (Q4167410). Some of them (maybe a few tens) are my mistake and I'm trying to find and fix them, but we should find a way to fix all them.--Pere prlpz (talk) 14:15, 25 November 2015 (UTC)

I guess what you're asking for is could we make a "dab solver", that suggests items of the right class with a similar label, to replace property values that are disambiguation pages.
Interesting challenge.
I think, even if a list were created, one would still wish for manual supervision of the proposed changes, or at least manual checking + quick statements, rather than full automation. Jheald (talk) 15:05, 25 November 2015 (UTC)
Some of those items already have a right value of located in the administrative territorial entity (P131) beside the wrong one. Just removing the wrong ones while leaving the right ones that are already there would be a good improvement - my first thought was to use Autolist2 "en masse" to remove P131 from those items, but as far as I know that would delete right values, too.
Having a dab solver to fix all links to disambiguation pages in Wikidata would be even more useful. I found 1200 wrong links because I just checked P131, but there are likely a lot of other properties set to disambiguation pages.--Pere prlpz (talk) 15:59, 25 November 2015 (UTC)
Yeah, "remove statements" in Autolist2 was my first thought too, when you were talking about right statements alongside wrong ones -- but in this case the Autolist search is only giving you part of the wrong statement, not the whole wrong statement to remove. Given some scripting and the SPARQL search, it would be easy enough to construct a list of superfluous apparently wrong statements to remove -- but I'm not sure whether Magnus provides a "Quick statement removal" tool to accompany his "Quick statements" tool. If he did, that might be a way to go; but I'm not sure if he does. Jheald (talk) 16:23, 25 November 2015 (UTC)
This should appear in the "value type"-violation section of Wikidata:Database_reports/Constraint_violations/P131, but it's buried within 14000 total (for 2.4 million items). --- Jura 16:29, 25 November 2015 (UTC)
I am fixing these frequently, as they are also appearing at first report of Wikidata:Database reports/Constraint violations/P625‎. Mostly they are products of bad mergers and sometime they are not easy to fix, mainly if bad merger is older and bots already start to import "non-disam" properties to disam item or disam labels to non-disam item. --Jklamo (talk) 18:24, 25 November 2015 (UTC)
  • @Pere prlpz, Jura1, Jklamo: Here's a query to try and get a sense of what might or might not be possible:
In the first pair of columns is the item; in the last (fourth) pair of columns is the current value of its P131 (a disambiguation item).
In the third pair of columns is any item with a matching English-language label, that is considered an administrative territory, and therefore might be a possible dab-solution.
In the second pair of columns is any value that the item already has for P131 that is considered an administrative territory.
The sample is slightly limited, in that I have restricted to cases where both the item and the disambiguation item have English labels; and also to items that where there is an administrative territory with a exactly matching English-language label. (So I'm ignoring near but non-exact matches, and matches in other languages; and probably almost all of User:Jklamo's bad merges.) This cuts things right down, to results for only about 100 items.
Each item can have several rows, however, if the search finds multiple possible matches for the dab; and/or multiple existing P131s that are adms. The two multiply, so if there are 2 valid P131s and three proposed dab matches, that will give six rows.
Despite being very much cut down from Pere prlpz's initial 1200 items with problems (and despite almost certainly missing many of the actual dab solutions), I think at least a couple of things are evident:
(i) there can very often be several candidate solutions for a particular dab.
(ii) in most cases, the value needing to be disambiguated is on a different administrative level than the value that is 'good' -- so it is probably not a good idea to just delete such dab-item values blindly, as that would lose information; rather, it is worth trying to solve them -- even if that is probably substantially going to need to be done by hand. Jheald (talk) 21:53, 25 November 2015 (UTC)
Having looked at the results of this a bit more, I find that the correct dab solution is very likely not to be on the list in the above query (often because it may include a suffix) -- and even for those that are on the list, the list presentation is not the most helpful, because you still have to click through to see the description.
So, unfortunately, the best approach seems still to be to open up the item in one of the larger Wikipedias, also open up the dab page, and try to spot the right dab-solution from the Wiki page.
A better tool might present the Q-numbers for all of the dabs from all of the wikipages, together with their descriptions (falling back to the Wiki page if the description on Wikidata was blank). But a pop-up tool like that is well beyond my capabilities. Jheald (talk) 22:37, 25 November 2015 (UTC)

Query people of certain age[edit]

How do I query with WDQ people born between 1984 and 1986?--Kopiersperre (talk) 20:23, 25 November 2015 (UTC)

@Kopiersperre: Query: BETWEEN[569,1984,1986-12-31] Jheald (talk) 22:01, 25 November 2015 (UTC)
@Jheald: Thanks.--Kopiersperre (talk) 22:52, 25 November 2015 (UTC)

Your input requested on the proposed #FreeBassel banner campaign[edit]

This is a message regarding the proposed 2015 Free Bassel banner. Translations are available.

Hi everyone,

This is to inform all Wikimedia contributors that a straw poll seeking your involvement has just been started on Meta-Wiki.

As some of your might be aware, a small group of Wikimedia volunteers have proposed a banner campaign informing Wikipedia readers about the urgent situation of our fellow Wikipedian, open source software developer and Creative Commons activist, Bassel Khartabil. An exemplary banner and an explanatory page have now been prepared, and translated into about half a dozen languages by volunteer translators.

We are seeking your involvement to decide if the global Wikimedia community approves starting a banner campaign asking Wikipedia readers to call on the Syrian government to release Bassel from prison. We understand that a campaign like this would be unprecedented in Wikipedia's history, which is why we're seeking the widest possible consensus among the community.

Given Bassel's urgent situation and the resulting tight schedule, we ask everyone to get involved with the poll and the discussion to the widest possible extent, and to promote it among your communities as soon as possible.

(Apologies for writing in English; please kindly translate this message into your own language.)

Thank you for your participation!

Posted by the MediaWiki message delivery 21:47, 25 November 2015 (UTC) • TranslateGet help