User talk:Dhx1

Jump to navigation Jump to search

About this board

Previous discussion was archived at User talk:Dhx1/Archive 1 on 2017-07-21.



Liuxinyu970226 (talkcontribs)

It seems that currently this section discussed things, that may or may not about your former edits, ehh, (added Wikidata usage instructions (P2559) value "List subjects which the article covers. Do not use P527 "has part" as parts of an article are the title, sections and paragraphs.") don't you need to join this thread?

Dhx1 (talkcontribs)
Reply to "Wikidata:Project_chat#Wikipedia article covering multiple topics (Q21484471)"
Lofhi (talkcontribs)

Hello again. I have one question: should we edit Q81068910 to differentiate between P585 and P577? Because there is a time lag of one day, since the data published on day n are the analysis of the last 24 hours (n-1).

Lofhi (talkcontribs)

Okay, nvm. It's only report n°58 that's weird: the report is based on data received at midnight CET. Well, it's a day's delay compared to the other reports, can't do much.

Dhx1 (talkcontribs)

@Lofhi I think point in time (P585) is probably best at the moment as it provides an indication of when WHO compiled data from dozens of countries together to form that global statistic. It's not going to overly accurate due to some countries creating their daily reports in timezones ~20 hours apart, and the WHO then compiling whatever they have access to at a particular point in time (P585).

Reply to "WHO COVID-19 data"

Would you like to join a WikiProject COVID-19?

4
TiagoLubiana (talkcontribs)

Hello,

I saw that you contributed to the item 2019–20 COVID-19 outbreak by country and territory (Q83741704) and I was wondering if you would be interested in helping to create a Wikidata WikiProject COVID-19.

The goals would be initially (of course, they can be changed):

  • curate the wikidata items relevant for describing the outbreaks and the virus itself.
  • think and develop ways to process these items to improve access to information (for example, via automated articles in languages that currently do not have pages about country-specific outbreaks).

Would you like to participate in  this effort?

I am trying to gather the Wikidata editors actively envolved in the topic.I believe that if we act together, we can have a shot at aiding the global effort in containing the pandemic.

Thanks!

~~~~

Dhx1 (talkcontribs)

@TiagoLubiana. Yes, a good idea! The project could also expand to look into other aspects too, for example:

  • changes to legislation/regulations
  • media coverage
  • effects--panic buying, quarantine measures between countries/regions, financial impacts
TiagoLubiana (talkcontribs)

Great suggestions!

The project has already been created, but its is under active construction. Feel free to add any suggestions to Wikidata:WikiProject COVID-19.


~~~~

TiagoLubiana (talkcontribs)

I have added your suggestions as open tasks to be created in the context of the project. If you have some time, feel free to elaborate there on the ideas you have for the topic.

Reply to "Would you like to join a WikiProject COVID-19?"
Jeblad (talkcontribs)

It could be wise to test scripts on items that has a rather low visibility, or even do testrun in the sandbox. Just I friendly remainder, I have messed up items myself, and it is not cool when people at several wikis start yelling at you.

Dhx1 (talkcontribs)

Data was added manually and not by script--but yes, I did make a lot of changes in the process due to finding a better way to model the data on 2019-nCov. It looks like @Salgo60 has also started importing the same data each day from the WHO, but using QuickStatements to do so (much more efficient).

Salgo60 (talkcontribs)

@Dhx1 I use Open Refine and generates Quickstatements but its gets messy I feel you cant set rank, it just works if the value is unique etc.... looks like @Larske start cleaning it.....


Jeblad (talkcontribs)

Another thing, at some point the spread will probably be so massive the numbers will stop making sense. I'm not sure how that should be modelled. It will then go from a confirmed number to an estimated number.

Dhx1 (talkcontribs)
Jeblad (talkcontribs)
Salgo60 (talkcontribs)

@Larske@Jeblad@Dhx1


Lesson learned so far having 2019-nCov data in Wikidata:

1) WIkidata with its "query lag" is not optimal for data like this

2) the current way of changing rank to show latest value doesnt work good with Quickstatement that cant set rank

3) Adding the same value as was entered before Quickstatement adds the data qualifiers to the already created value

4) In the latest report the reporting changed also having information about

  • Total (new) cases with travel history to China
  • Total (new) cases with possible or confirmed transmission outside of China
  • Total (new) cases with site of transmission under investigation

how should we handle that? Is Wikidata a good place for data like this were Wikidata has limited support for properties based on data from other properties, transactional data.....


Example query:

Salgo60 (talkcontribs)

Suggestion from Telegram group "Wikidata" is use "A designated wikibase"

Jeblad (talkcontribs)

I've been wondering about statistics, and whether there are some cases where the dynamic is such that we need an alternate way to update and query the values. In particular, that statistics somehow isn't a first order member of one item or even several items, but rather it is related to several of them as a born digital entity itself, and perhaps even related to items of heterogeneous type. It is not metadata about a single item, it is statistics about several items.

Salgo60 (talkcontribs)
Reply to "Q81068910"
99of9 (talkcontribs)
Dhx1 (talkcontribs)

Thanks for the invite @99of9. I'm Melbourne based, but hope it's a great meetup and opportunity to catch up amongst the Sydney Wikifolk. Do you mention these events in the weekly Wikidata news updates before or after the event? It'd be great to get a quick summary there to raise awareness of these meetups and projects including those on the agenda for the workshop--AustLII and SLNSW.

99of9 (talkcontribs)

Good idea, done.

Reply to "WD meetup in Sydney"
MediaWiki message delivery (talkcontribs)

RMaung (WMF) 17:37, 10 September 2019 (UTC)

Adam Harangozó (talkcontribs)

Hi, I created a new page for collecting sites that could be added to Mix'n'match and I plan to expand it with the ones that already have scrapers by category. Feel free to expand, use for property creation. Best, Adam Harangozó (talk) 22:47, 3 November 2019 (UTC)

Reply to "New page for catalogues"

Madeleine Ball (Q28649100)'s Hacker News I.D.

2
Arlo Barnes (talkcontribs)

Why remove it?

Dhx1 (talkcontribs)
D-Kuru (talkcontribs)

Is it really Wikidata's purpose to collect everything down to the single CPU?

Dhx1 (talkcontribs)
AMDmi3 (talkcontribs)

Hi!


Thanks for filling Repology names all over Wikidata!


After some fixes I've made here (removed some incorrect properties and split some entries which referred to multiple projects at once) and in Repology (added some missing project merges) it looks like Repology can be switched to using P:P6931 instead of Arch/AUR package names to match Wikidata entries to its projects. Currently there are also more entries with P:P6931 filled (1999) than with Arch/AUR packages (1271) so it'll also increase coverage. However not all entries with packages have Repology names filled yet, so switching now would cause some entries to disappear. Since you're adding P6931s you'd probably be interested in filling the remaining ones. If you need any data for that, feel free to ask me. I've created Wikidata:Partnerships and data imports to discuss that, but haven't got any feedback yet.