User talk:Magnus Manske

Jump to navigation Jump to search

About this board

Previous discussion was archived at User talk:Magnus Manske/Archive 9 on 2015-08-10.

Thierry Caro (talkcontribs)

Hello. Is there a tool that would help move image (P18) statements to smaller image properties such as image of interior (P5775)? I guess there are a number of images stored through P18 that have words such as 'interior', 'winter', 'map' within their title. Those are serious candidates for a move and I guess we can probably have a Wikidata game based on that or something.

Magnus Manske (talkcontribs)

Interesting, but the pool of candidate items would be all items with a P18, which seems a bit much (2,582,067 items at the moment). Unless we can automatically identify some candidate subset, a game would be very boring indeed (just false positives).

Thierry Caro (talkcontribs)

OK. We can probably do this through traditional requests then but I'm not qualified enough to bring it to life. If you ever come up with a solution that would make checking the images and then moving some to smaller properties very easy, please do let me know. It's clearly not an emergency though.

Reply to "Images"
Pixeldomain (talkcontribs)

Hi Magnus, I imported catalog 1760 "ITU-T Recommendation" into {{Q|28054658}} but then realised the format regex I used for the idendifier value was wrong. Could you please remove "ITU-T" from the identifier value regex. Values for {{P|5688}} should not start with "ITU-T" according to the formatter regex on the property page. Thanks!

Magnus Manske (talkcontribs)

Done.

Mix'n'match sync creating constraint violations

4
Nikki (talkcontribs)

I've reverted all the edits in https://tools.wmflabs.org/editgroups/b/QSv2T/1536896161032/ and https://tools.wmflabs.org/editgroups/b/QSv2T/1536919505199/. In the first batch, the IDs were added using the wrong property (and already existed using the right property). In the second, the ID was added to a disambiguation page.

In both cases, the edits created constraint violations, since GNIS ID (P590) has a constraint saying it shouldn't be used with GNIS Antarctica ID (P804) and IMDb ID (P345) has a constraint saying it shouldn't be used with instance of (P31) Wikimedia disambiguation page (Q4167410). It would be good if the tool would not add IDs which create constraint violations like that.

Magnus Manske (talkcontribs)

Thanks, this is very odd! I have no idea why that script would use P590; the catalog says P5688, which seems correct. @Pixeldomain: You imported that catalog, any ideas?

Magnus Manske (talkcontribs)

Update: Might be the wrong catalog number in the edit comment, checking...

Pixeldomain (talkcontribs)

@Magnus Manske: I imported ITU-T Recommendation, catalog 1760. It looks like @Nikki is referring to edits done by a different catalog though.

Reply to "Mix'n'match sync creating constraint violations"
Jura1 (talkcontribs)
Jura1 (talkcontribs)

@Nono314: maybe I should be asking you.

Nono314 (talkcontribs)

Sorry, why would I know anything about them???

Jura1 (talkcontribs)
Nono314 (talkcontribs)

Obviously not. The catalog you are pointing to is about HAL author ID (P4450) which has nothing to do with National Library of Wales ID (P2966). The relevant catalogue is very much more likely this one, to which I'm absolutely not linked.

I have no idea what happened with Reinheitsgebot and why it mentions the other one in the comment, but it's quite easy to see there is no link between them, mine being about French researchers still alive. I don't know about Wales and leave that to people knowledgeable on that context.

Jura1 (talkcontribs)

Sorry then.

You probably noted that the edit summary on https://www.wikidata.org/w/index.php?title=Q56187531&action=history reads:

Reinheitsgebot (talk | contribs)‎ . . (1,051 bytes) (+1,051)‎ . . (‎Created a new item: #quickstatements; #temporary_batch_1534660093342; invoked by mixnmatch:microsync from catalog 1674)

Magnus, would you check?

Nono314 (talkcontribs)

I think it's probably that sync, creating everything that had previously been marked as No WD as the ones there, so you were doubly asking the right person in the first place ;-).

You can see here several properties sharing the same supposed catalogue id, probably the last one existing at that time, given I imported it on August 15th.

And if your stomach is well enough fastened, you can have a look here and see you have only uncovered the tip of the iceberg... :(

Jura1 (talkcontribs)

I did notice that it wasn't a single occurrence otherwise I wouldn't have bothered Magnus about it ..

Jura1 (talkcontribs)

Potentially that means that all edit summaries by the bot reference incorrect catalogs ..

Nono314 (talkcontribs)

This was the microsync script and I think it has now been fixed to no more reference that latest catalog when performing a global sync.

92.226.160.13 (talkcontribs)

And all lived 1809-1855? @Nono314 what are these things?

Nono314 (talkcontribs)

@92.226.160.13, Whoever you are, ask that to someone at least remotely linked to them!

Magnus Manske (talkcontribs)

OK, it was the microsync script, creating items for entries previously marked as "not on Wikidata".

That functionality is now deactivated.

Jura1 (talkcontribs)

I'm not sure which catalog this was. Apparently it's not 1674 and maybe not the 851 listed on National Library of Wales ID (P2966).

Anyways, to find more problematic entries:

BTW, there is (or was) a Wikimedian in residence. Maybe he can help sort it out.

Azertus (talkcontribs)
Jura1 (talkcontribs)

I'd expect Magnus to fix them.

Jura1 (talkcontribs)

It's unclear how add the catalogues .. maybe it wasn't actually Magnus.

I guess I can attempt to fix these batches by merging some of the items.

Reply to "Jones and Jones, catalog <s>1674</s>"
Jura1 (talkcontribs)
Jura1 (talkcontribs)
Jura1 (talkcontribs)

Can you stop this batch/revert it? There is a discussion on the property talk page.

Magnus Manske (talkcontribs)

If you click on "Last", you can see that the first half of the commands remove the old IDs, and the second half add the new ones. No panic.

Jura1 (talkcontribs)

I disagree with that change. It makes us loose the previous identifier while adding one we already have in an other property.

Magnus Manske (talkcontribs)

But the old ones don't work anymore. What is the practical purpose of keeping them, besides sentimentality?

Jura1 (talkcontribs)

It's information people who used that identifier can use to match with Wikidata identifiers. As any other.

The new ones don't have any practical purpose as we already have the identifier.

Magnus Manske (talkcontribs)

I don't agree that keeping broken identifiers serves any actual purpose. And the purpose of the new ones is that there is actually a page on that website, even if other websites use the same ID.

Jura1 (talkcontribs)

You can bring that up in the discussion for the deletion of the property/proposal of a new one.

Jura1 (talkcontribs)

In the meantime, I guess I can fix these batches.

Reply to "id deletion"
GZWDer (talkcontribs)

Please fix.

Magnus Manske (talkcontribs)

Works for me. Yay self healing code!!!

Reply to "Mix'n'match API returns 500 error"

What should SPARQL query emit in Petscan?

5
Azertus (talkcontribs)

Hi Magnus

What should a SPARQL query emit when using it directly in PetScan? When I paste the result of this query in the "Manual list" of PetScan, I get the intended result. It should return some pages in enwikisource.

However, I haven't found in the documentation what SPARQL should return to get the same result. I've tried Wikidata QIDs, Wikisource URLs and Wikisource page titles.

Thanks!

Nikki (talkcontribs)

In my experience, you need to select the Wikidata item. If I change your query to select ?author (i.e. the Wikidata item) and paste it into Petscan without changing any other options, I get 312 results.

Azertus (talkcontribs)

The issue there is that it is returning the Wikidata items and I'm looking for the Wikisource pages instead, so I can filter on categories or templates there.

It might be related to the "Use wiki" setting on the "Other sources" tab, but regardless of where I tried to set enwikisource (probably haven't tried every possible place), I either got 0 results or the Wikidata items.

Magnus Manske (talkcontribs)

Truth is, there is not really a mechanism to do that. Yet.

Jura1 (talkcontribs)

Maybe with the "Uses items/props" option. Works with categories at Wikipedia.

Reply to "What should SPARQL query emit in Petscan?"
Succu (talkcontribs)

I'm a little bit tired of fixing cases like Q56033120.

Magnus Manske (talkcontribs)

Do you mean fixing the name? Hard to tell for my bot which one to use; usually, stuff in () is secondary.

Or do you mean adding the other properties? WikiSpecies is not exactly making it easy for me to get those.

The alternative is not to create items for WikiSpecies pages. Personally, I think that would be worse.

Succu (talkcontribs)

OK. The problem is a little broader. I will add this query to my todo list.

Reply to "Subgenera"
Edgars2007 (talkcontribs)

Hi!

I used your oauth class, but can't seem to get it working properly. When I use "getConsumerRights", I get something weird. Authorization works fine, so it can read the oauth.ini file properly. Ideas?

Reply to "Some help needed"
Jura1 (talkcontribs)
Reply to "Associated acts"