Hallo MisterSybnergy, ich weiß nicht, was im Datensatz drin stand, allerdings ist der Artikel im RAT drin, aber auch mit GND NUmmer etc. kannst du ihn mir bitte wieder herstellen. danke K@rl (Diskussion) 13:26, 18 September 2024 (UTC)
MisterSynergy
Joined 1 December 2012
Hi MisterSynergy,
please restore the item of https://web.archive.org/web/20200316111602/https://en.wikipedia.org/wiki/John_F._Remondi notable as CEO of Navient (Q19903490).
Thank you in advance
Do we have actual sources for this person? The archived, but deleted Wikipedia article is not enough.
There is press coverage on him. If certain Wikidatans work on them all mayors of large cities, all recipients of notable science prizes and all CEOs of major companies get listed at Wikidata. Such serial data becomes useful only if complete.
Inclusion here at Wikidata depends on the availablility of serious sources, not on a position held by the person.
Unfortunately, I cannot read the NYT article due to a paywall, and it is otherwise somewhat difficult to obtain a clear overview of Remondi's activities based on an Internet search. Do you have something else? This does not look like a clear case to restore at this point, but I'd be willing to reconsider in case you bring the evidence.
Einen vollständigen Lebenslauf gibt es unter https://nmefoundation.org/nme_team/jack-remondi/
Alle Vorstandsvorsitzenden der SLM Corporation (Q1569558) (wie auch dem "Zwilling" Freddie Mac (Q935969)) sollten relevant sein.
It is restored, please expand the items with references.
As I said, the availability of serious sources is key to inclusion at Wikidata, not the mere fact that someone held a certain position. Without sources (and Wikipedia sitelinks), it is often impossible to understand what an item is about if you are not an insider already.
This item meets the inclusion criterion "It refers to an instance of a clearly identifiable conceptual or material entity that can be described using serious and publicly available references". Can it be restored?
Do we have independent sources for this company?
Yes:
- https://www.standard.co.uk/hp/front/gaming-boss-jailed-for-14-years-for-murdering-wife-as-she-slept-6472958.html
- https://opencorporates.com/companies/gg/1-49289
- http://web.archive.org/web/20081217125049/http://www.gemlifestyle.com/company.asp
- https://gemlifestylebiz.wordpress.com/
- https://martin271.wordpress.com/
Questionable whether this is sufficient, but I have restored the item.
Hello. Please undelete Q128210473 as it is meant to support the addition of architect:wikidata
tags to buildings in OpenStreetMap. This architect's existence is documented in print and in Lisbon Municipal Archive's website, where a list of his works is available. Futher statements will be added to the element as further research on off-line documents is finalized.
Done. But it is in serious danger to be deleted again in this condition. Please add some basic and sourced information to the item quickly.
Duly noted. I must have let it slip while editing other elements. It has more data now. Thanks!
Hi. Please undelete Q113573313. It's about a third-league Romanian team. We maintain wikidata items for these teams mostly for structural reasons. For instance, this is the current team where Q30238685 plays.
Done
Hi MisterSynergy, was having a look at Wikidata:WikiProject sum of all paintings/Duplicate paintings. Only the "Paintings with same image" currently works, but is full of suggestions like Sunflowers (Q157541) & Sunflowers (Q21948567) that shouldn't be in the list. Looking at the source, can you change the query to only instance of painting and not all subclasses? I needed to add a limit to prevent it from timing out.
https://w.wiki/A$wi for the inventory number query completes just in time and I had to remove the sort from https://w.wiki/A$wo to complete the catelog query. Maybe you can update the code based on that?
Hey Multichill, this looks like a very reasonable suggestion, so I have removed all instances of subclasses from the "Paintings with same image" list. The list is now much shorter, and it seems there are quite a lot of actual duplicates on it.
As for the other two lists, I think I need to have a closer look at the situation as "just completes in time" is not really a good state either. Interestingly, the "Items with same inventory number of the same collection" list does sometimes contain results based on the revision history of the report page, so it is not completely off scale anyways.
That said, a solution that does not rely on WDQS that much and does the heavy lifting elsewhere (e.g. via Python/pandas) could be much more robust, but this would not be in scope of Deltabot anymore.
Thanks for implementing it. I see plenty of items now that need merging or other work (like usage of File:Hendrik Willem Mesdag - De Noordzee bij storm - 1513 (MK) - Museum Boijmans Van Beuningen.jpg, o no!).
I wonder if the fork of the query service will actually improve performance. For now I don't have any plans to use another source than the query service. I wonder if we can set a smart constraint for inventory number (P217) and catalog code (P528) that will trigger if another item with the same combination exists. Than we could probably just use that data?
I don't expect that the upcoming graph split is going to make things easier for end users. It is much more an effort to keep this thing administratable on the backend.
In the past, I have used the WDQS slicing service a couple of times, and mostly with code fragments from other tasks I can quickly offer something like this:
It compiles a wikitable of P217 (main value) + P195 (qualifier) or P528 (main value) + P972 (qualifier) duplicates, limited to instances of the painting item Q3305213. Could easily be written to a wiki page and automated, of course, or formatted differently, or filtered further if necessary.
I see Q126685212 was created with no data and was deleted accordingly. It's a regional development agency in Poland and I think it meets N2, and would also be useful in a Name Suggestion Index entry. Could you undelete it so I can add the appropriate data?
Sure, done. Please expand it to a useful form.
Thanks!
Not sure what’s done on Wikidata with users of this nature: user:Ressiboy
Appears to be here to promote a company and talk about sexual fantasies (note the 40% breastmilk comment on the user page).
Yeah, this is out of scope here, and partially also just blatant vandalism. Blocked indefinitely.
Hi MisterSynergy,
I have recently created an item on Rudolph William Papperitz (Q126368485), which has been deleted by you as far as I can see. As I am new on wikidata, could you just let me know, why this was happening? I have used the wikidata ID already for linking this person to other data, so it would be useful if the link to wikidata would still exist. Many thanks!
Hello SarahTWagner, the data item has been created, but left empty (meaning: it did not have any claims or references or sitelinks). I also cannot find any backlinks to to from within Wikidata or Wikimedia Commons.
We can restore it, but I would like to ask you to add information about the person *within the item* and *including external references* right away. It is best practice to bring new items of a shape that makes them notable by themselves, independently of eventual backlinks (usage of the data items at other places). Would that be possible for you?
Thanks a lot for your comments! I will be able to add more information about that person. Thanks a lot for restoring the data item :-)
how do I see that more items created by me in the past, might have been deleted due to the same issue? I just noticed the deletion of Rudolph William Papperitz (Q126368485) by chance...
Pages created by you are listed here: https://xtools.wmcloud.org/pages/www.wikidata.org/SarahTWagner/all There are currently no deleted pages.
Q126368485 is meanwhile restored, please work on it in order to prevent it from being deleted again.
I updated the title with adding space between Top and 100. Will bot recognise this? Btw. is this possible to create a list of top 1000? ~~~~
The bot will not recognize this page move, as the old title is hard-coded into the bot script. It would be nice if you could leave such page where they are, or notify the involved bot operators in advance.
Top 1000 (or even a full list) are also possible, but I do not see what the use case is. Do you have anything in mind here?
Top 100 is kinda short, top 1000 would be more useful.