Jump to content
Wikimedia Meta-Wiki

Tech

Add topic
From Meta, a Wikimedia project coordination wiki
This is an archived version of this page, as edited by Sarri.greek (talk | contribs) at 23:39, 5 June 2020 (→‎Lua help to small wiki: anyone?). It may differ significantly from the current version .

Latest comment: 4 years ago by Sarri.greek in topic Lua help to small wiki
Tech-related questions about third-party wikis should be asked at mw:Project:Support desk .
For faster (real-time) interaction, join the #wikimedia-tech IRC channel.

A place to talk about tech related to a Wikimedia wiki .

Have a technical wiki question? Ask here. This can include, for example:

  • requests for new tools, scripts, and bots;
  • help with CSS or JavaScript;
  • API help;
  • and data collection help (including database queries).

Please sign by entering four tildes (~~~~) or by clicking on the signature icon ; this will automatically produce your name and the date.

Large bot-generated lists

Latest comment: 4 years ago 1 comment1 person in discussion

On Wikipedia, we can make bots that check articles (i.e. by catscan). Bots make a formated list and write it to the page not in the main namespace.

  • the result can be long, so it approaches one of the technical limits for page. For example, a page can be so large that it cannot be included in another page.
  • the result can be updated once a day, once every few days, or once a week.
  • this can be done for many wiki-projects or for monitoring the status of a specific list of articles

Only current data is needed and previous records are not needed. Often several lists are made on separate pages and they are included on one page - only the current lists are visible (similar to the special page of recent changes), and and the history does not need the history of those separate pages.

Hence a couple of questions:

  • How does the Wikimedia Foundation look at the fact that its resources are wasted not for articles, but to save unnecessary bot generated data? For example, a new version of an article is generated every day that is close to the maximum size or several small lists, and all this is done for several wiki projects every day forever. Or maybe we can freely create versions for hundreds of lists every day without hesitation? (on the one hand it's close to the "pillars of wikipedia" and to "what wikipedia is not", on the other hand, if there was an explicitly declared permission or confidence that this does not create an unnecessary load, then we would deploy such lists more massively (more lists per wikiproject, more often updates, more statistics) and for more wiki-projects with fewer participants.)
  • Is it possible not to save old versions somehow? If this is beneficial for server maintenance and convenient for editors - can there be a namespace where old versions will not be saved or, for example, will be only the last 10, or it would be possible to switch the page to a new mode (Special:ChangeContentModel), or may not save marked pages to dumps - we can re-generate them, or, if it is profitable to maintain, can we make a list of pages / versions that can be permanently deleted after reaching community agreement on them, lists can even be written to another wiki (for example, data in a test wiki can be deleted) if they can be included in Wikipedia pages (non-main space), like files/datasets included from the commons-wiki.
  • Are there other ways to make it easier for servers to work with bot-lists? Are there bottlenecks (perhaps in the long run?) for servers when working with bot-lists?
  • Are there any guidelines, instructions, wishes from the WMF regarding such renewable bot-generated lists?
  • Can it somehow help, that sometimes old versions will be deleted for these pages? (with the current tool, versions are not deleted, but hidden)
  • Can creating and updating such lists in a separate (new?) namespace somehow help?

--Sunpriat (talk) 06:28, 7 May 2020 (UTC) Reply

Anti-vandalism bot

Latest comment: 4 years ago 5 comments2 people in discussion

Hi, I'm sysop at sc.wiki, and I have been having a problem with an incredibly annoying vandal for a while. Since Sardinian has a few different written standards, our pages can be written using any of those, and the community decided that the standard used by others can't be changed unless a few specific conditions are respected, but this guy it's targeting the pages written using the sc:LSC standard and the logudorese one, marked with Template:LSC or with Template:Variant. Since I can't watch the changes 24/7, and that If I ban him he changes IP and just starts again every time, it's there a way to create a bot that reverses any change that deletes or modifies those templates (unless it deletes Template:Variant, but adds Template:LSC, that allows multiple standards) unless I approve it? Also, it's there any other tool that I can use to stop him?--L2212 (talk) 12:43, 7 May 2020 (UTC) Reply

Hi L2212. Are you familiar with the AbuseFilter extension? It may fit your needs. --MZMcBride (talk) 18:56, 17 May 2020 (UTC) Reply
Thank you very much MZMcBride, it looks exactly like what we needed! I have another question, though. Studying the manuals and looking at the rights of the users, it looks like the autoconfirmed and extendedconfirmed rights are given, on sc.wiki, differently to how they are given on en.wiki. Where can I look at our local settings, and if necessary, change them?--L2212 (talk) 16:22, 19 May 2020 (UTC) Reply
Hi L2212. There is a reference copy of the Wikimedia wikis configuration available at <https://noc.wikimedia.org/conf/InitialiseSettings.php.txt>. You can file a Phabricator Maniphest task to request a wiki configuration change once there's local consensus. --MZMcBride (talk) 19:08, 19 May 2020 (UTC) Reply
Ok, thank you very much again!--L2212 (talk) 22:58, 22 May 2020 (UTC) Reply

How could I get bigdata?

Latest comment: 4 years ago 1 comment1 person in discussion

Hello! I need to get person name, date of birth, date of death, image url for each person in list of Category:Born_on_day_month for each day in a year (For each day about 1000 persons). When I tried to get a list by https://ru.wikipedia.org/w/api.php?action=query&format=xml&list=categorymembers&cmlimit=100000000&cmtitle=Category:родившиеся_1_января it`s all ok. But when I tried to get wiki id for each person by https://ru.wikipedia.org/w/api.php?format=xml&action=query&prop=pageprops&titles=Суворов,_Александр_Васильевич and then by https://www.wikidata.org/w/api.php?format=xml&action=wbgetclaims&entity=Q154232 it was already error after 5-10 requests. One request duration is about 0.5 s. I see, it`s very irrational way and hope to find better one. What is the best way for this purpose (maybe one big request for some subpurpose)? Thanks

@IoanGum: Hi (please sign your posts), what is the exact error? --AKlapper (WMF) (talk) 16:55, 7 May 2020 (UTC) Reply

3 Questions

Latest comment: 4 years ago 2 comments2 people in discussion

I feel that new MediaWiki makes the source editor harder to use.

  1. Where should I write this impression ?
  2. Where were the specifications for this revision considered ?
  3. Where can we write the request for the next revision ?

--HaussmannSaintLazare (talk) 19:14, 9 May 2020 (UTC) Reply

@HaussmannSaintLazare: Hi, 1) You just wrote it here (though we do not know what your impression is based on). 2) What "revision" of what exactly? 3) What "request" and what "next revision"? I'm afraid this thread currently needs way more context. --AKlapper (WMF) (talk) 19:20, 9 May 2020 (UTC) Reply

thema

Latest comment: 4 years ago 1 comment1 person in discussion

thelw na kanw to profil otan kaneis anazitisi sto google me to onoma mou na fenaite i oikonoma mou i stadrodromia mou imeromenia genisis olo pws mporo na to kanw giati den katexo to wipedia katholou thelw na me prosotanolisis

Κάθε άρθρο στην Wikipedia, είτε στην αγγλική έκδοση ή την ελληνική ή οποιαδήποτε άλλη, θα πρέπει να είναι για θέμα που αξίζει να βρίσκεται σε εγκυκλοπαίδεια (δες w:el:Βικιπαίδεια:Εγκυκλοπαιδικότητα ή w:en:Wikipedia:Notability). Επιπλέον, θεωρείται σύγκρουση συμφερόντων το να γράψει κάποιος πληροφορίες σε άρθρο για τον εαυτό του, δες w:el:Βικιπαίδεια:Σύγκρουση_κινήτρων ή w:en:Wikipedia:Conflict_of_interest. Αν όμως ενδιαφέρεσαι να συμμετάσχεις στην επέκταση της Wikipedia γενικά, μπορείς να ρίξεις μια ματιά εδώ: w:en:Wikipedia:Contributing_to_Wikipedia. Καλή επιτυχία! -- ArielGlenn (talk) 13:42, 17 May 2020 (UTC) Reply

New Page and subjects

Latest comment: 4 years ago 1 comment1 person in discussion

How exactly do I start a new page? And what can the page be about? — The preceding unsigned comment was added by Normal Person exploring this world (talk)

Hello, see Wikimedia projects to see what kind of pages they contain. See mw:Help:Starting a new page on how to create a page. Nemo 15:45, 19 May 2020 (UTC) Reply

What to use as client_id for OAuth2

Latest comment: 4 years ago 1 comment1 person in discussion

I'm trying to make a react based client to be hosted gitlab.io (static hosting) with PKCE flow (more info here).

Documentation relating to OAuth2 is here: [1].

When I register an OAuth2 consumer at https://meta.wikimedia.org/wiki/Special:OAuthConsumerRegistration/list I get 3 values, labeled as:

  • "Client application key"
  • "Client application secret"
  • "Access token"

I have tried all 3 of the values I got as client_id and none work. I navigate to to https://meta.wikimedia.org/w/rest.php/oauth2/authorize?client_id=...&redirect_uri=...&response_type=code&scope=openid&state=,..&code_challenge=...&code_challenge_method=S256&response_mode=query

But this page tells me "Application Connection Error: Client authentication failed (e.g., unknown client, no client authentication included, or unsupported authentication method)". So I guess I'm doing something wrong, first step would be to verify that I am indeed using the correct thing for "client_id".

The documentation says to use "client token" as "client_id", but I dunno what that is, I guessed it is "Client application key" but as that does not work I guess it is not.

Any ideas on what I'm doing wrong?

I'm using hueshika000. There is a related phabricator issue here.

If someone has an example client that does "PKCE flow" that I can have a look at it will be great. Iwan Aucamp (talk) 13:17, 23 May 2020 (UTC) Reply

MediaWiki forum

Latest comment: 4 years ago 1 comment1 person in discussion

I am making a mediawiki stack exchange site. Anyone who wants to contribute can and should. 72.74.131.76 14:59, 23 May 2020 (UTC) The proposal is up at https://area51.stackexchange.com/proposals/124244/mediawiki 65.96.125.113 11:08, 26 May 2020 (UTC) Reply

Starting new wikis

Latest comment: 4 years ago 3 comments3 people in discussion

I was told this website allows you to create wikis. How could I do that?

Please sign with four tildes. You need a web server. Install the software and read the docs. 65.96.125.113 11:11, 26 May 2020 (UTC) Reply
You probably mean Fandom. Ruslik (talk) 20:54, 26 May 2020 (UTC) Reply
Miraheze lets you create wikis. --Rob Kam (talk) 07:20, 27 May 2020 (UTC) Reply

Some CSS for Vector has been simplified

Latest comment: 4 years ago 2 comments2 people in discussion

Hello! I'd like to make a double-check about a change that was announced in Tech/News/2020/21.

Over-qualified CSS selectors have been changed. div#p-personal, div#p-navigation, div#p-interaction, div#p-tb, div#p-lang, div#p-namespaces or div#p-variants are now all removed of the div qualifier, as in for example it is #p-personal, #p-navigation .... This is so the skins can use HTML5 elements. If your gadgets or user styles used them you will have to update them. This only impacts the Vector skin.

On this wiki, this impacted or still impacts the following pages:

How to proceed now? Just visit all these pages and remove div before these CSS selectors if it hasn't been removed so far. Thank you! SGrabarczuk (WMF) (talk) 11:25, 25 May 2020 (UTC) Reply

My one cent of unhappiness that WMF did not give us enough (if any?) time to prepare for the change. — regards, Revi 12:00, 25 May 2020 (UTC) Reply

Restricting the use of Content translation tool

Latest comment: 4 years ago 2 comments2 people in discussion

If I have understood correctly, the English Wikipedia has restricted the use of the Content translation tool by 1) requiring 500 edits (extended confirmed user group) before the editor is able to publish a translated article straight to mainspace and 2) disabling machine translation completely. Does any other Wikipedia have similar or other restrictions with the tool? And would it be technically possible to limit the total use of the tool to a certain user group? So not only limiting the publishing feature of the tool like enwiki does, but hiding or otherwise restricting the tool completely from editors who are not part of the user group? -kyykaarme (talk) 22:22, 28 May 2020 (UTC) Reply

You can find the current configuration in InitialiseSettings.php. I far as I can see only enwiki has set such limitations. Ruslik (talk) 18:29, 2 June 2020 (UTC) Reply

Lua help to small wiki

Latest comment: 4 years ago 2 comments1 person in discussion

Dear Sirs, hello from el.wiktionary. I know nothing about Lua. Need to make a Module for some 40 placenames for my wiktionary.
(削除) I have trouble with a simple version of it: I tried all kinds of combinations to get me 'error' and I cannot, I cannot do it.
Could someone help? I know is is insultingly easy... But there is noone around anymore who would know how to do it. Thank you (削除ここまで)
--Sarri.greek (talk) 00:26, 4 June 2020 (UTC) Reply
Done.
Small little things, that would make the difference for small wikis: 'how to' little modules. Like, applications with the commands in a non-latin languages (it took me one week to solve). Like, how to make a Category from a word. Like, How to link a word. Like...
bad argument #1 to 'sub' (string expected, got table).

local export = {}
function export.test(text)
	return mw.getContentLanguage():ucfirst(mw.ustring.sub(text, 1, 1)) .. mw.ustring.sub(text, 2)
end
return export

Sarri.greek (talk) 23:38, 5 June 2020 (UTC) Reply

AltStyle によって変換されたページ (->オリジナル) /