Jump to content
Wikimedia Meta-Wiki

Steward requests/Global permissions/2016-11

From Meta, a Wikimedia project coordination wiki
Warning! Please do not post any new comments on this page. This is a discussion archive first created in November 2016, although the comments contained were likely posted before and after this date. See current discussion.

Requests for global rollback permissions

Latest comment: 8 years ago 18 comments15 people in discussion

Global rollback for Grind24

Status:  Not done
Not ending before 12 November 2016 16:18 UTC

Hi all, I'd like to reapply for global rollback rights. Most of my cross-wiki vandalism fighting dates to last year, however, this is do to an extended wikibreak I took at the beginning of the year. I have reverted vandalism cross-wiki on several projects. The rollback tool is more efficient than manually undoing and twinkle so this would make it much easier to revert nonsense/spam edits. I certainly wouldn't abuse the trust. Thanks for the consideration Grind24 (talk) 16:18, 7 November 2016 (UTC)

Not done No consensus, sorry. Concern: "activity is very recent". ~ Nahid Talk 17:31, 13 November 2016 (UTC)
I honestly feel that it will be a big help for the small wikis (I haven't been active recently due to real-life issues). Thank you for your concern.--Grind24 (talk) 19:58, 13 November 2016 (UTC)
The following request is closed. Please do not modify it. No further edits should be made to this discussion.

Global rollback for Mr. Fulano

Status:  Not done
Not ending before 20 November 2016 00:17 UTC

Hi, I'm a Mr. Fulano and I'm a eliminator and rollbacker of pt.wiki, but I undo vandalism in many wikis, include in languages that I don't know. Usually I check the global filters to undo vandalism. With this tool, I can undo vandalism fastly in many wikis, include in wikis that has a low acessodata and edit. Mr. Fulano! Talk 00:17, 15 November 2016 (UTC)

Hi. We like to see users to be active on many wikis and for a few weeks before being granted this user right. I'd recommend to be active for a few weeks more, then re-apply for this user right. Thanks for helping :-). Matiia (talk) 01:12, 15 November 2016 (UTC)
Closing as Not done due to perceived lack of experience. For gaining global rollback, "users must be demonstrably active in cross-wiki countervandalism or anti-spam activities (...), and make heavy use of revert on many wikis". Please, feel free to reapply later when you gain more cross-wiki experience. Thanks for volunteering. :) RadiX 03:08, 21 November 2016 (UTC)

Requests for global sysop permissions

Latest comment: 8 years ago 22 comments19 people in discussion

Global sysop for Eurodyne

The following request is closed. Please do not modify it. No further edits should be made to this discussion.
Status:  Not done
Not ending before 4 November 2016 20:36 UTC

Hello all. I'd like to apply for the global sysop permission. I've been an active member of the SWMT for almost 2 years now and have a fair amount of experience with tagging pages for deletion and reverting bad edits. I'd like to be able to help out with small wiki cleanup and other maintenance tasks. Additionally, I'm an administrator on Wikidata and MediaWiki. I'm usually on IRC most of the day and can be found under the nick eurodyne. Thank you for your consideration. eurodyne (talk) 20:36, 21 October 2016 (UTC)

  • Support Support I think do a good job. Alvaro Molina (Let's Talk ) 21:05, 21 October 2016 (UTC)
  • Support Support. Good user, he has a good work on the SWMT. --Ks-M9 [disc.] 21:27, 21 October 2016 (UTC).
  • (edit conflict) Oppose I still do not trust this user, in light of his previous history with hat-collecting and obscure renames / multiple accounts. (See also previous requests: 1, 2, 3, ...). --MF-W 21:42, 21 October 2016 (UTC)
  • Support Support active and constructive user, would make good use of the tools. --Druddigon (talk contributions) 21:50, 21 October 2016 (UTC)
  • Support Support -FASTILY 22:30, 21 October 2016 (UTC)
  • Support Support without question. I've known and collaborated with Eurodyne for quite some time. My trust in Eurodyne's care, integrity, honesty, commitment, and willingness to go the extra mile to volunteer his time and talent towards improving the overall project - has never come into question. This user right will be put into good hands, and will certainly be beneficial. ~Oshwah~ (talk) (contribs) 22:55, 21 October 2016 (UTC)
    The "never" part referring to "care, integrity, honesty, []" is quite impressive, considering the massive abuse of sockpuppets by this user in the past. Whether that, after a few years, is still a concern is a different question, but this statement which approves faking identities, massive hat collecting, and other things which happened in the past is shocking me. --Vogone (talk) 23:33, 21 October 2016 (UTC)
  • Support Support --HakanIST (talk) 04:40, 22 October 2016 (UTC)
  • Still thinking about this one. I want to support, but global sysop can do a bit more damage than Wikidata adminship due to the lack of scrutiny, and the past issues give me pause. I would prefer to see a longer period of time given the past. --Rs chen 7754 16:04, 22 October 2016 (UTC)
  • Support Support per above. KGirlTrucker81 (talk) 17:08, 23 October 2016 (UTC)
  • Oppose I needed a little time to consider this request, but I came to the conclusion access should not be granted. This is a case where a user who has abused almost all of the trust which can be put into an editor has been given a second chance. I believe in these second chances, and think we should allow users to continue to contribute to the projects, provided they don't repeat their abusive behaviour. Now the question is, whether this second chance should apply to all kind of contributions. And this is the point where I see a need to differentiate. In a well-monitored environment, like on Wikidata, I myself trusted Eurodyne with the local admin privileges. However, in this case access to way more wikis would be granted, which requires even more and "blind" trust. I don't trust this user "blindly", provided the actions in the past. I think this can be compared to a real life example: You would not trust a convicted criminal to be a good judge either. I appreciate Eurodyne's contribution to the SWMT, but would prefer it if he continued it without this user right. Kind regards, --Vogone (talk) 21:27, 23 October 2016 (UTC)
  • 'Oppose unfortunately, per Vogone. --Steinsplitter (talk) 16:16, 24 October 2016 (UTC)
  • Support Support Despite the past issues, it seems as though Eurodyne is qualified and has not fallen off the wagon and gone back to previous habits (as far as I can tell) Cameron11598 (Converse) 16:25, 24 October 2016 (UTC)
  • Oppose Oppose Has been quite inactive as of late in his role as a WD admin, and I am still not convinced about his maturity. Also, per Vogone.--Jasper Deng (talk) 19:13, 24 October 2016 (UTC)
  • Oppose Oppose - per Jasper Deng, good vandal fighter but i have not seen him active at all over the last 6 months..we already have too many non-active GS, we do not need more..--Ste moc 22:21, 25 October 2016 (UTC)
  • Neutral NeutralTBhagat (talk) 05:34, 26 October 2016 (UTC)
  • Oppose Oppose Mostly per Jasper Deng and Stemoc. Looking at Eurodyne's logs at WD [1], 100 actions brings me all the way back to December 2015. lNeverCry 09:02, 30 October 2016 (UTC)
  • Neutral Neutral I want to support, but the concerns pointed out by Jasper Deng, Stemoc and INeverCry lead me to hesitate. From my interactions with Eurodyne, I know that he is a keen and enthusiastic user who is willing to help. I also know that once he obtains tools, he would definitely do his best to utilise them. However, his inactivity in Wikidata leads me to doubt whether he would stay active or not as a GS. I have nothing against Eurodyne personally. What I actually want to see is him becoming active in CVU work before requesting GS. I think this is a bit too soon. Will consider supporting in the upcoming months. Jianhui67 talk contribs 08:24, 31 October 2016 (UTC)

Requests for global IP block exemption

Latest comment: 8 years ago 10 comments8 people in discussion

Global IP block exempt for JuniorX2

Status:  Done

Hi! I'm a active user of newiki, maiwiki, meta, etc. Please, grant me Global IP block exempt becoz I wish not my account to be blocked when i've to use Tor or anonymous proxy abroad. Thanks ! —JuniorX2 Chat Hello! 12:16, 19 November 2016 (UTC)

Done Ruslik (talk) 19:28, 20 November 2016 (UTC)

Global IP block exempt for SeniorStar

Status:  Done

Hi! I'm a active user of newiki,. Please, grant me Global IP block exempt because I don't want my account to be blocked when I've to use Tor or anonymous proxy abroad. Thanks ! —SeniorStar (talk) 13:52, 21 November 2016 (UTC)

@SeniorStar: GIPBE is for globally active users (those who active in multiple wikis). You're active @newiki. You can request for IP block exempt there at local request page. RegardsTBhagat (talk) 08:28, 23 November 2016 (UTC)
@Tulsi Bhagat: Ya I am active @newiki but I use other wikis too. I use meta wiki and I'm planning to edit hindi wiki too. So I think this right is needed for me. Regards and good hope from all.SeniorStar (talk) 11:20, 23 November 2016 (UTC)
Done Ruslik (talk) 15:57, 23 November 2016 (UTC)

Global IP block exempt for Arunasp

Status:  Done

Hello, The private server I am using on the IP address 176.9.61.81 is falling into the blocked IP address range. Please exclude this IP address from the blacklist. Thanks! --Arunasp (talk) 11:29, 24 November 2016 (UTC)

Done--Vituzzu (talk) 11:35, 24 November 2016 (UTC)

Global IP block exempt for Wikid77

Status:  Done

First request. I edit pages daily on enwiki, dewiki, or eswiki edit-previews using an AT&T mobile phone, as Desktop view, but every few days, I get View Source on all wikis (edit refused) due to frequent global IP-range blocks. Lately, many blocks as ".0/24" (255 IPs) made by User:MF-Warburg have killed edit-previews. See complaint "User_talk:MF-Warburg#Periodic blocks of AT&T phone IP range" (dif542) which suggested I request IP block exemption. Not sure if such IP blocks should be banned for AT&T cell phones, or if I alone get exemption, thanks, -Wikid77 (talk) 16:22/16:37, 26 November 2016 (UTC)

Done Granted for 1 year to expire on 2017年11月27日. User in good standing in need of this. Please come back a week or so before the permissions expire so we can renew it in case you still require it. Best regards. —Marco Aurelio 13:30, 27 November 2016 (UTC)

Requests for global rename permissions

Latest comment: 8 years ago 27 comments21 people in discussion

Global rename for Tulsi Bhagat

Status:  Not done
Not ending before 1 December 2016 16:44 UTC

Hello, I've a past withdrawn request. As, I would like to apply for global renamer to abet the communities in further more works. I would like to help Nepali, Maithili, Hindi and Bhojpuri speaking users here and also at local request pages. I observed that there is a few global renamers from newiki, maiwiki, hiwiki, & bhwiki. I'm one of a admin on both mai & newiki but not a crat anywhere (verify). Also, I'm helping here @newiki & maiwiki in related work. I'm aware of all relevant rename policies as well its scope. One can connect with me (Nickname: Tulsi) on related IRC channel #wikimedia-rename connect . Have a good day/evening ahead. Thanks for consideration. Regards — TBhagat (talk) 16:44, 17 November 2016 (UTC)


Requests for other global permissions

Latest comment: 8 years ago 128 comments35 people in discussion

add global OTRS member for The Polish

Status:  Done

thanks, --Krd 13:01, 3 November 2016 (UTC)

Done, QuiteUnusual (talk) 14:47, 3 November 2016 (UTC)

Global editinterface for タチコマ robot

Status:  Not done

I previously filed a global sysop request, after an IRC discussion, I feel a Interface editors is more appropriate for the task as indeed a Global sysop would be an overkill.

We have a large number of wikis with Double Redirect issues. I operate a bot that fixes this on practically every wiki. Over the years these smaller wikis have accumulated double redirects bots cannot fix. This may be because the page is protected or because the double redirect is formed due to a circular or self redirects.

The protected pages can be due to admin protections or user .js or .css files that are leftover as redirects if the user account is moved more than once. I fixed plenty of these on Commons wiki using my admin privileges. An admin helped me with the task on en.wikipedia. If the bot (User:タチコマ robot) is given global sysop (even briefly, for a day or two) it can fix the protected double redirect pages on the smaller wikis that accumulated in over a decade.

The bot will not use any admin privileges aside from editing protected pages.

After that there still will be a need to process whats left manually. I tend to recommend speedy deletion of circular redirects since they lead to nowhere to begin with. They are a navigational hazard.

-- とある白い猫 chi? 20:01, 22 October 2016 (UTC)

  • Support Support Task is clear, trusted user.--HakanIST (talk) 20:09, 22 October 2016 (UTC)
  • Neutral Neutral I have some concerns yet. Alvaro Molina (Let's Talk ) 20:18, 22 October 2016 (UTC)
    • Care to share? --MF-W 21:19, 22 October 2016 (UTC)
      • Comment Comment I do not trust much that a bot can edit protected pages as a user interface or space MediaWiki all projects which governs the permission, understanding that it is to fix double redirects, but still am somewhat mistrustful. Like I wish you luck and I think this is more logical than the other permission is more delicate. Alvaro Molina (Let's Talk ) 02:01, 23 October 2016 (UTC)
        • I would agree with you if this request came from a new unknown user who would request the rights for an indefinite time. Though even then it would still be possible to closely monitor the activities of the bot and step in when something abusive happens. --MF-W 07:48, 23 October 2016 (UTC)
        • @AlvaroMolina: Bot ignores every page not listed on Special:DoubleRedirects. All it will do is edit pages that are redirects that point to redirects and make them point to the page at the end of the redirect chain. If such a page can't be found, it will ignore it. If the page it edits is not a redirect, it will again ignore it. It is performing this on about 700 wikis. -- とある白い猫 chi? 13:18, 28 October 2016 (UTC)
  • I am sympathetic towards this request. I wonder if we could get a list or at least an estimate of the number of these protected broken redirects. I also expect that probably a good number of the affected pages on small wikis were protected a long time ago and could very well be unprotected now. --MF-W 21:19, 22 October 2016 (UTC)
  • Support Support Useful task; operator knows what they're doing. -FASTILY 01:43, 23 October 2016 (UTC)
  • Oppose Oppose I still see a mixture of two requests here, editing protected pages and deleting circular redirects. Both are different things IMO. I'd say:
    • Circular redirects in namespace 0 unlikely affect protected pages and don't require the editinterface right.
    • Circular redirects in namespace 1 likely can and shall be resolved on the situation of the related ns-0 pages.
    • Circular redirects in namespace 2, even if at user .js or .css files, are not a navigational hazard as stated by the requestor. Likely nobody will ever see that.
    • The same applies more or less for all other namespaces: It should be resolved, or it doesn't really matter. Please prove me wrong.
    As a person who appreciates any cleanup, I'd support this generally, but the approach "speedy deletion since nowhere to begin with" appears bit too simple to me. I also suggest to build maintenance lists first and try to have a native speaker of the individual related language have a look. --Krd 06:34, 23 October 2016 (UTC)
    He doesn't request deletion rights anymore, so there is no danger of any deletions. --MF-W 07:48, 23 October 2016 (UTC)
    I'm not really sure about that, but anyway I wonder how many relevant protected double redirect pages there could be. I think it isn't asked too much, regarding a critical thing like editinterface, to do ones homework first and prepare some numbers. --Krd 08:08, 23 October 2016 (UTC)
    Sorry, I don't understand how deletions should happen if a user doesn't have the possibility to delete pages. --MF-W 11:51, 23 October 2016 (UTC)
    Cite "The bot will not use any admin privileges aside from editing protected pages." and following. I'm not sure if he thinks he will have deletions rights.
    But even if not, simply tagging affected pages, protected or not, for speedy deletion will IMO not be the best approach, regardless if doing in bulk or slow start method. Such things are usually done with lists of cases, per wiki, per namespace, where the respective admin can do both, get an overview over the whole situation and resolve individual cases one by one. I don't see documentation of the intended workflow here, which I consider mandatory before doing this at full scale. --Krd 12:17, 23 October 2016 (UTC)
    Just to clarify, the idea here is to edit protected pages not delete them. A protected or unprotected double redirect has the same solution by the bot by converting them to regular single redirects as the bot has done for years. Bot operates on hundreds of wikis. Even with global sysop I would not delete anything. I do not have any code (tested or otherwise) to deal with circular redirects nor do I have any intention to deal with them automatically or through a bot. I would not recommend automated deletion of circular redirects as that can be easily abused. -- とある白い猫 chi? 12:24, 23 October 2016 (UTC)
    How many protected double redirects are there in total, or for some example wikis? Does this bot account only run tasks related to double redirects, or which other jobs are done by this account on any wiki? --Krd 12:35, 23 October 2016 (UTC)
    • @Krd: Short answer is it depends on local activity. Long answer is... sorry, this will be a really long reply. I want to detail the operation as well as my rationale as much a possible. :)
    • The bot only handles double redirects on all wikis except commons and en.wikipedia which handles separate tasks when requested. Reviewing bots global activity may give a better idea of the bots operation. You will immediately notice large gaps for some wikis. This has to do with the double redirect generation in them. Some wikis do not have a vibrant enough community to generate them, some wikis do not move pages at all so double redirects are hardly an issue. And some wikis simply got locked. You will notice there are very few entries, you will typically observe one or two in a year. Typically such redirects are protected against vandalism or other persistent long term abuse. I just don't think this is something humans should spend too much time looking at if a bot can handle it perfectly fine.
    • As for numbers on the largest wiki, (en.wikipedia) it isn't surprising to observe 300-700 double redirects per cycle (whenever en:Special:DoubleRedirects is generated, probably daily), smaller wikis do not typically see more than 10 double redirects generated per week. It isn't unusual to see smallest wikis not to generate a single double redirect in a month. These wikis are run by few users whom would better spend time on any other task on the wiki. Collectively 5,000-10,000 double redirect edits a week wikimedia-wide isn't unusual where vast majority is on the largest wikis. This is merely my guesstimate.
    • The problem is that individually, smaller wikis do not have that many problems. Collectively it would become a chore since we have hundreds of wikis. Let me try to find a few wikis that exemplifies the issue. Consider scn:Spiciali:RedirectDoppi, map-bms:Astamiwa:Pengalihan ganda which list protected redirects, ba:Special:DoubleRedirects as an example of circular redirects, bar:Spezial:Doppelte Weiterleitungen is an example for both. Note that an entry with a strike through is typically "fixed".
    -- とある白い猫 chi? 13:28, 23 October 2016 (UTC)
    Ok, I think that is reasonable. Changing to support. --Krd 15:17, 23 October 2016 (UTC)
    Switching back to oppose. Sorry, the numbers presented below do not at all support your arguments, and having the real numbers now, I agree with Savh that fixing the actual problem is even less manual work for a steward than evaluating this request. --Krd 14:10, 30 October 2016 (UTC)
    @Krd: As you wish. The numbers presented by him is the same as mine though. Bear in mind he is talking about protected double redirects, not all double redirects. If that is the case, why has a steward or global sysop not resolved the issue already? For example Savh himself (a steward) could fix the issue on every wiki instead of arguing about it. He could have done so since 2012. I am curious how many protected double redirects he resolved to date. It would void my request entirely and save me a lot of time discussing. I do not care about the flag as long as the issue is resolved. The entire task for the steward here is to grant the bot the flag for something like 20 minutes (I mentioned 3 days for good measure in case of issues with the API etc that happened recently) and then removing it. Should take about 10-20 seconds per action so half a minute (literally twice the amount of time it had taken me to read your above message above). It naturally needs to be coordinated with me so that I make sure the bot is running within than ~20 minute window. The other alternative is doing it by hand which would take hours to days if conducted through the local community by hand as this would require probably hundreds of local discussions. The steward would not be allowed to use any kind of script as per the whole point of this request. End result will be the same though humans may make mistakes. I am all for inefficiency if someone else is going to process it. -- とある白い猫 chi? 12:17, 31 October 2016 (UTC)
    Issue is even prevalent here on meta: Special:DoubleRedirects. -- とある白い猫 chi? 12:32, 31 October 2016 (UTC)
    Why I haven't done it? Because it isn't a problem that needs solving. With regard to meta, there is no single one your bot would fix using editinterface which should be fixed. Sa vh ñ 12:47, 31 October 2016 (UTC)
  • (削除) Wikimédia France/Gouvernance/Plan d'action 2017-2019 (削除ここまで) Striked
  • User:Golgaris Schwinge (edit) →‎ User:* Golgari * →‎ User:-Golgari- Not protected
  • User:Golgaris Schwinge/EditCounterGlobalOptIn.js (edit) →‎ User:* Golgari */EditCounterGlobalOptIn.js →‎ User:-Golgari-/EditCounterGlobalOptIn.js Should be deleted, not fixed.
  • User:Golgaris Schwinge/global.css (edit) →‎ User:* Golgari */global.css →‎ User:-Golgari-/global.css Should be deleted, not fixed.
  • User:* Golgari * (edit) →‎ User:-Golgari- →‎ de:User:Tol'biacMG Not protected
  • User:XOXOXO/EditCounterGlobalOptIn.js (edit) →‎ User:AzorAhai/EditCounterGlobalOptIn.js →‎ User:Sunfyre/EditCounterGlobalOptIn.js Should be deleted, not fixed.
  • User:XOXOXO/global.js (edit) →‎ User:AzorAhai/global.js →‎ User:Sunfyre/global.js Should be deleted, not fixed.
  • User:HZI/global.css (edit) →‎ User:Golgaris Schwinge/global.css →‎ User:* Golgari */global.css Should be deleted, not fixed.
  • User:T0taku/global.js (edit) →‎ User:Hrum-Hrum/global.js →‎ User:Sunpriat/global.js Should be deleted, not fixed.
  • User:InformationvsInjustice (edit) →‎ User:InformationvsInjustice →‎ User:InformationvsInjustice Not protected
  • (削除) User talk:Evan.mc.doyle (削除ここまで) Striked
  • Help:Redirect/Redirect 3 (edit) →‎ Help:Redirect/Redirect 1 →‎ Help:Redirect/Redirect 2 Should not be fixed, this is an example of double redirects
  • Help:Redirect/Redirect 1 (edit) →‎ Help:Redirect/Redirect 2 →‎ Help:Redirect/Redirect 3 Should not be fixed, this is an example of double redirects
  • Help:Redirect/Redirect 2 (edit) →‎ Help:Redirect/Redirect 3 →‎ Help:Redirect/Redirect 1 Should not be fixed, this is an example of double redirects
  • Help:Redirect/double redirect (edit) →‎ Help:Redirect/single redirect →‎ Help:Redirect/Target Should not be fixed, this is an example of double redirects
What is the difference between, relinking those .css/.js pages and deleting them? You say "it should be deleted" and yet you don't. You claim double redirects are not an issue. Then why do we have examples of it? Why do we have pages explaining it in detail? Why even generate the Special page? Why even deal with any problem if something trivial is fine to leave as is? Why do we even have Redirect.py in pywikibot core?
What you are saying is, 1) the portrayed issue exists 2) you do not want the bot fixing the issue 3) you will not fix it yourself 4) you do not think it is an issue.
-- とある白い猫 chi? 19:56, 31 October 2016 (UTC)
I will keep repeating myself again; "Why I haven't done it? Because it isn't a problem that needs solving". Because the existance of these double redirects does not cause any harm, no one really cares about an abandoned .js/.css page and at this time I would have to, according to you, not only explain that they are better deleted but also delete them - doubling the required waste of time. For your next replies, please find the answer in any of my previous comments, since I'm tired of repeating myself. Sa vh ñ 21:42, 31 October 2016 (UTC)
@Savh: Since I started editing Wikipedia in 2005, we always had backlogs. It was always the case where people worked towards lowering backlogs to zero. That has always been the gold standard. You want to have 0 pending copyright review queue, you want to have 0 deletion request queue, you want to have 0 permissions request queue, you want to have 0 OTRS queue. Never in my experience on any Wikimedia project have I seen SUCH a push back on a non-controversial maintenance task.
Never have I heard of a steward suggesting that backlogs with straightforward solutions should be kept until time ends because essentially they said so. The entire basis of your argument is that this back log issue is not an issue and should be ignored. As a steward and as a meta wiki admin, among other tasks your role includes facilitating the maintenance of Wikimedia sites. Not only are you not fulfilling your role, you are arguing that others shouldn't either.
Presence of protected double redirects obfuscates the other types of problems associated with double redirect log: Circular redirects, interwiki redirects. I want to have the protected double redirect issue resolved so that I can then focus on the other two issues.
-- とある白い猫 chi? 00:09, 1 November 2016 (UTC)
  • Except ironically one of your bot old user pages (which should really be unprotected and not fixed), none of the lists of double redirects you link includes a redirect that is somehow protected. Can you give a clear answer: how many fully protected double redirects are there that need fixing? Sa vh ñ 00:42, 28 October 2016 (UTC)
  • Oppose Oppose any explanation for this? P.S. And I urgently recommend to read Requests for comment/-jkb- . Regards -jkb- 10:55, 23 October 2016 (UTC)
  • Oppose Oppose I don't think that bots should have any elevated global permissions in general. Fortiori if it is not disallowable their editing on local wiki (they just edit without bot flag then).
    Danny B. 11:37, 23 October 2016 (UTC)
    What does the second sentence of your comment mean? --MF-W 11:51, 23 October 2016 (UTC)
    AFAIK, you can not disallow (apart from blocking account by account, of course) editing of global bots on the wiki. Such account still technically can edit on the wiki, edits are just not marked with "b" flag. However, that's just an add-on, my major point was the first sentence.
    Danny B. 12:18, 23 October 2016 (UTC)
    The request here is an exception. To my knowledge there is only one bot that deals with double redirects on almost all wikis, this one.
    In it's operation the bot has never incorrectly corrected a double redirect. The code isn't even mine. It is pywikibot's redirect.py which hasn't been modified in eons so I would call it very stable. It's task is very simple one. If a redirect leads to another redirect (double redirect), it follows the redirect chain until it either hits one of the pass through redirects (hence circular) or it converts the redirect to redirect to the actual page. Ie if A->B->Page, A is edited to redirect directly to Page.
    I did run into difficulties with some communities that demanded I apply for a bot flag as the bot got noisy. This is only an issue if wikis grow large enough. Smallest wikis do not even get a single double redirect in a week so pretty much no edits happen. A few wikis (three IIRC) want to handle double redirects manually for whatever the reason so I do not operate my bot in those.
    -- とある白い猫 chi? 12:40, 23 October 2016 (UTC)
  • Support, I believe this is a useful task and since the way the bot works remains unaffected by this I see no real danger of something going wrong either. However, for how much time shall the user right be granted? (this permission is temporary by default) --Vogone (talk) 12:50, 23 October 2016 (UTC)
    Giving permission for a single cycle should be sufficient. Main problem here is double redirects that accumulated in over a decade. If grated now, probably for a few days for good measure. After that point it would be a separate matter. It can be re-granted yearly, but in that case any protected double redirects would have to wait until this second granting. Also after the first run, these may be managed by humans since there shouldn't be too many new entries a year. -- とある白い猫 chi? 13:32, 23 October 2016 (UTC)
  • Support Support Reasonable request, trusted user who knows what he is doing. I do not see why not. Jianhui67 talk contribs 15:55, 23 October 2016 (UTC)
  • Support Support. For resolve this task using this global permission is ok. --Ks-M9 [disc.] 17:31, 23 October 2016 (UTC).
  • Oppose Oppose This request is more within scope but I don't like the attitude displayed in the last request; it smells too much like "I'm a valuable bot operator doing an invaluable task so I get to treat communities that object however I want". --Rs chen 7754 21:49, 23 October 2016 (UTC)
    • @Rschen7754: I am sorry but I would not characterize myself as a "valuable bot operator". There are people who spend days to months writing complicated code to take care of tasks whom are far far more valuable than I. I take care of a task so that humans (editors, developers etc.) can worry about pretty much everything else. Bear in mind that if any wiki objects bots operation, I do not run it. Bot currently already operates on 700 wikis (See User:とある白い猫#Bots for an outdated list, most of those are local flags based on local consensus).
    -- とある白い猫 chi? 22:54, 23 October 2016 (UTC)
  • Support Support Valid rational and can be trusted with the bit. — TBhagat (talk) 05:27, 26 October 2016 (UTC)
  • Oppose Oppose, per my oppose to his exact same proposal in 2012. Double redirects that are fully protected should, most likely, either be deleted or unprotected, and both those actions do not fall within the scope of this tool. User renames no longer generate a redirect for .js/.css pages, and the amount of people having been renamed twice really is minimal. This right gives a great deal of access, and granting it for such a useless reason is, in my opinion, a waste of everyones time - be it wasting it on this request (for the second time!) or reviewing the correct use of the tools. Sa vh ñ 22:43, 27 October 2016 (UTC)
    • @Savh: There is no consensus to your claim that protected redirects should either be deleted or unprotected. That is just your opinion. Since 2012 the problem did not get better and humans have not undertaken this task. It doesn't appear likely they will do so in the future either. It is a real trivial problem that becomes not so trivial when you try to manage it on 700+ wikis. On en.wikipedia alone, 54 .js/.css of these had been manually fixed by an admin as this accumulated over time. On Commons an admin fixed this issue as well with 8 edits. A bot could have made all of those edits without an issue.
    • Are you claiming granting this access for a limited period of time (even 1-3 days) would cause problems? This code has worked fine for over a decade now and I operate it on almost 700 wikis, there isn't much to verify. it already makes thousands of identical edits to non-protected pages every week. The page being protected makes no difference as far as the code is concerned. A redirect is a redirect. A redirect pointing to a redirect is a double redirect.
    • Or do you object on the basis that I cannot be trusted with this level of access? I have been entrusted by this community in different areas. I do not know what you expect me to do if I am granted the right in editing protected pages through my bot. I feel your fears are unwarranted in both areas. I am not here to wreck the project.
    -- とある白い猫 chi? 13:04, 28 October 2016 (UTC)
    First of all, I find it surprising you invoke certain invented fears I might have to oppose your request; I would greatly appreciat if we could keep your fantasies out of this discussion. A thing I do fear is that this request creates a precedent, and more people start wasting our time for non-significative trivial absurdist solutions to nonexistent problems, like this one that has only bothered a handful of people since 2012. I feel this fear to be totally warranted, and you are free to feel whatever you like about my fear, but that unfortunately is not really convincing.
    With regard to "There is no consensus to your claim that protected redirects should either be deleted or unprotected", please note I am not suggesting that you should delete or unprotect them, but that they most likely don't need fixing and remaining protected - and where is the consensus for that?
    I, with regard to protected double redirects, distinguish between user .js/.css subpage double redirects and normal protected redirects:
    1. User .js/.css redirects should be deleted as is currently done automatically when moving such a page - and therefore do not require fixing.
    2. Normal protected redirects are usually fully protected for a reason, and editing a fully protected page that is not part of the site's interface is outside this tools scope. On top, from the single example you link, I believe (and I don't say there is or is no consensus) the local communities would probably benefit more from unprotecting them and allowing any user to fix them more than from having them relinked. I therefore suggest, repeating a suggestion by Quentinv57 in 2012, that you simply notify the communities and/or Global sysops with a generated list of the protected double redirects if it is such a problem.
    The consensus to grant you access to this right so you could edit those protected pages should possibly surge from this request, by granting you a right which, per the relevant policy, "can seriously disrupt Wikimedia wikis if used incorrectly", and is therefore "only assigned to users who have a strong track record in maintaining code and scripts". Do you have a strong track in maintaining code and scripts?
    Additionally, you have been consistently vague, now and in 2012, on the amount of fully protected double redirects, linking to a single case of an old user page of your bot which, as far as I see, has no valid reason for being protected (@Melos?). Can you also give an estimated amount of fully protected double redirects that need fixing, linking at least a few different cases that do not need either unprotection or deletion? Sa vh ñ 16:50, 28 October 2016 (UTC)
    @Savh: I honestly expect such fears of trust. I am not too involved with interwiki issues aside from double redirects and commons. I think it would be only normal for people to have such fears. And as you said a setting a precedent also requires a level of trust.
    There is consensus that double redirects should be fixed. Even the global bot policy mentions double redirects. If not fixed, the Special:DoubleRedirects log would eventually become useless as it has a finite limit of 5,000 (as with most WMF logs) and anything higher is pruned. This list is generated in cycles which I think is daily, it seems to be semi irregular as double redirect log generation is an expensive query on the WMF end.
    1. There are plenty of existing .css/.js pages that accumulated in over a decade. Mind that they still form: fr.wikipedia has one formed 9 days ago and en.wikipedia has one formed 6 days ago, this wiki (meta) has 6 such pages. Bear in mind en.wikpedia had 54 of these that was recently cleaned by a human (en:User:Oshwah. Which is kind of why I decided to seek editing protected pages since it is clear the accumulation keeps happening and I do not want to bother humans for such a mundane task.
    2. I have notified local communities in the past as well as global sysops and stewards for months now. Not only did none of them fixed any of these pages, none bothered to reply to my inquiry. A bit rightfully so since the problem individually is trivial. A human fixing this problem requires a significantly larger period of time and actual attention. More on this later. That single case is my bot. I did not protect that page nor requested its protection. Regardless, it shows up in the log so it is a problem that needs to be fixed.
      • Most protected redirects are over issues like disruption from vandalism and POV issues (enforced by community consensus). If the protected page redirects lead to a page that is moved per community consensus, bot would simply be performing the after move cleanup. There is positively no logical reasons to keep double redirects as they offer nothing for us.
      • There are a few exceptions to this. There are two sample pages on en.wikipedia that demonstrate double redirects, there is a meta page that demonstrates double redirects. These are the only examples I am aware of. I question why such examples are needed anymore since fixing double redirects are more or less a bot exclusive domain. This is an issue I am mindful of. I will stop running the bot on these two wikis should an interface flag is granted until either its removed or probably try to resolve matter locally.
    I have been a software engineer for over a decade. I have been a wikimedian for over a decade. I have coded stuff on wiki such as cascading templates as well as the IRC bots for vandalism detection and other such scripts before. I hold a bot flag on so many wikis locally as well as a global one. Pywikibot's redirect.py has been among the most stable of core pywikibot files for over a decade now. I do not believe its code has changed at all in the past 5-8 years aside from language additions. Of my operation of the code fixing hundreds of thousands of double redirects, I have never encountered a situation where the code failed its specifications in any way. Also bear in mind the bot will ignore the entire wiki aside from pages listed under Special:DoubleRedirects. If a page is listed under Special:DoubleRedirects but is NOT a redirect, it will be ignored. If the redirect chain do not lead to a page (hence circular redirects or broken redirects) they will be ignored.
    I do not know how many protected double redirects are there, I ran into a few of them every now and then. They pass by on the bots log. Individually they are but a blip. Gathering such an info is a bit non trivial since I really do not want to parse this useless information. I expect very few cases to exist per wiki if at all. Checking each wiki would take exponentially more time than to fix the issue itself with the bot. It is not like the protection makes a difference for the bot. I got the above examples by randomly clicking about 12 wikis on the meta Wikipedia list. More on this below.
    The meat of the problem here is that the task here is dealing with 700+ wikis individually for a specific maintenance task. That is all double redirects are, a maintenance task. It is trivial and non-controversial. This is a continuous task so constant monitoring will be required over time. So here are the questions I want to ask you to emphasize the amount of time NOT letting a bot handle this would waste.
    1. How much time would it take to review 700+ wikis to identify protected double redirects? Bear in mind, most wikis will have an empty special page so just checking it will be a waste of your time.
    2. How much time would it take to notify the wikis on which a problem is detected?
    3. How much time would it take for the local community to understand the problem (it isn't surprising to run across an on wiki admin whom does not know anything about double redirects)
    4. How much time would it take to constantly process the above three items. How frequently should we ask global sysops, stewards, and local communities to drop everything they are doing to reprocess this?
    It is indeed a problem no one cares about because people like me and I have been mitigating the impact of double redirects with bots. If I appear like boasting, I am not. This is a trivial maintenance task. By letting a bot deal with it we are liberating time for the local communities so they can spend it on anything else instead of wasting time on mundane maintenance with a known automatic solution.
    -- とある白い猫 chi? 00:34, 29 October 2016 (UTC)
    Both those recently created redirects have no value at all, and should probably be deleted - fixing them only solves the "they appear on the DoubleRedirects page"-issue. Since checking 700+ wikis for how many fully protected double redirects there are, I have taken a sample: Checking all wikipedias whose language codes are within the aa-lx range (skipping en.wiki, 137 wikis), there are only 6 with one or more protected double redirects: ar.wiki (1), cawiki (35, all subpages of their main page), frwiki (1, the one you mention, which points to a dead link - and should probably be deleted), gotwiki (1), hiwiki (6, all .js subpages of a single user), kywiki (1). If in over 130+ wikis, there are a double protected redirects on 6 wikis, do you really think it's worth requesting this global right? Twice? Sa vh ñ 01:05, 29 October 2016 (UTC)
    @Savh: Sure, they can be fixed by deletion but why bother? Like you said, the pages typically aren't very useful to begin with (.css/.js). Weather they are processed by the bot (in under a second) or by a human (easily about a minute) makes no difference at the end of the day. I feel fixing it regularly by editing is more fail safe than deletion since I do not speak every language. The idea here is to save on human time.
    I do feel the global right would save on time so as not to review 700 wikis constantly for a trivial issue. Running it for something like 3 days as I suggested above would take care of the decades worth of backlog (which could be one or two protected pages for every nth wiki, reviewing is more trouble than it is worth). Beyond that, should this be used for sustained maintenance? Honestly I do not see the harm. If it becomes evident that this is not needed, it can be lifted at any point in the future. This is not a human, its a bot. It wont complain if its access is suddenly removed. :) I have no need for "Global editinterface" myself after all.
    -- とある白い猫 chi? 01:39, 29 October 2016 (UTC)
    You're still wasting our time with a request that a) does not comply with the policy b) does not solve much c) could better be done by notifying the local communities (as you have already been told 4 years ago).
    Granting your bot the right to edit all interface and protected pages across all wikis just so it can edit a handful of redirects is totally overkill. I have completed a look at all Wikipedia's listed protected double redirects: Of the 264 Wikipedias, 21 have one or more broken double redirects. arwiki (1), cawiki (35), frwiki (1), gotwiki (1), hiwiki (6), kywiki (1), mrwiki (1), newiki (1), orwiki (2), pamwiki (1), quwiki (1), sdwiki (1), simplewiki (2), sqwiki (3), suwiki (1), tetwiki (2), ukwiki (19), vecwiki (1), viwiki (2), zhwiki (2), zhyuewiki (1). None come close to the dreaded "5000 pages in the list problem", and many still don't have a valid reason to either be protected or not deleted. So, unless there is a real (within scope) reason for your bot to have editinterface access, I see no real benefit from granting access to your bot. Sa vh ñ 08:40, 29 October 2016 (UTC)
    @Savh: What I have been told back in 2012 does not hold water. Since then local communities did not address the issue nor did stewards nor global sysops. Why? Doing so is a complete waste of their time. You have not stated a single reason why humans are more suited for this task than bots.
    No one here said any wiki was any close to "5000 pages in the list problem". Thank you for confirming my intuitive numbers I mentioned prior. As you can see in my response to Krd, I said that I was expecting one or two problem pages per wiki if at all. The premise of your entire argument is what this request claimed from the beginning. Few edits (1 or 2 most likely) would happen on many wikis. Summing your numbers for just 264 wikis: 1+たす35+たす1+たす1+たす6+たす1+たす1+たす1+たす2+たす1+たす1+たす1+たす2+たす3+たす1+たす2+たす19+たす1+たす2+たす2+たす1=85 double redirects (0.322 double redirects per wiki). It would take a human about 85 minutes if each correction takes a minute, 42.5 if they take half a minute. It will likely take a lot more time since the human needs to be either granted a flag themselves, ask the community and discuss the issue - possibly also make mistakes given how monotonous the task is. If we take this 85/264 rate as the average, that would be about 225 problems for 700 wikis. How much time did it take you to review 264 wikis to generate the numbers above? Why do you want people to regularly spend 700/264=2.65 times that time on this?
    -- とある白い猫 chi? 11:00, 29 October 2016 (UTC)
    1. "No one here said any wiki was any close to "5000 pages in the list problem"". Indeed, no one said such a thing (and note I don't say you said that), but you do mention that "If not fixed, the Special:DoubleRedirects log would eventually become useless as it has a finite limit of 5,000 (as with most WMF logs) and anything higher is pruned".
    2. "Thank you for confirming my intuitive numbers I mentioned prior", but I see nowhere you suggesting there are only such cases on about 10% of all wikis and with in most cases only one or two such cases. Only 5 wikipedias have 3 or more protected double redirects.
    3. "You have not stated a single reason why humans are more suited for this task than bots". I have, if you understand that humans are, in this case, local community members instead of a global bot. I have repeatedly stated humans/local community members can evaluate the very few cases on their wiki (if any, see #2) better, and decide whether fixing the redirect is really the solution they need, instead of either deleting it or unprotecting the page.
    4. "It would take a human about 85 minutes if each correction takes a minute". There have been 57 edits to this (and the GS) request now. There have been 19 edits to both your 2012 requests. Assuming those edits take the same time (which obviously isn't the case, as correcting a double redirect is usually more straightforward), this request indeed consumes nearly as much time as allowing the local community members to fix it.
    As I've mentioned before, granting rights for such tasks creates a precedent for an unwanted waste of time of which nobody benefits. Sa vh ñ 15:42, 30 October 2016 (UTC)
    @Savh: Alright, if you can find a way for the local communities to handle this issue on all wikis under (削除) an hour (削除ここまで) 20 minutes, I will withdraw my request. That is the amount of time it would take the bot to process them. Dealing with double redirects is indeed straight forward which is why bots deal with it. If dealing with the process takes any longer than the arbitrary time I specified, it will waste the time of everyone involved. I have been trying to answer your questions which you kept asking. I am sorry you see it as a waste of your time. -- とある白い猫 chi? 20:58, 30 October 2016 (UTC)
  • Support Support This is a request for a temporary flag, to allow completing a non-controversial maintenance task using 'standard' software (not even written by the operator) that has completed many thousands of edits on hundreds of wikis with no problems. The only real reason to oppose this is if the operator is expected to abuse the ability to edit interface pages, or if some wikis desire such work to not be done automatically. Having done this for years now, ToAruShiroiNeko obviously knows which ones to exclude, and as an OTRS member, a sysop on multiple wikis, and the operator of a bot that is locally flagged on over 175 wikis (as well as globally) he's obviously trusted by the vast majority of the community. Arguing about this is silly, IMO. Revent (talk) 00:54, 29 October 2016 (UTC)
    Arguing about this is as silly as mentioning that arguing about this is silly. Sa vh ñ 01:06, 29 October 2016 (UTC)
@Savh: Probably. Do you really think he's going to set the world on fire by running redirect.py across protected pages? It might be pointless in particular cases, there might be a better solution in particular cases, but it's quite unlikely to break anything, and I hardly think he's going to use his bot account to do anything especially 'world destroying' in the few days of access requested. It's a well defined, and limited, task, and will be generally useful simply to clean up a decade of accumulated cruft. Revent (talk) 02:37, 29 October 2016 (UTC)
I don't think he's going to set the world on fire. I have mentioned that:
  1. this request is outside the tools' scope.
  2. proposes to solve a nonexistant problem, since by far most double redirects are unprotected, and the few that are are most times not a problem - except for them sitting around in the DoubleRedirects list (which isn't a problem either)
  3. it creates a precedent for people to ask for global rights to solve some small irrelevant issues.
And I really would like to encourage you not to state that these are silly reasons without a proper reply. Sa vh ñ 08:40, 29 October 2016 (UTC)
@Savh: Interface editors: "This permission is enabled on every public Wikimedia wiki that shares access via CentralAuth and SUL, and is only to be used for non-controversial maintenance [...]"
Did you even read the scope? This is non-controversial maintenance. Double redirects are marked as such.
-- とある白い猫 chi? 20:04, 31 October 2016 (UTC)
Selective reading does not help; "They maintain templates and the site's JavaScript (*.js) and Cascading Style Sheets (*.css) resources". Redirects are neither. Sa vh ñ 21:33, 31 October 2016 (UTC)
That is easy then, we can simply add redirects to the exact wording. The idea here is to make an edit that removes useless .js and .css from the double redirect log instead of permanently keeping them there which is also unhelpful. -- とある白い猫 chi? 00:35, 1 November 2016 (UTC)

Global editinterface for Nirmos

Status:  Not done

As I've updated JavaScript across various projects, I've replaced wgFoo with mw.config.values.wgFoo. This is good in that the script no longer depends on the global variables that will be removed. However, now the mw.config.values.wgFoo form is deprecated too in favor of mw.config.get( 'wgFoo' ).

What I'm asking permission for, is to change from the values form to the get form.

The relevant Phabricator task is phab:T146432 and the relevant gerrit change is gerrit:312557.

Pinging Catrope and Krinkle. Nirmos (talk) 15:06, 28 October 2016 (UTC)

Note previous requests here and here. --Rs chen 7754 18:28, 28 October 2016 (UTC)
  • Neutral Neutral Per previous request. I don't think editinterface is appropiate for onetime find-and-replace actions. These are trivially done with a semi-automated bot by existing editinterface users (tourbot, mwgrep, [2]). Also note that these are not trivial find-and-replace operations. Each edit requires careful careful review and understanding of the code, as it is immediately deployed to end-users. I appreciate Nirmos' enthusiasm and willingness to help. I also note that his understanding of JavaScript and MediaWiki JS is getting better. Although I am also a little bit surprised Nirmos was able to work around the opposition of previous editinterface requests and successfully requested temporary sysop on many individual projects where he wanted to perform the changes. --Krinkle (talk) 23:54, 31 October 2016 (UTC)
  • Neutral Neutral Per Krinkle. Alvaro Molina (Let's Talk ) 00:30, 1 November 2016 (UTC)
  • Oppose Oppose per my comment/vote on the last two requests. --Steinsplitter (talk) 09:41, 1 November 2016 (UTC)
  • Oppose Oppose similar to my comments on the request above: a user claiming that they are performing an "invaluable" task and that justifies them having such a high-powered right, while demonstrating issues with communication that would cause problems on wikis with established communities when those wikis object to such a thing. --Rs chen 7754 01:32, 4 November 2016 (UTC)
  • Oppose Oppose per the previous two requests and the fact that I simply don't trust Nirmos with this bit. Their insistence on getting it makes me concerned that they will not abide by the wishes of local communities. Instead just pushing through their changes without any discussion. --Majora (talk) 22:43, 5 November 2016 (UTC)
  • Not done - no consensus. – Ajraddatz (talk) 07:58, 13 November 2016 (UTC)

Blocked in English Wikipedia for Anass Badou

Status:  Not done

blocked for no reason--Anass Badou (talk) 13:28, 13 November 2016 (UTC)

  • "Blocked in English Wikipedia" is not a global permission. If you are blocked on enwiki, you can ask local administrators for unblock. This page is for make requests for global permissions, not for a unblock. --Ks-M9 [disc.] 14:15, 13 November 2016 (UTC).
Not done per above. Btw, your account isn't block on English Wikipedia, it's probably your IP address. Anyway, please contact English Wikipedia administrators for that. ~ Nahid Talk 17:23, 13 November 2016 (UTC)

add global OTRS member for Jan Kovář BK

Status:  Done

thanks, --Krd 07:43, 13 November 2016 (UTC)

Done. – Ajraddatz (talk) 07:58, 13 November 2016 (UTC)

remove global OTRS member for Teemeah

Status:  Done

thanks, --Krd 08:23, 16 November 2016 (UTC)

Done. – Ajraddatz (talk) 08:27, 16 November 2016 (UTC)

Oathauth test for Jbhunley

Status:  Done

It was mentioned to me that it might be possible for me to be included in the 2FA test group in a discussion at en.wp's Administrator's noticeboard [3] by making a request here. thanks, --Jbh Talk 01:57, 17 November 2016 (UTC)

Done. --MF-W 02:22, 17 November 2016 (UTC)

remove global OTRS member for Melos

Status:  Done

Thx. --Krd 11:25, 17 November 2016 (UTC)

Done. —Marco Aurelio 11:34, 17 November 2016 (UTC)

Oathauth test for Luke081515

Status:  Not done
This request is to be declined, user has already local adminship. --Vogone (talk) 21:31, 17 November 2016 (UTC)
Sorry, my fault, I forgot that. Luke 081515 21:37, 17 November 2016 (UTC)

add global OTRS member for Mike1901

Status:  Done

--Krd 07:15, 18 November 2016 (UTC)

Done--Shanmugamp7 (talk) 07:20, 18 November 2016 (UTC)

add global OTRS member for Melos

Status:  Done

--Krd 07:37, 20 November 2016 (UTC)

Done, Linedwell (talk) 09:43, 20 November 2016 (UTC)

add global OTRS member for Mz7

Status:  Done

--Krd 06:54, 21 November 2016 (UTC)

Done--Shanmugamp7 (talk) 06:57, 21 November 2016 (UTC)

remove global OTRS member for Amada44

Status:  Done

--Krd 07:15, 21 November 2016 (UTC)

Done. – Ajraddatz (talk) 08:07, 21 November 2016 (UTC)

remove global OTRS member for Microchip08

Status:  Done

--Krd 07:15, 21 November 2016 (UTC)

Done. – Ajraddatz (talk) 08:07, 21 November 2016 (UTC)

add global OTRS member for Putnik

Status:  Done

--Krd 13:32, 22 November 2016 (UTC)

Done. --Stryn (talk) 14:46, 22 November 2016 (UTC)

add global OTRS member for 4nn1l2

Status:  Done

--Krd 13:33, 24 November 2016 (UTC)

user has long history of blocks and socks in fawiki, I have zero trust in him Mardetanha talk 13:59, 24 November 2016 (UTC)
Agreed, Two unsuccessful RfAs and not to mention stuff like this which makes me worried on giving out non-public information to this person (Americophile = 4nn1l2) Amir (talk) 14:39, 24 November 2016 (UTC)
Although this is the wrong page to discuss this and the application was visible for a week at OTRS/volunteering, could you please advise why the user is not blocked or where the connection between this account and the sock farm is visible? --Krd 14:45, 24 November 2016 (UTC)
As I told in my application, I had a clean start 5 years ago and I retain a clean block log since then. I certainly regained the trust of community when I was elected to the Persian Wikipedia SupCom (akin to EN‌ WP ArbCom) with the highest votes (election was held on votewiki and results were certified by two stewards). I do not understand why you cannot just drop the stick and move on. I do not understand why you try to make drama out of what happened 5 years ago. Wikimedia projects are not battlegrounds and I am here to do good work. 4nn1l2 (talk) 15:00, 24 November 2016 (UTC)
The image is irrelevant. As a regular user of Commons and a lover of photojournalism, I risked my life to take some free pictures of a rare incident. That's it. 4nn1l2 (talk) 15:25, 24 November 2016 (UTC)

Done --MF-W 17:26, 24 November 2016 (UTC)

while we are talking, what sort of "The request will be approved if consensus to do so exists after a short period of consideration" could find, how could you determine such consensus ? ? it was very rude and wrong behavior for steward ? ‍‍‍Mardetanha talk 05:50, 25 November 2016 (UTC)
OTRS-member is a group without permissions which is assigned on request from OTRS admins to acknowledge publicly that a user has access to certain queues. Protesting against the user's OTRS access doesn't take it away. --MF-W 09:38, 25 November 2016 (UTC)
Mardetanha, I understand that you have objections to this user, but blocking a request like this from being processed is coming dangerously close to breaking the steward policy. When executing a simple task like this, it isn't our role to be blocking the result of something that we disagree with. There is no consensus required for this right to be assigned, and indeed it has nothing to do with their OTRS access. Therefor, this isn't the appropriate place to make a stand either - the proper venue would be contacting the OTRS admins. – Ajraddatz (talk) 09:47, 25 November 2016 (UTC)
while I agree with you, as you can see here otrs admin, was asking us a question here, and without giving us proper time to respond to his question, granting access is something unacceptable Mardetanha talk 10:26, 25 November 2016 (UTC)
You can still reply to him. --MF-W 10:50, 25 November 2016 (UTC)
for what, to mock our-self for an already granted request ? at least have decency to accept it was wrong of you to do this Mardetanha talk 10:57, 25 November 2016 (UTC)
@Mardetanha: please review Requests for comment/Creation of a global OTRS-permissions user group and Special:GlobalGroupPermissions. It is concerning that you are accusing other stewards of wrongdoing when you are clearly unaware of how this user group works. --Rs chen 7754 22:24, 25 November 2016 (UTC)
I wonder why don't you get such simple thing, as you can see ""The request will be approved if consensus to do so exists after a short period of consideration". we were asked a question and we should be given enough time to respond, Mardetanha talk 05:08, 26 November 2016 (UTC)
That does not, and has never applied to these requests... – Ajraddatz (talk) 05:19, 26 November 2016 (UTC)
Then remove it at least if it is there for fun Mardetanha talk 05:34, 26 November 2016 (UTC)
I agree the situation is not perfect and think it should somehow (technically) be made possible that these requests can be handled directly by the OTRS admins, but this is not the only confusing thing about SRGP (e.g. the local "global-rename" permission is assigned here, while one would expect it to be on Meta:RFA and removal requests for OTRS-member are being placed here while all other global group membership removals are being requested on SRP). We definitely missed to give this page a structure, also removing the additional sections and making a single "other" out of it has been, in my opinion, a mistake. But these kind of discussions probably belong to the talk page where we can summarise the inconsequential parts of this page which lead to confusion and situations like these and think about solutions. Regards, --Vogone (talk) 07:53, 26 November 2016 (UTC)

add global OTRS member for Diego Queiroz

Status:  Done

--Krd 06:51, 27 November 2016 (UTC)

Done, Linedwell (talk) 08:25, 27 November 2016 (UTC)

add global OTRS member for 1989

Status:  Done

--Krd 11:01, 29 November 2016 (UTC)

done ~ Nahid Talk 12:17, 29 November 2016 (UTC)

remove global OTRS member for Laurentius

Status:  Done

--Krd 11:11, 29 November 2016 (UTC)

done ~ Nahid Talk 12:17, 29 November 2016 (UTC)

remove global OTRS member for Matanya

Status:  Done

--Krd 11:11, 29 November 2016 (UTC)

done ~ Nahid Talk 12:17, 29 November 2016 (UTC)

Add global OTRS member for タチコマ robot

Status:  Not done

I am handling some bulk cases where I want to process the pages using the bot. The bot will not be reviewing the actual tickets. :p -- とある白い猫 chi? 13:40, 29 November 2016 (UTC)

If you want to apply to become an OTRS member, you should follow procedures of OTRS/Volunteering. Stewards will only accept requests here which come from OTRS administrators. RegardsTBhagat (talk) 13:51, 29 November 2016 (UTC)
@Tulsi Bhagat: I am an OTRS member already. This is my bot. -- とある白い猫 chi? 13:55, 29 November 2016 (UTC)
@Krd and Matthewrbowker: Do you support such a request, where a user wants OTRS member access for their bot? Trijnstel talk 14:05, 29 November 2016 (UTC)
Just to clarify, I have just closed a ticket with 40 files. I do not want to tag 40 files individually. You can observe the problem here. -- とある白い猫 chi? 14:22, 29 November 2016 (UTC)
Negative. Please use your main account for such edits, or if it is _really_ required, change the related abuse filter at Commons. --Krd 14:51, 29 November 2016 (UTC)
PS: Adding of OTRS permission is likely a thing that should appear on watchlists, so one should do this without bot flag in any case. --Krd 14:53, 29 November 2016 (UTC)
I agree with Krd on this one. @とある白い猫: If you need to make bulk changes to files, maybe use VisualFileChange ? ~ Matthewrbowker Drop me a note 17:42, 29 November 2016 (UTC)
Not done per Krd. ~ Nahid Talk 14:54, 29 November 2016 (UTC)

AltStyle によって変換されたページ (->オリジナル) /