Wikipedia:Village pump (technical)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Mr. Stradivarius (talk | contribs) at 00:06, 12 February 2014 (→‎Template:Edit protected script errors..: comment). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

 Policy Technical Proposals Idea lab WMF Miscellaneous 
The technical section of the village pump is used to discuss technical issues about Wikipedia. Bugs and feature requests should be made at Bugzilla (see how to report a bug). Bugs with security implications should be reported to security@wikimedia.org or filed under the "Security" product in Bugzilla.

Newcomers to the technical village pump are encouraged to read these guidelines prior to posting here. Questions about MediaWiki in general should be posted at the MediaWiki support desk.


How should external protocol-relative links be implemented?

At Wikipedia:Village pump (policy)/Archive 111#As WP uses HTTPS.2C should .28some.29 external links.2C_too.3F it was recently decided that Wikipedia should "use HTTPS links for HTTPS only sites, protocol relative links for sites that support both HTTP and HTTPS, and HTTP links for sites that don't support HTTPS at all". The closer of that discussion noted that "the discussion [didn't] concern the implementation of this proposal, and therefore a new one should be initiated regarding this". I am therefore opening such a discussion on how this can best be implemented.

Some editors in the previous discussion and elsewhere have suggested that the consensus can be implemented simply by removing the protocol from the URLs of sites which support both HTTP and HTTPS. For example, [https://www.youtube.com/ YouTube] could be changed to [//www.youtube.com/ YouTube]. However, I contend this may be a bad idea because it breaks such links when reading an offline (or otherwise non-HTTP[S]-hosted) version of Wikipedia. There at least a couple scenarios in which someone might be reading such a version:

  • Some Wikipedia Apps for mobile devices and offline readers for computers download a complete dump of Wikipedia to local storage and access it from there. Such scenarios are particularly advantageous for users subject to low bandwidth, government surveillance, or government censorship, which were three groups singled out in the previous discussion.
  • Some users who normally read Wikipedia online may use their web browser to manually save an article to local storage for future reference.

When viewing these offline copies the protocol used to view the article is likely file:, which means that the previous example would link to the non-existent file://www.youtube.com/.

In light of this I wonder if there is some technical workaround, be it via clever use of templates or changes to MediaWiki itself. —Psychonaut (talk) 20:49, 23 January 2014 (UTC)[reply]

IMO, it would be a bug in the application that has stored the webpage to file:// that any protocol relative links on the page not been changed to the protocol being used at the time the page was stored (unless selected by the user to make an exact copy, or if the linked resource has also been stored). The application is effectively re-authoring the pages and should be responsible for making such changes. My opinion on this does not change the fact that such issues may exist in some situations, may impact users, and are something to be considered in any implementation which we adopt.
Having just checked this, I find that Firefox 26.0 correctly translates PR links to the protocol in use at the time the page is saved when saving files for offline viewing.
I also just checked the Wikipedia App on Android which properly translated a PR link to //www.youtube.com/ stored from a Wikipedia page on to local storage to be https://www.youtube.com/, the protocol in use at that time. I was able to open the link from a saved page with out any problems. It automatically opened in a browser using the HTTP protocol. So, the Wikipedia App is not an issue. Makyen (talk) 01:23, 24 January 2014 (UTC)[reply]
Does the WMF Wikipedia App really let you read offline? From the article it wasn't clear, though some of the non-WMF ones listed there certainly do. We should also consider popular offline readers for traditional computers, such as XOWA and Kiwix. —Psychonaut (talk) 10:03, 24 January 2014 (UTC)[reply]
If you have saved the page for offline viewing the Wikipedia App does enable viewing the page while completely disconnected from any network.
In addition, I checked IE. Like the others above, IE properly translated //www.youtube.com/ to https://www.youtube.com/ on a Wikipedia page when stored as a file for offline viewing.
For both Firefox and IE, I verified that the original page source served by Wikipedia contained the link as //www.youtube.com/ and that it was translated to https://www.youtube.com/ in the stored file. I did not do this verification for the Wikipedia App. Makyen (talk) 02:02, 24 January 2014 (UTC)[reply]
I tested Firefox and SeaMonkey myself. The links are correctly converted only if you use the "Web page, complete" save mode. Using "Web page, HTML only" they don't work. Some web clients, such as Links, don't handle protocol-relative URLs even when reading online—of course, that's a bug, though if it's one that affects several popular browsers, or even a small number used by a particular subset of our readers who have litttle or no choice (such as the blind and visually impaired), then that's something we need to consider. —Psychonaut (talk) 10:03, 24 January 2014 (UTC)[reply]
There are many other URLs in the HTML of a Wikipedia page are always protocol-relative, whether we intend them to be or not. Some examples:
  • Images - the emitted HTML uses the <img /> tag, with attributes like src="//upload.wikimedia.org/wikipedia/commons/thumb/..." and srcset="//upload.wikimedia.org/wikipedia/commons/thumb/..."
  • Interlanguage links (whether in the sidebar or in the page's inline text) - the <a>...</a> element has an attribute like href="//fr.wikipedia.org/wiki/..."
  • Interwiki links (commons, meta, wikidata, wiktionary etc.) - similarly, the <a>...</a> element has an attribute like href="//commons.wikimedia.org/wiki/..."
So long as these work without a protocol, I don't think that we should worry about external links which also happen to be formatted using the protocol-relative syntax. --Redrose64 (talk) 13:11, 24 January 2014 (UTC)[reply]
Psychonaut: So, at least for Firefox, your issue is that protocol relative links don't work if the user chooses to translate the page first from being online to being offline, and then selects a non-default format which is guaranteed to have all links, other than internally to the page, broken when viewed offline? Sorry, I don't really see that as an issue. As it currently stands, every single link to within Wikipedia is broken under those circumstances. I don't feel we should be considering it to be to be a significant negative that the links are broken when someone chooses to store just the HTML page. Further, this discussion is supposed to be about how we implement the change, not if we are to do so.
So far, the primary thing that I hear you saying is that you have concerns that PR links might be an issue under some unknown/unspecified circumstances (some browsers, etc.), or when the user has made choices which are guaranteed to break most/all links (at least those internal to Wikipedia). Your statements now imply that you do not have any actual cases where using PR links is a known problem (for situations where the content was actually intended to be viewed, which saving just the HTML page is not). I know that it is unreasonable to expect one person to test all cases for all browsers. Thus, I don't expect you to do so.
The reality is that for external sites which support both HTTP and HTTPS we can offer links that are A) HTTP, B)HTTPS, or C) protocol relative. The previous discussion ended with unanimous opposition to both options A and B while showing unanimous support for option C, protocol relative links. The current discussion is how we are to implement changing to using protocol relative links for external sites which support both HTTP and HTTPS. We appear to be getting sidetracked on a subset of concerns as to if we should provide PR links under such circumstances as opposed to how we do so. While considering the potential issues you have brought up about providing PR links, keep in mind that providing HTTP links, or HTTPS links each has its downside where the links are broken for some readers viewing the pages while online. Using PR links was, and is, the best of the three options (when both protocols are available from the external site).
So the question remains how are we going to implement providing such links. I will take a stab at the very rough basics:
For external sites providing both HTTP and HTTPS:
  • Change templates which provide links to such sites to providing PR links (e.g. {{YouTube}} ).
  • Change configuration files for AWB and other tools such that the change is made along with any other general fixes, typo fixes, etc.
  • Consider running a bot/bot task to make such changes more rapidly.
There are certainly enough pages where such links exist to make a bot an appropriate option. One question is: do we want to push such a change through wholesale with a bot, or let it be more of a gradual migration? Is now when we want to make such a change? How fast do we want to make the change? etc.
I am sure that there is more to it than just the above very rough list. Makyen (talk) 15:32, 24 January 2014 (UTC)[reply]
Before doing the above, I suggest that we create a page describing protocol relative links, so when someone asks why we removed the "http(s):" from a link, we could point them to protocol relative link or an appropriate page in the Wikipedia or Help namespace. Also, are there any other domains that could use PR links besides youtube.com and web.archive.org? Thanks! GoingBatty (talk) 15:25, 27 January 2014 (UTC)[reply]
OK, I'm glad to learn that the situation isn't as problematic as I suspected, and that we've already been using protocol-relative links for some time without any apparent objections. In the absence of any more substantiated suspicions then I'd support changing the necessary templates, and requesting a bot to convert existing and future hard-coded links. —Psychonaut (talk) 17:17, 29 January 2014 (UTC)[reply]
As to what links to change: I believe at least the following all can use PRLs:
  1. web.archive.org
  2. youtube.com
  3. myspace.com
  4. twitter.com
There are certainly others. I know that user:Bender235 has a list that has been used to implement a large number of PRL changes via AWB.
Any volunteers to take on creating Protocol relative link? Makyen (talk) 00:11, 4 February 2014 (UTC)[reply]
MySpace and Twitter use HTTPS by default. No need for protocol-relative links there. Wayback only keeps the existing HTTP links alive so that no link is broken. Anyone who enters their site now is redirected to HTTPS per default. Only YouTube has a true either/or strategy from those on your list. --bender235 (talk) 00:35, 4 February 2014 (UTC)[reply]
When I enter "web.archive.org" into my browser, I'm not redirected to a HTTPS page. When I click on a link in Wikipedia that starts with "http://web.archive.org" (e.g. reference 5 in Linda McCartney), I'm not redirected to a HTTPS page. GoingBatty (talk) 02:18, 5 February 2014 (UTC)[reply]
OK, so our list is really only 2 sites? It would be nice to have a moderately comprehensive list prior to starting so we are not making a run for each site. Makyen (talk) 18:31, 9 February 2014 (UTC)[reply]
Hmm, you're right, Wayback doesn't redirect every site. Entering http://web.archive.org gives you the HTTP version. Only the "main entrace" http://archive.org is redirected. That's what they announced in October 2013. But the way I read their announcement, they encourage people to use HTTPS and HTTPS only.
@Makyen: it's not just two sites. Basically you can add all pages from the EFF's HTTPS Everywhere Atlas. But the main ones, from Wikipedia's perspective, are Wayback and YouTube, because those are (as far as I can tell) by far the most linked domains from Wikipedia (maybe apart from http://dx.doi.org). --bender235 (talk) 19:55, 11 February 2014 (UTC)[reply]

Soft redirects

There's been a longstanding system bug by which some soft redirects to other wikis (e.g. Wikisource or Wiktionary) get erroneously picked up as "uncategorized articles" by the various uncategorized page detection tools. Most commonly this occurs after someone has converted a Wiktionary redirect into a dicdef article and then somebody else has reverted it back to a soft redirect — however, with the recent creation of {{Wikisource redirect}} a few days ago, it's now beginning to also happen to many pages on which the new template has been added as a replacement for {{softredirect|wikisource}}.

The problem is that once this erroneous automated pickup has happened, the only way the page can ever be removed from the uncats list again is to be permanently added to the internal Category:Temporary maintenance holdings category; just in the few days since the new template's creation I've had to so "categorize" Allende's last radio message, As some day it may happen, Chinese proverb, Chinese proverbs, Declaration of the Breakdown of Chile’s Democracy, Executive Order 13026, Executive Order 13072, French proverbs, German proverbs, Hungarian proverbs, Icelandic proverbs, Indonesian proverbs, Korean Air Lines Flight 007 transcripts, List of misquotations, List of Polish proverbs, Phil the Fluther's Ball, Portuguese proverbs, The Interest of America in Sea Power, Present and Future and The River Merchant's Wife: A Letter. In other words, I've had to add twice as many articles to that bugtrap "category" in the past week alone as have ever had to be added to it in the entire previous three years combined, and the problem's only going to get worse as the usage of that new replacement wikisource template expands further.

I've asked before if there was anything that could be done to fix this, but it keeps happening nonetheless. Can anybody assist in figuring out how to ensure that soft redirects stop getting improperly detected as uncategorized pages? Bearcat (talk) 01:37, 31 January 2014 (UTC)[reply]

This link is the primary tool I've been looking at. So far, both Allende and "As some day it may happen" have already immediately reappeared on the list; the "proverbs" pages that you pulled out as part of the test haven't, but those were tagged a few days ago and thus might potentially reappear tomorrow when the new daily batch rolls over, and the "deaths" ones were never part of this issue at all (the 2012 one was a similar but not directly related issue over a year ago; the 2013 one appears, from what I can tell, to have been added only because the 2012 one was in there and so whoever converted the 2013 list to a redirect erroneously assumed that was standard process in all cases.)
And just for the record, that tool doesn't pick up most soft redirects as a rule; it successfully avoids the vast majority of them overall, and only specific quirks like the situations I've talked about here (dicdef reversions, conversion of wikisource redirects to this new template) seem to trip it up, which means that there's something about the pages' status in our database that isn't correctly reporting rather than something in the toolserver coding. Bearcat (talk) 02:21, 31 January 2014 (UTC)[reply]
  • (edit conflict × 2) I think I know what is going on, but it will take a little more time for stuff to process through to be sure. Have you tried contacting that tool's operator, JaGa to find out if it is something in the tool itself?
What I'm guessing is that the tool is only doing one check on the page props that looks like:
API query response
<?xml version="1.0"?>
<api>
  <query>
    <pages>
      <page pageid="983737" ns="0" title="Allende&#039;s last radio message" />
      </pages>
    </query>
  </api>
The tools should be doing a second query that looks like:
API query response
<?xml version="1.0"?>
<api>
  <query>
    <pages>
      <page pageid="983737" ns="0" title="Allende&#039;s last radio message">
        <categories>
          <cl ns="14" title="Category:Redirects to Wikisource" />
        </categories>
      </page>
    </pages>
  </query>
</api>
The second query would pick up the fact that it is in a hidden category which it is currently missing. If the tool maintainer doesn't respond directly, I'll see if I can find the tool's code and find a way to submit a pull request to fix it. Technical 13 (talk) 02:51, 31 January 2014 (UTC)[reply]
Okay, I'll ask him about that. But I suspect that the tool is already coded for that, because as I noted above it successfully avoided soft redirects to Wikisource using the old template, which wouldn't have been the case if the toolserver wasn't already coded that way. Bearcat (talk) 02:57, 31 January 2014 (UTC)[reply]
<?xml version="1.0"?>
<api>
  <query>
    <pages>
      <page pageid="983737" ns="0" title="Allende&#039;s last radio message">
        <categories>
          <cl ns="14" title="Category:Redirects to Wikisource" />
        </categories>
      </page>
    </pages>
  </query>
</api>
So, I guess we'll just have to wait for the tool operator to respond.  :) Technical 13 (talk) 03:02, 31 January 2014 (UTC)[reply]
Well, just for the record I think the missing hidden category that you identified and added was the problem. I thought not at first, because the pages didn't drop from the list right away after you added it, but then I remembered that there's been a problem lately with pages lagging in picking up changes to their transcluded templates — so I null-edited both of the pages again, and that succeeded in dropping them. So it's still worth seeing if JaGa has anything helpful to contribute, but I think you've already fixed the problem. So thanks for that :-) Bearcat (talk) 03:14, 31 January 2014 (UTC)[reply]
Never mind. They both dropped on first refresh but then returned again on a second one, so the problem's still active and we'll definitely need JaGa's input. Bearcat (talk) 03:16, 31 January 2014 (UTC)[reply]
I'm totally not familiar with this, but just out of curiosity: what would happen if instead of manually adding Category:Temporary maintenance holdings at the bottom [1] you added Category:Redirects to Wikisource -- the text that is supposed to be being added by the template? In other words, is it the transclusion that doesn't work, or the category it's put in? Wnt (talk) 03:40, 7 February 2014 (UTC)[reply]

Customizing user experience based on logged-in status

Hi. I'd like to do the following.

Add this block of code to MediaWiki:Common.css:

.display-for-user {
	display: none;
}

Add this block of code to MediaWiki:Group-user.css

.display-for-user {
	display: inline;
}

Then create a wrapper template similar to testwiki:Template:Hide from anons. This will have the effect of allowing us to show content only to logged-in users. Thoughts? --MZMcBride (talk) 07:02, 31 January 2014 (UTC)[reply]

@MZMcBride: What is the use case for this? Wikipedia is not really set up to have truly private information that isn't visible to readers. Yes, we can noindex things and remove them from default search settings, but the wiki is public to all readers for good reason. I'm not sure we want people creating content that should be absolutely hidden from unregistered users... Steven Walling (WMF) • talk 17:40, 31 January 2014 (UTC)[reply]
S: Yes, I'm deeply familiar with how both Wikipedia and CSS operate. This isn't intended to be used with trade secrets or personally identifiable information. :-)
One suggested use-case was hiding the scary-looking links at the bottom of MediaWiki:Anontalkpagetext as they're off-putting and probably only intended for logged-in users.
I think having a generalized, trackable way of hiding information based on logged-in status would be useful. You make a good point that we would need to clearly document that the information is only hidden and not deleted, however. --MZMcBride (talk) 18:57, 31 January 2014 (UTC)[reply]
If that content is in a MediaWiki message, why don't we hide that content by stripping out the IP inspection tools in to a separate message and hide that conditionally within MediaWiki proper? Similar tools are available on some wikis (en, es, fr) but not others. It might be appropriate to provide some default IP inspection tools, which people could customize. If the use case was merely in templates which are entirely specific to English Wikipedia, I'd support adding these classes. But hiding content from logged out users is the kind of thing that we usually do within core or an extension. Steven Walling (WMF) • talk 20:35, 31 January 2014 (UTC)[reply]
S: I don't think modifying MediaWiki core is needed here, but feel free to file tickets in Bugzilla as you see fit. As I understand it, MediaWiki hasn't had the ability to customize the user experience based purely on logged-in status (using CSS) until very recently. (Previously JavaScript could be used.) This is partially why traditionally MediaWiki extensions or modifications to MediaWiki core have been used instead. By using CSS, the content is still transferred to the client, but parser cache fragmentation is reduced and there's no JavaScript dependency. This seems like a reasonable trade-off to me. --MZMcBride (talk) 21:40, 31 January 2014 (UTC)[reply]
"By using CSS, the content is still transferred to the client, but parser cache fragmentation is reduced and there's no JavaScript dependency. This seems like a reasonable trade-off to me." Agreed, that's entirely reasonable. I guess I'm just a little worried about people abusing this whenever they want to hide content. Steven Walling (WMF) • talk 22:08, 31 January 2014 (UTC)[reply]
You're thinking about a censorship problem, like someone using it to hide "bad words" in articles, without any indication of their removal or anyway for the reader to override it? I don't mind people being able to hide disturbing material from themselves, but I do object to it being done to them with no notice and no way to override it.
In practice, though, I think we could easily have a policy against that, although a small number of abuses in low-traffic articles might get overlooked. It might also be possible to design it so that it did not work in the mainspace. WhatamIdoing (talk) 00:47, 1 February 2014 (UTC)[reply]
WhatamIdoing, Steven: The use of a wrapper template allows us to track (mis|ab)use. We can also add in namespace restrictions at the CSS level or the template level, as necessary and appropriate. I'd personally prefer to take a "wait and see" approach.
Broadly, the idea here is less prone to abuse than its opposite, I think. That is, if content could be hidden from only logged-in users, it might allow vandalism and other bad content to be exposed to anonymous users for a longer period of time. However, in the proposed scenario, the abuse vector seems much smaller. --MZMcBride (talk) 23:36, 3 February 2014 (UTC)[reply]
But if it exists, there's no way to prevent someone from calling it directly or substing the template, and thus not being able to track it (in our ample free time), right?
I don't actually feel that strongly about it. If it became a nightmare, it could (in theory) be removed. But I am a little nervous about it. WhatamIdoing (talk) 00:05, 4 February 2014 (UTC)[reply]
I don't like this idea at all. To hide any text from anons seems like a bad precedent, and to hide that text seems out and out dishonest. There's something incredibly creepy about the notion of not merely tracking the anons by IP address and using it to look up all this stuff about them, but not letting them know you're doing it. Secrecy, in-groups ... it's the beginning of a long, long road that ends somewhere around NSA headquarters. Let's not take that trip. Wnt (talk) 02:01, 9 February 2014 (UTC)[reply]
I feel similarly to User:Wnt. I wouldn't really mind hiding admin-only links, like delete, from us normal users, but I can't really think of anything to hide from IPs, unless, perhaps, there are some links causing them confusion... —PC-XT+ 04:15, 9 February 2014 (UTC)[reply]
PC-XT and Wnt: We already differentiate between logged-in and non-logged in users in various parts of the user interface (e.g. "view source" instead of "edit" or CentralNotice banners that only display to logged-in users). These CSS tweaks would just allow us to make a distinction at a more fine-grained level. Whether MediaWiki:Anontalkpagetext should be friendlier is an orthogonal point, I think. I think there are additional use-cases for implementing this capability. --MZMcBride (talk) 04:20, 9 February 2014 (UTC)[reply]
Showing "view source" instead of "edit" provides information specific to the class of user, not just IPs. (Most of us see "view source" on pages protected at a certain level or higher.) I don't mind that kind of differentiation. I also don't mind the CentralNotice banners, as far as I am aware. I'm not sure if the finer grain is needed, but if so, I would prefer that it differentiate between all classes, if that is possible, rather than only those logged in/out. Some text is only really useful to admins, like I said above, and other users don't really need to see it. Showing some of these things openly does seem to be a part of the Wikipedia scene, but a few areas feel a bit crowded with stuff I don't need, even though I am logged in. I wouldn't object, but possibly support, if the stuff to be hidden is useless to IPs and does not have anything to do with them. This is just my opinion. I'd like to hear what some IPs think... —PC-XT+ 04:49, 9 February 2014 (UTC)[reply]
Ideally articles are not generally protected, and "view source" is meant to be as close to "edit" as it can be without confusing the person into thinking he can edit. Admittedly there are other options which admins see differently from other users, but that really isn't desirable either. I mean sure, I suppose that showing every editor grayed-out block and delete buttons would use too much bandwidth resource to be financially justifiable, but given our choice we ought. The ideal should be that everything is out in the open. Wnt (talk) 06:28, 9 February 2014 (UTC)[reply]
Note: We already have admin-only text, using the sysop-show class. For example: You (the person reading this) are an administrator, though others reading this are not an administrator. View the source to see what I did there. – PartTimeGnome (talk | contribs) 21:46, 9 February 2014 (UTC)[reply]
Here are some other examples where we could differentiate between logged-in and logged-out users:
  • Anywhere we invite a user to create a page, this does not make sense for IP addresses because they cannot do so. We should instead invite them to request the page's creation (e.g. via AfC). MediaWiki already does this on the search results page (MediaWiki:searchmenu-new vs. MediaWiki:searchmenu-new-nocreate).
  • Any help pages that instruct someone to change their preferences or edit a page in their user space could include an extra instruction to register or log in first for IP address users.
However, both of the above require that we not only be able to hide information from IP address users, but have alternative messages for them that don't show to registered users. Unless some mechanism existed to tightly control who could write such hidden messages, this would present a high vandalism risk since vandalism could be hidden from registered editors. – PartTimeGnome (talk | contribs) 21:46, 9 February 2014 (UTC)[reply]
I agree that, ideally, there should be a way to see all of the text. Due to IPs not really having preferences for such things, it could be a touchy issue. (There could be a button in some corner of every page to display all hidden text, for that matter. [Some way of selecting the page view at a certain level may be better.] If I wanted to do that, I could make a userscript, but that wouldn't help IPs. It should be available by default, and preferences could turn it off.) User:PartTimeGnome gave the information I wanted from a balanced view. If this included a way for anyone to see all of the forms the page could take by tweaking some control on the page, (which in turn could be removed by a gadget or something, if desired,) I would support this. —PC-XT+ 02:10, 10 February 2014 (UTC)[reply]

Differences between Wiki ViewStats and Stats.grok.se

From what I can tell, German user Hedonil is the adminstrator of the new Wiki ViewStats pageview tool, which started in 2013. It seems to be more robust than the Stats.grok.se which started in 2007 in the sense of having generally higher pageview counts. However, I have noticed that on pages with the apostrophe character (') like Victoria's Secret or Sinéad O'Connor its totals are lower. Has anyone else noticed other problematic characters? Not even the diacratic of just the word Sinéad is a problem for the tool so I am not sure what the tool's problems are. Hedonil is ignoring me for some reason so I am unable to communicate this problem. If anyone is able to make contact with that German user, please call his/her attention to this problem (which may be one of many).--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 09:07, 31 January 2014 (UTC)[reply]

I should note that Stats.grok.se continues to have problems with the question mark (?) in articles like Who Dat?.--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 15:00, 31 January 2014 (UTC)[reply]
With the changeover to a new month, I just noticed that the View stats is a rolling 90 day database. It does not do older months as they age.--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 07:43, 2 February 2014 (UTC)[reply]
  • I finally have gotten a reply from Hedonil. ViewStats does not currently aggregate variants. See this example of variants stats.grok.se aggregates variants and ViewStats just presents the primary result. As you can see in the case of apostrophes, the difference is rather large.--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 05:31, 5 February 2014 (UTC)[reply]
Wiki ViewStats is pulling up a blank for me today. However, at some point in the future is this going to replace Henrik's tool site-wide? — Maile (talk) 23:16, 9 February 2014 (UTC)[reply]
I am guessing that they were testing a lot of things and are now redoing the data. Now Jan 1 to date is back up. However, I have just reported a problem with disambiguation pages like Frank Underwood (House of Cards) and House of Cards (season 2), which are reporting numbers lower than stats.grok. All pages are currently appearing to have lower totals than stats.grok--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 08:43, 11 February 2014 (UTC)[reply]
Pages now higher.--TonyTheTiger (T / C / WP:FOUR / WP:CHICAGO / WP:WAWARD) 22:46, 11 February 2014 (UTC)[reply]

Wasted space on right-hand side constraining width

What is the user preference that causes the big blank space down the right-hand side of this screenshot? This is in relation to Template talk:Multiple issues#Default to collapsed? where it causes lateral compression of a banner, and hence the banner expands vertically. I have never had that blank space, so it must be something that GliderMaven (the user concerned) has set for themselves - I thought that it might be one of the new "beta" things, but it's not listed there, and I can't find where else it could be set. --Redrose64 (talk) 09:54, 31 January 2014 (UTC)[reply]

Yes its caused by the mw:Typography refresh beta feature.--Salix alba (talk): 10:25, 31 January 2014 (UTC)[reply]
Any news on this? I get squeeze effects in places like Template:Periodic table, where content page width matters. If the page margins change, there will be some place to get information? The mw:Typography refresh suggestion mentioned by Salix alba did not explain this (unsurprisngly, given that topic). -DePiep (talk) 13:29, 1 February 2014 (UTC)[reply]
All I could see there was that the image captioned 2nd iteration - Since January 6th also shows the big blank space. No explanation of why it's there. --Redrose64 (talk) 14:56, 1 February 2014 (UTC)[reply]
Indeed, it appears the documentation does not mention this. The best I can refer you to is a previous announcement on this page and bugzilla:59815. – PartTimeGnome (talk | contribs) 16:49, 1 February 2014 (UTC)[reply]
PartTimeGnome gave the right links. That page also provides a link to the mw:Talk:Typography_refresh#max-width:_715px for comments. (Content page width was set to maximum of 715px). Indeed part of the mw:Typography refresh Beta that Salix mentioned. I am very unhappy. -DePiep (talk) 18:23, 1 February 2014 (UTC)[reply]
There is a fix but it requires adding something to your vector.css (like I have here User:Spudgfsh/vector.css). That will override the maximum width => Spudgfsh (Text Me!) 18:35, 1 February 2014 (UTC)[reply]
... or switch off that beta option. -DePiep (talk) 07:40, 2 February 2014 (UTC)[reply]
  • Just a note, the tracked template reports it as "resolved", for clarification that is "RESOLVED WONTFIX". I'm curious why we can't apply a global fix to some MediaWiki:??.css or MediaWiki:??.js page... Technical 13 (talk) 18:56, 1 February 2014 (UTC)[reply]
It appears to have been a deliberate policy decision in the refresh. What I thought would be better if it could be made optional somehow (even if the what they'd decided was the default option).=> Spudgfsh (Text Me!) 19:05, 1 February 2014 (UTC)[reply]
See the last post in the bug: "Won'tfix" because it is considered a "feature not a bug" (in a Beta test environment). They let the idea linger on in the Beta. See discussion at mw:Talk:Typography refresh. -DePiep (talk) 07:37, 2 February 2014 (UTC)[reply]
Something that squeezes Wikipedia:Featured_pictures/Places/Panorama and two-column WP:DIFFs should probably be called a misfeature. WhatamIdoing (talk) 23:52, 5 February 2014 (UTC)[reply]

Extension:DynamicPageList appears to have been rejected before, as it doesn't appear to scale. Would we like to use Extension:DynamicPageListEngine instead? It could be infinitely useful for managing WikiProjects work, such as to show fresh category members. --Gryllida (talk) 07:33, 2 February 2014 (UTC)[reply]

Related discussion can likely be found in existing bug reports. --AKlapper (WMF) (talk) 15:40, 3 February 2014 (UTC)[reply]
In short, no. Extension:DynamicPageListEngine is much better written than the real DPL extension, however it still doesn't solve the underlying problem of self-joins on the categorylinks table to do category intersections is inefficient. In order to make something that would scale to wikipedia levels, it would have to use a different backend (Such as Solr/lucene search). Bawolff (talk) 01:08, 7 February 2014 (UTC)[reply]

Highlighting within an article from a list of regex expressions

There's some interest in a script that will draw a little red box (as User:Ucucha/duplinks.js does) around words and phrases of interest that come from a regex list. I know the regex bit, but I don't know how to adapt Ucucha's script. Any help? - Dank (push to talk) 18:47, 2 February 2014 (UTC)[reply]

  • Dank, I'd love to try and help you, but I'm afraid I'm going to need some more information.
  1. Do you want this script to run automagically or only if you click a link?
    Clicking a link in the left-hand column, as duplinks does. - Dank (push to talk)
  2. Which namespaces do you want it to run in? Surely there is no need for it in Talkspaces?
    Articlespace and userspace (for testing)
  3. Where is your ReGex list? If it's not already on a page, could you put it on User:Dank/Highlighter/list for me please?
    Done. Comma-space-delimited, because I want to optimize it for readability by, you know, humans, but I can massage it if you like.
I think knowing these things, we can start to write you a script to highlight stuffs.  :) — {{U|Technical 13}} (tec) 19:52, 2 February 2014 (UTC)[reply]
All done, thanks much. - Dank (push to talk) 20:43, 2 February 2014 (UTC)[reply]
  • Great! I'll work on this over the next couple days and leave a message on your talk page when it is done. (Library is closing for the day or I'd work on it tonight, should be able to do in less than a day...) :) — {{U|Technical 13}} (tec) 21:57, 2 February 2014 (UTC)[reply]
Note: it's regex, or regexp. Not "ReGex". — Scott talk 22:39, 2 February 2014 (UTC)[reply]
OK, I just spent half an hour putting together Module:Highlight. See Module talk:Highlight/highlight for a usage example. You can omit the page name if you start (or just preview) a page with the name of the page you want to mark up + "/highlight". No Javascript needed :) Is this good for you? Wnt (talk) 02:37, 9 February 2014 (UTC)[reply]
Just saw this, thanks so much, I'll have a look in the morning. I'm looking for one more thing ... if you're interested, see User_talk:Technical_13#One small tweak. The regex list might be long, on the order of the typo list at WP:RETF. - Dank (push to talk) 05:25, 9 February 2014 (UTC)[reply]
well, I'm thinking I could set up a parameter to let you choose a file of regexes to use, rather than submitting a regex directly, and those files could be individualized. I suppose adding an option of doing a find and replace with that file, then highlighting the replacements rather than the finds, is also feasible. And, of course, the module could output wiki source rather than processed text, so you could copy and paste it. But not tonight... Wnt (talk) 07:00, 9 February 2014 (UTC)[reply]
I should, however, note that Lua has some limitations that may make it unsuitable to replace the Javascript functions like AWB for comprehensive lists. First and foremost, it has limited execution time, and we'll have to see how many regexes * text it can do at once. Then there's the problem that the Javascript regex list from WP:RETF isn't going to work directly in Lua - there are deficiencies in alternation (|) and counts ({1, 3}). It would be possible to build a tool to autotranslate nearly all of the list, I think, but I'm already amid one autotranslate idea that I keep dragging out... And last but not least, there's the annoying inability of Wikitext (which Lua/Scribunto outputs) to access the mouseover property via inline CSS. There ought to be some way to set up custom .css/.js that gives Wikitext some access to mouseover capability without burdening normal page display, but it'll take some thought, and unless merged into the common. files, it wouldn't be usable by everyone. Wnt (talk) 15:01, 9 February 2014 (UTC)[reply]
i cannot see how a lua module makes sense here. i think the idea is to allow individual readers to highlight what they choose, no? as far as i understand, lua module is something the *editor* can use to highlight what she wants, but she does not really need a module for that, does she? she can control the appearance of the page directly.
regarding the JS thing: i hope you guys are aware of the fact that mw contains the "textHighlight" plugin, which makes it pretty easy (though i'm not sure it supports full regex - i did not fiddle with it much).
e.g.: this section contains the words "interest", "automagically" and "lua". to highlight those words on the page, hit F12, and type into the console:
mw.util.$content.highlightText('interest automagically lua');
and see the magic.
if nothing happens, you may need to run this line first:
mw.loader.load('jquery.highlightText');
but i do not think it's really needed - it works for me without loading the plugin explicitly. maybe one of the gadgets i'm using loads this plugin, but i tend to think that it loads for everyone on enwiki. you can control the actual appearance of the highlighted text. the default is plenty good enough for me. peace - קיפודנחש (aka kipod) (talk) 16:42, 9 February 2014 (UTC)[reply]
  • I am using FireFox 27 kipod and even
mw.loader.load('jquery.highlightText');
mw.util.$content.highlightText('interest automagically lua');
does nothing for me (I go to the console directly with CTRL+⇧ Shift+K). I just need to have a half an hour to toy with it, and I should be able to have it quickly highlight any match (I'll need to reformat the list page a little to make it work with the extras added like popups). It won't be right now as I am too busy, but soon. — {{U|Technical 13}} (tec) 16:55, 9 February 2014 (UTC)[reply]
Since that functionality is already in Mediawiki, it's possible it will run faster and less buggily that hand-coded stuff, so I'm very interested. When I paste mw.loader.load('jquery.highlightText'); into the console, it returns "undefined". mw.util.$content.highlightText('interest automagically lua'); does make those 3 words bold ... that is visible enough for me, though others may want something more visible. You're right that the general goal is to allow users to highlight any words they want. Specifically, I want to make it easier for users to catch common mistakes. They probably won't need to know that "tyop" is a typo, but if I highlight "except" because it appears they actually meant "accept", and they don't get the difference, I want to make it easier for them to find out why the word "except" was highlighted ... a link to a section of a page that explains the difference would be nice, or maybe a script could generate a page with explanations for all of the words that were highlighted on their target page. Any idea how to do that? (And btw, does Kipod mean "hedgehog"?) - Dank (push to talk) 17:08, 9 February 2014 (UTC)[reply]
Dank: if you inspect the highlighted element, you'll notice that calling "highlightText" bracketed it in a span with class "highlight". you can then access those elements through $('.highlight'), to add more stuff (e.g. a tooltip. either by directly setting the "title" attribute or through the "tipsy" plugin. something like:
mw.loader.using(['jquery.highlightText', 'jquery.tipsy'], function() {
    mw.util.$content.highlightText('interest automagically lua');
    $('.highlight').tipsy({
        title: function() { 
            return 'the word ' + $(this).text() + ' is highlighted';
        }
    });
});
as to your "btw" question, the answer is yes. peace - קיפודנחש (aka kipod) (talk) 17:57, 9 February 2014 (UTC)[reply]
That does just what I want, except it has a bug: after I loaded it into my monobook.js, it deleted the word "interest" everywhere it appeared every time I saved an edit! Btw ... how do I get it to search for a two-word phrase? - Dank (push to talk) 20:28, 9 February 2014 (UTC)[reply]
@Dank: you probably want to gate the whole thing to reading a page, e.g., by preceding the "using" statement with something like
if (mw.config.get( 'wgAction' ) === 'view')
or somesuch, so it won't interfere with editing... peace - קיפודנחש (aka kipod) (talk) 21:56, 9 February 2014 (UTC)[reply]
@קיפודנחש: That worked. But I've tried everything I can think of, and can't get it to match any two-word phrase. (\ , \\ , \s, "two words", etc. aren't working for the space) - Dank (push to talk) 22:49, 9 February 2014 (UTC)[reply]
@Dank: i do not think patterns with spaces are possible with the highlightText plugin. as far as i could see, this plugin is home-grown (i.e., developed as part of mediawiki, rather than brought in from some known plugin), and the changes required to make it understand patterns with spaces and regexes is minimal, so you can ask for this on bugzilla if you have a good use-case. peace - קיפודנחש (aka kipod) (talk) 05:20, 10 February 2014 (UTC)[reply]
@קיפודנחש: That module is currently used by the search suggestions code, to do the bolding in suggestions lists, so it should be loaded for everyone – but, as always, this might change in the future – there are some plans to either load the module only after the user starts typing instead of on page load to increase page load performance, or even axe this feature client-side entirely and do the highlighting server-side, as the new CirrusSearch is smarter and can do this better. If you need some module, always do mw.loader.using(['jquery.highlightText'], function(){ /* code to run after the module is loaded */ }). Matma Rex talk 17:30, 9 February 2014 (UTC)[reply]
To be clear, I had (and still have) little idea of what this regex highlighter would be used for; when I made the module I was picturing ongoing collaborative tasks, such as highlighting sections at the Science Refdesk that only contain one signature, i.e. no answers. (though it's not really pretty for that purpose as of yet, not sure how to make it neatly circle a paragraph) Wnt (talk) 20:35, 9 February 2014 (UTC)[reply]

New extension: Flow

The new Flow extension is being deployed to enwiki today. It is being deployed to only two wikiproject pages as a test run to get real users trying out the new interface constructs so they can be tweaked or completely changed based on real world usage until we arrive at a discussion system that can serve the needs of wikipedians.

Because Flow is in such an early stage, with many things uncertain, the API modules it enables are a shim exposing the internals which is sufficient only for the existing ajax calls. These will change without notice, and I encourage you to not yet build out integrations with these APIs.

We have a regular integrated mediawiki API in the works which bots and scripts will be able to integrate with, we expect to have this merged and deployed well before expanding from our initial test runs in the wiki project space. Flow integrates with a number of mediawiki constructs such as recent changes, watchlists, contributions, etc. Feel free to file bugs for anything those integrations might break that previously worked.

EBernhardson (WMF) (talk) 17:44, 3 February 2014 (UTC)[reply]

See tests: Use "WT:Flow/Developer test page" not the 2 live wp:WikiProjects. -Wikid77 22:30, 6 February 2014 (UTC)[reply]
What are the two pages? How are we supposed to comment on it if you won't tell us where we're meant to be looking? Mogism (talk) 17:50, 3 February 2014 (UTC)[reply]
(edit conflict) No mention of just which two wikiproject pages. One, I suspect, is WT:WikiProject Hampshire. --Redrose64 (talk) 17:52, 3 February 2014 (UTC)[reply]
The other page is WT:WikiProject Breakfast. I encourage any testing to be done on either mw:Talk:Sandbox or if testing within enwiki is necessary, WT:Flow/Developer test page. Note that the enwiki pages are not enabled yet, they will be in the next four hours. EBernhardson (WMF) (talk) 18:09, 3 February 2014 (UTC)[reply]
So does this mean that Flow-enabled pages will basically be non-editable by bots and user scripts until the API is done? Documentation on this is all very sparse and poorly organized. Nor is it very clear where the community should provide feedback. Mr.Z-man 18:35, 3 February 2014 (UTC)[reply]
Yes. The initial project pages flow is being released to were chosen in part because they see very little, if any, bot activity. There is no documentation for the API because we do not want anyone developing against the Flow API at this time. The API that has been exposed is not an API in the classic mediawiki sense, it is a shim that exposes the internals of Flow's implementation. It requires knowledge of these internals to construct a proper request currently, and is only intended for use with the AJAX requests used by the web interface.
If you wish to provide feedback on Flow in general i can suggest mw:Talk:Sandbox for testing and mw:Talk:Flow for feedback. The release to enwiki is incredibly limited and targeted, If you are not a member of the two wiki projects we are testing with you wont have a chance to use Flow on enwiki. We kindly ask that users do not disturb the wikiprojects from their normal workflow's by going off-tangent about flow in their project pages. If you are a member of a wiki project that was not chosen and would like to propose the project for being converted to Flow in the future User:Quiddity is our community point person, but I'm not sure we will be expanding from the current group of two pages too soon. We expect to find many things, based on this initial test group, that will need to change before its worthwhile for anyone to build a bot or user script integrating with flow.
EBernhardson (WMF) (talk) 19:42, 3 February 2014 (UTC)[reply]
  • Looks good. It'll be easier for new editors. Older editors may need to change habit if this is installed in all talk pages. In MediaWiki sandbox I tried it, my text was included in nowiki. TitoDutta 19:52, 3 February 2014 (UTC)[reply]
  • Hmm! URL is broken above when I tried to link using [full-url text] TitoDutta 19:53, 3 February 2014 (UTC)[reply]
Known issue, we have a fix in code review right now that will remove the square brackets from urls. EBernhardson (WMF) (talk) 20:01, 3 February 2014 (UTC)[reply]
See also Help:Link#Disallowed characters and percent-encoding. --Redrose64 (talk) 20:11, 3 February 2014 (UTC)[reply]
Everything delivered to a user is properly encoded, unfortunatly some web browsers try and be helpful and decode the links for you when copy/paste. EBernhardson (WMF) (talk) 21:13, 3 February 2014 (UTC)[reply]
Is there a description somewhere of how the proposed API might work or where bot/script authors can provide feedback on that specifically? I'm just wanting to make sure that this won't be like WikiEditor, where documentation to help us rewrite our tools for it lagged behind deployment. Mr.Z-man 21:48, 3 February 2014 (UTC)[reply]
bugzilla:57659 and bugzilla:58361 cover the API and its documentation. Quiddity (WMF) (talk) 01:51, 4 February 2014 (UTC)[reply]
How is it supposed to work in the watchlist? Since the old talk pages were moved to the archives, I see 13 entries for Wikipedia talk:WikiProject Hampshire, and 7 for Wikipedia talk:WikiProject Breakfast. The familiar "diff" and "hist" links have mostly gone, to be replaced in some cases by "topic" and "post" (in two cases "history") links. These links ("history" excepted) don't tell you what the actual change was, they just take you to the current version of the page, so I cannot see what the change was. The edit summaries are mostly the non-intuitive "added a comment" - yes, but what was the comment? Some of the entries have no links at all on the left - the edit summary is either "created the board header" or "edited the board header", but again, there is no way of finding out what the exact change was. Only two of those (timed at 21:17 and 21:19) have the familiar "diff" link - and it doesn't work as expected, it just does the same as those "topic" links. Only the "history" links seem to do anything different - but how can a page with 13 edits have only three entries in the history? --Redrose64 (talk) 22:35, 3 February 2014 (UTC)[reply]
The entries for RC/watchlists/contribs are being overhauled very soon. You can see the target layout at mingle card 631. This will include a slightly clearer automatic edit-summary. A few people have suggested adding automated excerpts as additional edit-summary context (also per the editsummary guideline's recommendation). (Further feedback on that is welcome, but ideally at WT:Flow where everyone interested is watchlisting. :)
The Board-history and the Topic-history are currently separated (A vs B); iirc, there are plans to merge their display (to prevent this confusion), but I'll have to check how far along that is. If it will be long, I'll see if we can get a note added to the page-history header, asap.
HTH. Quiddity (WMF) (talk) 01:51, 4 February 2014 (UTC)[reply]
Why reintroducing old LQT bugs? Helder 13:10, 4 February 2014 (UTC)[reply]

Which API call(s) can be used to check for redlink status?

In a script, I need to check if titles have articles. That is, I need to determine for each link in an article whether that link is a redlink or not.

How can this be done? The Transhumanist 16:51, 4 February 2014 (UTC)[reply]

I don't know if scripts support parser functions, but if they do, the #ifexist parser function could help. SiBr4 (talk) 18:09, 4 February 2014 (UTC)[reply]
Can't you just see whether the link has class "new"? Jackmcbarn (talk) 18:19, 4 February 2014 (UTC)[reply]
 – Jackmcbarn (talk) 04:08, 9 February 2014 (UTC)[reply]
Where would I look for that in order to see it? The Transhumanist 03:54, 9 February 2014 (UTC)[reply]
See T13's reply below. Jackmcbarn (talk) 04:08, 9 February 2014 (UTC)[reply]
See mw:API:Query#Missing and invalid titles. Helder 18:47, 4 February 2014 (UTC)[reply]
Thank you. This looks promising. The Transhumanist 03:06, 9 February 2014 (UTC)[reply]
Okay, I looked it over and tried some calls. Making such a call for every link on a page, when there are hundreds of links and then scraping the results pages of each, would be rather cumbersome. I think it would be simpler and faster to get the whole page in some format with redlink=1 codes on it and search for all instances of links followed by that code. Thoughts? The Transhumanist 05:40, 9 February 2014 (UTC)[reply]
  • Doesn't work that way... Your code might look something like:
$('a').each(function(){
    if($(this).hasClass('new') === true || $(this).is("[href$='&redlink=1']") === true ){
        $(this).css("border", "1px solid #F00");
    }
});
This should put a red border around RedLinks but not Blue ones (try it in your console ;)). — {{U|Technical 13}} (tec) 06:30, 9 February 2014 (UTC)[reply]
You don't have to make a separate API call for every link to check. You can supply up to 50 titles per request, separated by %7C. You don't need to "scrape" API results, since they are designed to be machine-readable. E.g. the JSON format (format=json) is natively readable by JavaScript. Other formats are available depending on your needs. Though as others have said, if you only need to check links on the current page, you are better scraping the current page since that doesn't involve any extra server requests. – PartTimeGnome (talk | contribs) 22:02, 9 February 2014 (UTC)[reply]
  • If there are links to the pages in question on the page that you want to check from, I usually either look for typeof($(this, '.new')) === 'undefined' like Jack suggested. I've noticed there are some odd instances where they are redlinks and they are not using .new, in those cases I look for typeof($(this, "[href$='&redlink=1']")) === 'undefined'. If you are checking pages not linked to from the current page, you'll have to use the API (which of course is much slower). Happy editing and good luck! — {{U|Technical 13}} (tec) 03:49, 9 February 2014 (UTC)[reply]
  • Do you mean scrape the html source? I don't see any typeof($(this, '.new')) or typeof($(this, "[href$='&redlink=1']")) strings in there. Instead, there's something like this:

<li><a class="new" title="Geology of Afghanistan (page does not exist)">Geology of Afghanistan</a></li>

I look forward to your reply. The Transhumanist 04:13, 9 February 2014 (UTC)[reply]
  • See:
Somewhat of a crash course there on JavaScript and jQuery, but I think that is all you need to know for this question. The crash course for accessing the information through the API is somewhat more complex (which is part of the reason it is slower). Let me know if you have any further questions. — {{U|Technical 13}} (tec) 04:51, 9 February 2014 (UTC)[reply]

mobile watchlist grayed out...or greyed out if you're european..

on my tablet, (motorola xoom) the watchlist page , although completely functional... has recently been grayed out...it's the only page on wikipedia that behaves this way... i was wondering if this is a known issue, or something I did that i need to fix..Nickmxp (talk) 01:01, 5 February 2014 (UTC)[reply]

Might be something to bring up on mw:Mobile/Feedback. --AKlapper (WMF) (talk) 15:32, 5 February 2014 (UTC)[reply]
@Nickmxp: Looks OK to me (although I'm not in Europe). What exactly do you mean by "grayed out"? Are you in beta mode? Kaldari (talk) 18:27, 5 February 2014 (UTC)[reply]
Note that in the 'modified' sorting, the mobile watchlist does show mostly gray and black text, and it might not be clear what's clickable... but it works on my Galaxy Tab 10.1, which is running some Android 4.0.something. Is your Xoom running 4.0 as well? Can you switch between alpha & mobile sorting? Does tapping on an individual entry show you a diff, or does it do nothing? --brion (talk) 18:55, 5 February 2014 (UTC)[reply]

Mines running 4.1.2 I'm using the desktop browsing setting.. I'm pretty sure it's something I did, as it was working fine originally.....Nickmxp (talk) 01:44, 6 February 2014 (UTC)[reply]

And oddly if I put the search on templates it apparently shows all my pages, and they are not grayed out... but if I switch back to all.. it goes gray againNickmxp (talk) 01:54, 6 February 2014 (UTC)[reply]

and it's not really gray as in color.. it's just the colors are washed out.... the only thing that doesn't gray out is the search box...everything is functional... and if I do selective searches the page looks normal..but the default page ya get when ya touch watchlist is gray. Nickmxp (talk) 03:19, 6 February 2014 (UTC)[reply]

Don't know why , but the watch list is back to normal! Nickmxp (talk) 02:25, 8 February 2014 (UTC)[reply]

Bad Gateway error

Getting the following message when trying to access "Contributors" on the History page of any article.— Maile (talk) 01:40, 5 February 2014 (UTC)[reply]

Bad Gateway

An error occurred while communicating with another application or an upstream server.

There may be more information about this error in the server's error logs.

If you have any queries about this error, please e-mail [email protected].

Back to toolserver.org homepage [ Powered by Zeus Web Server ]

Exact steps to reproduce welcome, especially as this sounds like Toolserver territory instead. --AKlapper (WMF) (talk) 15:32, 5 February 2014 (UTC)[reply]
Corrected now, and I also believe it was a Toolserver error. But the steps were: Pull up any article. Page/History/Contributors— Maile (talk) 18:05, 5 February 2014 (UTC)[reply]

Notifications broken

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


The notifications system appears to have broken. I see a red box indicating that I have 2 notifications, but when I click on it Special:Notifications just displays: "Error: Could not find the requested workflow". --BrownHairedGirl (talk) • (contribs) 13:05, 5 February 2014 (UTC)[reply]

When I click the red letter 1 on my messages it won't clear and keeps coming up with

"Error From Wikipedia, the free encyclopedia

Could not find the requested workflow.

Return to Main Page."

I'm out of here at the moment, hope it's fixed by the time I return!♦ Dr. Blofeld 14:00, 5 February 2014 (UTC)[reply]

@BrownHairedGirl and Dr. Blofeld: I've combined the "Notifications broken" and "Message" sections as they are about the same problem. (For me the notifications display fine, though I don't have any new ones.) SiBr4 (talk) 14:10, 5 February 2014 (UTC)[reply]
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

When will the Reflinks tool be moved to the stable server?

It's the tool I use more often than I use all other automated tools combined. The fact that it is still hosted on the old toolserver is very frustrating. It is in fact the only tool stranded on that old server that I use regularly. What is required to make moving it to wmflabs a high priority task for the server staff? Roger (Dodger67) (talk) 13:36, 5 February 2014 (UTC)[reply]

It is already listed here: http://tools.wmflabs.org/ so you might want to ask its maintainer. --AKlapper (WMF) (talk) 15:59, 5 February 2014 (UTC)[reply]
The link in my "Tools" menu still goes to the old server - how do I fix that? Who is the "maintainer"? Roger (Dodger67) (talk) 15:53, 9 February 2014 (UTC)[reply]
  • Roger, you are the maintainer of that link actually. Go to your common.js and replace:
// Add [[WP:Reflinks]] launcher in the toolbox on left
addOnloadHook(function () {
 addPortletLink(
  "p-tb",     // toolbox portlet
  "http://toolserver.org/~dispenser/cgi-bin/webreflinks.py/" + wgPageName
   + "?client=script&citeweb=on&overwrite=&limit=20&lang=" + wgContentLanguage,
  "Reflinks"  // link label
)});
with:
// Add [[WP:Reflinks]] launcher in the toolbox on left
addOnloadHook(function () {
 addPortletLink(
  "p-tb",     // toolbox portlet
  "https://tools.wmflabs.org/dispenser/cgi-bin/webreflinks.py/" + wgPageName
   + "?client=script&citeweb=on&overwrite=&limit=20&lang=" + wgContentLanguage,
  "Reflinks"  // link label
)});
Would be my guess (I can't seem to get anything to load from tools.wmflabs.org today, not sure why so I can't confirm AKlapper any ideas on what's up with that?). — {{U|Technical 13}} (tec) 16:41, 9 February 2014 (UTC)[reply]
Thanks, I've updated the script. (Note to self: Create a list on my User page of all of my "custom" scripts. I can't keep track of such stuff I did years ago.) Roger (Dodger67) (talk) 17:19, 9 February 2014 (UTC)[reply]

@Dispenser: (tJosve05a (c) 20:08, 9 February 2014 (UTC)[reply]

24 Terabytes. That's 24 Flickrs accounts or $1,000 in Hard Drive ($10,000 with infrastructure [2]). That's equivalent to 2-4 weeks opportunity cost of an American worker. There are other demands, but considering waving of FLOSS requirements and that security is borked, it shouldn't be an issue. Even if I do not fully use it, other tools are also disk space strapped. (e.g. offline copies of enwiki to annotate)
I am willing to entertain offers from other players (Other Toolservers, Amazon's AWS, Google's Compute Engine, etc.) with acceptable terms. If that doesn't work out, I may need to solicit donations. — Dispenser 00:54, 10 February 2014 (UTC)[reply]
Since I changed to use the wmflabs version it doesn't work at all! The toolserver version at least worked some of the time. Roger (Dodger67) (talk) 13:54, 11 February 2014 (UTC)[reply]

When I edit a section and then "Cancel" I'm taken to the top of the page instead of staying at that section

When I edit a section of an article and then "Cancel" I'm taken to the top of the page.

When I edit a section of an article and then "Save" I stay at the section that was edited.

The latter is the more logical behavior in my opinion and it should also be what happens when you "Cancel".

Contact Basemetal here 15:55, 5 February 2014 (UTC)[reply]

This is true. But I never use the "Cancel" link; instead I use the "back one page" facility of my browser. Depending upon browser, this may be one or more of: a button at upper left showing a left-pointing arrow; the ← Backspace key; the sequence Alt+Left arrow. --Redrose64 (talk) 19:53, 5 February 2014 (UTC)[reply]

Hiding admin tools

Hi,

Is there a simple way to hide the admin tools, or a subset?

The one I most want to hide is delete/undelete of revisions. This adds clutter to the page history and I will hardly use it.

Obviously, if this hiding was relatively easy to toggle on and off that would be good, to allow me to use the tools when they are needed.

Yaris678 (talk) 00:47, 6 February 2014 (UTC)[reply]

Do you have those links in page histories? I have them in user contributions but not page histories. They can be removed from contributions and possible from other places with this in Special:MyPage/common.css:
.mw-revdelundel-link {display: none;}
PrimeHunter (talk) 01:17, 6 February 2014 (UTC)[reply]
In page histories, I get check boxes for each revision and a grey button marked "del/undel selected revisions". That's what I want to hide. I don't get anything like that in user contributions, but I do get a link to "deleted user contributions". I've added the bit of code to User:Yaris678/common.css but it hasn't hidden any of the above. Yaris678 (talk) 11:35, 6 February 2014 (UTC)[reply]
  • You can get rid of the button with .historysubmit .mw-history-revisiondelete-button{ display: none; } in User:Yaris678/common.css but you have to use JavaScript/jQuery to get rid of the checkboxes $('input[type="checkbox"][name^="ids"]').css("display", "none"); in your User:Yaris678/common.js should do it. — {{U|Technical 13}} (tec) 12:40, 6 February 2014 (UTC)[reply]
Why do it in javascript? CSS can be used to hide the checkboxes:
input[name^="ids"][type="checkbox"] { display: none; }
This of course assumes that your browser respects the substring matching attribute selectors introduced with CSS 3. --Redrose64 (talk) 14:30, 6 February 2014 (UTC)[reply]
Thanks guys. The check boxes are now gone but the button is still there. Yaris678 (talk) 21:58, 6 February 2014 (UTC)[reply]
Try .mw-history-revisiondelete-button {display: none;}. My earlier code was for big "(del/undel)" link to the left of the time stamp for each entry on contributions pages. You shouldn't see it after adding my code but I'm surprised if you didn't see it before. PrimeHunter (talk) 22:36, 6 February 2014 (UTC)[reply]
Technical 13's suggestion of
.historysubmit .mw-history-revisiondelete-button { display: none; }
should have worked to remove the button.
Was there a problem with that? --Redrose64 (talk) 22:54, 6 February 2014 (UTC) No, it wouldn't have worked, because elements with class mw-history-revisiondelete-button are not children of elements with class historysubmit --Redrose64 (talk) 23:01, 6 February 2014 (UTC)[reply]
Ah, I've worked out what Technical 13 was trying to do.
.historysubmit.mw-history-revisiondelete-button { display: none; }
works to remove the button: the only significant difference is the absence of a space between the two class selectors. --Redrose64 (talk) 09:05, 7 February 2014 (UTC)[reply]

Thanks guys. It's all working now in the normal interface.

It doesn't seem to be working in the mobile interface. Specifically, I get del/undel links on my contribs in mobile mode. I've used tools to check and the links are in the same class in mobile mode.

Does custom css not work in mobile mode?

Yaris678 (talk) 10:09, 7 February 2014 (UTC)[reply]

Wikipedia:Village pump (technical)/Archive 111#Applying custom user common.css to mobile site (m.wikipedia.org) from May 2013 says custom css in mobile mode is not possible. I think this is still the case. There is a request at bugzilla:46247 PrimeHunter (talk) 12:06, 7 February 2014 (UTC)[reply]
OK. I guess I'll just have to put up with the clutter in mobile mode. At least the normal interface is clutter free.
Thank you everyone!
Yaris678 (talk) 12:17, 7 February 2014 (UTC)[reply]

User contributions for new accounts + view 500 per page = database error. Known bug?

When trying to view User contributions for new accounts with 500 edits per page, this keeps leading to a database error on both Firefox 27.0 and Chrome 32.0. The exact text given is "A database query error has occurred. This may indicate a bug in the software. Function: IndexPager::buildQueryInfo (contributions page unfiltered) Error: 0".

Is this a known bug, or should I report it at bugzilla.wikimedia.org? I have done a quick search at bugzilla and did not find anything, but it may well have been filed in a way I didn't think to search for. AddWittyNameHere (talk) 10:19, 6 February 2014 (UTC)[reply]

It's been happening for weeks, see e.g. Wikipedia:Village pump (technical)/Archive 122#Database error in filtered new users' contributions search. Don't file a new bugzilla, but feel free to add to the existing one with any additional relevant details. --Redrose64 (talk) 10:43, 6 February 2014 (UTC)[reply]

API query to list available protection levels for a wiki?

Hi, is there a single API query for a given wikipedia wiki that will list the available protection levels for that wiki? (levels e.g. sysop, autoconfirmed) I ask because I'm told that different language wikipedias can have custom protection levels different to en-wp. Already know that I can check the protection level of a given page, this is not what I need. Checked mediawiki API documentation, could not see how. Thanks Rjwilmsi 12:47, 6 February 2014 (UTC)[reply]

@Rjwilmsi: It would appear there isn't one, or at least I can't find any – it should probably be added. In the meantime you can see the settings for all wikis at https://noc.wikimedia.org/conf/highlight.php?file=InitialiseSettings.php, under the key of wgRestrictionLevels (reproduced below for posterity, as the settings might change). Matma Rex talk 15:06, 6 February 2014 (UTC)[reply]
'wgRestrictionLevels' => array(
	'default' => array( '', 'autoconfirmed', 'sysop' ), // semi-protection level on
	'arwiki' => array( '', 'autoconfirmed', 'autoreview', 'sysop' ), // bug 52109
	'ckbwiki' => array( '', 'autoconfirmed', 'autopatrol', 'sysop' ), // bug 52533
	'enwiki' => array( '', 'autoconfirmed', 'templateeditor', 'sysop' ), // bug 55432
	'hewiki' => array( '', 'autoconfirmed', 'autopatrol', 'sysop'), //bug 58207
	'plwiki' => array( '', 'autoconfirmed', 'editor', 'sysop' ), // bug 46990
	'ptwiki' => array( '', 'autoconfirmed', 'autoreviewer', 'sysop' ), // bug 39652
	'testwiki' => array( '', 'autoconfirmed', 'templateeditor', 'sysop' ), // bug 59084
),
I have submitted a patch for review to add this to the API, as action=query&meta=siteinfo&prop=restrictions: [3]. Comments welcome. Matma Rex talk 15:36, 6 February 2014 (UTC)[reply]
And it's merged already, thanks Anomie! You should see this feature live here on 20 February 2014, per the schedule: mw:MediaWiki 1.23/Roadmap. Matma Rex talk 16:00, 6 February 2014 (UTC)[reply]

Yikes! Science Refdesk is dropping PHP errors?

[4] has just given me "PHP fatal error in /usr/local/apache/common-local/php-1.23wmf12/extensions/Math/Math.hooks.php line 50: Call to undefined method ParserOptions::getMath() ". Repeatably, four times, in between loading other pages OK. Looks like somebody broke something... Wnt (talk) 22:15, 6 February 2014 (UTC)[reply]

Just started working now though, before I even had a chance to go look... Wnt (talk) 22:17, 6 February 2014 (UTC)[reply]

This is happening all over the place, including articles like Earth. It's due to the Math extension. Steven Walling (WMF) • talk 22:18, 6 February 2014 (UTC)[reply]
Hmmm, it looks like this was fixed on January 2 in a branch that I thought was live... but I guess I'm still a little hazy about which version really is live. Wnt (talk) 22:40, 6 February 2014 (UTC)[reply]
This problem was also mentioned below. Graham87 04:23, 7 February 2014 (UTC)[reply]
That's not the same problem. I was getting the PHP error with the Blue Page Of Death "Wikipedia has a problem", not just a red message where the formula is supposed to be. Wnt (talk) 05:36, 7 February 2014 (UTC)[reply]
Latest Math version seems to have a few problems, see the list under "Depends on:" on bugzilla:60997. The "Wikipedia has a problem" was logged under bugzilla:60970. --AKlapper (WMF) (talk) 11:40, 7 February 2014 (UTC)[reply]
Ah, I see that the latter thread has a comment on 2-6 that "everything has been reverted back to wmf11". That explains why line 50 appeared again. Perhaps I saw the error during mid-revert, because whatever it referenced hadn't been reverted yet. Wnt (talk) 17:19, 7 February 2014 (UTC)[reply]
Sorry about this folks. There was some incompatibilities between the Math extension and MediaWiki core and I deployed broken Math code yesterday in the process of trying to fix some downtime we were dealing with. Things should all be back to normal now, again sorry! ^demon[omg plz] 17:25, 7 February 2014 (UTC)[reply]
Thanks! In possibly-related news, do you know anything about the "Math aligned environments failing to parse" issue below? Melchoir (talk) 19:01, 7 February 2014 (UTC)[reply]

Changes to button colors/styles on some forms

Hey all, this is announced as part of the normal Tech News newsletter (see Tech News: 2014-06 above), but I wanted to give folks an extra notice that some important forms will have a change of button color and style. This includes login, account creation, search, and some other forms. We did this as part of UI standardization work across teams, so that for instance, we can move closer to a place where all buttons (at least in Vector, at first) will look the same across forms, Flow, guided tours, mobile, etc. Steven Walling (WMF) • talk 22:17, 6 February 2014 (UTC)[reply]

Chronology of category-insertion and category-removal events

There are, in fact, people tougher than Chuck Norris...

See the orthogonal teahouse thread here.[5] See the now-closed tangentially-related AN/I thread here.[6]

   Question: Is there a way to see the history of a category's contents, i.e. what articles were in the category a year ago, for instance? Can we find the removal-events, and the insertion-events, and who was responsible for each? I realize that one can visit Chuck Norris (may he live ten thousand years) and see that the master is currently in Category:American martial artists.

  But consider the horrific possibility, that some internet heretic might one day dare create Category:people tougher than Chuck Norris. This person will be known, from the edit-history of the category itself. But what if others have added names like Bruce Lee to this blasphemous category? Obviously, those edits will quickly be reverted. If there *is* ever Category:people tougher than Chuck Norris, it will be an empty set, if friends of Chuck Norris have anything to say about it.

  But what of the Bruce Lee heretics? They may escape punishment, if we cannot get a list of exactly when an article was added to a category, by whom. Additionally, we should reward the defenders of the honour of Chuck Norris! Therefore we need to know exactly when an article was removed from a category, and by whom. See insertion-example below. Along the same lines, ColinFine mentioned that it would also be nice to be able to get the links to the articles, which are being inserted to a category, at the time of insertion. Same applies for a link to the version of the article that was removed from a category, at the time of removal. So for example:

"Chuck Norris was inserted[7] into Category:American practitioners of Brazilian jiu-jitsu by Jackmcbarn as of 22:57, 27 August 2013."

  Note that the diff shows what the Chuck Norris article looked like when he was inserted. Is there a Special:category-content-history-page, which shows a bunch of rows like this, for a given category? (As opposed to, for a given article.) Plus possibly, optionally, all the subcategories thereof? (Chuck Norris may never have been inserted into the toplevel Category:Martial Arts but he belongs in any insertion-and-removal-history of that category methinks.)

  Is there a way, currently, to accomplish this sort of functionality? With some series of API calls perhaps, or some view-changes-to-category-page that I do not realize already exists? If not, can this feature be implemented? It is commonly problematic with musical genres as applied/unapplied to BLPs, and also with political categories (politician BLPs and also nation-labelling). I thank you for your input on this important matter, and for improving wikipedia.  :-)

  p.s. And, in case they may still care about category-histories, ping the good editors PrimeHunter, Lightbreather, Mike_Searson, Gaijin42, Drmies, and Lukeno94. Methinks that Liz may also have an interest in this categorization question. — 74.192.84.101 (talk) 22:39, 6 February 2014 (UTC)[reply]

  • As far as I am aware, no, there isn't. It isn't unfeasible to write a script to go and hunt for the historical additions and removals of a given category, but it would be an incredibly slow script, I would guess, and even more so if it began going through deleted edits. Lukeno94 (tell Luke off here) 22:46, 6 February 2014 (UTC)[reply]
  • It sounds like you are looking for a technical solution, 74.192.84.101, and I can't help you there. In one of your questions, the only way I know of to see past contents of a category is if one or a few editors in particular are working, organizing it, and you look in their contributions to see what articles they have added or removed categories from. This can be a guessing game but often editors focus in specific area so, for example, you might know who would create and populate that Category:American martial artists.
Otherwise, as far as I know, the only other record of category additions or subtractions is on each article's page history. The assignment of an article to a category is not noted on the category's page history although, I agree, that would be useful information to have. Liz Read! Talk! 22:54, 6 February 2014 (UTC)[reply]

implementation tech

From the bugzilla entry, which was first opened in 2005 (re-opened 2006 / 2007 / 2012 and now 2014 also)...

That kind of information is not stored in the database, and is not likely to be added. The membership set changes based on edits to other pages, not edits to the category, and has no independent history of its own. —Brion VIBBER, 2005-12-24 00:17:49 UTC
...What I suggest is that every category has a log attached to it. It would only require Number_of_category additional pages to the database (my guess is that it is just a fraction of the overall size, isn't it?) When an edit is made to an article that add category C1 and removes categories C2 C3, Mediawiki would log this information in the logfile of the categories C1, C2, and C3. ... —Jmfayard 2006-04-11 09:19:54 UTC

There is already a logfile for e.g. abuseFilter actions,[8] which are regex-based detectors as I understand it. Can an edit-filter be written, which detects the insertion of a category (not sure that can be done for the deletion of a category thataway) which triggers an entry into the CategoryEventLog? Or, does somebody have a better way to build such a thing? 74.192.84.101 (talk) 00:11, 7 February 2014 (UTC)[reply]

Not all pages are placed in categories by adding e.g. [[Category:Foo]] to the page. A lot of templates will categorise a page; the two main groups that spring to mind are maintenance templates and stub templates. For example, {{fact|date=February 2014}} will place the page into both Category:All articles with unsourced statements and Category:Articles with unsourced statements from February 2014; {{Laos-footy-bio-stub}} will place the page into Category:Southeast Asian football biography stubs, Category:Laotian people stubs and Category:Laotian sport stubs. An edit filter would also need to detect such usage. Categorisation by template get pretty subtle; for example, Category:Articles with incorrect citation syntax has a number of subcategories, the membership of which is triggered by certain combinations of circumstances when the Citation Style 1 templates are used. A change to any of these templates can cause articles to move in or out of categories without the article itself needing to be touched; even a simple thing like the calendar ticking to the next day can cause category membership to change, see for example the membership of Category:Expired proposed deletions. --Redrose64 (talk) 07:59, 7 February 2014 (UTC)[reply]
So, you are saying that edit-filters cannot see the parsed content of a final page? In which case, we would need a mediawiki extension, right? See also the CategoryWatch thing, below. 74.192.84.101 (talk) 16:18, 7 February 2014 (UTC)[reply]
What I'm saying is that edit filters check the wikitext of the page being edited. If an edit to an article adds a template, the edit filter does not expand that template to see what's inside it. If an edit to a template changes the categorisation of those articles which transclude that template, there is no way for an edit filter to spot that change in article categorisation. --Redrose64 (talk) 16:40, 7 February 2014 (UTC)[reply]
So, for a non-kludge solution, the edit-filter isn't sufficient. (We could use an edit-filter to catch a large chunk of cases i.e. the ones that were simply "Category:$cat" wiki-text changes... and that might be valuable... but it would be a kludge.) To do better, we either need an extension that uses PageContentSaveComplete hook (3/4ths answer), or we need a tool that polls periodically (2/4ths answer). But to get the full 4/4ths solution, we'd need some kind of DB-level upgrade, to start tracking category-insertion-and-removal-events in a new log-table, it seems. How does the Category:People generate a list of pages-currently-in-this-category, on the fly? What SQL does it use, specifically, if somebody can link to the codebase? 74.192.84.101 (talk) 17:01, 7 February 2014 (UTC)[reply]
The code that performs the database query is in CategoryViewer::doCategoryQuery(). It is a join over the categorylinks, category and page tables. The categorylinks table is the one that lists current category members. It is normally updated whenever an article is edited.
If categories are added or removed by editing a transcluded template, categorylinks is not immediately updated. Instead, a job is added to the job queue for later processing. It typically takes a day or two for categorylinks to update after a template edit, though it has been known to take weeks. Editors can force the category membership for a single article to be immediately updated by editing the article (including null edits). Bots can also use the purge API's forcelinkupdate parameter for the same effect.
Where a template changes the categories it adds based on the date, MediaWiki does not update the categorylinks table. We instead have Joe's Null Bot, which makes null edits to pages in time-sensitive categories at appropriate times, forcing the table to update.
It probably wouldn't be difficult to keep a log of changes to the categorylinks table, but such a log might not be that useful. MediaWiki couldn't meaningfully link to an edit or show who caused the change – one person could edit a transcluded template, but the category link update could be triggered by someone else editing the transcluding page. Thanks to the job queue processing updates, the time of the addition or removal could be a long time after the edit that caused it. – PartTimeGnome (talk | contribs) 01:06, 8 February 2014 (UTC)[reply]
Interesting. If a log were kept, would it be possible to log the revision id of the category being added/removed? That is all the info neede to link to that particular edit. Edokter (talk) — 01:49, 8 February 2014 (UTC)[reply]
The revision ID of the category page? That would remain the same for most additions to and removals from a category – a category's revision ID only changes when someone edits the header text at the top of the category. If you meant the revision ID of the page being added or removed, that could be a completely unrelated edit if the addition or removal was caused by an edit to a template or the passage of time. – PartTimeGnome (talk | contribs) 17:28, 8 February 2014 (UTC)[reply]
Sorry I meant the latter. Would there be no way to know if the added category originated from a template edit? Edokter (talk) — 00:24, 9 February 2014 (UTC)[reply]
Well, all category updates via the job queue are due to template (or Lua module) edits. However, if someone edits a page affected by a template change before the job queue gets there, the category update occurs as part of that edit, even though the latter edit didn't touch categories (it could even be a null edit). MediaWiki won't be aware the category change is due to a template edit in this case; it just knows the categories for the submitted text don't match those in the categorylinks table, so the table is updated. – PartTimeGnome (talk | contribs) 21:12, 9 February 2014 (UTC)[reply]

half answer

i do not know of a way to detect removal of articles from a category. however, the API *does* provide addition to category timestamp. if this is of interest, you can look at he:מדיה ויקי:סקריפטים/71. it is not too long, and the one language-dependent function there is "ago" which translates the timestamp to more convenient strings such as "3 minutes ago", "5 weeks ago" or "2 years ago". just ignore this function, and you have the skeleton for a gadget/script that tells you when existing category members were added to the category. if addition is significantly less interesting than removal, feel free change this section title to "quarter answer". peace - קיפודנחש (aka kipod) (talk) 00:57, 7 February 2014 (UTC)[reply]

This is useful code; it makes an API call to https://www.mediawiki.org/wiki/API:Lists/All#Categorymembers , and then gets the timestamp-property of when each article was added to the category. Using it to track insertions and removals would require an external tool, which polled every category of interest periodically (every 24 hours or so maybe... categories that were added-but-then-quickly-removed would be missed by this polling-driven rather than event-driven approach). 74.192.84.101 (talk) 16:18, 7 February 2014 (UTC)[reply]

three-quarters answer

https://www.mediawiki.org/wiki/Extension:CategoryWatch "Extends watchlist functionality to include notification about membership changes of watched categories." Which is not quite as good as a log-page, but would still be a big improvement.

  Known to work with versions of MediaWiki from 1.11 thru 1.21, at which point the ArticleSaveComplete hook[9] was renamed PageContentSaveComplete.[10] Code is GPL, last updated 2011. Some people are using it as of 2014,[11] but not by the WMF. (We *do* use plenty of extensions,[12] including e.g. CategoryTree[13] which started life as an external wiki-tool.) The extension-author who created CategoryWatch in 2008 just posted in December 2013,[14] so they are still around.

  Is is possible to enable this extension (prolly after somebody swaps the deprecated call for the newer one), and take CategoryWatch from 3rd-party status to third-party-which-is-used-by-the-WMF-status? Alternatively, can the techniques used by CategoryWatch be used to create the CategoryEventLog stuff, which was the original suggestion? 74.192.84.101 (talk) 16:18, 7 February 2014 (UTC)[reply]

Rotten Tomatoes glitch

Can someone tell me why The Lego Movie is showing "Failed to retrieve Rotten Tomatoes information. Please contact Theopolisme." on the Critical Reception header? Theopolisme is very inactive, so I don't think I'd get a response from him for months, so I think it'd be better to ask here. Ten Pound Hammer(What did I screw up now?) 23:38, 6 February 2014 (UTC)[reply]

Theopolisme has discussed the template only four days ago so it seems odd to give up contacting him and posting here without notifying or pinging him. The message is caused by an edit [15] by User:Technical 13 to the template. A bot operated by Theopolisme is supposed to create a data page for a movie after the template has been added to an article. If the bot hasn't done this within an hour of the latest edit to the article then Technical 13's edit displays the error message you saw in the article. I don't think articles should tell the reader to go to a named editor's talk page. The template has been removed from the article so the message isn't displayed currently. The message is never displayed in preview because {{REVISIONTIMESTAMP}} returns the current time in preview, but if the data page is not created at Rotten Tomatoes score/1490017 Template:Rotten Tomatoes score/1490017 then the message can be seen in old revisions of the article such as [16]. PrimeHunter (talk) 01:56, 7 February 2014 (UTC)[reply]
  • I've forced a current result as a placeholder for this movie until Theo can figure out why the movie hasn't been found on the API. There are apparently a few little glitches to work out in the template and I monitor both the template's talk page and Theo's for anyone that asks a question about the template or bot. (I'm also watching here at VPT). I can change the error message to say to leave a message on the template's talk page, if other's think that would be more appropriate. I just used the wording that the bot uses if the movie isn't in the api by the IMDb number. Anyways, I'm going back to bed. I'll be looking for responses here in the morning. :) Happy editing. — {{U|Technical 13}} (tec) 03:47, 7 February 2014 (UTC)[reply]

Yeah, I'm hardly "very inactive" -- although I may not be editing frequently, I check in pretty much daily. :) It looks like Wikimedia Labs which the bot runs on was having some trouble, so I just manually prodded it and Template:Rotten Tomatoes score/1490017 now exists. No problems with the bot or the API AFAICS, just the infrastructure it runs on, which is out of my control. Theopolisme (talk) 02:49, 8 February 2014 (UTC)[reply]

Merge the Page Curation and Patrol logs?

There are currently two logs activated by activities related to newpage patrols: curation and patrol. This can make reviewing these logs difficult - for example, when I tagged an article yesterday and inadvertently marked it as reviewed, the review and unreview were marked on different logs: [17], [18].

Is there any reason the functionality of these two logs cannot be merged? VQuakr (talk) 00:30, 7 February 2014 (UTC)[reply]

It would, to my knowledge, create a lot of duplication; patrolling something in the page curation interface also patrols it from a Special:NewPages point of view (and vice versa), but they're different actions, and so one patrol appears in both logs. I also don't know if we have any energy on the engineering side for this at the moment: people are working on quite a few things. Ironholds (talk) 17:33, 7 February 2014 (UTC)[reply]
Thanks for the reply. It seems to me that Special:NewPages and Curation could (and should) be different interfaces to the same back-end database, and it should be the back end that generates the log entries. If there is no bandwidth available to streamline this right now, maybe it can go in a "solve someday" queue? VQuakr (talk) 18:06, 7 February 2014 (UTC)[reply]

Math aligned environments failing to parse

There's a bug being discussed at Wikipedia talk:WikiProject Mathematics#Problem with multiline equations. The "aligned" and "alignedat" environments are failing to parse. Affected articles include Spherical trigonometry#Polar triangles and 1 + 2 + 3 + 4 + ⋯#Heuristics. Here's an example:

I see the output Failed to parse(unknown function '\begin'): {\begin{alignedat}{3}A'&=\pi -a,&\qquad B'&=\pi -b,&\qquad C'&=\pi -c,\\a'&=\pi -A,&b'&=\pi -B,&c'&=\pi -C.\end{alignedat}}

Can anyone here help? Thanks, Melchoir (talk) 02:05, 7 February 2014 (UTC)[reply]

Oh, and I should mention that this was working yesterday! Melchoir (talk) 02:14, 7 February 2014 (UTC)[reply]

I am seeing this at Triple product#Using geometric algebra, which was working when I edited it on 21 January. Similarly it's complaining about the align directive:
Failed to parse(unknown function '\begin'): {\begin{aligned}{\mathbf ...
--JohnBlackburnewordsdeeds 02:30, 7 February 2014 (UTC)[reply]
See also a similar question above. Graham87 04:20, 7 February 2014 (UTC)[reply]

There are some more examples, which helpfully list the TeX source, at Help:Displaying a formula. Also experienced a few timeout/too many people accessing the page errors accessing that and Noether's theorem, another problem page.--JohnBlackburnewordsdeeds 14:17, 7 February 2014 (UTC)[reply]

This has also been reported to OTRS, see Steradian for example.--ukexpat (talk) 15:19, 7 February 2014 (UTC)[reply]
Sorry, I'm not really sure what that means... Is anyone working on this issue? Melchoir (talk) 19:00, 7 February 2014 (UTC)[reply]

This seriously undermines the functionality of Wikipedia's mathematics articles. It needs to be fixed right away, even if that means undoing the recent upgrade. Sławomir Biały (talk) 20:08, 7 February 2014 (UTC)[reply]

I see red everywhere I go on WP today - and also botched attempts to fix the affected formulae. Where can I go to usefully complain about this? --catslash (talk) 23:23, 7 February 2014 (UTC)[reply]

This week's update of the Math extension had some weirdness, but it was reverted - which means the Math extension here should be running the same version as last week. bugzilla:60997 is the tracking bug for the issues. Legoktm (talk) 23:51, 7 February 2014 (UTC)[reply]

I'm unclear what you mean. You mean it was broken but was reverted so should be back to normal/how it was a week ago? It's still broken on the many pages linked here and in the example above. Or do we have to wait for this to be rolled out/propagated?--JohnBlackburnewordsdeeds 00:12, 8 February 2014 (UTC)[reply]

This has to be fixed quickly. Almost all of the articles that display mathematical equations have been showing this error. Formulas and equations are definitely the most important things that people are looking for in articles about Math and science.  [ Derek Leung | LM ] 00:17, 8 February 2014 (UTC)[reply]

  • Use {array} columns or colon-indent to align: Hopefully, the source of the problems with "{alignedat}" can be pinpointed and corrected, but meanwhile, the "{array}" alignment works (such as with 6-el "{llllll}" for 6 columns separated by "&"), as follows:
      :: <math> 
      \begin{array}{llllll}
      A' &= \pi - a ,  \qquad & B' &= \pi - b ,\qquad& C' &= \pi - c ,\\
      a' &= \pi - A ,         & b' &= \pi - B ,      & c' &= \pi - C .
      \end{array}
      </math>
That math-tag will show alignment into 6 columns:
Each qquad spacer "\qquad &" should end with an "&" separator, and with that then the various math articles can be fixed, as well as other issues copy-edited, to format correctly. -Wikid77 (talk) 03:42, 8 February 2014 (UTC)[reply]
That's a very bad idea. Align works as it's an easy and natural addition to a block of formulae: the directives at beginning and end and then '&=' and '//' where you want things aligned and lines broken within. Using arrays like that is overkill. But more importantly the articles aren't broken: the Mediawiki software is. The solution is to get that fixed as soon as possible, not edit articles just to revert then hours or days later. Anyone who needs to work with such formulae in the (hopefully very short) interim can enable MathJax which doesn't have this problem.--JohnBlackburnewordsdeeds 03:54, 8 February 2014 (UTC)[reply]
Using the "{array}" alignment is already done for other equations in those math articles, and is not "overkill" by any means. Telling users here to "enable MathJax" does not fix the red-error parser messages which hundreds of users see in major math articles such as "Integral". Also, there is no need to revert use of "{array}" alignment as it is already used in many articles. -Wikid77 07:17, 8 February 2014 (UTC)[reply]
@Wikid77: As long as you continue to damage the formatting on mathematics articles in this way I will continue to revert you. Ozob (talk) 06:08, 8 February 2014 (UTC)[reply]
Calling the copy-editing of math articles as "damage" still does not permit a violation of wp:3RR by reverting the re-aligned formulas to, once again, display parser errors such as the glaring "Failed to parse(unknown function '\begin'): {\begin{alignedat}{3}". The copy-editing of those articles should not be reverted to emphasize a wp:POINT about problems with math-tag formatting. Let other users edit those math pages to improve the formats. -Wikid77 07:17, 8 February 2014 (UTC)[reply]
As of yet none of the editors in this discussion have violated WP:3RR. Ozob made two reverts each to Integral and Spherical trigonometry. The 3RR rule only counts reverts within the same article. – PartTimeGnome (talk | contribs) 18:01, 8 February 2014 (UTC)[reply]
I didn't realise that Wikid77 was going ahead with this. Seconded. I've reverted other editors already who've tried 'fixing' articles not realising the problem was with Mediawiki. But anyone who's read the thread here or at the maths project should be perfectly clear where the problem lies and so should not be 'fixing' the articles which aren't broken.--JohnBlackburnewordsdeeds 06:27, 8 February 2014 (UTC)[reply]
The main goal is to fix the equation-formatting problems wherever they are displayed, rather than blame the math-tag software as an excuse to leave broken equations in major articles, for 3 days. -Wikid77 07:17, 8 February, 12:17, 10 February 2014 (UTC)[reply]
They will fix themselves once the fix in MediaWiki has been backported. It is futile to fix all errors by hand. Edokter (talk) — 10:57, 8 February 2014 (UTC)[reply]
Wikid77 has raised this issue at Wikipedia:Administrators' noticeboard#Reverting fixes of equations. – PartTimeGnome (talk | contribs) 18:01, 8 February 2014 (UTC)[reply]

While some choose to bicker over the short-term management of this problem, I would prefer to look at the underlying issue. How is that a change to the software that supports mathematics articles can be made which breaks a significant number of them without that fact being noticed, or fixed before it is deployed? Why is it that mathematics editors at such places at Wikipedia:WikiProject Mathematics, who may be assumed to have between them a considerable degree of expertise and experience in these matters, are not consulted, or even informed about these changes? There appear to have been failures at a number of levels over this matter. Deltahedron (talk) 12:06, 8 February 2014 (UTC)[reply]

Many math experts used MathJax format and did not see the error messages viewed by thousands of people. -Wikid77 12:17, 10 February 2014 (UTC)[reply]
You raise a good point. I don't know the details, but expect that there is some set of automated tests the programmers use to test changes to the Mediawiki software. It is apparent that the automated latex test suite is not comprehensive. Part of the problem is that this is only broken for folks not using MathJax--perhaps not all branches are tested automatically? This particular change was meant to be transparent to the user, which is likely why it was not mentioned at WP Math. It is not the first time in the past year that we have had latex rendering issues, so it seems a problem area for Mediawiki. --Mark viking (talk) 17:32, 8 February 2014 (UTC)[reply]
All MediaWiki seems to be "problem areas" and rebroken many times; the only reason why #switch or #ifeq still work is they remain "non-improved" but there is talk to "rewrite the parser" as NewPP is likely a hodge-podge of software subsystems. Rather than blame the developers, we need to know "Top 100 math articles" to watch/rescue, and then notice how Polynomial, Calculus, Derivative, Integral, Sine, Complex number (etc.) were all broken, to be quick-fixed (same-day) not bicker for 3 days whether altered alignment is worse than a glaring red-error message. Meanwhile "wp:Equation hoarding" has overcomplified the major math pages, where most of the botched equations should have been in minor subpages, not excessive wp:UNDUE detail in major pages viewed 2x per minute. The page "Sine" should be kept minimal with links to subpages such as "Polynomial equivalents of sine" and then put Taylor series formulas (+Lagrange polynomials +others) on the subpages. -Wikid77 12:17, 10 February 2014 (UTC)[reply]
See Wikipedia:WikiProject_Mathematics/Popular_pages for the "Top 100 math articles" to watch/rescue.--Salix alba (talk): 12:57, 10 February 2014 (UTC)[reply]

Problems with math rendering

As of 7-Feb-2014 there seems to be a problem with math rendering.

Referring to Preferences, Appearance, Math, options "Always render PNG", and "MathJax (experimental; best for most browsers)", the following used to work in PNG and in MathJax. Now it doesn't work in PNG anymore, producing an error "Failed to parse(unknown function '\begin'): {\begin{aligned}...", but it still works in MathJax, although the formular are now centered on the page:

In article Complex number:

In article Polynomial:

Attempts were made to "correct" the faulting aligns; [19], [20], [21], [22].

Other changes were made, for instance this one to Maxwell's equations.

The following render correctly (no problems in PNG) but in MathJax some equations get centered on the page, whereas other remain left aligned and are properly indented:

When text is added after the math tags, there is no centering:

(text)

(text)

(text)
(text)
(text)
(text)

What's up? - DVdm (talk) 10:49, 8 February 2014 (UTC)[reply]

The first problem is being discussed above, at #Math aligned environments failing to parse. I am not sure if the other alignment inconsistencies are related or not. —PC-XT+ 11:18, 8 February 2014 (UTC)[reply]
They are; all math related problems started to happen when they took out a large chuck of math code from core in the assumption that the Math extension could handle it. Turns out it can't because the core code did a lot of converting before feeding it to the extension. Without this converting, the extension is now being fed invalid code, hence the errors. Edokter (talk) — 11:26, 8 February 2014 (UTC)[reply]
Thanks for the replies. Let's hope this gets fixed before too many math articles get—sort of —damaged by well-meaning authors. Shouldn't some kind of watchlist announcement be put in place? - DVdm (talk) 12:10, 8 February 2014 (UTC)[reply]
@Edokter: That's not a correct assessment of the situation. The Math extension has been handling the rendering since about 2011. There were several issues with the most recent update; the code removed from core did not cause the issue with rendering various formulas (it did cause some site performance issues). The problem behind the rendering is that when the "transformation and validation" code was separated from the "render PNG" code so that validation could be used for MathJax as well (Template:Bug), it was overlooked that the output of the validator isn't accepted as input to the renderer (which still contains all the same transformation and validation code). Anomie 14:35, 8 February 2014 (UTC)[reply]
Oh well, that was my quick assesment of the situation browsing through the bugs. BTW, why is bug 49169 hidden? Edokter (talk) — 14:46, 8 February 2014 (UTC)[reply]

Math parsing Error

At Real_projective_line#Motivation_for_arithmetic_operations I get this error:

Failed to parse(unknown function '\begin'): {\begin{aligned}\\a+\infty =\infty +a&=\infty ,&a\in {\mathbb {R}}\\a-\infty =\infty -a&=\infty ,&a\in {\mathbb {R}}\\a\cdot \infty =\infty \cdot a&=\infty ,&a\in {\mathbb {R}},a\neq 0\\\infty \cdot \infty &=\infty \\{\frac {a}{\infty }}&=0,&a\in {\mathbb {R}}\\{\frac {\infty }{a}}&=\infty ,&a\in {\mathbb {R}},a\neq 0\\{\frac {a}{0}}&=\infty ,&a\in {\mathbb {R}},a\neq 0\end{aligned}}

Can someone fix it please. I am not sure what else I should tell you, so feel free to ask. TheKing44 (talk) 19:27, 8 February 2014 (UTC)[reply]

Yes, see all over the place. I made the same mistake, creating a section about something already well under discussion - DVdm (talk) 19:16, 8 February 2014 (UTC)[reply]
I've backported and deployed a Math syntax check fix. It should be less broken now (which seems to be the case). Aaron Schulz 21:08, 8 February 2014 (UTC)[reply]
Yes, it seems to be fixed. Thanks!

The strange indending/centering behaviour in MathJax remains though—see two threads higher at #Problems with math rendering. Any idea about that? DVdm (talk) 21:31, 8 February 2014 (UTC)[reply]

This is discussed at Wikipedia talk:WikiProject Mathematics#Displayed equations are centered?. It might be due to a change in the MathJax from version 2.2 to version 2.3. There is a bug at Template:Bug.--Salix alba (talk): 00:42, 9 February 2014 (UTC)[reply]

Broken math markup at Great-circle navigation

The article Great-circle navigation includes some mathematical formulae, which are displayed using <math>...</math> tags.

However, something is broken, and the display just consists of red error messages. Can anyone fix it? --BrownHairedGirl (talk) • (contribs) 01:14, 9 February 2014 (UTC)[reply]

That was probably related to the issues reported above. A purge fixed it. Matma Rex talk 01:18, 9 February 2014 (UTC)[reply]
So are 0.999... and every single article that contains \begin{align}...  [ Derek Leung | LM ] 01:28, 9 February 2014 (UTC)[reply]
 Done Thanks, looks good now. --BrownHairedGirl (talk) • (contribs) 04:43, 9 February 2014 (UTC)[reply]

Fixed but slow

Although it's now rendering properly it's so slow it's causing serious problems. See Wikipedia talk:WikiProject Mathematics#Performance problems.--JohnBlackburnewordsdeeds 07:06, 9 February 2014 (UTC)[reply]

There's still at least one page that still isn't working. Jestingrabbit (talk) 07:25, 9 February 2014 (UTC)[reply]
It loaded for me (though it took 33 seconds) and looks fine.--JohnBlackburnewordsdeeds 08:02, 9 February 2014 (UTC)[reply]
Editing is extremely slow. Just made this edit. It took a few minutes between Save page and a "504 Gateway Time-out". The same thing happened with the preceding edit earlier. I estimate that it took as much as 5 minutes. Math editing has become virtually impossible. - DVdm (talk) 11:24, 9 February 2014 (UTC)[reply]
@DVdm: I think it's a known issue, Aaron and Physikerwelt were investigating it on IRC yesterday. Stay tuned :) Matma Rex talk 11:38, 9 February 2014 (UTC)[reply]

Failure to Parse Mathematical Formulas

The \begin{aligned} command seems to fail in many pages, such as these:

Maximum entropy classifier

Positive-definite matrix

The matter is of some urgency as there are many pages with matrix algebra and these seem affected. Limit-theorem (talk) 14:04, 9 February 2014 (UTC)[reply]

Please see previous threads. --Redrose64 (talk) 14:06, 9 February 2014 (UTC)[reply]
Both pages are now fine. I guess someone purged them. The underlying software bug is fixed, so any pages still showing this error are from cache and can be fixed with a purge. Per the first point in the Village pump (technical) FAQ, if something looks wrong, a purge is the first thing to try. – PartTimeGnome (talk | contribs) 22:31, 9 February 2014 (UTC)[reply]

Purge-refresh major math pages

People should continue to wp:purge-refresh the major math pages (with "?action=purge"), which have lagged in reformatting the 3-day mathtag glitch, even though the math-tag '{align}' problem was fixed over 24 hours earlier, on Saturday c.21:00, 8 February 2014. I had to purge the following: Calculus, Derivative, Integral, Sine, Fast Fourier transform, Completing the square, System of linear equations, Quadratic integral, etc. Others have purged: Complex number, Polynomial, Quadratic formula, Great-circle navigation, Maximum entropy classifier, etc. Many had pageviews 1,000-3,000 per day. The typical page uses 'begin{aligned}' to force equations at '=' into alignment. I posted a similar note at User_talk:Jimbo_Wales, viewed 1,000 per day. -Wikid77 00:58/12:17, 10 February 2014 (UTC)[reply]

I just purged Level set method which was showing "Failed to parse(unknown error)" for the <math>\varphi</math> in "The boundary of the shape is then the zero level set of " and for the same markup later in the section at " is represented as the zero level set of " - in both cases without the align environment (if I've understood things correctly). I don't know if this is significant, but this seems a vaguely appropriate place to report it. --Mrow84 (talk) 18:13, 10 February 2014 (UTC)[reply]
That's fine... what we'd really like to know about is pages where a WP:PURGE didn't fix the problem. --Redrose64 (talk) 19:40, 10 February 2014 (UTC)[reply]
Example? — {{U|Technical 13}} (tec) 20:27, 10 February 2014 (UTC)[reply]
I don't know of any, that's the point. Are there any left out there? --Redrose64 (talk) 22:12, 10 February 2014 (UTC)[reply]

Side question

Does mediawiki throw pages with these types of errors into a tracking category? Werieth (talk) 21:06, 10 February 2014 (UTC)[reply]

Removed preferences

Show table of contents (for pages with more than 3 headings)

I used to use the checkbox for always hiding tables of contents (in Monobook) but Tech news 2014-06 and bugzilla inform us that this feature has been deemed to be too unimportant to continue cluttering up the code base, and removed. They also inform us that it can easily be simulated by custom CSS, but they don't inform us what the CSS is. Has anyone else taken the effort to figure this out yet, and if so how does one go about making tables of contents always invisible? —David Eppstein (talk) 03:10, 7 February 2014 (UTC)[reply]

Do you mean that you don't want to display the TOC at all? If so, add
#toc {display:none;}
to your common.css. If you want to have the "hide" option selected on load, add
$('.toc').addClass('tochidden');
to your common.js. (I haven't tested this out, so I don't know if it works.) ~HueSatLum 03:18, 7 February 2014 (UTC)[reply]
I meant the first one. Thanks! —David Eppstein (talk) 05:00, 7 February 2014 (UTC)[reply]

Contents switch disable?

I used to have the "Contents" section turned off, but now it's showing up in articles and I can't find the switch to turn it off. Was this feature removed? Maury Markowitz (talk) 12:58, 7 February 2014 (UTC)[reply]

I see above that it was indeed removed. Nice that they didn't tell anyone through channels that mere mortals would ever see (buzilla, seriously?). I also see that I now have to edit my css files to fix this, another ability well beyond the average user. Was it really too much to ask first? Maury Markowitz (talk) 13:02, 7 February 2014 (UTC)[reply]
Quoting from mw:Requests for comment/Core user preferences#New dataset:

The ability to disable the Table of Contents feature (used by 86 users).

Helder 13:13, 7 February 2014 (UTC)[reply]
I can understand why these options were removed; they were turning into development hell. Each new feature would have to be tested against all the possible rendering scenarios that were available, while their use base was negligable. Going for a more consistent default user experience makes sense. Edokter (talk) — 13:17, 7 February 2014 (UTC)[reply]
See also the List of user preferences in MediaWiki to be removed. Helder 13:19, 7 February 2014 (UTC)[reply]

Justify paragraphs

Last night, out of the blue, text justification stopped working (about 14 hours ago) and all articles suddenly appeared with staggered text. I don't like it. I seem to recall that justification was a personal preference, but no such option shows up in Preferences (any longer...?). Since I'm fed-up with searching for a solution and I have work to do, I'm asking here: Is someone fooling around with skins again or what? André Kritzinger (talk) 10:35, 7 February 2014 (UTC)[reply]

It does seem that option has been removed; it used to be under the Appearance tab. Edokter (talk) — 11:00, 7 February 2014 (UTC)[reply]
  • I've reopened the ticket on Bugzilla (tracked ⇒) as this was supposed to have the option still available as a gadget before it was removed from core. — {{U|Technical 13}} (tec) 11:08, 7 February 2014 (UTC)[reply]
    Several options formerly at Preferences → Appearance → Advanced options have gone west in the last few months, some quite recently. These include: Format links to non-existent pages like this (alternative: like this?); Show table of contents (for pages with more than 3 headings); Disable browser page caching; Enable "jump to" accessibility links; Justify paragraphs; Enable collapsing of items in the sidebar in Vector skin; Exclude me from feature experiments; Enable font embedding (Web fonts). --Redrose64 (talk) 11:48, 7 February 2014 (UTC)[reply]
    Gadgets are added by local administrators, not developers. Bugzilla is not the right place to ask for them. You can requests new gadgets for this wiki at Wikipedia:Gadgets/proposals. – PartTimeGnome (talk | contribs) 00:25, 8 February 2014 (UTC)[reply]
While the fiddlers are fiddling, my opinion for what it's worth. Justified paragraphs should be the default, not the option. Newspapers, magazines, books, you name it, have used justified paragraphs since long before the grandparents of anyone alive today were born. It simply looks neater. Aligned left went extinct with the old Remington typewriter. Make aligned left, centered or aligned right the optional preference for those readers who prefer it that way. I'm pretty sure they are not the majority. André Kritzinger (talk) 12:29, 7 February 2014 (UTC)[reply]
See also:
Helder 13:19, 7 February 2014 (UTC)[reply]
The CSS is
#article, #bodyContent, #mw_content { text-align: justify; }
just put that in Special:MyPage/common.css. --Redrose64 (talk) 14:05, 7 February 2014 (UTC)[reply]
Thanks, Redrose64 and all, but that's not good enough. It is a preference option and still belongs in Preferences → Appearance → Advanced options, where it used to be until it was fixed into being broken. Just how many users do you think are aware of Special:MyPage/common.css? I wasn't, and when I click on it I get to "Wikipedia does not have a user page with this exact name". So now I and countless others who are unaware of its existence need to create a new page to replace an option that used to require ticking a box. Plus, setting personal preferences must now be done in multiple locations. That's not progress. I just love the "wham, bam, thank you mam" attitude displayed about this matter at Bugzilla. André Kritzinger (talk) 15:14, 7 February 2014 (UTC)[reply]
Thank you for that Redrose, but I think it's ridiculous that this option has been removed from preferences. Justified text looks so much more professional and I can't believe it isn't the default on WP, let alone not even an option any more... --Loeba (talk) 19:49, 7 February 2014 (UTC)[reply]
Justified text is actually a poor choice for web readability as it creates "gutter" effects for readers who have dyslexia and certain forms of macral-degeneration. Accordingly, it is not set by default here. I should imagine that the preference was removed because maintaining preference bloat is something we don't really want to deal with.--Jorm (WMF) (talk) 00:32, 8 February 2014 (UTC)[reply]
  • I agree. It should be the default, and it should certainly be an available preference. But more importantly, this kind of interface change should not be made without visible, public discussion and clear consensus. This change should be reverted. DES (talk) 23:51, 7 February 2014 (UTC)[reply]
I fear we are heading for another Visual Editor-esque debacle with these typography changes. There needs to be an easily locatable place, widely broadcast (not stuck away in a dark corner of mediawiki), where they can be discussed and user preference actually taken into account. I and many others like justified text, and I and many others like full page width text, so these should be available as preferences if not the default.--ukexpat (talk) 02:48, 8 February 2014 (UTC)[reply]
I seem to remember someone saying only 86 users enabled this option, but I lost where. In any case: Oh look here. Edokter (talk) — 11:21, 8 February 2014 (UTC)[reply]

There's something that's being lost sight of here. The no-longer-existing options were available to registered users, therefore mainly contributors/editors. So now the situation exists were some contributors/editors (definitely not all) are aware of a gadget/snippet that will allow them to see the fruits of their labour displayed in a professional-looking justified format. In the meantime, the many millions of users who use the Wikipedia but will never register (the customers) can only see a primitive non-justified version. We're not doing all this work just for our own benefit, are we? André Kritzinger (talk) 12:10, 8 February 2014 (UTC)[reply]

Justified text is common in printed texts where it can be controlled by the publication, but not in websites where it's controlled by the reader's browser, usually with poor results. Here is a url for Wikipedia with the "Justify Paragraphs" gadget at Special:Preferences#mw-prefsection-gadgets: https://en.wikipedia.org/wiki/Wikipedia?withCSS=MediaWiki:Gadget-JustifyParagraphs.css. I tried five browsers and it looked bad in all of them with large spaces between words, especially on lines which are shortened due to images, tables or columns. PrimeHunter (talk) 04:18, 9 February 2014 (UTC)[reply]
  • The spacing between words is part of what I like about justified paragraphs. It makes the text easier to read because it is less of a wall of text. Those spaces make it easier for me to keep my attention on what I'm reading and not lose which line I am on. It's about accessibility to me. — {{U|Technical 13}} (tec) 04:54, 9 February 2014 (UTC)[reply]
p {word-spacing:3px;}

Portal:Current events displaying "Invalid time"

Resolved
 – Redrose64 (talk) 07:31, 7 February 2014 (UTC)[reply]

Portal:Current events is currently showing "February -7, 2014 (Error: Invalid time.)" I'm guessing the "invalid time" is related to the (invalid) date of "February -7" but I don't know now to fix it. Can someone knowledgeable please fix it? Thanks. DH85868993 (talk) 06:50, 7 February 2014 (UTC)[reply]

This edit fixed it at source; I did a WP:PURGE of all transcluding pages so that it displays correctly. --Redrose64 (talk) 07:31, 7 February 2014 (UTC)[reply]
Thanks. DH85868993 (talk) 08:41, 7 February 2014 (UTC)[reply]

CSS bug in IE 8 and 9

Core CSS has been migrated to use LESS, and the Vector skin has had some updates, ie. it now uses SVGs for the small icons (watch, arrow, user icon). It also uses a linear-gradient for the page background, but they forgot to use a PNG fallback for browsers that do not support gradients (IE8/9). That is why the background behind the tabs may look funky. I have submitted a bug. Edokter (talk) — 11:59, 7 February 2014 (UTC)[reply]

IE does support some form of linear gradient, but still not using the beta-standardized method. There are workarounds. But for now linear gradients should not be essential for the presentation of MEdiaWiki, if they are not rendered a plain background color should be enough. Linear gradients are fun, but not required. You should avoid them for coloring large buttons, they can only be used to create pseudo-3D lighting effects to an existing background color (for example a grey button should look OK even if it's plain gray, without the gradients of shades of grey that create the 3D lighing effect. verdy_p (talk) 12:55, 10 February 2014 (UTC)[reply]
The move is towards making the Vector skin completely image independent from background images, so eventually, the page background and tabs will be rendered solely using CSS gradients. Still brewing on a suitable fallback, perhaps using IE filters, but eventually I think it will be CSS-only in the future. Edokter (talk) — 15:57, 10 February 2014 (UTC)[reply]

User Contributions: Accept Revision

Does Wikipedia already have a way to display a list of "Accept Revision" approvals given by Reviewer? If not, this feature is needed. I need it because I believe I have approved revisions in violation of WP:DOY and I wish to remove some additions to some pages that I should not have approved. By the way, am I posting this question and request to the right place? If not, where should I have posted it? —Anomalocaris (talk) 18:10, 7 February 2014 (UTC)[reply]

You can use the review log. You can "unnaccept" a revision, but unless there are no newer reviewed revisions, it's kind of pointless. You don't have to unnaccept an edit to revert it. Mr.Z-man 18:31, 7 February 2014 (UTC)[reply]
Thanks. I reviewed my changes and did what I needed to do. —Anomalocaris (talk) 21:17, 7 February 2014 (UTC)[reply]

New problem: I go to Special:PendingChanges and open review links in a new window or the same window. Usually everything is fine, but sometimes, the Accept Revision button is grayed out and can't be clicked. One might think that this is because by the time I get to the page, someone has already approved the revision. Not so. The revision remains awaiting approval. Right now, I am on Ricky Rubio: Difference between revisions and the Approve Changes button is grayed out, but Ricky Rubio Revision history reveals that there are two revisions pending review. What is going on? —Anomalocaris (talk) 23:41, 11 February 2014 (UTC)[reply]

Watchlist question

Is it me, or did the watchlist display how many pages had changed during the duration you have it set for. For example, mine currently states "Below are the changes in the last 72 hours, as of 8 February 2014, 13:44." I'm sure it used to say x number of pages over the last y number of hours. Lugnuts Dick Laurent is dead 13:46, 8 February 2014 (UTC)[reply]

Yes, that's a change; instead of MediaWiki:Wlnote we're getting MediaWiki:Wlnote2. -- John of Reading (talk) 14:45, 8 February 2014 (UTC)[reply]
Thanks John. Is there a way to change it back? Lugnuts Dick Laurent is dead 14:58, 8 February 2014 (UTC)[reply]
@Lugnuts: No, it was a change in the software. Were you using that number for something other than satisfying your curiosity? Maybe there's a better way to do whatever that was :) Matma Rex talk 15:02, 8 February 2014 (UTC)[reply]
I just found it useful to see that I had 96 changes, or whatever the number was, over the given time. Lugnuts Dick Laurent is dead 15:10, 8 February 2014 (UTC)[reply]
I miss that number too. What was the point in removing it? Was it consuming too much cpu? HandsomeFella (talk) 20:19, 9 February 2014 (UTC)[reply]
  • Matma, I also happened to like having that number. What was the purpose of removing it? I'm more curious than anything because I can easily tweak something up in JavaScript to give me the number like:
var changes = 0;
$('.mw-enhanced-rc-nested').each(function(){
    changes++
});
alert(changes);
which tells me how many changes are on the page (yes, I use enhanced-rc, and it could be modified to work for those who don't). — {{U|Technical 13}} (tec) 21:33, 9 February 2014 (UTC)[reply]
as you noted, this works "on the page", i.e., as long as your watchlist change all fit on one page. i do not know how mw behaves when not all changes fit on one page (i never had such a large wathlist) - maybe there is no such thing as "does not fit in one page" for watchlist. presumably, the old count came from the DB and was not dependent on the page. BTW, there is no need to use "each" and increment the counter if all you want is the count - you can simply ask
$('.mw-enhanced-rc-nested').length
peace - קיפודנחש (aka kipod) (talk) 22:14, 9 February 2014 (UTC)[reply]
Is this going to be fixed soon?
HandsomeFella (talk) 11:29, 10 February 2014 (UTC)[reply]
I suspect that removing it was the "fix". Template:Bug is still open, but this simplification was suggested there. There have been a number of changes recently aimed at simplifying the UI and slightly improving performance, and this feels like that kind of change. WhatamIdoing (talk) 19:34, 10 February 2014 (UTC)[reply]
When refreshing my watchlist just now, I noticed the text <wlnote> in the position that the text in question used to be. So I've undeleted MediaWiki:Wlnote and it the status quo ante has returned. --Redrose64 (talk) 19:55, 11 February 2014 (UTC)[reply]
@Redrose64: Is that still using that for you, or was it a temporary glitch? I'm still seeing wlnote2. Jackmcbarn (talk) 20:57, 11 February 2014 (UTC)[reply]
Oh dear. It has gone back to MediaWiki:Wlnote2. But just in case it reverts again, let's keep MediaWiki:Wlnote for a while longer. --Redrose64 (talk) 22:33, 11 February 2014 (UTC)[reply]

Template mess

I updated two articles Girona railway station and AVE with the new international train services. This is mix of AVE and TGV services. The templates do not provide for this. Unfortunately there is no easy way to bypas the templates without rewriting whole articles. Can someone look at this?Smiley.toerist (talk) 12:27, 9 February 2014 (UTC)[reply]

I created Template:S-line/AVE right/, giving it the same contents as Template:S-line/TGV right/. Hopefully that helps. I did not look closely at the details of how these templates work. Wbm1058 (talk) 13:54, 9 February 2014 (UTC)[reply]

Help requested with using Module:String

Given the string

{{Orphan/sandbox|date=August 2013}} {{Unreferenced|date=January 2009}} which is passed into a template as parameter {{{1}}}, I want to replace that string with
{{Orphan/sandbox|multi=multi|date=August 2013}} {{Unreferenced|date=January 2009}}, i.e., add another parameter |multi=multi.

My attempt to do this with

{{Replace|{{{1|}}}|Orphan/sandbox|Orphan/sandbox{{!}}multi=multi}} did not work, as it seems that {{!}} breaks the desired template parameter construction so that the syntax isn't recognized. Of course I need to template the pipe character so that it isn't confused for the terminus of the string parameter in the {{replace}} template.

Digging deeper, I see that

{{#invoke:String|replace|{{{1|}}}|Orphan/sandbox|Orphan/sandbox{{!}}multi=multi}} is equivalent to the above template usage, as module:String is the underlying module that is used, and I get the same results with that syntax.

The documentation at Module:String#replace offers some hope, in that there is a plain parameter which is a "Boolean flag indicating that pattern should be understood as plain text and not as a Lua-style regular expression." It defaults to plain-text. Can someone provide either a plain-text or Lua-style regular expression solution for me? Looking at the module:String source code might offer some additional documentation or insight into how it works. I only have a bare-bones understanding of how php regex works, and no knowledge of Lua or how Lua regex differs from php regex. Thanks, Wbm1058 (talk) 14:35, 9 February 2014 (UTC)[reply]

Can't be done, unfortunately. The issue is that in template call {{foo|{{bar}} }}, the bar template would be expanded before being passed to foo. You'd have to do a modification of the expanded content of the template to do what you're trying to do. Jackmcbarn (talk) 17:04, 9 February 2014 (UTC)[reply]
Now I get it. Thanks for pointing me in the right direction :) Wbm1058 (talk) 19:46, 9 February 2014 (UTC)[reply]

@GTBacchus, Isaacl, Jayen466, Obiwankenobi, and Sam and anyone else who might be interested:

Discussion on category intersections seems to have ground to a halt last May. What do we have to do to get this effort moving again? — Scott talk 14:36, 9 February 2014 (UTC)[reply]


requesting support for collapsible template at mr-wikisource

Hi Seasons greetings. On mr-wikisource we intend to use a hide and show collapsible template s:mr:Template:इतरभाषीउतारा to hide the content which is in other than marathi language (and the template content is translated below on the same wikisource article page) When the reader clicks show it is supposed to show display the content inside. We are coming across three defficulties.


The Defficulties we are coming across are as follows

1) When I am not signed in Template remains in closed form but when I click on show it does not show up the content but it remains closed.

2) When I sign in What I see is that template remains open and does not close For signed in or not what we need is template content should remain default hidden and should show up when we click show

  • Simmiller other template based on same concept works well on mr wikipedia.I use win 7 and Firefox latest auto updates.

3) In the same template I want to use 5 more sets simmiller to of |group2 = इतरमजकूर |list2 = {{{इतरमजकूर}}} |group3= मजकूर |list3 = {{{मजकूर}}} the requirement headings which did not work are shown on talk page of the template

We wish and request some one help us out since mr-wikisource community is small and further trial and error from our side will take more time to accomplish the task.

Thanks and warm regards

Mahitgar (talk) 15:20, 9 February 2014 (UTC)[reply]

If I enter {{Template:इतरभाषीउतारा}} at s:mr:Special:ExpandTemplates and copy the produced code to the English Wikipedia then the navbox becomes collapsible. This makes me suspect the problem is somewhere in the code for collapsible tables in s:mr:MediaWiki:Common.js versus MediaWiki:Common.js, but I haven't compared the code. PrimeHunter (talk) 02:28, 10 February 2014 (UTC)[reply]
I belive I have fixed all three issues. I used jQuery.makeCollapsible instead of the code in common.js, so that should no longer be the issue here. I probably will take a look at this issue again tomorrow, right now I am a little sleepy.--Snaevar (talk) 02:55, 10 February 2014 (UTC)[reply]


Hi Thanks the Collapsible issue is now sorted out so point no 1 and 2 are addressed. For third point I copy pasted Snaevar' improvements from template talk to main template page s:mr:Template:इतरभाषीउतारा page the template behaviour is bit improved but at this article namespace page content filled in tabulated manner is still not visible. Both of you are taking so much efforts its realy nice of you.

warm regards

Mahitgar (talk) 06:45, 10 February 2014 (UTC)[reply]

I don't know the mr script and that makes it hard to examine but some of your parameter names are not identical in the template call and the templates own code. I don't know which version is correct so I'm not fixing it but if you copy-paste from one to the other then you should get identical names and avoid issues like a zero-width non-joiner and spaces versus underscores. PrimeHunter (talk) 14:02, 10 February 2014 (UTC)[reply]

Reviewing broken by early edit?

Hey there. Yesterday, I made a new article on an asteroid called 2013 JX28, and was waiting for it to be reviewed, when another user edited it, oblivious of the fact that it was pending review. Typically, I think that it takes 10-30 minutes for a person to review an article, but because he'd edited but not reviewed it, I think it threw the whole process off. Now the article seems 'broken' in an inter-stage between creation and review. I've contacted the person who made the edit and we're trying to fix the problem, but meanwhile I don't have much more than a basic knowledge of programming/coding, and don't know if there's a protocol for this sort of thing. Can anyone help here, or give suggestions on what to do to fix this sort of problem? Thanks for your help. exoplanetaryscience (talk) 18:04, 9 February 2014 (UTC)[reply]

I don't think there's a problem. -- Ypnypn (talk) 19:54, 9 February 2014 (UTC)[reply]
I guess you're talking about the new pages patrol; reviewing is something else. I don't see anything that would stop your article being patrolled. It is still marked as awaiting patrol in the new pages list. What did you think was broken?
If it's just that the page is still un-patrolled, remember that patrollers are volunteers, each with their own interests and ways of working. This means that the time it takes for a new article to be patrolled can vary by a lot. I can see articles from 11 January that are still un-patrolled, while one created 2 minutes ago has already been patrolled. – PartTimeGnome (talk | contribs) 23:01, 9 February 2014 (UTC)[reply]
Ha! While I was typing that, User:Huon patrolled the page! – PartTimeGnome (talk | contribs) 23:06, 9 February 2014 (UTC)[reply]
Some pages that are apparently patrolled very quickly might not have actually been independently patrolled at all. If a user with the autopatrolled right creates a new page, it's still entered into the list at Special:NewPages but does not have the yellow background which denotes an unpatrolled page. The way to distinguish it from a new page that was patrolled quickly is by looking in Special:Log/patrol, to see if it's also listed there as "automatically marked revision x of page Foo patrolled"; it will also credit that action to the page's creator. --Redrose64 (talk) 09:55, 10 February 2014 (UTC)[reply]
Good point, I forgot about that. I can't remember which article was "created 2 minutes ago", so I can't go back to check it. Ignoring auto-patrolled pages, I currently have to go back 24 minutes to find a page that was manually patrolled. – PartTimeGnome (talk | contribs) 01:28, 11 February 2014 (UTC)[reply]

Odd post-deletion behaviour

Please check MediaWiki talk:Deletedtext#Weird_glitch?. I don't know if the weird behaviour I encountered is related to parser functions or to transcluding a too template-heavy talk page, but I would like to know what happened there. —Kusma (t·c) 19:23, 9 February 2014 (UTC)[reply]

TOR, proxies and whatever

This is a simple curiosity. I'm not sure this question belongs to here, but anyway. If Wikipedia is able to block whatever proxy, TOR (not sure about this) and other methods, how china, Iran and others countries are not? Hóseás (talk) 22:01, 9 February 2014 (UTC)[reply]

I don't know how TOR blocks work, but we have a bunch of volunteers who check for open proxies. I'm pretty certain there are a lot of open proxies we don't block because we don't know about them. Such unknown proxies will probably be discovered and blocked as we start getting suspicious edits from them, or if they appear on a public list of open proxies that one of our volunteers checks. – PartTimeGnome (talk | contribs) 22:13, 9 February 2014 (UTC)[reply]
  • With TOR you have two points, access nodes and exit nodes. When you connect to TOR you access one of the access nodes, and are then routed through several other nodes until you can access a exit node, Identifying and blocking the editing from exit nodes is fairly straight forward on our end. But governments and other entities that want to block access to TOR would need to know all the access nodes and block access to those, that information is not easily obtained, while exit node information tends to be more public. Werieth (talk) 22:18, 9 February 2014 (UTC)[reply]
I don't think that Tor is an issue if we only block from editing the IP of its exiing node. But we will want that registered users connected to their account (which provide them non-public tracking by China, Iran or Syria, and allow contributors from these countries to benefit of freedom of use of Wikimedia sites.
So it's OK to block Tor proxies as long as users can register an account from Tor, and then use it, getting a but more sure that their account will not be tracked by dictatures to the time when they registered their publicly visible account. Of course, these users won't be able to connect anonymously with the IP of the Tor exist nodes. But there's nothing wrong if they logon via Tor to their account, and then use their account to read/write contents in confidence within Wikiemdia sites. We wil still monitor these users like any other regular registered user, with their effective IP hidden and kept private (nobody should know publicly that this user was contributing or visiting Wikiledia via Tor, just like nobody has to know the IP used by regular connected users).
Tor is good for Wikimedia projects in my opinion, and we should even promote it for users residing in countries in severe political troubles, or that fear social/political/religious harassment (e.g. LGBT users and independant reporters in Russia or Iran, or defenders of Fallun Gong in China, or Tibetan supporters in China, or women interested in other education resources in Islamic countries, such as control of birth, or even basic sexual education for young people, or simply reading news from foreign sources not censored by their government).
Using Tor with a registered account does not mean that the connected user can do everything on Wikimedia. They have to follow the same policies as any other registered user because there's strictly no difference.
However Tor may be used by regular users to create sock puppets; But Checkuser Admins can identify them (it's not up to the normal community to check this). If there are clear signs that sock puppets are created and used via Tor by some user that has been blocked on Wikimedia (this does not concenr a lot of people) in an abusive way, the registered account created and used via Tor will be blocked like any other account. verdy_p (talk) 12:48, 10 February 2014 (UTC)[reply]
The problem of course is that if the sockpuppeteer is using Tor, then there is no way to block them from creating more accounts. This is why we currently prevent all editing from Tor, not just anonymous editing; at least on the English Wikipedia. Users with a legitimate reason for using Tor can get IP block exempt on their account. This has been discussed quite a bit on the mailing list recently. The problem is that to sufficiently mitigate the potential for abuse, we need to be able to associate Tor user accounts with some form of unique-ish identifying information other than an IP, but doing that kind of defeats the purpose of using Tor in the first place. Mr.Z-man 14:40, 10 February 2014 (UTC)[reply]

Which TCP ports does Wikipedia use?

If I wish to configure a firewall to only allow those TCP ports that are needed to access Wikipedia, Wikimedia, Wictionary, etc. how many ports do I need to leave open? Obviously I need to allow Port 443 (HTTPS) and Port 80 so HTTP traffic can be redirected to HTTPS, but are there any other ports that we use?

I did a port scan (not definitive, especially on large systems with load balancing and caching servers), and according to the scan, en.wikipedia.org appears to use:
TCP Port 80 (HTTP)
TCP Port 179 (Border Gateway Protocol)
TCP Port 443 (HTTPS)
TCP Port 8649 (Ganglia) [23]
I don't think that an ordinary user has any need to access 179 or 8649. --Guy Macon (talk) 02:46, 10 February 2014 (UTC)[reply]

Correct. You only need 80 and 443. Edokter (talk) — 10:04, 10 February 2014 (UTC)[reply]

09:30, 10 February 2014 (UTC)

Please participate in the discussion. Thanks. --Gryllida (talk) 22:34, 10 February 2014 (UTC)[reply]

Bizarre array of errors

Over the past 30 minutes or so, I have been barraged with a bizarre array of errors, including "site down" (due to too many connections), "service unavailable (HTTP 503)", an internal PHP error, and an internal database error. Is the site undergoing some form of attack? Or is someone just testing code patches on the live site? WikiDan61ChatMe!ReadMe!! 22:38, 10 February 2014 (UTC)[reply]

I got all those messages also. Happens now and then over the last few months. — Maile (talk) 22:40, 10 February 2014 (UTC)[reply]
Yes, I got some WMF servers not available messages just several minutes ago. And some odd database errors. ~ J. Johnson (JJ) (talk) 22:53, 10 February 2014 (UTC)[reply]
According to the Operations team there was a database lockup around 2014-01-11 22:10 UTC - the root cause is still investigated. Sorry for the inconvenience and thanks for reporting it here! --AKlapper (WMF) (talk) 09:25, 11 February 2014 (UTC)[reply]

Forced "https" on wikis

I generally run Wikipedia on "http" because i'm secure about my connection but over the past few days, it seems like wikipedia is now FORCING users to use the "https" format, i even checked my preference thinking "Always use a secure connection when logged in" may have been enabled by mistake but it wasn't..any reason why wikimedia is forcing people to use their https version?, technically http format loads faster, so i always preferred using that..is there any way to get back to using http? This seems be be happening to all wikimedia wikis..--Stemoc (talk) 00:22, 11 February 2014 (UTC)[reply]

  • For many usernames, they have already been forced to use https/secure protocol because the http-mode does not retain the username in some browsers. Otherwise, IP-address users (21% of edits) for months have been able to run and edit in http-mode. Also, some high-security browsers lose the prior edit-buffers due to security restrictions, and so a mistake in editing could lose all changes, unless copied to an external file beforehand. Also, many people fail to omit the "https:" prefix in diff-links (as just "//en.wikipedia"), and that is another reason users keep getting forced back into https/secure protocol. I opposed the whole https obsession at the outset and noted even the U.S. Library of Congress did not support https mode. I keep suggesting to improve important problems, instead, such as auto-merge wp:edit-conflicts to adjacent lines in diff3.c, or LIFO-stack replies, or increase the wp:expansion depth limit to 60 or 90, or increase the Lua-timeout limit from 10 seconds to 30, but instead we get https-mode confusion, or increased mobile-phone support where almost no one edits from cellphones (and really should not be encouraged to phone-edit WP while driving).... -Wikid77 00:59, 11 February 2014 (UTC)[reply]
so they broke it even more instead of fixing it?....so no workaround? https is quite useful for email, online banking and social networking sites as thats where you share private information but seriously; not wikipedia, 90% of users are already incognito..wikipedia should be the last place to have this as its one of those sites which runs on overdrive as it can get millions of views in an hour sometimes, would be even harder for the servers to handle if they run on a secure server...btw, Wikipedia server admins, Wikipedia is created for an INTERNATIONAL market, which means its being used by people from all over the world and not just limited to First world/developed countries where internet speed and internet data is NOT slow and limited respectively..so please think of the poor kids in under developed countries who have no idea why their much beloved wiki is running slow and is constantly getting hit by "wikimedia errors" ..most sites i'm part of have moved on to the the "more secured" https server, but those are "commercial" sites but wikipedia is NOT..so lets keep it that way, free and fast..--Stemoc (talk) 01:30, 11 February 2014 (UTC)[reply]
There is one piece of private information sent with every request you make to Wikipedia while logged in: your login cookie. If someone else intercepts this, they can impersonate you on Wikipedia and make edits using your account (which could lead to you being blocked). If you are aware of this and don't mind that risk, then fair enough. I just want to make sure you are aware of the risks of using an unencrypted connection while logged in. – PartTimeGnome (talk | contribs) 01:52, 11 February 2014 (UTC)[reply]
(edit conflict)Er, even if you feel secure about your connection (to your ISP), how can you feel secure about all the connections between your ISP and Wikimedia? Even if you have convinced yourself that those links are all free from any risk of eavesdropping, the route from your ISP to Wikimedia can change without notice. You can only have a secure connection to Wikipedia by being sat inside a Wikimedia data centre, or by using some form of end-to-end encryption such as HTTPS or a VPN tunnel. (Personally I don't trust the connection to my ISP. I've seen too many roadside telecom cabinets with their doors wide open...) – PartTimeGnome (talk | contribs) 01:52, 11 February 2014 (UTC)[reply]
I have had this discussion before, It hops around 21 IP addresses and i live in a country where no one wants to hack your wikipedia account lol....i'm not on some insecure wifi and there is already an option for those wanting to use the https format in our Special:Preferences, I do not and yet I'm forced to use the https format..I'm aware of the dangers, but its the same as crossing the road, so I should just stop crossing road? not only is https slow, it can also force errors due to its 'slowness' like it did to me when i posted this topic twice...maybe they should just 'enforce' the https format for mobile platform (m.en.wikipedia.org)ONLY as its by far MORE INSECURE than via PC...--Stemoc (talk) 02:16, 11 February 2014 (UTC)[reply]
According to meta:HTTPS you are forced to use HTTPS for login only, not for anything else. The disadvantage of this behavior is currently unclear to me. --AKlapper (WMF) (talk) 13:19, 11 February 2014 (UTC)[reply]
Meta is just wrong. When logging in via http you dont get logged into other wikis, and if you dare to login to those wikis a forceHTTPS cookie is set for all wikimedia domains overriding your preferences. At that point your preferences are useless, the only way to fix that is to manually go into your cookies and delete all of the forceHTTPS cookies, which then logs you out of the other wikis and you start back where you began. Werieth (talk) 13:32, 11 February 2014 (UTC)[reply]
I see. Am I correct that this covered by bugzilla:61048? --AKlapper (WMF) (talk) 13:48, 11 February 2014 (UTC)[reply]
Close but not 100% because a forceHTTPS cookie is set. I think it may have to do with the inability to globally disable HTTPS connections, and that centralauth doesnt log you in via unsecured connections anymore. Werieth (talk) 13:52, 11 February 2014 (UTC)[reply]
I found that at least one of the CentralNotice banners (I forget which; I think that one of them was to do with wmuk:) contains a link that isn't a straight Wikilink, but goes through some intermediate layer that sets the forceHTTPS cookies. But whether you've been clicking those links or not, if you want to use http: for all projects, you also need to disable "Always use a secure connection when logged in" individually at every site that you are likely to visit, such as meta, Commons, etc. --Redrose64 (talk) 17:37, 11 February 2014 (UTC)[reply]

Confirmed http works when https gets 504 Gateway Time-out

Indeed, I have also confirmed how http-mode edit-preview will continue (although very slow) for large pages, when https/secure protocol hits a fatal wp:504_Gateway_Time-out. Thanks again to Stemoc, for the reminder above. By running 25 tests of edit-preview over 2 hours, alternating http versus https, I confirmed how http-mode always worked, while https/secure always cratered with "504 Gateway Time-out" when editing large page "Nash equilibrium" (re: John Forbes Nash, Jr., A Beautiful Mind). In all cases, http-mode worked within 75-95 seconds realtime (2-3 CPU seconds), but the https/secure edit-preview session was always fatal, yet a direct SAVE-changes would work sometimes. Hence, users can edit-preview as http-mode, then redo as https/secure and SAVE to get username in history log. -Wikid77 (talk) 09:04, 11 February 2014 (UTC)[reply]

Files not recognized as such

I realize that this is a small error, but I'd like to get it fixed: there are files showing up at Category:NA-Class Bacon articles when they should be showing up at Category:File-Class Bacon articles. I seem to recall having this problem in the past, but I don't recall what the solution was. Northern Antarctica (talk) Previously known as AutomaticStrikeout 04:03, 11 February 2014 (UTC)[reply]

We have a WikiProject for bacon? {{WikiProject Bacon/class}} has file=no. I don't know the purpose of this but removing it should make the category change you want. PrimeHunter (talk) 04:47, 11 February 2014 (UTC)[reply]
Well, I've made the change, but it still doesn't seem to be working. Maybe it doesn't take effect immediately. Northern Antarctica (talk) Previously known as AutomaticStrikeout 04:55, 11 February 2014 (UTC)[reply]
Actually, I think the problem is fixed. Thanks for your help. Northern Antarctica (talk) Previously known as AutomaticStrikeout 05:00, 11 February 2014 (UTC)[reply]

Switching template colours

Hey. Could someone help me with {{Infobox power station/sandbox}} (sample usage)? The template has 6 options (the six types of power stations). I am trying to use an IF function to do this: For example, if the "Wind farm" section is filled, the template colour (and thus the template header and footer sections) would switch to the colour of the "Wind farm" section (blue). I understand that we would then have to put in the name of the colour in two places (which won't be an issue). Thanks! Rehman 13:36, 11 February 2014 (UTC)[reply]

Can you help in putting that in? I'm not really familiar with that... Rehman 14:13, 11 February 2014 (UTC)[reply]
{{Infobox power station}} shows a lot of parameters but I don't see one for type. Is the template supposed to say "If at least one parameter in the wind farm section is used then it's a wind farm", and so on? It appears so from the current code where I see stuff like {{#if:{{{wind_turbines|}}}{{{turbine_manu|}}}{{{turbine_model|}}}{{{hub_height|}}}{{{rotor_diameter|}}}{{{rated_wind_speed|}}}. And that's only to determine one of the types. It looks like it would require a giant test in two places to choose the color. Maybe it would be simpler and more stable to make a subtemplate and pass all parameters there, plus one extra parameter which deduces the type. Then the subtemplate could also use that parameter instead of its own type determining code. But it seems the template should have been designed with a type parameter from the start. PrimeHunter (talk) 14:17, 11 February 2014 (UTC)[reply]
Exactly, it's the case of "If at least one parameter...". The series of code you quoted is basically an IF situation where the section label gets generated if one of those is filled. Is it possible for the colour function to be derived directly from these labels instead? Hope I make sense. Rehman 14:43, 11 February 2014 (UTC)[reply]
  • Okay, now that I've actually had a moment to look the template over, why not add a |station_type= which would allow you the ability to simply change the header/footer color to {{#switch:{{{station_type}}}|wind=this color|solar=this color|...=...|#default=orange}}? — {{U|Technical 13}} (tec) 16:06, 11 February 2014 (UTC)[reply]
There should be no pipe right after #switch:. The challenge is the template already has 1874 transclusions. We would need a bot (or very patient editor) to go through them and add the right type parameter. PrimeHunter (talk) 16:29, 11 February 2014 (UTC)[reply]
  • I have AWB and no issue with updating the transclusions one at a time once the new code is tested. — {{U|Technical 13}} (tec) 19:24, 11 February 2014 (UTC)[reply]

RFC: Should display equations be centered?

Following The update to the MathJax code means MathJax display equations are now centered as opposed to Texvc equation and previous versions which were left-aligned. Should display equations be left-aligned, centered or configurable using the displaystyle feature? An WP:RFC has be started at Wikipedia talk:WikiProject Mathematics#Should display equations be centered?.--Salix alba (talk): 14:28, 11 February 2014 (UTC)[reply]

Images

For some reason every image I upload is now on my watchlist. I've removed them but can somebody tell me how to disable it, I can't seem to find the box in my preferences.♦ Dr. Blofeld 14:33, 11 February 2014 (UTC)[reply]

@Dr. Blofeld: Special:Preferences#mw-prefsection-watchlist, uncheck "Add pages I create and files I upload to my watchlist". Matma Rex talk 14:57, 11 February 2014 (UTC)[reply]
The odd thing is that none of the boxes are ticked yet the images I uploaded today all went on my watchlist..♦ Dr. Blofeld 15:14, 11 February 2014 (UTC)[reply]

Template:Edit protected script errors..

Template:Edit protected/testcases

@Jackmcbarn and Mr. Stradivarius: since I know you've both worked on this project. For some reason, most transclusions of this template are returning script errors. I tried reverting the most recent edit to the module but it either hasn't gone through the job queue yet or didn't fix it. Could you guys, or any other skilled Lua coder look into this please? Thanks. — {{U|Technical 13}} (tec) 19:49, 11 February 2014 (UTC)[reply]

I don't see any script errors, so I've reverted the revert for now. Where did you see them at? Also, I've un-transcluded /testcases here, as it caused this page to be miscategorized. Jackmcbarn (talk) 20:32, 11 February 2014 (UTC)[reply]
  • The /testcases I transcluded was all script errors except for two instances, so maybe my undo fixed it once it made it through the job queue. Should know for sure if the errors show back up from your revert. I'll just screenshot it next time. Otherwise I've no idea what caused it, how to replicate it, or how to fix it.  :) — {{U|Technical 13}} (tec) 20:38, 11 February 2014 (UTC)[reply]
    If you do see it somewhere else, make sure you click on "Script error" and include the information that comes up in the screenshot. Jackmcbarn (talk) 20:45, 11 February 2014 (UTC)[reply]
    To see the effect of a change to a script or template without waiting for the job queue, purge the test cases page. (I have done so, and it still looks fine to me.) You can also preview the effect of a script/template change without even saving the change, using the "Preview page with this template (what's this?)" options beneath the edit area. – PartTimeGnome (talk | contribs) 21:34, 11 February 2014 (UTC)[reply]
@Jackmcbarn: The only one that I saw was at about 20:12 today, on Talk:Cat Creek, Montana#Edit request - it showed Script error where the {{Edit template-protected|ans=n}} template should have been displayed. I didn't click it; but I did notice that it had put the page into Category:Pages with script errors. A WP:NULLEDIT fixed it. --Redrose64 (talk) 22:23, 11 February 2014 (UTC)[reply]
  • I see a lot of talk pages transcluding Module:Protected edit request in Category:Pages with script errors, but it looks like the script errors have all disappeared now. The edit that Technical 13 reverted was made too early (7 Feb) to have been the cause. The script has definitely been working since then, as I have been answering edit requests without seeing any script errors. It's most likely to have been caused by a server-side change, but I can't see anything likely in the roadmap either. — Mr. Stradivarius ♪ talk ♪ 00:06, 12 February 2014 (UTC)[reply]

Edit Protecting

I'm new to Wikipedia. How do i edit protect my personal page so it can't be vandalized? — Preceding unsigned comment added by Justmephotography (talkcontribs) 22:36, 11 February 2014 (UTC)[reply]

@Justmephotography: You can't. What you can do is file a request at WP:RFPP, but unless you can demonstrate that there is a genuine need for the page to be protected, I suspect that they will decline the request. --Redrose64 (talk) 23:24, 11 February 2014 (UTC)[reply]
Actually, user pages are usually protected on request by the user; you don't need a special reason. User talk pages, on the other hand, are almost never protected. The relevant policy is at WP:UPROT if anyone wants to see it. — Mr. Stradivarius ♪ talk ♪ 23:34, 11 February 2014 (UTC)[reply]