Jump to content

Wikipedia:Reference desk/Archives/Computing/2012 September 24

From Wikipedia, the free encyclopedia
Computing desk
< September 23 << Aug | September | Oct >> September 25 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 24[edit]

Google book text[edit]

I can't view text using Firefox, but can with Chrome. Is this due to some setting or is it part of Google's plans for world domination. Clarityfiend (talk) 07:43, 24 September 2012 (UTC)[reply]

I viewed text in books on Google books using Firefox 5 seconds ago. Without knowing anything about what specifically you mean by 'I can't view text using Firefox', there's no real way we can say what your specific problem is although the later is unlikely. Nil Einne (talk) 09:08, 24 September 2012 (UTC)[reply]

Reports of similar issues on various support websites suggest it is a problem with javascript not being interpreted properly. The solution seems to be to create a new profile in Firefox AvrillirvA (talk) 10:45, 24 September 2012 (UTC)[reply]

Resetting worked. Thanks. Clarityfiend (talk) 22:22, 25 September 2012 (UTC)[reply]

Sending large data files[edit]

Thanks to everyone involved. Good effort. Globalistcontributor (talk) 05:10, 26 September 2012 (UTC)[reply]

I have huge files that I want to send over but gmail caps attachments at 25 MB. I try to compress it by using Winrar but it's still 2 gigs. What's the easiest way for me to transfer the files? (Assume very low computer literacy on my part.) Thanks reference desk. Globalistcontributor (talk) 15:24, 24 September 2012 (UTC)[reply]

You'll probably have to upload it to some site intended for purposes like this. A (rather old, but still relevant) Guardian article that discusses this is here. -- Finlay McWalterTalk 15:30, 24 September 2012 (UTC)[reply]
We have a list of such services at Category:Email attachment replacements. Most of these are intended for people without a high level of technical knowledge, so they should be fairly straightforward to use. -- Finlay McWalterTalk 15:33, 24 September 2012 (UTC)[reply]

Mediafire. You would need to split the archive into 195MB parts as Mediafire has a 200MB per file limit (195MB to be on the safe side). WinRAR has this option under the "Split to volumes" section of the compression dialogue box. Type 195 into the box and change "B" to "MB". It will output ~11 parts which can then be uploaded to Mediafire, then downloaded and recombined with WinRARat at the other end AvrillirvA (talk) 15:40, 24 September 2012 (UTC)[reply]

  • Well, it really depends on where you want to send it. If the target site supports FTP it can be done directly -- or if you have two Windows machines and you can enable file sharing. Looie496 (talk) 15:55, 24 September 2012 (UTC)[reply]
    • Second Looie496's advice . If you need to send large files then its best to use the right tools for the job. FTP is well worth becoming familiar with. As you're a self proclaimed low computer literacy user there is no point in me saying how easy it is to do on linux. However, there must be by now some windows guide-lines to using FTP client servers. Discus this with the person or organisation your downloading from. After all, their offering these big files and so some of their target audience must experience the same of the same problems you encounter. They may be able to offer step by step advice (like, can we stick it on a DVD and post it to you - a god send for some).--Aspro (talk) 16:14, 24 September 2012 (UTC)[reply]
I would use a cloud storage site that lets you share folders with other people: Google Drive lets you send files up to 5GB in size for free. DropBox does up to 2GB for free. SkyDrive is 7GB. They are all very easy to use -- install software, copy attachment to special "sharing" folder, use interface to share the folder with someone else. --Mr.98 (talk) 17:37, 24 September 2012 (UTC)[reply]
A very simple option is also to put the file on a USB stick and send it via sneakernet. --Stephan Schulz (talk) 18:06, 24 September 2012 (UTC)[reply]

2GB is quite a large file even by today's standards. On a typical home ADSL connection it might take 10+ hours to upload. USB stick or DVD burner might well be a more practical approach. 69.228.171.70 (talk) 19:52, 24 September 2012 (UTC)[reply]

I second the above two posts. I work in IT for a company with a global WAN and we still quite frequenly burn DVD and post them when we want to send gigs of data. Vespine (talk) 00:24, 25 September 2012 (UTC)[reply]

Honestly, it's sad but true that a USB thumbdrive and the postal service is still the best solution for sending very large files. HominidMachinae (talk) 01:08, 25 September 2012 (UTC)[reply]

I'm still not sure I agree. Dropbox and Google Drive and are pretty straightforward. Yeah, if you have a slow connection, it's as fast as that's going to be. But it'll get there. On my home cable internet connection 2GB takes a couple of hours and absolutely zero labor on my part with Dropbox. Putting stuff in the mail takes time, postage, drives, disks, whatever, days of transit. --Mr.98 (talk) 01:47, 25 September 2012 (UTC)[reply]
Skydrive used to have a per-file size limit. Does it still? Shadowjams (talk) 03:10, 25 September 2012 (UTC)[reply]
That's a good question; I haven't used it before, though I've used the others. --Mr.98 (talk) 17:07, 25 September 2012 (UTC)[reply]
Remember, ADSL is slower uploading that downloading, often ten times or more slower. A 16Megbit/sec down, 1 Megabit/sec up will download the file in about 30 minutes, closer to 400 minutes (8 hours) to upload. It may also (for technical reasons), make downloading anything difficult at the same time. CS Miller (talk) 03:13, 25 September 2012 (UTC)[reply]
A populair solution in NL (but available international) is http://wetransfer.com/ - it will handle files up to 2GB. (just be sure to zip 'm, even if they're in an almost incompressible format like PSD or MP4. I don't know if it's server or user error, but sometimes we receive truncated files, and the CRC32 checks of Zip help to inform you that it's the file that's broken, not just your local application being caught in some weird version/codec mismatch). Unilynx (talk) 17:28, 25 September 2012 (UTC)[reply]

Fixing history errors in svn repository[edit]

Is there a way to fix history errors in SVN? I am in the process changing source control system from RCS to SVN. The process involves using a shell script to build a temporary file containing a list of all the changes made under RCS, and another shell script that checks out each version from RCS and adds it to SVN, and thereby keeping the RCS revision history. The process has been largely successful and over 30,000 revisions have be moved to SVN. Unfortunately, a few revisions have been skipped due to various minor errors such as SVN reporting that it could not be checked in for some reason. So for example, file xyz.c had revisions 1.1, 1.2, 1.3 and 1.4 under RCS but under SVN the check in of 1.2 failed and the script simply continued with revision 1.3. That resulted in the SVN history combining revisions 1.2 and 1.3 into a single change. Is there a way to now insert revision 1.2 in its correct place in the history, without deleting the SVN repository and starting again (the process is pretty darn slow so I would really prefer not to have to start all over again). Astronaut (talk) 17:41, 24 September 2012 (UTC)[reply]

Somebody else may be able to give you a better answer, but my opinion is that you're getting into some deep voodoo here. You can try to read the relevant chapter of the SVN book (http://www.visualsvn.com/support/svnbook/reposadmin/), but it probably won't help. SVN is built around a Berkeley DB, and trying to make low-level changes to a database is generally best avoided unless it is absolutely necessary. Looie496 (talk) 18:43, 24 September 2012 (UTC)[reply]
I apologise for diverting your question, but I earnestly hope to save you as much suffering as possible. Subversion really isn't something anyone who has a choice is migrating into. Mercurial will let you convert straight from RCS/CVS format (details), as will Git (details), Bitkeeper (details), Perforce (details), and TeamWare (details). SubVersion may seem modern compared to RCS, but you're missing out on 15 years of real improvements and all but guaranteeing yourself another complicated conversion in a few years. ~`84.93.190.89 (talk) —Preceding undated comment added 02:22, 25 September 2012 (UTC)[reply]
Unfortunately, a corporate descision which out of my hands. Astronaut (talk) 09:02, 25 September 2012 (UTC)[reply]
For what is it worth, we did a professional evaluation of several version control systems about 2 years and ultimately chose SVN as the best modern option for our project. Of course, it is probably worth noting that we had unusual performance requirements (GBs of text) that led to popular options like Git and Mercurial being ruled out due to poor scaling at very large sizes (at least at the time of our evaluation). Anyway, just want to note that SVN might actually be the best tool for some jobs depending on your requirements. Dragons flight (talk) 20:04, 25 September 2012 (UTC)[reply]

Changing a value in Excel in the same cell[edit]

I am trying to input data that uses different units of measurement. I know that I can convert that data in a different cell using a formula (like if I wanted to input 12 inches in to cell A1, I could have cell A2 say =(A1*2.54) to give me the appropriate value in centimeters). I was wondering if there was a way to put 12 in to cell A1 and have cell A1 spit back out 30.48 without having to use a formula that references other cells. Livewireo (talk) 19:51, 24 September 2012 (UTC)[reply]

That may not be a good idea, as you are likely to lose precision in some conversions, and that behavior might be confusing, if you don't recall how it's set up when reusing the spreadsheet later. StuRat (talk) 20:37, 24 September 2012 (UTC)[reply]
Its a short term project, I don't really care if I don't remember what needs to change in 6 months. Just wanted to see if it was an option. Livewireo (talk) 20:46, 24 September 2012 (UTC)[reply]
My guess is this might not be possible. Formulas start with =. I don't think you can enter "data" into the same cell without breaking the formula. What's the problem with just using a second cell as the input? Vespine (talk) 00:20, 25 September 2012 (UTC)[reply]
You could do this, but IMO it would be more trouble than it's worth. What you'd need to do is use VBA to have the worksheet check whether given cells are updated, and then run the formula on them, and then replace the value of the cell in question. I haven't tried this myself but it's probably the Worksheet_Change event that you want. See here for some examples. What you'd want to do is first see, is Target the right cell? And is so, what should it be changed to (the formula)? And then have it just change the value of the Target with the new one. --Mr.98 (talk) 01:52, 25 September 2012 (UTC)[reply]

No internet access[edit]

The little yellow triangle with the exclamation mark has just popped up on the network icon telling me I have no network access. However, as I am posting this I obviously still have access as do the other computers on the network. Bringing up the "Network and sharing centre" shows an X between the network and the Internet. Any ideas as to why this would be. I've tried several sites and have no problem getting to them. Using W7. CambridgeBayWeather (talk) 20:07, 24 September 2012 (UTC)[reply]

Are you connected to a wireless router instead perhaps? ¦ Reisio (talk) 20:17, 24 September 2012 (UTC)[reply]
(edit conflict) Sounds like you have two network interfaces, maybe a wireless and wired connection. One is the one you're using, the other is the one that's complaining. Check out the network interfaces in the control panel. --NorwegianBlue talk 20:22, 24 September 2012 (UTC)[reply]
Windows periodically connects to a site (part of microsoft.com) to verify whether the local network you're on has connectivity to the internet as a whole. If you're connected to a local network, but it can't reach that microsoft site, windows shows that yellow triangle. If, as you say, you can see the internet as a whole, but you're getting the yellow triangle, that's a false report. The most common cause for that is some security software (on your PC or on a router), which has mistaken the automatic periodic connections to Microsoft for something bad, and has blocked them. -- Finlay McWalterTalk 20:26, 24 September 2012 (UTC)[reply]
Microsoft calls this feature Network Connectivity Status Indicator; the associated machines are www.msftncsi.com and ncsi.glbdns.microsoft.com. -- Finlay McWalterTalk 21:08, 24 September 2012 (UTC)[reply]
  • (ec) A bit of googling indicates that this is a rather common problem with W7 -- basically what is happening is that Windows is unable to detect the network even though the network is present. An answer on a Microsoft support site suggests that the first thing to try is powering off your router and then (after a pause) powering it up again. An answer on a different site suggests that the problem can sometimes be caused by antivirus software. Looie496 (talk) 20:29, 24 September 2012 (UTC)[reply]
Seems to have sorted itself now. There are no wireless networks for 2.5 km (1.6 mi) and I can't pick them up. One thing I forgot to mention is that the other two computers are using XP. I suspect that the answer lies with either Finlay McWalter or Looie496 answers. Thanks all. CambridgeBayWeather (talk) 20:57, 24 September 2012 (UTC)[reply]

In Windows (7), how do I ensure that I always have enough resources to pull up the task manager and kill a process?[edit]

It seems that the operating designers were too short-sighted to never impose any sanity checks on resource management. Priority-control doesn't solve my problem, and then I have to adjust it for each process. Are there any mods I can use to ensure that the operating system or window manager always has 5% CPU power and some leftover memory to kill any renegade process? It always takes absurdly long if some process goes out of control and consumes 100% CPU power (like Adobe Flash). 71.207.151.227 (talk) 22:33, 24 September 2012 (UTC)[reply]

Wow, they still haven't solved this problem yet ? I don't think I will upgrade to a new Windows O/S until they do. StuRat (talk) 02:06, 25 September 2012 (UTC)[reply]
I gess it’s kind a relative thing… for my work I prefer the system use all its resources and non less until it finish the work… -> I find myself a lot more olften hating to wait for a render and seeing the cpu use at 80% than having to wait and seeing the cpu use at 100% in spite the ‘crashing risks’ of my system…
It’s thomething like if the homework it’s not done yet? Where is he other 20% of the CPU? fooling around?
If your particular problem is with some program from adobe, you can configure it in the preferences menu, in the “performance option” here you can select the amount of resource this adobe aplication will be able to use — Preceding unsigned comment added by Iskander HFC (talkcontribs) 14:31, 25 September 2012 (UTC)[reply]
Windows has a built-in user-interface to set task priority: Windows Task Manager. Full instructions can be found from Microsoft. On Unix, we use nice for this purpose; there are, of course, many alternative methods. Nimur (talk) 00:10, 26 September 2012 (UTC)[reply]
However, my impression is that those priorities only apply when a program is "playing nicely". A program with a memory leak or some other problem just seems to ignore the priority system and lock up the entire computer. StuRat (talk) 00:27, 26 September 2012 (UTC)[reply]
Windows NT has always been a preemptive multitasking operating system so your 'impression' is mistaken when it comes to CPU usage. Memory usage is of course more complicated. Of course since IIRC you were using Windows 9x based OSes up until 2 or 3 years ago, and these do use cooperative multitasking this may explain your confusion, but this hasn't been something most of the world was doing. Nil Einne (talk) 08:17, 26 September 2012 (UTC)[reply]
My Windows XP computer still regularly locks up, with no hope of bringing up the Task Manager. Are you saying this no longer happens in Win 7 ? StuRat (talk) 08:23, 26 September 2012 (UTC)[reply]
My comment was solely a reply to what I indented to i.e. where you said
However, my impression is that those priorities only apply when a program is "playing nicely". A program with a memory leak or some other problem just seems to ignore the priority system and lock up the entire computer.
It's not possible for a program to 'ignore' the priority system since it's enforced by the kernel, the priority does not rely on programs playing nicely (well except to the possibility of programs taking advantage of bugs in the kernel and the fact that a program with sufficient permissions can change its own and other priorities), that was a concept abandoned many, many years ago.
Nil Einne (talk) 09:19, 26 September 2012 (UTC)[reply]
BTW as for the problem you refer to, I have experienced that sort of thing in the past but not for a long time and it seems to be something quite difficult to search for. So while I can't give a definite answer of the problem I think it's something like this.... On Windows 7 Task Manager is started with high priority when you use Control Alt Delete and IIRC it was the same for Windows XP. Most programs start with normal priority, but of course as I mentioned it's possible for an application to start with a different priority if it has sufficient permissions (or maybe you will change it yourself). In the case of another high priority application let alone a realtime one, the problem isn't surprising since Task Manager has to compete with a equivalent or higher priority application for CPU time. It's also a bit unclear to me if every step in the process of going from pressing Ctrl Alt Delete to starting the Task Manager is given high priority, [1] seems to suggest on at least on Windows Vista and 7, Explorer is used and I presume this is the default explorer process which is normal priority; meaning the problem is not surprising even if the program is normal priority (let alone higher). In that case using Contrl Shift Escape to start Task Manager could be better.
The other possible problem is RAM. Windows as with most modern operating systems isn't designed to waste RAM, so it doesn't keep RAM free for no reason, so unless you either screw up your Windows config or have a very odd computer, most RAM will be used, at least for cache. If a program is using a lot of memory simultaneously, then when another program needs to start like Task Manager, Windows needs to find RAM for it, either by dumping cache (which is fast but may have already been mostly dumped) or by paging out what's used by some other program, and I'm not sure that Windows even necessarily gives that much higher priority to dumping cache instead of paging. As I hinted at earlier, there is of course the possibility the problem is neither memory or CPU but something else, I had problems in the past where faulty video drivers or something interacting with certain programs cause the computer to become very unresponsive.
However from my experience on a decent computer (i.e. if you aren't borderline on RAM and don't have very dodgy drivers), CPU is the most common problem, hence why as I indicated the problem isn't something I experience much anymore. On a multicore computer provided the problematic program isn't maximise all cores your computer provides, there should still be something available for Task Manager and whatever it takes to start it. It may also be that the interrupt handling for control alt delete has improved on Windows Vista and 7; but I did find things improved quite tremendously even on Windows XP x64 with a multicore computer compared to a single core one. Just to emphasise this doesn't mean the Windows preemptive multitasking isn't working properly. If you do actually get to Task Manager and reduce the priority of the problematic program to low, you may see how much of a difference it makes.
Nil Einne (talk) 10:15, 26 September 2012 (UTC) [reply]
The thing is, process priority controls only one thing: if all things are equal, and all processes in consideration are runnable, the one with highest priority gets to run. But that scheme is easily (even if not-intentional) subverted: a process doing a lot of reads will easily evict data from other processes out of the cache, which has dramatic consequences on any non-realtime OS: even if your task manager is high priority, if its data is not in memory by the time it runs it will have to wait, giving time for lower priority processes to run and cause all kinds of mayhem (especially if they're in a memory allocation loop, and keep allocating swap, occupying the I/O system and still destroying available memory). Task manager doesn't bother to 'lock' its memory pages to stay alive and avoid this issue, and even if it did, the GUI doesn't and would still be affected. It's the primary reason why a runaway Linux or OSX system is easily fixed through ssh - the shell's fractional memory pressure and limited dependence on other processes makes it easy to use them to get the attention and kill the runaway processes. (Unfortunately, this is hard to reference - priority inversion describes part of the problem, but not why a runaway process not directly involved in the system overload, can still manage to grind the system to a halt). Unilynx (talk) 20:14, 26 September 2012 (UTC)[reply]
Whoops looks like I'm mistaken about Windows 9x as per our articles. It does use preemptive multitasking for 32 bit processes. 16 bit processes do use cooperative multitasking. Windows NT variants that support 16 bit processes also generally run them as 1 process with cooperative multitasking, but it's possible to runthem as seperate processes. Nil Einne (talk) 09:13, 26 September 2012 (UTC)[reply]
Thanks for the info, Nil. It's quite apparent that the switch to preemptive multitasking does not, in itself, solve the problem, for many of the reasons you and Unilynx listed, although this switch may be necessary (but not sufficient) to stop a PC from locking up as a result of a "rogue process". StuRat (talk) 01:17, 27 September 2012 (UTC)[reply]
To be more precise, all implementations of Win32 are preemptively multitasked (going back to Win32s on Windows 3.1) and all implementations of Win16 are cooperatively multitasked. It's effectively baked into the specification. All Win16 applications run in the same address space; they're more like Win32 DLLs than like Win32 EXEs. But you can run more than one Win16 instance, each with its own address space, and preemptively schedule them. -- BenRG (talk) 15:56, 27 September 2012 (UTC)[reply]
I don't know if this will help, but you can use the AutoHotkey script from this thread to kill rogue applications more quickly. -- BenRG (talk) 15:56, 27 September 2012 (UTC)[reply]

Compaq Evo Thin Client T30[edit]

There's a Craiglist ad in my area of a guy selling a Compaq Evo Thin Client for $2. What is this thing and what can I do with it? Are there any modifications I can do on it to turn it into something useful for a residential home environment? Acceptable (talk) 22:37, 24 September 2012 (UTC)[reply]

Assuming it's the later T30 (not the T20) people have managed to get Linux on it [2]. But it was as underpowered as could credibly do anything when it was built, 7 years ago, so this is strictly the kind of project people do for its own sake. -- Finlay McWalterTalk 22:47, 24 September 2012 (UTC)[reply]
As to what it is: it's a thin client with a Geode processor and a pittance of RAM and flash memory. It was just enough to run a Citrix Metaframe client. In its native condition it's useless now. -- Finlay McWalterTalk 23:08, 24 September 2012 (UTC)[reply]