Jump to content

Wikipedia:Reference desk/Archives/Computing/2018 November 28

From Wikipedia, the free encyclopedia
Computing desk
< November 27 << Oct | November | Dec >> Current desk >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


November 28[edit]

Looking for an online quote generator[edit]

I've been googling for about 90 minutes by now to find an online quote generator where you can upload your own photo and input your own quote, to no avail. It's because I'm looking for pretty much this authoritative look: [1] or [2] But all I can find are online generators for ugly modern Instagram equivalents of kitschy postcards like this: [3] The closest to that above design that I'm looking for are motivational poster generators that make this: [4], but that's not really too close either.

So, is there any online quote generator for the simple design I'm looking for above? Or do I really have to use a desktop application like Poopshop? --46.93.158.170 (talk) 00:11, 28 November 2018 (UTC)[reply]

1. Open Microsoft Paint which is a simple raster graphics editor that has been included with all versions of Microsoft Windows under "Accessories".
2. Click File --> Open... --> <Filename of your photograph>
3*. Click Image --> Attributes... Note the present Width of your picture. Replace the Width value with double that value.
4. Click OK
5. Click on the text button (the large "A")
6. Move the cursor and click in the blank area created at the right of your picture.
7. Enter the text of your quote.
8. Click File --> Save As... Choose a File name and Folder to save your edited image.
  • The Image tab also has functions for Rotating, Resizing and Cropping the image if needed. DroneB (talk) 13:47, 28 November 2018 (UTC)[reply]

Cure of hemiparesis[edit]

What is the state of research on treatment of hemiparesis by stem cell therapy or surgery.if a person has congenital left hemiparesis since say over 33 years and if he can walk,run and even move things and leading a normal life like completing education and doing job is it possible is it possible that with current trends of research he will get complete cure by medication/surgery/stem cell therapy/neurogenesis in his lifetime.Is research progressing in this direction that he will get a complete cure in his or lifetime with state of research in 2018.Wrogh456 (talk) 14:49, 28 November 2018 (UTC)[reply]

@Wrogh456: IMVHO almost everything is possible - but my supposition is no way better or more accurate than yours. Anyway, even if it is possible in general it doesn't necessarily mean it will be possible (or available) to specific person. And even if it is possible to specific person in some predictable time, it by no means guarantees it will actually happen. So any opinion you might get here is essentially useless. I strongly advise you to see a specialist, who has enough experience and estabilished knowledge to give you at least an educated guess. Wikipedia is not a right place for getting medical advice or even predictions. --CiaPan (talk) 16:13, 28 November 2018 (UTC)[reply]

Didn't the person who wrote worlds first compiler have to, well, compile it somehow? Did he compile it at all, and if he did, how did he do that?[edit]

This was blanked from the Misc desk. Andy Dingley (talk) 16:16, 28 November 2018 (UTC)[reply]

Didnt the person who wrote worlds first compiler have to, well, compile it somehow?Did he compile it at all, and if he did, how did he do that? 46.238.248.116 (talk) 09:06, 28 November 2018 (UTC)[reply]

You have a misconception. The first thing called a compiler was written by Grace Hopper, a woman. This is also a loader, required to get the program into the computer off whatever it was stored in (card, tape, drum). Initial software would have been written in machine code or assembly language. Graeme Bartlett (talk) 11:00, 28 November 2018 (UTC)[reply]
  • Compilers pre-date Grace Hopper's work by about 5 years. One of the first of the tools of relevance here would be Autocode (1952 or 1954, Manchester). It's arguable as to whether it's truly a compiler or not (probably not), but I think it's closer to the question you're looking for.
To write a compiler, you need at least two things: a design for the source code language that humans will work with (in the future) and secondly a set of software tools which can allow you to program that compiler and make it .
The language for the first Autocodes was a crude model of the hardware itself - a text-based representation of the computer's own machine code (not even assembly language). This didn't represent much of an invention, as it was little more than this - and ferociously difficult to work with, as these first computers were so damned awkward in their operation, to save every valve and circuit. This language (Manchester at least) pre-dates the compilers, as it's little more than the "annotated machine code" which had first been written by pencil in notebooks.
The Autocode compiler was written in much the same language as it would eventually compile. This is a bizarrely early example of "bootstrapping" (there's a whole article on your core question). The first version of it was hand-translated into the literal codes to work the machine, but after that was done, the translator program itself could be used to translate them again, by itself.
Autocode (1952) invented one of the lasting principles of software development, which continues to this day: software tools that are obscure and are only ever used by their sole inventor. It was simply too awkward to use, and offered very little advantage over translating the program by hand. Tony Brooker's 1954 Autocode 1 (another tradition, of geeks counting from 0) was much more useful and widely used. It introduced the idea (in a very simple way) of hardware abstraction: rather than being a language that exactly represented the hardware, it offered a simpler and more consistent abstract representation of the processor. This made programming easier and so this language was both adopted and seen as an inspiration for where future work should go.
At this time, Hopper had been busy with systems in the A-0 series. These were early and they were important, but they were more like subroutine management tools (linkers and loaders in later terms) and weren't compilers as such (Arguments over this definition will, of course, be infinite). Especially as the word "compiler" was coined by Hopper's team.
Efforts now shifted to the tools we'd recognise as assemblers. These represent some abstracted version of the processor in their source code, but it's a very low-level one, like Autocodes'. They aren't much "easier" to use, for turning an abstract problem into a program, but they're far less error prone. Their function is mostly to remove rote work and to hide the annoying vagaries of a processor, such as bit flags within the processor hardware changing their meaning according to the operation performed. As multiple computers appeared, they could also be used to hide the development differences between them. Assemblers also developed macro and subroutine features, making programming more productive.
There's now a split in language development. The hardware people want to get the most from their computers (but without making errors), so they focus on assemblers. A new group appears, the programming language designers. They want to make high-level languages, which will make programming fundamentally more abstract. That has several virtues: it's easier, it's closer to the problem domain, it's more portable to convert the same program to run on totally different machines and it avoids the need to even know how the target machine works, or how long its machine words are. We now see the first real compilers and compiled languages that are "high-level languages" and clearly distinct from assemblers. Hopper works mostly with this group. Backus at IBM invents FORTRAN, Hopper's work with CODASYL produces COBOL and the first databases.
Bootstrapping has fallen from favour. Compiler writers write in assembler, because assembler is efficient and can best exploit the new hardware. "Languages" are giant things, invented by committees, and one day there might be as many as half a dozen different ones! Language development stagnates a bit: compiler writers mostly work for hardware companies and are trying to make more efficient hardware implementations of the same languages. PL/I appears from IBM as a software emulation of the Ford Edsel, and is intended to solve everyone's programming problems, forever. Over in Europe, ALGOL happens. Language development for practical or commercial applications really stagnates through the early '60s. Anyone interested is fiddling with Lisp and no-one outside that world know what they're doing. The money-money is going to making mainframes able to run businesses, by building better COBOL implementations which can access larger magnetic disc and tape hardware. The US military-aerospace are using hand-tweaked assembler and the beginnings of JOVIAL to fly around Vietnam and the Moon.
In the late 1960s, C appears. This grows out of cheaper minicomputer hardware and university work with BCPL (UK) and B (USA). Bootstrapping is back! The two teams of assembler-based, hardware-funded compiler writers merge with the language designers. We see a series of low-level programming languages appear (unfortunately one of them is PL/M). The idea of these is to provide a simple programming language where the programmer writes for a simple abstract machine that is very low-level (machine words and pointers are explicitly visible artefacts for the programmer) but this machine is also 'virtual': it works the same on every computer and it hides the memory-management limitations of the real hardware. So although the first compiler for a language might still need to be written in assembler, very soon the compiler can be used to bootstrap itself. After that, the compiler bootstraps itself again for each new improved language version. Your COBOL compiler will also be written in C - a tradition which is still pervasive in Unix distros today.
In the mid-1970s, we see another focus on hardware development. LSI and then VLSI electronics make computer development easier and so there's a flurry of new minicomputer and then microcomputer makers. Bootstrapping C onto them is a universal task (everything starts to run C) but there's also interest in cross-compiling, especially for the microcomputers with limited storage. A big computer that already has a compiler has its code generator modified slightly (same input language) and then makes programs which can be run elsewhere.
By now it's easy to make compilers. Everyone studies the Dragon Book at college. We even see domain-specific languages making an impression. Before long (late '80s) there are Too Many Languages (all nearly identical in structure and abilities) and the problem becomes how to find time to learn each new one, as each new manufacturer foists a new one upon developers, usually for very little advantage. It has calmed down a bt since, but it is a periodic fashion that "everyone needs their own new reinterpretation of the big new idea". Andy Dingley (talk) 16:17, 28 November 2018 (UTC)[reply]

See History of compiler construction. Bubba73 You talkin' to me? 16:34, 28 November 2018 (UTC)[reply]

That seems to be more about metacompilers though (aka compiler-compilers), rather than the "executable code" problem that's the root here. Andy Dingley (talk) 16:47, 28 November 2018 (UTC)[reply]
Hardware
The very first electrical computers (there were earlier, purely mechanical computers, and "computer" was once` a job title for a human) were "programmed" by tearing apart the existing computer and putting the parts together in a different way to do something new.
The next improvement was to use switches and plugs, which made it easier to put the parts together in different ways.
Then came data, and later programs, stored on paper tapes and punched cards, which were already in use (without computers) to control looms, player pianos, and for storing census data in the late 1890s.
The first electronic computers that ran a stored program were in the late 1940s. As time went on different ways of storing programs and data were invented from the early drum memories to modern SD cards.
Software
As I described above, the first "programs" were actually hardware. Eventually the programmer was able to write each instruction as a number that the hardware could execute. This is known as "machine language". An improvement was the use of a human readable list of instructions that were converted to machine language (SUB -- for subtract -- instead of "00101110"). This was known as "assembly language", and I still write programs in it today.
The first compilers and interpreters -- programming languages where one instruction could create multiple machine language instructions -- were written in the 1950s in assembly language. The very first compiler was developed in 1952 for the Mark 1 computer at the University of Manchester.
The first self-hosting compiler (excluding assemblers) was a Lisp compiler written in Lisp 1962. Basically they had to write a Lisp Compiler in assembly language (with testing done using an existing Lisp interpreter), then mostly in Assembly with a small part in Lisp, and so on, building it up until it could compile its own source code. --Guy Macon (talk) 17:50, 28 November 2018 (UTC)[reply]
Humans were termed "computers", rather than "calculators", because the word is from the Latin computare, CUM ("with") + PUTARE ("think"). "Calculate" derives from calculus, "a small stone". 81.139.245.109 (talk) 08:40, 30 November 2018 (UTC)[reply]
A few corrections. Zuse's Z3 (computer) from 1941 was programmed using holes in tape, and the Z4 (computer) had a 'program construction unit' for putting programs onto a tape. The first compiler was for FORTRAN in 1957, and there were some pretty powerful autocodes before then. Dmcq (talk) 21:26, 5 December 2018 (UTC)[reply]