I'm making a small game where you have to guess if the answer is cp1 or cp2 and I want to give the user some sort of hint. One way of doing that I think is by flashing the answer for a nano or milisecond to user, i.e. a new window would open with a : . Any ideas as to how to do this?
[...] flashing the answer for a nano or millisecond to user [....] in a new window [...]
A millisecond is too short (both for the human player -read about the persistence of vision- and for the Python interpreter); your screen is probably refreshed at 60Hz. Consider flashing for at least one tenth of a second (and probably more than that, you'll need to experiment, and you might make the flashing delay or period configurable). How to do that depends upon the widget toolkit you are using.
If using something above GTK, you'll need to find the Python binding to g_timeout_add, see also this.
If you use something above libSDL (e.g. pygame_sdl2), you need something related to its timers.
There are many other widgets or graphical frameworks usable from Python, and you need to choose one (look also into PyQt). Each of them has its own way to deal with timing, delays, windows, graphical display of text inside a window, etc...
If your system is Linux, see also time(7) for a general overview of time related things. Event loops (like those in graphics libraries) are built above a multiplexing system call such as poll(2) (or the old select, etc...).
You need to spend several days in reading more, choosing your graphical toolkit, before coding a single line of code of your game (which might need more code than what you imagine now).
I think the closest you can easily get to this affect is to just print to the console, but don't print the new-line (\n), just the carriage return (\r). This way you can write over the text after a couple of milliseconds. We do need to know the length of the thing we are printing so that we can be sure to completely override it with the next print.
The code for this would look something like:
import time
ms = 100
s = 'cp1'
print(s, end='\r')
time.sleep(ms / 1000)
print(' ' * len(s))
This only occurs when I am using vim on Linux (it's Kali Linux to be precise, though I haven't tested it on other distributions). I am using a standard German keyboard layout.
Sometimes when I type in vim (it often happens when I exit insert mode or when I use :w, maybe only on one of these since I often do one after the other), the cursor randomly jumps elsewhere, usually about 100 lines upwards (I don't have an exact number). At the same time, the next number in the line my cursor was in is decremented.
I suspect that this happens because I hit some sequence of keys to quickly, since this, on my Linux distribution, can cause some special characters to be inserted due to one of the keys modifying the other. For example, if I type "yt" quickly with this keyboard, it becomes "yƧ" (with a second bar on the t)
This by itself is somewhat annoying to me so if someone knows a way to turn that off on Linux while still retaining the basic keyboard layout, this would solve my problem, but telling me the exact command I accidentally executed so I can avoid/remove it will also help.
As far as I can remember, this problem only occurred when I was editing .texfiles, but that is also what I have been using vim the most for recently, so I wouldn't assume that it only happens there.
Still, I can post my list of plugins and my .vimrc if necessary. Just in case it has something to do with only LaTeX files, the only vim plugin I have for that is vimtex.
The command you are looking for is mapped to control + x by default. It decrements the next number on the given line.
I have sailed the 1e100 and did the mandatory search your question. I'll keep it short cos for most of you here time IS money so.
Leaving out the circumstances, i am left with 18 gb of data from testdisk recovery after total weirdness on all my pcs at once.
One of those might be a mega recovery key which unfortunately i stashed in two offline and one online place and are all gone (total weirdness, nm that)
I can probably script this with some thinking but my head is in explode-mode so i was wondering if any superguru here knows the oneliner to do that before i start scripting multiple lines in loops.
Anything that gets through umpteen recursive dirs listing all files with exactly x characters will do fine, that should narrow it down a lot.
If not not, cant blame me for asking i hepe. Thanks in advance for your time. (im using Linux mint 18.1 mainly or openwrt on that little vocore but i can switch to anything installable for this time i suppose)
I see ... (how to mark as solved, mark the answer that solved it but if there is none write your own) ok
the answer is simple, thanks to mister Benjamin W.
go the the directory that contains all the folders you need to check (however you do, in linuxming rightclick and "open terminal" for instance)
then theres one simple command : (for a file 22 characters in size in this case 22,no newline which i dont think matters since a newline is a character but see thats why im not good at here i go again)
the command was "find -size 22c"
gotta love linux
(ah well, assuming you got all the directories and files under one folder and set the permissions b/c testdisk writes to locked folders so you would have to change those and thats where i start elaborating but i found i found assuming is often incomplete so im sorry for that, you also have to ... turn the computer on and press play on tape but let's assume that for once)
nothing more , nothing less, thanks a lot and i think this site truly belongs top 10 or five in the whole ww
have a nice day
I am implementing a software project using C++ on Debian. When I execute the stand alone binary on a debian box, program runs fine for at least 15-20 minutes but after a while the console output becomes corrupted. I see lots of ASCII characters for most of the characters, but some characters display fine, so output becomes almost unreadable. If I CTRL+C and stop the execution, whatever I type on command line is also displayed as weird ASCII characters. If I reboot the box and start over, everything works fine for another 15-20 minutes then the same thing happens. Does anybody have any idea what might be going on here? Debian box has only command line support no GUI.
It sounds like you are printing some unwanted characters at some point. I think you may have a problem with managing memory you use for strings. Try running your program under valgrid. You can follow this tutorial. You should expect warnings about reading from uninitialized memory.
I don't think you're using "ASCII" properly here. Considering the fact that ASCII is in range 0-127, there's not much "weird" stuff in that range. I've seen that happen before, it's usually due to escape characters interpreted as display codes. I am a bit fuzzy on this -- I haven't done console stuff in a long while. But I'm pretty sure it's related to raw output of stuff that actually out of ASCII range.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
When you write something in BASIC, you are required to use line numbers. Like:
10 PRINT "HOME"
20 PRINT "SWEET"
30 GOTO 10
But I wonder: who came up with the idea to use line numbers at all? It is such a nuisance, and left quite an "echo" in the developing (pun intended) world!
The idea back then was that you could easily add code everywhere in your program by using the appropriate line number. That's why everybody uses line numbers 10, 20, 30.. so there is room left:
10 PRINT "HOME"
20 PRINT "SWEET"
30 GOTO 10
25 PRINT "HOME"
On the first interfaces BASIC was available for, there was no shiny editor, not even something like vi or emacs (or DOS edit, heh). You could only print out your program on the console and then you would add new lines or replace them, by giving the appropriate line number first. You could not navigate through the "file" (the program was kept in memory, although you could save a copy on disk) with the cursor like you are used to nowadays.
Therefore the line numbers weren't only needed as labels for the infamous GOTO, but indeed needed to tell the interpreter at what position in the program flow you are editing.
It has a loong-loong history.
Line numbering actually comes from Dartmouth BASIC, which was the original version of the BASIC programming language and was the integral part of a so called Dartmouth Time Sharing System. That DTSS had a rudimentary IDE, which was nothing more than an interactive command line.
So every line typed inside this "IDE", and beginning with a line number, was added to the program, replacing any previously stored line with the same number; anything else was assumed to be a DTSS command and immediately executed.
Before there was such a thing as a VDT (video display terminal), we old-timers programmed on punch cards. Punch cards reserved columns 72-80 for sequence numbers - if you dropped your card deck and they all got out of order, you could put the deck in a card sorter that would order the cards based on those sequence numbers. In many ways, the BASIC line numbers were similar to those sequence numbers.
Another advantage in the BASIC world is that in the old days BASIC was interpreted as it was run. Using labels rather than sequential line numbers for branches would require a first pass to pick up all the labels and their locations, where as if you use line numbers the interpreter knows whether it needs to start scanning forwards or backwards for the destination.
Back in the day you didn't have a 2 dimensional editor like emacs or vi. All you had was the command line.
Your program was stored in memory and you would type in single line commands to edit single lines.
If you were a Unix god you could do it with ed or something, but for BASIC on a C-64, VIC-20, or TRS-80 you'd just overwrite the line.
So a session might look like:
$10 PRINT "Hellow World"
$20 GOTO 10
$10 PRINT "Hello World"
And now the program would work correctly.
Some older mainframes even had line terminals without a screen. Your whole session was printed on paper in ink!
The "Who?" would be the inventors, Kemeney and Kurtz.
After reading the replies, I checked the Wikipedia entry for "Dartmouth BASIC", and was surprised to learn
The first compiler was produced before the time-sharing system was ready. Known as CardBASIC, it was intended for the standard card-reader based batch processing system.
So, it looks like Paul Tomblin "gets the square".
Paul Tomblin's answer is the most comprehensive, but I'm surprised no one has mentioned that a big part of the BASIC project's initial goal was to provide a beginner-friendly interactive environment using timesharing. (Kurtz and Kemeny's vision for "universal access for all students" was far ahead of its time in this regard.)
The BASIC system that was developed to fulfill this goal featured Teletype ASR-33 (and later other) printing terminals. When connected to a timesharing-capable OS, these allowed editing and running BASIC programs in an interactive mode (unlike working with punched cards), but they are not cursor-addressable. Line numbers were a beginner-friendly way to both specify the order of program statements and allow unambiguous editing in the absence of a screen editor. The Wikipedia entry for "line editor" explains further, and anyone who's ever tried to use a line editor (such as the Un*x 'ed') can appreciate why Kurtz and Kemeny should be thanked for sparing the beginner having to learn the cryptic command sequences required for editing text in this manner.
They originated in FORTRAN, from which BASIC was derived. However, in FORTRAN only lines referenced by other lines (like GOTO targets) needed numbers. In BASIC they had a secondary use, which was to allow editing of specific lines.
Back in the fifties, when high programming languages were in their early beginnings, there were no terminals, no editors, no monitors (yes, no monitors), just card punchers and readers (for writing and reading the contents of cards into memory of a computer) and printers (for printing results, naturally).
Later, tape was introduced, but that's another story.
Each punch card had its own number. There were several reasons for that; from purely keeping them in order, to determining the sequence of execution. Each card was one line of code (in today's terms). Since, at that time, there were no constructs like if..then..else, or whatever variant of the like, the sequence of execution had to be determined somehow. So GOTO statements were introduced. They were the basis of loops. The term "spaghetti code" comes from that time period also, since badly written code was relatively hard to follow, like spaghetti on a plate :)
I'd guess it comes from assembler, where each instruction has an address which may be jumped to by another instruction.
Additionally, the first computers didn't have much memory, and storing a line number only takes two bytes (if done properly). Writing a label takes more memory, first in the location, where that label is defined, then in any jump command.
Finally in the good old days there weren't any fancy editors. The only "editor" was a simple command line interface, which treated everything starting with a number being part of a program and everything else as commands to be executed immediately. Most prominent example should be the Commodore 64.
Newer dialects of Basic don't have the need for line numbers any longer.
in Basic, if you didn't have a line number, how can you preform a
GOTO 10
that was a way to jump lines, a good way that was found ... more than 20 years ago!
today, the lines help us catching errors/exceptions because the debug engines are made to send us in the message that we got an exception on line xxx and we jump right away to it!
imagine a world without line numbers... how can a reporter be paid without the lines?
"Now that you know the novel, you have to write a summary with no more than 50 lines"
remember this? Even at the school we learn about line numbers!
if it wasn't invented, someone will already invented again so we could use it nicely :)
Not all versions of BASIC required line numbers. QBasic, for instance, supported labels. You could then jump to those with GOTO (ignoring Dijkstra's "Go To Statement Considered Harmful," for the moment).
The answer is already above. Paul Tomblin wrote it (with a caveat to zabzonk). Actually, I would argue that any answer which does not mention "punch cards" is incomplete, if it mentions neither punch cards nor FORTRAN, it is wrong. I can say that this is definitively right because my parents both used punch cards on a regular basis (they started with FORTRAN 66 and 77), then migrated to Basic and COBOL in the 80's.
In the early days, most programs were entered with punch cards. The punch cards were usually entered in sequence, usually one instruction per card, with labels (JMP/JSR targets) being a separate instruction card.
To edit your program, you replaced the card.
Later implementations added an optional sequence number on the right end of the line, so that when/if they got out of order, they could be resequenced by an automated reader.
Fortran used both numeric target labels on the left (col 1-5) and left a reserved block on the right (73-80) for sequence or comment.
When BASIC was initially written, it was decided to move the sequence numbers to the left, into FORTRAN's label field, and to allow overwriting prior cards' memory footprint... as an editing mode. This was intended for the interactive dev environment, but worked just as well with cards. And cards were used in some early implementations for a variety of reasons.
Keep in mind: Many computers were card-reader and printer interface right through the late 1970's. Even tho' interactive mode basics were available, card punched basic programs were frequently used. Since many simply were feeding into the IDE, they worked exactly the same way. Including needing a "Run" card at the end. In such cases, one could simply tack a correction card and another Run card to rerun with a variation on some variable; likewise, in complex programs, simply adding a corrected line of a card before the run was adequate to edit out problems without spending precious time finding the errant card itself.
I like the robot church on Futurama, on the walls were written stuff like
10 SIN
20 GOTO HELL
On the Speccy you couldn't edit a line without the line number.
I find them very helpful when pairing. I don't have to point at a line when my pair has the keyboard, I can just say, "on line 74, shouldn't that really be getMoreBeer()?"
The original editor for DOS was a wonderful utility called edlin. You could only edit a single line. To make life even more interesting in many versions of BASIC you could type lines out of order, line 10, 20, 30, 25, 5, The execution would be by line line number not by the order of appearance.