import sys
data = []
for line in sys.stdin:
data.append(line)
I did not use .read() or .readline()
but this code works reading the data. It read the data line by line. Input data is separated by '\n'. data is typed by user. like 'input()' .
My Question:
Is .read() or .readline() not necessary?
I wonder why for loop works on sys.stdin and how it reads data line by line?
If you look at the documentation for sys, you'll see that sys.stdin (and sys.stdout / sys.stderr) are just file objects.
These streams are regular text files like those returned by the open() function.
The documentation about open() says:
The type of file object returned by the open() function depends on the mode. When open() is used to open a file in a text mode ('w', 'r', 'wt', 'rt', etc.), it returns a subclass of io.TextIOBase (specifically io.TextIOWrapper).
TextIOWrapper inherits from IOBase, which has an __iter__ method. This __iter__ method allows looping over lines in the file. I can't find where this is noted in the Python documentation, but it is given in the source code for IOBase.
IOBase (and its subclasses) support the iterator protocol, meaning that an IOBase object can be iterated over yielding the lines in a stream.
Related
def update(login_info):
stids = 001
file = open('regis.txt', 'r+')
for line in file:
if stids in line:
x = eval(line)
print(x)
c = input('what course you would like to update >> ')
get = x.get(c)
print('This is your current mark for the course', get)
mark = input('What is the new mark? >>')
g = mark.upper()
x.update({c: g})
file.write(str(x))
Before writing into the file
After writing into the file
This is what happens in the idle
As you can see, the system is not writing the data into the original dictionary. How can we improve on that? Pls, explain in detail. Thx all
Python doesn't just make relations like that. In Python's perspective, you are reading a regular text file, executing a command from the line read. That command creates an object which has no relationship to the line it was created from. But writing to the file should still work in my opinion. But you moved a line further (because you read the line where the data was and now you are at the end of it).
When you read a file, the position of where we are on the file changes. Iterating over the file like that (i.e for line in file:) invokes implicitly next() on the file. For efficiency reasons, positioning is disabled (file.tell() will not tell the current position). When you wrote to the file, for some reason you appended the text to the end, and if you test it it will no longer continue the loop even though it is still on the second line.
Reading and writing at the same time looks like an undefined behaviour.
Beginner Python: Reading and writing to the same file
Working on python script that uses stdin and stdout. The default
behaviour is the script will write to a file, so I have this function
that is called to handle writing to a file:
def run_loop(data,name):
with open(name, 'w') as fo:
fo.write("webstat output "+t) # t is a date str created in global scope
for x in data:
fo.write(processer(x))
fo.write('\n')
I have a few places in the script where I use this function, so it
would be less convenient to have to rewrite it or write a substituent
of it to work with stdout. When I try using stdout with open,
sys.stdout says its a type _io.TextIOWrapper. I get an error when
tying to call run_loop(same_data,sys.stdout) I get this error:
TypeError: expected str, bytes or os.PathLike object, not _io.TextIOWrapper
I'm wondering if there is away to use open() with stdout? I would
guess that since stdout is file io, I should be able to write to it as
a file. Ideally there would be a way to change _io.TextIOWrapper into
an object that open() can use, but I've been looking at types of the
methods inside of sys.stdout and they don't seem to provide this
facility. I'm not sure why the fo object from open() couldn't be
set to stdout?
The behaviour I'm trying to achieve is
cat input-data.txt | myscript.py > out.txt
I've been able to get the stdin portion to work, but I'm still at a
loss for how to write to stdout with open().
Supposedly, I could forego open(), if I could keep compatibility with
the other places I call this function: data being a list and name
usually being a string of the file name where the output should be saved.
Python 3.7.5rc1
I am trying to read a large data file (= millions of rows, in a very specific format) using a pre-built (in C) routine. I want to then yeild the results of this, line by line, via a generator function.
I can read the file OK, but where as just running:
<command> <filename>
directly in linux will print the results line by line as it finds them, I've had no luck trying to replicate this within my generator function. It seems to output the entire lot as a single string that I need to split on newline, and of course then everything needs reading before I can yield line 1.
This code will read the file, no problem:
import subprocess
import config
file_cmd = '<command> <filename>'
for rec in (subprocess.check_output([file_cmd], shell=True).decode(config.ENCODING).split('\n')):
yield rec
(ENCODING is set in config.py to iso-8859-1 - it's a Swedish site)
The code I have works, in that it gives me the data, but in doing so, it tries to hold the whole lot in memory. I have larger files than this to process which are likely to blow the available memory, so this isn't an option.
I've played around with bufsize on Popen, but not had any success (and also, I can't decode or split after the Popen, though I guess the fact I need to split right now is actually my problem!).
I think I have this working now, so will answer my own question in the event somebody else is looking for this later ...
proc = subprocess.Popen(shlex.split(file_cmd), stdout=subprocess.PIPE)
while True:
output = proc.stdout.readline()
if output == b'' and proc.poll() is not None:
break
if output:
yield output.decode(config.ENCODING).strip()
I am new to python.
Can anybody explain what's the difference between a string variable and io.StringIO . In both we can save character.
e.g
String variable
k= 'RAVI'
io.stringIO
string_out = io.StringIO()
string_out.write('A sample string which we have to send to server as string data.')
string_out.getvalue()
If we print k or string_out.getvalue() both will print the text
print(k)
print(string_out.getvalue())
They are similar because both str and StringIO represent strings, they just do it in different ways:
str: Immutable
StringIO: Mutable, file-like interface, which stores strs
A text-mode file handle (as produced by open("somefile.txt")) is also very similar to StringIO (both are "Text I/O"), with the latter allowing you to avoid using an actual file for file-like operations.
you can use io.StringIO() to simulate files, since python is dynamic with variable types usually if you have something that accepts a file object you can also use io.StringIO() with it, meaning you can have a "file" in memory that you can control the contents of without actually writing any temporary files to disk
stackoverflow.
I've been trying to get the following code to create a .txt file, write some string on it and then print some message if said string was in the file. This is merely a study for a more complex project, but even given it's simplicity, it's still not working.
Code:
import io
file = open("C:\\Users\\...\\txt.txt", "w+") #"..." is the rest of the file destination
file.write('wololo')
if "wololo" in file.read():
print ("ok")
This function always skips the if as if there was no "wololo" inside the file, even though I've checked it all times and it was properly in there.
I'm not exactly sure what could be the problem, and I've spend a great deal of time searching everywhere for a solution, all to no avail. What could be wrong in this simple code?
Oh, and if I was to search for a string in a much bigger .txt file, would it still be wise to use file.read()?
Thanks!
When you write to your file, the cursor is moved to the end of your file. If you want to read the data aferwards, you'll have to move the cursor to the beginning of the file, such as:
file = open("txt.txt", "w+")
file.write('wololo')
file.seek(0)
if "wololo" in file.read():
print ("ok")
file.close() # Remember to close the file
If the file is big, you should consider to iterate over the file line by line instead. This would avoid that the entire file is stored in memory. Also consider using a context manager (the with keyword), so that you don't have to explicitly close the file yourself.
with open('bigdata.txt', 'rb') as ifile: # Use rb mode in Windows for reading
for line in ifile:
if 'wololo' in line:
print('OK')
else:
print('String not in file')