I need to download a 60MB ZIP file and extract the only file that comes within it. I want to download it and extract it using streams. How can I achieve this using Rust?
fn main () {
let mut res = reqwest::get("myfile.zip").unwrap();
// extract the response body to myfile.txt
}
In Node.js I would do something like this:
http.get('myfile.zip', response => {
response.pipe(unzip.Parse())
.on('entry', entry => {
if (entry.path.endsWith('.txt')) {
entry.pipe(fs.createWriteStream('myfile.txt'))
}
})
})
With reqwest you can get the .zip file:
reqwest::get("myfile.zip")
Since reqwest can only be used for retrieving the file, ZipArchive from the zip crate can be used for unpacking it. It's not possible to stream the .zip file into ZipArchive, since ZipArchive::new(reader: R) requires R to implement Read (which is fulfilled by the Response of reqwest) and Seek, which is not implemented by Response.
As a workaround you may use a temporary file:
copy_to(&mut tmpfile)
As File implements both Seek and Read, zip can be used here:
zip::ZipArchive::new(tmpfile)
This is a working example of the described method:
extern crate reqwest;
extern crate tempfile;
extern crate zip;
use std::io::Read;
fn main() {
let mut tmpfile = tempfile::tempfile().unwrap();
reqwest::get("myfile.zip").unwrap().copy_to(&mut tmpfile);
let mut zip = zip::ZipArchive::new(tmpfile).unwrap();
println!("{:#?}", zip);
}
tempfile is a handy crate, which lets you create a temporary file, so you don't have to think of a name.
That's how I'd read the file hello.txt with content hello world from the archive hello.zip located on a local server:
extern crate reqwest;
extern crate zip;
use std::io::Read;
fn main() {
let mut res = reqwest::get("http://localhost:8000/hello.zip").unwrap();
let mut buf: Vec<u8> = Vec::new();
let _ = res.read_to_end(&mut buf);
let reader = std::io::Cursor::new(buf);
let mut zip = zip::ZipArchive::new(reader).unwrap();
let mut file_zip = zip.by_name("hello.txt").unwrap();
let mut file_buf: Vec<u8> = Vec::new();
let _ = file_zip.read_to_end(&mut file_buf);
let content = String::from_utf8(file_buf).unwrap();
println!("{}", content);
}
This will output hello world
async solution using Tokio
It's a bit convoluted, but you can do this using tokio, futures, tokio_util::compat and async_compression. The key is to create a futures::io::AsyncRead stream using .into_async_read() and then convert it into a tokio::io::AsyncRead using .compat().
For simplicity, it downloads a txt.gz file and prints it line by line.
use async_compression::tokio::bufread::GzipDecoder;
use futures::stream::TryStreamExt;
use tokio::io::AsyncBufReadExt;
use tokio_util::compat::FuturesAsyncReadCompatExt;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let url = "https://f001.backblazeb2.com/file/korteur/hello-world.txt.gz";
let response = reqwest::get(url).await?;
let stream = response
.bytes_stream()
.map_err(|e| futures::io::Error::new(futures::io::ErrorKind::Other, e))
.into_async_read()
.compat();
let gzip_decoder = GzipDecoder::new(stream);
// Print decompressed txt content
let buf_reader = tokio::io::BufReader::new(gzip_decoder);
let mut lines = buf_reader.lines();
while let Some(line) = lines.next_line().await? {
println!("{line}");
}
Ok(())
}
Credit to Benjamin Kay.
Related
In rust, using sha256 = "1.0.2" (or similar), how do I hash a binary file (i.e. a tar.gz archive)?
I'm trying to get the sha256 of that binary file.
This doesn't work:
fn hash() {
let file = "file.tar.gz";
let computed_hash = sha256::digest_file(std::path::Path::new(file)).unwrap();
computed_hash
}
the output is:
...
Error { kind: InvalidData, message: "stream did not contain valid UTF-8" }
The sha2 crate upon which depends supports hashing Readable objects without needing to read the entire file into memory. See the example in the hashes readme.
use sha2::{Sha256, Digest};
use std::{io, fs};
let mut hasher = Sha256::new();
let mut file = fs::File::open("file.tar.gz")?;
let bytes_written = io::copy(&mut file, &mut hasher)?;
let hash_bytes = hasher.finalize();
Edit:
Upgrading to sha256 = "1.0.3" should fix this
The issue is that digest_file is internally reading the file to a String, which requires that it contains valid UTF-8, which is obviously not what you want in this case.
Instead, you could read the file in as bytes and pass that into sha256::digest_bytes:
let bytes = std::fs::read(path).unwrap(); // Vec<u8>
let hash = sha256::digest_bytes(&bytes);
Here's an implementation using the sha2 crate that doesn't read the entire file into memory, and doesn't depend on the ring crate. In my case, ring isn't pure rust, which leads to cross-compilation difficulties.
use data_encoding::HEXLOWER;
use sha2::{Digest, Sha256};
use std::fs::File;
use std::io::{BufReader, Read};
use std::path::{Path, PathBuf};
/// calculates sha256 digest as lowercase hex string
fn sha256_digest(path: &PathBuf) -> Result<String> {
let input = File::open(path)?;
let mut reader = BufReader::new(input);
let digest = {
let mut hasher = Sha256::new();
let mut buffer = [0; 1024];
loop {
let count = reader.read(&mut buffer)?;
if count == 0 { break }
hasher.update(&buffer[..count]);
}
hasher.finalize()
};
Ok(HEXLOWER.encode(digest.as_ref()))
}
I'm creating a small application that explores variable lifetimes and threads. I want to load in a file once, and then use its contents (in this case an audio file) in a separate channel. I am having issues with value lifetimes.
I'm almost certain the syntax is wrong for what I have so far (for creating a static variable), but I can't find any resources for File types and lifetimes. What I have thus far produces this error:
let file = &File::open("src/censor-beep-01.wav").unwrap();
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ creates a temporary which is freed while still in use
let x: &'static File = file;
------------- type annotation requires that borrow lasts for `'static`
The code I currently have is:
#![allow(dead_code)]
#![allow(unused_imports)]
#![allow(unused_must_use)]
#![allow(unused_variables)]
use std::io::{self, BufRead, BufReader, stdin, Read};
use std::sync::mpsc::{self, TryRecvError};
use std::thread;
use std::time::Duration;
use std::fs::File;
use std::rc::Rc;
use rodio::Source;
fn main() {
let file = &File::open("src/censor-beep-01.wav").unwrap();
let x: &'static File = file;
loop {
let (tx, rx) = mpsc::channel();
thread::spawn(move || loop {
let tmp = x;
let (stream, stream_handle) = rodio::OutputStream::try_default().unwrap();
let source = rodio::Decoder::new(BufReader::new(tmp)).unwrap();
stream_handle.play_raw(source.convert_samples());
match rx.try_recv() {
Ok(_) | Err(TryRecvError::Disconnected) => {
break;
}
Err(TryRecvError::Empty) => {
println!("z");
thread::sleep(Duration::from_millis(1000));
}
}
});
let mut line = String::new();
let stdin = io::stdin();
let _ = stdin.lock().read_line(&mut line);
let _ = tx.send(());
return;
}
}
You need to wrap the file with Arc and Mutex like Arc::new(Mutex::new(file)) and then clone the file before passing it to the thread.
Arc is used for reference counting, which is needed to share the target object (in your case it is a file) across the thread and Mutex is needed to access the target object synchronously.
sample code (I have simplified your code to make it more understandable):
let file = Arc::new(Mutex::new(File::open("src/censor-beep-01.wav").unwrap()));
loop {
let file = file.clone();
thread::spawn(move || loop {
let mut file_guard = match file.lock() {
Ok(guard) => guard,
Err(poison) => poison.into_inner()
};
let file = file_guard.deref();
// now you can pass above file object to BufReader like "BufReader::new(file)"
});
}
reason for creates a temporary which is freed while still in use error:
You have only stored the reference of the file without the actual file object. so, the object will be droped in that line itself.
I'm new to Rust and am likely have a huge knowledge gap. Basically, I'm hoping to be create a utility function that would except a regular text file or a ZIP file and return a BufRead where the caller can start processing line by line. It is working well for non ZIP files but I am not understanding how to achieve the same for the ZIP files. The ZIP files will only contain a single file within the archive which is why I'm only processing the first file in the ZipArchive.
I'm running into the the following error.
error[E0515]: cannot return value referencing local variable `archive_contents`
--> src/file_reader.rs:30:9
|
27 | let archive_file: zip::read::ZipFile = archive_contents.by_index(0).unwrap();
| ---------------- `archive_contents` is borrowed here
...
30 | Ok(Box::new(BufReader::with_capacity(128 * 1024, archive_file)))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ returns a value referencing data owned by the current function
It seems the archive_contents is preventing the BufRead object from returning to the caller. I'm just not sure how to work around this.
file_reader.rs
use std::ffi::OsStr;
use std::fs::File;
use std::io::BufRead;
use std::io::BufReader;
use std::path::Path;
pub struct FileReader {
pub file_reader: Result<Box<BufRead>, &'static str>,
}
pub fn file_reader(filename: &str) -> Result<Box<BufRead>, &'static str> {
let path = Path::new(filename);
let file = match File::open(&path) {
Ok(file) => file,
Err(why) => panic!(
"ERROR: Could not open file, {}: {}",
path.display(),
why.to_string()
),
};
if path.extension() == Some(OsStr::new("zip")) {
// Processing ZIP file.
let mut archive_contents: zip::read::ZipArchive<std::fs::File> =
zip::ZipArchive::new(file).unwrap();
let archive_file: zip::read::ZipFile = archive_contents.by_index(0).unwrap();
// ERRORS: returns a value referencing data owned by the current function
Ok(Box::new(BufReader::with_capacity(128 * 1024, archive_file)))
} else {
// Processing non-ZIP file.
Ok(Box::new(BufReader::with_capacity(128 * 1024, file)))
}
}
main.rs
mod file_reader;
use std::io::BufRead;
fn main() {
let mut files: Vec<String> = Vec::new();
files.push("/tmp/text_file.txt".to_string());
files.push("/tmp/zip_file.zip".to_string());
for f in files {
let mut fr = match file_reader::file_reader(&f) {
Ok(fr) => fr,
Err(e) => panic!("Error reading file."),
};
fr.lines().for_each(|l| match l {
Ok(l) => {
println!("{}", l);
}
Err(e) => {
println!("ERROR: Failed to read line:\n {}", e);
}
});
}
}
Any help is greatly appreciated!
It seems the archive_contents is preventing the BufRead object from returning to the caller. I'm just not sure how to work around this.
You have to restructure the code somehow. The issue here is that, well, the archive data is part of the archive. So unlike file, archive_file is not an independent item, it is rather a pointer of sort into the archive itself. Which means the archive needs to live longer than archive_file for this code to be correct.
In a GC'd language this isn't an issue, archive_file has a reference to archive and will keep it alive however long it needs. Not so for Rust.
A simple way to fix this would be to just copy the data out of archive_file and into an owned buffer you can return to the parent. An other option might be to return a wrapper for (archive_contents, item_index), which would delegate the reading (might be somewhat tricky though). Yet another would be to not have file_reader.
Thanks to #Masklinn for the direction! Here's the working solution using their suggestion.
file_reader.rs
use std::ffi::OsStr;
use std::fs::File;
use std::io::BufRead;
use std::io::BufReader;
use std::io::Cursor;
use std::io::Error;
use std::io::Read;
use std::path::Path;
use zip::read::ZipArchive;
pub fn file_reader(filename: &str) -> Result<Box<dyn BufRead>, Error> {
let path = Path::new(filename);
let file = match File::open(&path) {
Ok(file) => file,
Err(why) => return Err(why),
};
if path.extension() == Some(OsStr::new("zip")) {
let mut archive_contents = ZipArchive::new(file)?;
let mut archive_file = archive_contents.by_index(0)?;
// Read the contents of the file into a vec.
let mut data = Vec::new();
archive_file.read_to_end(&mut data)?;
// Wrap vec in a std::io::Cursor.
let cursor = Cursor::new(data);
Ok(Box::new(cursor))
} else {
// Processing non-ZIP file.
Ok(Box::new(BufReader::with_capacity(128 * 1024, file)))
}
}
While the solution you have settled on does work, it has a few disadvantages. One is that when you read from a zip file, you have to read the contents of the file you want to process into memory before proceeding, which might be impractical for a large file. Another is that you have to heap allocate the BufReader in either case.
Another possibly more idiomatic solution is to restructure your code, such that the BufReader does not need to be returned from the function at all - rather, structure your code so that it has a function that opens the file, which in turn calls a function that processes the file:
use std::ffi::OsStr;
use std::fs::File;
use std::io::BufRead;
use std::io::BufReader;
use std::path::Path;
pub fn process_file(filename: &str) -> Result<usize, String> {
let path = Path::new(filename);
let file = match File::open(&path) {
Ok(file) => file,
Err(why) => return Err(format!(
"ERROR: Could not open file, {}: {}",
path.display(),
why.to_string()
)),
};
if path.extension() == Some(OsStr::new("zip")) {
// Handling a zip file
let mut archive_contents=zip::ZipArchive::new(file).unwrap();
let mut buf_reader = BufReader::with_capacity(128 * 1024,archive_contents.by_index(0).unwrap());
process_reader(&mut buf_reader)
} else {
// Handling a plain file.
process_reader(&mut BufReader::with_capacity(128 * 1024, file))
}
}
pub fn process_reader(reader: &mut dyn BufRead) -> Result<usize, String> {
// Example, just count the number of lines
return Ok(reader.lines().count());
}
fn main() {
let mut files: Vec<String> = Vec::new();
files.push("/tmp/text_file.txt".to_string());
files.push("/tmp/zip_file.zip".to_string());
for f in files {
match process_file(&f) {
Ok(count) => println!("File {} Count: {}", &f, count),
Err(e) => println!("Error reading file: {}", e),
};
}
}
This way, you don't need any Boxes and you don't need to read the file into memory before processing it.
A drawback to this solution would if you had multiple functions that need to be able to read from zip files. One way to handle that would be to define process_file to take a callback function to do the processing. First you would change the definition of process_file to be:
pub fn process_file<C>(filename: &str, process_reader: C) -> Result<usize, String>
where C: FnOnce(&mut dyn BufRead)->Result<usize, String>
The rest of the function body can be left unchanged. Now, process_reader can be passed into the function, like this:
process_file(&f, count_lines)
where count_lines would be the original simple function to count the lines, for instance.
This would also allow you to pass in a closure:
process_file(&f, |reader| Ok(reader.lines().count()))
use std::env;
use std::fs::File;
use std::io::prelude::*;
fn main() {
let args: Vec<String> = env::args().collect();
let filename = &args[1];
let mut f = File::open(filename).expect("file not found");
let mut contents = String::new();
f.read_to_string(&mut contents).expect("something went wrong reading the file");
println!("file content:\n{}", contents);
}
When I attempt to read a GBK encoded file, I get the following error:
thread 'main' panicked at 'something went wrong reading the file: Error { repr: Custom(Custom { kind: InvalidData, error: StringError("stream did not contain valid UTF-8") }) }', /checkout/src/libcore/result.rs:860
It says the stream must contain valid UTF-8. How can I read a GBK file?
I figured out how to read line by line from a GBK-encoded file.
extern crate encoding;
use std::env;
use std::fs::File;
use std::io::prelude::*;
use std::io::BufReader;
use encoding::all::GBK;
use encoding::{Encoding, EncoderTrap, DecoderTrap};
fn main() {
let args: Vec<String> = env::args().collect();
let filename = &args[1];
let mut file = File::open(filename).expect("file not found");
let reader = BufReader::new(&file);
let mut lines = reader.split(b'\n').map(|l| l.unwrap());
for line in lines {
let decoded_string = GBK.decode(&line, DecoderTrap::Strict).unwrap();
println!("{}", decoded_string);
}
}
You likely want the encoding crate.
With Rust being comparatively new, I've seen far too many ways of reading and writing files. Many are extremely messy snippets someone came up with for their blog, and 99% of the examples I've found (even on Stack Overflow) are from unstable builds that no longer work. Now that Rust is stable, what is a simple, readable, non-panicking snippet for reading or writing files?
This is the closest I've gotten to something that works in terms of reading a text file, but it's still not compiling even though I'm fairly certain I've included everything I should have. This is based off of a snippet I found on Google+ of all places, and the only thing I've changed is that the old BufferedReader is now just BufReader:
use std::fs::File;
use std::io::BufReader;
use std::path::Path;
fn main() {
let path = Path::new("./textfile");
let mut file = BufReader::new(File::open(&path));
for line in file.lines() {
println!("{}", line);
}
}
The compiler complains:
error: the trait bound `std::result::Result<std::fs::File, std::io::Error>: std::io::Read` is not satisfied [--explain E0277]
--> src/main.rs:7:20
|>
7 |> let mut file = BufReader::new(File::open(&path));
|> ^^^^^^^^^^^^^^
note: required by `std::io::BufReader::new`
error: no method named `lines` found for type `std::io::BufReader<std::result::Result<std::fs::File, std::io::Error>>` in the current scope
--> src/main.rs:8:22
|>
8 |> for line in file.lines() {
|> ^^^^^
To sum it up, what I'm looking for is:
brevity
readability
covers all possible errors
doesn't panic
None of the functions I show here panic on their own, but I am using expect because I don't know what kind of error handling will fit best into your application. Go read The Rust Programming Language's chapter on error handling to understand how to appropriately handle failure in your own program.
Rust 1.26 and onwards
If you don't want to care about the underlying details, there are one-line functions for reading and writing.
Read a file to a String
use std::fs;
fn main() {
let data = fs::read_to_string("/etc/hosts").expect("Unable to read file");
println!("{}", data);
}
Read a file as a Vec<u8>
use std::fs;
fn main() {
let data = fs::read("/etc/hosts").expect("Unable to read file");
println!("{}", data.len());
}
Write a file
use std::fs;
fn main() {
let data = "Some data!";
fs::write("/tmp/foo", data).expect("Unable to write file");
}
Rust 1.0 and onwards
These forms are slightly more verbose than the one-line functions that allocate a String or Vec for you, but are more powerful in that you can reuse allocated data or append to an existing object.
Reading data
Reading a file requires two core pieces: File and Read.
Read a file to a String
use std::fs::File;
use std::io::Read;
fn main() {
let mut data = String::new();
let mut f = File::open("/etc/hosts").expect("Unable to open file");
f.read_to_string(&mut data).expect("Unable to read string");
println!("{}", data);
}
Read a file as a Vec<u8>
use std::fs::File;
use std::io::Read;
fn main() {
let mut data = Vec::new();
let mut f = File::open("/etc/hosts").expect("Unable to open file");
f.read_to_end(&mut data).expect("Unable to read data");
println!("{}", data.len());
}
Write a file
Writing a file is similar, except we use the Write trait and we always write out bytes. You can convert a String / &str to bytes with as_bytes:
use std::fs::File;
use std::io::Write;
fn main() {
let data = "Some data!";
let mut f = File::create("/tmp/foo").expect("Unable to create file");
f.write_all(data.as_bytes()).expect("Unable to write data");
}
Buffered I/O
I felt a bit of a push from the community to use BufReader and BufWriter instead of reading straight from a file
A buffered reader (or writer) uses a buffer to reduce the number of I/O requests. For example, it's much more efficient to access the disk once to read 256 bytes instead of accessing the disk 256 times.
That being said, I don't believe a buffered reader/writer will be useful when reading the entire file. read_to_end seems to copy data in somewhat large chunks, so the transfer may already be naturally coalesced into fewer I/O requests.
Here's an example of using it for reading:
use std::fs::File;
use std::io::{BufReader, Read};
fn main() {
let mut data = String::new();
let f = File::open("/etc/hosts").expect("Unable to open file");
let mut br = BufReader::new(f);
br.read_to_string(&mut data).expect("Unable to read string");
println!("{}", data);
}
And for writing:
use std::fs::File;
use std::io::{BufWriter, Write};
fn main() {
let data = "Some data!";
let f = File::create("/tmp/foo").expect("Unable to create file");
let mut f = BufWriter::new(f);
f.write_all(data.as_bytes()).expect("Unable to write data");
}
A BufReader is more useful when you want to read line-by-line:
use std::fs::File;
use std::io::{BufRead, BufReader};
fn main() {
let f = File::open("/etc/hosts").expect("Unable to open file");
let f = BufReader::new(f);
for line in f.lines() {
let line = line.expect("Unable to read line");
println!("Line: {}", line);
}
}
For anybody who is writing to a file, the accepted answer is good but if you need to append to the file you have to use the OpenOptions struct instead:
use std::io::Write;
use std::fs::OpenOptions;
fn main() {
let data = "Some data!\n";
let mut f = OpenOptions::new()
.append(true)
.create(true) // Optionally create the file if it doesn't already exist
.open("/tmp/foo")
.expect("Unable to open file");
f.write_all(data.as_bytes()).expect("Unable to write data");
}
Buffered writing still works the same way:
use std::io::{BufWriter, Write};
use std::fs::OpenOptions;
fn main() {
let data = "Some data!\n";
let f = OpenOptions::new()
.append(true)
.open("/tmp/foo")
.expect("Unable to open file");
let mut f = BufWriter::new(f);
f.write_all(data.as_bytes()).expect("Unable to write data");
}
By using the Buffered I/O you can copy the file size is greater than the actual memory.
use std::fs::{File, OpenOptions};
use std::io::{BufReader, BufWriter, Write, BufRead};
fn main() {
let read = File::open(r#"E:\1.xls"#);
let write = OpenOptions::new().write(true).create(true).open(r#"E:\2.xls"#);
let mut reader = BufReader::new(read.unwrap());
let mut writer = BufWriter::new(write.unwrap());
let mut length = 1;
while length > 0 {
let buffer = reader.fill_buf().unwrap();
writer.write(buffer);
length = buffer.len();
reader.consume(length);
}
}