process::Command not running a program being provided - rust

I'm trying to run a program through rust to make a sort of selection GUI. I'm 90% sure that I've gotten my working directory correct but actually running the test command is not doing anything. I've followed the documentation the best I could, but I'm pretty new.
use std::process::Command;
use std::path::Path;
use std::env;
const ROM_DIRECTORY: &str = "C:/ROMS";
fn main() {
let internal_directory = Path::new(ROM_DIRECTORY);
assert!(env::set_current_dir(internal_directory).is_ok());
println!("Successfully changed working directory to {}!", internal_directory.display());
Command::new("Rockman.nes");
}

Creating Command object by itself will not do anything. You must explicitly execute it. You can do it by calling methods like output, spawn or status.

Related

How to deal with assets in rust?

I have several json files which contain objects that need to be exported from the module to be used(read only) in various places in the code base. Exporting a function that reads the files and parses them and invoking it every time the objects are needed seems very wasteful. In go I'd export a global variable and initialize it in init function. So how do I go about doing it in rust?
I guess you are using this for interface definitions between different system parts. This is a known and well understood problem and is usually solved with a build script, such as in the case of protobuf.
There is a very good tutorial about how to use the build script to generate files.
This is how this could look like in code.
(All files are relative to the crate root directory)
shared_data.json:
{
"example_data": 42
}
build.rs:
use std::{
env,
fs::File,
io::{Read, Write},
path::PathBuf,
};
fn main() {
// OUT_DIR is automatically set by cargo and contains the build directory path
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
// The path of the input file
let data_path_in = "shared_data.json";
// The path in the build directory that should contain the generated file
let data_path_out = out_path.join("generated_shared_data.rs");
// Tell cargo to re-run the build script whenever the input file changes
println!("cargo:rerun-if-changed={data_path_in}");
// The actual conversion
let mut data_in = String::new();
File::open(data_path_in)
.unwrap()
.read_to_string(&mut data_in)
.unwrap();
{
let mut out_file = File::create(data_path_out).unwrap();
writeln!(
out_file,
"::lazy_static::lazy_static! {{ static ref SHARED_DATA: ::serde_json::Value = ::serde_json::json!({}); }}",
data_in
)
.unwrap();
}
}
main.rs:
include!(concat!(env!("OUT_DIR"), "/generated_shared_data.rs"));
fn main() {
let example_data = SHARED_DATA
.as_object()
.unwrap()
.get("example_data")
.unwrap()
.as_u64()
.unwrap();
println!("{}", example_data);
}
Output:
42
Note that this still uses lazy_static, because I didn't realize that the json!() macro isn't const.
One could of course adjust the build script to work without lazy_static, but that would probably involve writing a custom serializer that serializes the json code inside the build script into executable Rust code.
EDIT: After further research, I came to the conclusion that it's impossible to create serde_json::Values in a const fashion. So I don't think there is a way around lazy_static.
And if you are using lazy_static, you might as well skip the entire build.rs step and use include_str!() instead:
use lazy_static::lazy_static;
lazy_static! {
static ref SHARED_DATA: serde_json::Value =
serde_json::from_str(include_str!("../shared_data.json")).unwrap();
}
fn main() {
let example_data = SHARED_DATA
.as_object()
.unwrap()
.get("example_data")
.unwrap()
.as_u64()
.unwrap();
println!("{}", example_data);
}
However, this will result in a runtime error if the json code is broken. With the build.rs and the json!(), this will result in a compile-time error.
The general way of solving this problem in Rust is to read the assets in a single place (main(), for example), and then pass a reference to the assets as needed. This pattern plays nicely with Rust's borrow checker and initialization rules.
However, if you insist on using global variables:
When we apply Rust's initialization and borrow checker rules to global variables one can see how the compiler might have a hard time proving (in general) that all accesses to a global variable are safe. In order to use global variables the unsafe keyword might need to be used when accessing the variable, in which case you are simply asserting that the programmer is responsible for manually verifying that all accesses to the global variable happen in safe way. Idiomatic Rust tries to build safe abstractions to minimize how often programmers need to do this. I wouldn't consider the lazy_static! macro as a hack, it is a abstraction (and a very commonly used one) that transfers the responsibility from the programmer to the language to prove that the global access is safe.

Add items to crate root module from separate files without creating child modules

I have a small Rust project, and would like the simplicity of a flat module system, combined with the modularity of separate files. In essence, I would like this program:
src/main.rs
fn print_hello_world() {
println!("Hello, world!");
}
fn main() {
print_hello_world();
}
to be structured akin to this:
src/print_hello_world.rs
pub fn print_hello_world() {
println!("Hello, world!");
}
src/main.rs
use crate::print_hello_world; // Error: no `print_hello_world` in the root
fn main() {
print_hello_world();
}
Note that there is only a single module in this pseudo project; the crate root module. No child modules exists. Is it possible to structure my project like this? If yes, how?
Solution attempts
It is easy to get this file structure by having this:
src/main.rs
mod print_hello_world;
use print_hello_world::print_hello_world;
fn main() {
print_hello_world();
}
but this does not give me the desired module structure since it introduces a child module.
I have tried finding the answer to this question myself by studying various authorative texts on this matter, most notably relevant parts of
The Rust Book
The Rust Reference
The rustc book
My conclusion so far is that what I am asking is not possible; that there is no way to use code from other files without introducing child modules. But there is always the hope that I am missing something.
This is possible, but unidiomatic and not recommended, since it results in bad code isolation. Making changes in one file is likely to introduce compilation errors in other files. If you still want to do it, you can use the include!() macro, like this:
src/main.rs
include!("print_hello_world.rs");
fn main() {
print_hello_world();
}

How to use a crate only for a given platform?

I would like to use the nix crate in a project.
However, this project also has an acceptable alternative implementation for OSX and Windows, where I would like to use a different crate.
What is the current way of expressing that I only want nix in Linux platforms?
There's two steps you need to make a dependency completely target-specific.
First, you need to specify this in your Cargo.toml, like so:
[target.'cfg(target_os = "linux")'.dependencies]
nix = "0.5"
This will make Cargo only include the dependency when that configuration is active. However, this means that on non-Linux OS, you'll get a compile error for every spot you use nix in your code. To remedy this, annotate those usages with a cfg attribute, like so:
#[cfg(target_os = "linux")]
use nix::foo;
Of course that has rippling effects as now other code using those items fails to compile as the import, function, module or whatever doesn't exist on non-Linux. One common way to deal with that is to put all usages of nix into one function and use a no-op function on all other OSes. For example:
#[cfg(target_os = "linux")]
fn do_stuff() {
nix::do_something();
}
#[cfg(not(target_os = "linux"))]
fn do_stuff() {}
fn main() {
do_stuff();
}
With this, on all platforms, the function do_stuff exists and can be called. Of course, you have to decide for yourself what the function should do on non Linux.

How can I get command stdout from a process using assert_cli crate?

I am using assert_cli crate to test a command line application. While it is very helpful with simple use cases (see some examples in this article), sometimes I want to get the raw output of the command I am testing as a String to do more sophisticated checks (regex, json or just more complex logic in the output).
For that I need to get a copy of the command output verbatim. Here is an example:
extern crate assert_cli;
fn main() {
let a = assert_cli::Assert::command(&["echo", "foo-bar-foo"]);
a.execute();
println!("{:?}", a.expect_output);
}
Somewhat predictably it gives me the following error:
error[E0616]: field `expect_output` of struct `assert_cli::Assert` is private
--> src/main.rs:14:22
|
14 | println!("{:?}", a.expect_output);
| ^^^^^^^^^^^^^^^
It also has a .stdout() method, but that requires OutputAssertionBuilder and there it is also not obvious how to access the actual contents of stdout. You can only do some simple checks using predicates syntax.
assert_cli does internally get the full output of the command during execute as seen in the source code of assert.rs
let output = spawned.wait_with_output()?;
All the internal Command and output variables seem to be private and are never exposed to retrieve the raw stdout. This functionality seems to be too basic to be omitted from assert_cli library. I am probably missing something very obvious...
Q: Is there any way to get raw stdout back as contents of a variable?
This is what I want to achieve ideally:
extern crate assert_cli;
fn main() {
// do simple checkign with assert_cli
let a = assert_cli::Assert::command(&["echo", "foo-bar-foo"])
.stdout().contains("foo-bar-foo")
.unwrap();
// get raw stdout
let cmd_stdout = a.get_raw_stdout(); // how to do it?
// do some additional complex checking
assert_eq!(cmd_stdout, "foo-bar-foo");
}
P.S.: I know I can use std::process::Command separately to achieve this. I wonder if I can still stick to assert_cli since I do 80% of the testing with it.
The library defines only 3 types. None of which allow to access the output directly.
This functionality seems to be too basic to be omitted from assert_cli library. I am probably missing something very obvious...
The library is called assert* and it has all the functions you need to assert stuffs on the output of your command. Getting the actual output is outside the domain of "assertions".
Other people have opened an issue on the repository asking for this exact feature. I suggest you to go there, and tell the author that this feature interests you.

No method 'entries_mut' when unarchiving a file with the tar crate?

I am trying to use the flate2 and tar crates to iterate over the entries of a .tar.gz file, but am getting type errors, and I'm not sure why.
Here is my code (and yes, I know I shouldn't use .unwrap() everywhere, this is just POC code):
extern crate flate2; // version 0.2.11
extern crate tar; // version 0.3
use std::io::Read;
use std::fs::File;
use flate2::read::GzDecoder;
use tar::Archive;
fn main() {
let file = File::open("/path/to/tarball.tar.gz").unwrap();
let mut decompressed = GzDecoder::new(file).unwrap();
let unarchived = Archive::new(decompressed);
let entries_iter = unarchived.entries_mut();
}
This gives me the error error: no method named 'entries_mut' found for type 'tar::Archive<flate2::gz::DecoderReader<std::fs::File>>' in the current scope.
GzDecoder::new is returning a DecoderReader<R>, which implements Read as long as R implements Read, which File does, so that should be fine. Archive<O> has different methods depending on what kind of traits O implements, but in this case I am trying to use .entries_mut(), which only requires O to implement Read.
Obviously I am missing something here, could someone help shed some light on this?
Oh man, this is tricky. The published documentation and the code do not match. In tar version 0.3.2, the method is called files_mut:
extern crate flate2; // version 0.2.11
extern crate tar; // version 0.3
use std::fs::File;
use flate2::read::GzDecoder;
use tar::Archive;
fn main() {
let file = File::open("/path/to/tarball.tar.gz").unwrap();
let decompressed = GzDecoder::new(file).unwrap();
let mut unarchived = Archive::new(decompressed);
let _files_iter = unarchived.files_mut();
}
This commit changed the API.
This is a subtle but prevalent problem with self-hosted Rust documentation at the moment (my own crates have the same issue). We build the documentation on every push to the master branch, but people use stable releases. Sometimes these go out of sync.
The best thing you can do is to run cargo doc or cargo doc --open on your local project. This will build a set of documentation for the crates and versions you are using.
Turns out that the published documentation of tar-rs was for a different version than what was on crates.io, so I had to change .entries_mut to .files_mut, and let files = to let mut files =.

Resources