I would like to write binary data to a file for an ancillary hash table operation and then read it back using stream.rawRead(). How would I go about converting a string to binary in D. I would prefer not to use any third party libraries if I can.
The built in module std.utf has methods to convert to and from the utf encodings (with utf8 being compatible with ascii).
If you want to use raw read you should write the length of the string first so when reading you know how many bytes the string is.
Side note - if your strings are ASCII, it is pretty much straightforward:
// following will not work:
// ubyte[] stringBytes = cast(ubyte[]) "Добар дан!".dup;
ubyte[] stringBytes = cast(ubyte[]) "Hello world".dup;
writeln(stringBytes);
char[] charr = cast(char[]) stringBytes;
writeln(charr);
string str = to!string(charr);
writeln(str);
Output:
[72, 101, 108, 108, 111, 32, 119, 111, 114, 108, 100]
Hello world
Hello world
As Ratched pointed out, you will need some sort of unicode conversion...
Another option is representation:
import std.stdio, std.string;
void main() {
auto s = "March";
auto a = s.representation;
a.writeln; // [77, 97, 114, 99, 104]
}
https://dlang.org/library/std/string/representation.html
Related
I would like to save a file name (at most 32 bytes) in a byte array, and then convert bytes back to String. Since there are a sequence of file names, the underlying array is designed to be fixed size (i.e, 32 bytes).
// name is the file name `&str`
let mut arr = [0u8; 32];
arr[..name.len()].copy_from_slice(name.as_bytes());
But the problem is: it is possible to get the file name the 32-byte long array (arr) without storing the length?
In C/C++, many built-in functions are offered due to the fact that the raw string is terminated with 0:
// store
memcpy(arr, name.c_str(), name.length() + 1);
// convert it back
char *raw_name = reinterpret_cast<char*>(arr);
So, what is the idiomatic way to do it in Rust? A possible way is to explicitly store the size using an extra 5 bits, but it seems that it is not the best method.
I' dont know what reinterpret_cast<char*> exactly do in C. But I think you can do similar thing with std::str::from_utf8
let name ="hello";
let mut arr = [0u8; 32];
arr[..name.len()].copy_from_slice(name.as_bytes());
println!("{:?}",arr);
//[104, 101, 108, 108, 111, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
let recoverd_str = std::str::from_utf8(arr.as_slice()).unwrap();
println!("{:?}",recoverd_str);
//"hello\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
println!("{}",recoverd_str);
//hello
However, recovered_str and name is not actually same... but you can trim the trailing null bytes! Check this answer.
Another way to convert a null-terminated [0u8; 32] to a &str would be through CStr, with the unstable from_bytes_until_nul:
#![feature(cstr_from_bytes_until_nul)]
CStr::from_bytes_until_nul(arr.as_slice()).unwrap().to_str().unwrap()
Playground
However, this requires that a 0 byte is included in the slice, the maximum string length storable becomes 31 bytes. (Try reducing the array length to 5 in the playground to see what happens for 32 bytes long strings.)
If you want to be able to store 32 byte strings, you could use std::str::from_utf8(arr.split(|&c| c == 0).next().unwrap()).unwrap(), but I don't think that qualifies as idiomatic anymore…
I'm using init_color(); in ncurses in C to try to define new RGB color values. However, init_color(); does not take affect and change the default colors once I run the program.
I have tried moving around the init_color(); statements before and after both the init_pair(); statements and start_color(); but have had no luck. I also have tried using different values (ASCII, and numbers from other sources) in place of ex. COLOR_MAGENTA, for the first argument in one of the init_color(); statements, but also no luck. My start_color();, init_color(); and init_pair(); statements are all within a main function before the rest of the program. My terminal (using cloud9/cs50) supports 256 colors (checked using terminal commands). Also all color definitions are above function 'main'.
int main(int argc, char *argv[])
{
// ensure that number of arguments is as expected
if (argc != 1)
{
fprintf(stderr, "Usage: ./lemonade\n");
return 1;
}
// start up ncurses
if (!startup())
{
fprintf(stderr, "Error starting up ncurses\n");
return 2;
}
// initialize colors
start_color();
// re-asign specific RGB value to colors
init_color(COLOR_MAGENTA, 254, 160, 207);
init_color(COLOR_GREEN, 37, 244, 82);
init_color(COLOR_BLUE, 96, 82, 186);
// used cyan for a different greeen
init_color(COLOR_CYAN, 46, 243, 74);
// used yellow for a grey
init_color(COLOR_YELLOW, 156, 156, 156);
// used red for a purple
init_color(COLOR_RED, 208, 196, 253);
// initilaize color pairs
init_pair(LOGO_PAIR, COLOR_MAGENTA, COLOR_GREEN);
init_pair(DRAWBORDERSSPECIAL_PAIR, COLOR_BLACK, COLOR_GREEN);
init_pair(BORDERS_PAIR, COLOR_WHITE, COLOR_BLACK);
init_pair(SPECIALNEXT_PAIR, COLOR_BLACK, COLOR_GREEN);
init_pair(SUNNYBLUE_PAIR, COLOR_WHITE, COLOR_BLUE);
init_pair(WEATHERGREEN_PAIR, COLOR_WHITE, COLOR_CYAN);
init_pair(CLOUDYGREY_PAIR, COLOR_WHITE, COLOR_YELLOW);
init_pair(HOTPURPLE_PAIR, COLOR_WHITE, COLOR_RED);
// clean
clean(); // clean includes (refresh(); and clear();)
// draw borders
drawborders();
// run screen 1
screenone();
// support color test
mvprintw(6, 50, "My terminal supports %d colors.", COLORS);
// has_color(); test
if (has_colors() == FALSE)
{
mvprintw(7, 50, "Your terminal does not support color \n");
}
// can_change_color(); test
if (can_change_color() == FALSE)
{
mvprintw(8, 50, "Can_change_color is false \n");
}
I expected the init_color(); statements to take affect and change the default colors (Ex. Magenta, black, etc.) to the newly assigned specific RGB values, but they remain the same once the program runs.
I added checks to the number of colors supported, has_colors();, and can_change_color();. The number of colors support returns 8, has_colors(); returns true, and finally can_change_color(); returns false. Thank you for suggesting using has_colors(); and can_change_color(); although this seems to be the issue I'm not sure where to go from here?
Hmm, I cannot see your calls to has_colors() and can_change_color(), which should be used to detect if you're even allowed to do this on your system?
This is the first thing you should be checking. It may be that color changing is not permitted in your environment.
For the application that I am currently working on, I am required to read UTF-8 encoded strings from a binary file. These strings are not null-terminated, but rather are prefaced with a byte specifying their length.
When I attempt to read in such a string, all multibyte UTF-8 characters become ?. Find below a sample:
public void main(string[] args) {
File file = File.new_for_path("test.bin");
DataInputStream instream = new DataInputStream(file.read());
uint8[] chars = new uint8[instream.read_byte()];
instream.read(chars);
print(#"$((string) chars)\n");
}
This is, of course, a stripped sample. The actual binary files in question are encrypted, which is not reflected here. If I use this with a sample file test.bin that contains the byte sequence 09 52 C3 AD 61 73 74 72 61 64, or Ríastrad prefaced with its byte length in UTF-8. The expected output is thus Ríastrad, but the actual output is R?astrad.
Might anyone be able to shed some light on the problem and, perhaps, a solution?
You need to add Intl.setlocale (); to your code:
public void main(string[] args) {
Intl.setlocale ();
File file = File.new_for_path("test.bin");
DataInputStream instream = new DataInputStream(file.read());
uint8[] chars = new uint8[instream.read_byte()];
instream.read(chars);
print(#"$((string) chars)\n");
}
The default locale for print () is the C locale, which is US ASCII. Any character outside the US ASCII character range is presented as a ?. Using Intl.setlocale (); sets the locale to be the same as the machine running the program.
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 5 years ago.
Improve this question
I've finally gotten to the stage of getting a response from a UDP tracker.
Here's an exmaple, that I split into an array:
[ 1, 3765366842, 1908, 0, 2, 0 ]
Action, request ID, interval, leechers, seeders, peers.
No matter which torrent I chose, I get 1/2 seeders, which I'm assuming is the server tracking me, and no peers / leechers.
Am I not using the correct info hash?
This is how I retrieve it from a magnet link:
magnet:?xt=urn:btih:9f9165d9a281a9b8e782cd5176bbcc8256fd1871&dn=Ubuntu+16.04.1+LTS+Desktop+64-bit&tr=udp%3A%2F%2Ftracker.leechers-paradise.org%3A6969&tr=udp%3A%2F%2Fzer0day.ch%3A1337&tr=udp%3A%2F%2Fopen.demonii.com%3A1337&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Fexodus.desync.com%3A6969
...
h = 9f9165d9a281a9b8e782cd5176bbcc8256fd1871
Now I split this into chunks of two, and parse them, for hex bytes:
bytes = [];
for (var i = 0; i < h.length; i++) bytes.push(parseInt((h[i]) + h[i++], 16));
[153, 153, 102, 221, 170, 136, 170, 187, 238, 136, 204, 85, 119, 187, 204, 136, 85, 255, 17, 119]
There's no need to encode this, so I send it along with my request.
This is the only point causing trouble, yet it seems so simple...
http://xbtt.sourceforge.net/udp_tracker_protocol.html
As the8472 says, your decoding is incorrect:
for (var i = 0; i < h.length; i++) bytes.push(parseInt((h[i]) + h[i++], 16));
i and i++ will have the same value here. (One of the reasons to avoid clever inline stuff.) You can use i and ++i, or maybe expand it all to multiple lines for readability:
for (var i = 0; i < h.length; i += 2) {
var hex = h.substr(i, 2);
bytes.push(parseInt(hex, 16));
}
And if you’re using Node, just parse it into a Buffer, which can easily be converted to an array if necessary:
var bytes = Buffer.from(h, 'hex');
9f91
should result in the first two bytes being 159, 145, so your hex-decoding is incorrect.
Beyond that you should compare your implementation with a working one through wireshark.
http://xbtt.sourceforge.net/udp_tracker_protocol.html
As was already mentioned in an answer to another question official and up-to-date specs reside at bittorrent.org, that includes the UDP tracker spec. The xbtt page is not maintained.
How to read and write to binary files in D language? In C would be:
FILE *fp = fopen("/home/peu/Desktop/bla.bin", "wb");
char x[4] = "RIFF";
fwrite(x, sizeof(char), 4, fp);
I found rawWrite at D docs, but I don't know the usage, nor if does what I think. fread is from C:
T[] rawRead(T)(T[] buffer);
If the file is not opened, throws an exception. Otherwise, calls fread for the file handle and throws on error.
rawRead always read in binary mode on Windows.
rawRead and rawWrite should behave exactly like fread, fwrite, only they are templates to take care of argument sizes and lengths.
e.g.
auto stream = File("filename","r+");
auto outstring = "abcd";
stream.rawWrite(outstring);
stream.rewind();
auto inbytes = new char[4];
stream.rawRead(inbytes);
assert(inbytes[3] == outstring[3]);
rawRead is implemented in terms of fread as
T[] rawRead(T)(T[] buffer)
{
enforce(buffer.length, "rawRead must take a non-empty buffer");
immutable result =
.fread(buffer.ptr, T.sizeof, buffer.length, p.handle);
errnoEnforce(!error);
return result ? buffer[0 .. result] : null;
}
If you just want to read in a big buffer of values (say, ints), you can simply do:
int[] ints = cast(int[]) std.file.read("ints.bin", numInts * int.sizeof);
and
std.file.write("ints.bin", ints);
Of course, if you have more structured data then Scott Wales' answer is more appropriate.