MINICOM TTY communication and HEX conversion - linux

I'm trying to send information through minicom to a device through TTY.
The interesting thing is that when I enter let's say "a" I see "4f", which in ASCII should be "61" hex, or "97" decimal. Other examples are:
b = 27
c = ce
1 = 67
2 = 33
3 = e6
They definitely do not correspond to their ASCII counterparts.
Obviously I am doing something wrong. I wonder - is there a way to figure the "formula" for translating the characters to hex.
Please help!

Related

What is the difference between "x", "-" and "0" in Table 9 of the ISO-7816-4 Section 5 documentation

I'm going through the ISO-7816 documentation and having trouble interpreting the CLA scheme under section 5.4.1, Table 9:
b4 b3 b2 b1
Meaning
x x - -
Secure messaging (SM) Format
0 x - -
No SM or SM not according to 1.6
0 0 - -
No SM or no SM indication
0 1 - -
Proprietary SM format
What I understand so far is that if CLA = 8X, then the above nibble represents the various patterns that "X" can take on. What do the symbols "x" (lowercase) and "-" imply in terms of the value of the bit at that position? More concretely, what would a CLA of "80" mean? How is 0000 different from xxxx or ---- ?
More concretely, what would a CLA of "80" mean?
CLA=80 corresponds to the proprietary class, because "Bit b8 set to 1 indicates the proprietary class". Table in section 5.4.1 specifies interindustry class, i.e. where bit 8 set to 0.
What do the symbols "x" (lowercase) and "-" imply in terms of the value of the bit at that position?
You can treat mark 'x' as wildchar (any value), mark '-' as "bit is not used in this case", and 0 and 1 as exact values for bit place. So, xx-- is just a bit mask. It tells that bit 4 and bit 3 indicate what SM format is used int the command and bit 2 and bit 1 are used for something different .
0x-- can be 00-- or 01--.
Bits 2 and 1 are described with an other line of the table.

Converting a string to base64

I'm trying to understand how Base64 works.
If you wanted to send !"# using Base64, what would it look like?
Here's my working out:
String: ! " #
Hex: 21 22 23
Binary: 00100001 00100010 00100011
Base64 conversion:
Hex: 4 12 8 23
Binary: 001000 010010 001000 100011
None of the final binary values are able to be represented using any of the ascii chars in Base64.
I've obviously misunderstood something here, if anyone can point me in the right direction with an example that would be great.
If I understand your question correctly, you are using trying to re-interpret the Base64 values as characters using an ASCII table (i.e. 0x04 would be EOT). However, you will have to use the base64 index table to convert the resulting numbers back to characters (note that the index values are in decimal, not in HEX there).
Here, your values will be
Base64:
Hex: 4 12 8 23
String: E S I j
Does that make sense?

node.js: get byte length of the string "あいうえお"

I think, I should be able to get the byte length of a string by:
Buffer.byteLength('äáöü') // returns 8 as I expect
Buffer.byteLength('あいうえお') // returns 15, expecting 10
However, when getting the byte length with a spreadsheet program (libreoffice) using =LENB("あいうえお"), I get 10 (which I expect)
So, why do I get for 'あいうえお' a byte length of 15 rather than 10 using Buffer.byteLength?
PS.
Testing the "あいうえお" on these two sites, I get two different results
http://bytesizematters.com/ returns 10 bytes
https://mothereff.in/byte-counter returns 15 bytes
What is correct? What is going on?
node.js is correct. The UTF-8 representation of the string "あいうえお" is 15 bytes long:
E3 81 82 = U+3042 'あ'
E3 81 84 = U+3044 'い'
E3 81 86 = U+3046 'う'
E3 81 88 = U+3048 'え'
E3 81 8A = U+304A 'お'
The other string is 8 bytes long in UTF-8 because the Unicode characters it contains are below the U+0800 boundary and can each be represented with two bytes:
C3 A4 = U+E4 'ä'
C3 A1 = U+E1 'á'
C3 B6 = U+F6 'ö'
C3 BC = U+FC 'ü'
From what I can see in the documentation, LibreOffice's LENB() function is doing something different and confusing:
For strings which contain only ASCII characters, it returns the length of the string (which is also the number of bytes used to store it as ASCII).
For strings which contain non-ASCII characters, it returns the number of bytes required to store it in UTF-16, which uses two bytes for all characters under U+10000. (I'm not sure what it does with characters above that, or if it even supports them at all.)
It is not measuring the same thing as Buffer.byteLength, and should be ignored.
With regard to the other tools you're testing: Byte Size Matters is wrong. It's assuming that all Unicode characters up to U+FF can be represented using one byte, and all other characters can be represented using two bytes. This is not true of any character encoding. In fact, it's impossible. If you encode every characters up to U+FF using one byte, you've used up all possible values for that byte, and you have no way to represent anything else.

ASCII text to Hexadecimal in Excel

I want to this but i don't know what to do, the only functions it seems to be useful is "DEC.TO.HEX".
This is the problem, i have in one cell this text:
1234
And in the next cell i want the hexadecimal value of each character, the expected result would be:
31323334
Each character must be represented by two hexadecimal characters. I don't have an idea how to solve this in excel avoiding make a coded program.
Regards!
Edit: Hexadecimal conversion
Text value Ascii Value (Dec) Hexadecimal Value
1 49 31
2 50 32
3 51 33
4 52 34
Please try:
=DEC2HEX(CODE(MID(A1,1,1)))&DEC2HEX(CODE(MID(A1,2,1)))&DEC2HEX(CODE(MID(A1,3,1)))&DEC2HEX(CODE(MID(A1,4,1)))
In your version you might need the .s in the function (and perhaps ;s rather than ,s).
DEC2HEX may be of assistance. Use, as follows:
=DEC2HEX(A3)
First split 1234 to 1 2 3 4 by using MID(), then use Code() for each character, and then again concentate. Below is the formula, Y21 is the cell in which 1234 is written
=CONCATENATE(CODE(MID(Y21,1,1)),CODE(MID(Y21,2,1)),CODE(MID(Y21,3,1)),CODE(MID(Y21,4,1)))
1234 >> 49505152

Representing and adding negative numbers in Easy68k Assembly

I'm trying to write a simple program in Easy68k that stores to negative values, adds them together, and then outputs them in the console. I am having trouble figuring out how to represent the negative numbers. We are asked that they be in hex format and output in decimal. Everything seems correct but the values themselves. I used 2s complement and then converted the two numbers to hex.
First decimal number = -102
Second decimal number = -87
Using 2s complement I converted the two numbers to hex (though I'm not sure if this is even correct):
-102 -> 1A
-87 -> 29
Here's my code so far:
addr EQU $7CE0
data1 EQU $1A
data2 EQU $29
ORG $1000
START: ; first instruction of program
* Put program code here
MOVE #data2,D1
MOVEA.W #addr,A0
ADD #data1,D1
MOVE D1,(A0)
MOVE.B #3,D0
TRAP #15
* Variables and Strings
* Put variables and constants here
END START ; last line of source
I even tried to just convert binary versions of the negative numbers straight to hex:
-102 -> 11100110 -> E6
-87 -> 11010111 -> D7
Which didn't work either. I also tried storing the binary version and adding them, but got the same result.
Here's the question:
Write a program in assembly to add the two numbers (-102 and -87). Inputs should be in hexadecimal format. Store the result in hexadecimal at an address $7CE0. Print out the result in decimal.(Hint: use the track function task #3). If an error happens, you should also print out the error message as well.
I know I am misrepresenting the two negative numbers, but I just can't figure out how to do it right. I've looked everywhere and found nothing on how to store/add/output negative numbers in 68k. Any help is appreciated, this is for an assignment so I'm not expecting direct answers. Thanks!

Resources