How to convert String^ to float? - string

I have taken text from a textbox txt1 in Visual Studio 2012 for Windows 8, when i use txt1->Text, it returns the text in String^. How do I convert it to float? I want this app to run on Windows 8.

Looks like you have a .NET System::String^. .NET provides some conversion functions, of which my favorite is TryParse.
float value;
String^ str;
if (System::Single::TryParse(str, value)) {
/* ok, use value */
}
else {
/* problem : str isn't numeric */
}

Related

How to convert AnsiString to std::string in C++ Builder?

I would like to ask how can I get a text input from TEdit control and cast it to std::string (not AnsiString).
For example, if I have a TEdit control with the name User, I get the text from it with the User->Text command. What I want to do is to assign that value to a std::string, for example string my_str = User->Text;.
I would like to ask, how can I do this in C++ Builder? Is there some sort of a ToString() method or sort of, because I was not able to find one.
In C++Builder 2007 and earlier, TEdit::Text is an 8-bit AnsiString in the user's default ANSI locale. It is very straight forward to convert an AnsiString to a std::string - just use the AnsiString::c_str() method to get a null-terminated char* pointer to the AnsiString data, and then you can assign that to the std::string, eg:
std::string my_str = User->Text.c_str();
/* or:
System::AnsiSystem text = User->Text;
std::string my_str(text.c_str(), text.Length());
*/
If you want the std::string data to be in another character encoding, such as UTF-8, then you will have to convert the AnsiString data accordingly, such as with MultiByteToWideChar()/WideCharToMultiByte(), UTF8Encode(), etc, before assigning it to the std::string.
In C++Builder 2009 and later, TEdit::Text is a 16-bit UnicodeString in UTF-16 format. The easiest way to convert a UnicodeString to a std::string is to first convert to an AnsiStringT<CP> (where CP is the desired ANSI codepage - AnsiString uses CP=0, UTF8String uses CP=65001, etc), and then convert that to std::string, eg:
std::string my_str = AnsiString(User->Text).c_str(); // or UTF8String, etc...
/* or:
System::AnsiString text = User->Text; // or UTF8String, etc...
std::string my_str(text.c_str(), text.Length());
*/
Alternatively, in C++11 and later, you can convert the UnicodeString to a std::wstring first, and then use std::wstring_convert, eg:
#include <locale>
std::wstring my_wstr = User->Text.c_str();
/* or:
System::UnicodeString text = User->Text;
std::wstring my_wstr(text.c_str(), text.Length());
*/
// System::Char may be either wchar_t or char16_t, depending
// on which platform you are compiling for...
std::string my_str = std::wstring_convert<std::codecvt_utf8_utf16<System::Char>>{}.to_bytes(my_wstr);
I had a lot of those to migrate from Borland to Embarcadero Rio. So I created a method to do it.
#include <cwchar.h> //std::wcslen
char* __fastcall AnsiOf(wchar_t* w)
{
static char c[STR_CONV_BUF_SIZE];
memset(c, 0, sizeof(c));
WideCharToMultiByte(CP_ACP, WC_NO_BEST_FIT_CHARS, w, std::wcslen(w), c, STR_CONV_BUF_SIZE, NULL, NULL);
return c;
}
std::string my_str = AnsiOf((User->Text).c_str());

Getting hexadecimal character escape while printing string in Xcode

I'm trying to run this C++ code in Xcode 8.1:
std::string str = "g[+g]g[-g]g[āˆ’g[+g]g]g[+g]g";
for (auto& c : str) {
printf("%c", c);
}
and I'm getting this as output:
g[+g]g[-g]g[\342\210\222g[+g]g]g[+g]g
Does anyone knows why some characters are coming as hexadecimal characters?
I already tried to print as c_str().

Changing the text of a C++ CLI label

I am trying to change the text of a label in a C++ CLI program. I need to take a value the the user entered in a textbox, insert that into a short string, then change a label to that string. I have no problem constructing the string, but I am having trouble setting the label to the new string. Here is my code...
std::string v1str = "Phase A: ";
v1str.append(vt2); //vt2 is type str::string
v1str.append(" Vac");
label->Text = v1str;
This is the error message that I'm getting...
Why am I not allowed to pass v1str as the label text setter? How can I pass the string I've constructed to the label text setter?
Label::Text has a type of System::String^, which is a managed .Net string object. You cannot assign a std:string to a System::String^ directly becuase they are different types.
You can convert a std::string to a System::String. However you most likely just want to use the System::String type directly:
System::String^ v1str = "Phase A: ";
v1st += vt2; // or maybe gcnew System::String(vt2.c_str());
v1str += " Vac";
label->Text = v1str;
C++/CLI is not C++, you can't use std::string in there. But you can use C++ within C++/CLI, and convert std::string to and from System::String
//In C++/CLI form:
#include <vcclr.h>
System::String^ clr_sting = "clr_sting";
//convert strings from CLI to C++
pin_ptr<const wchar_t> cpp_string = PtrToStringChars(clr_sting);
//convert strings from C++ to CLI
System::String^ str = gcnew System::String(cpp_string);
//or
std::string std_string = "std_string";
System::String^ str2 = gcnew System::String(std_string.c_str());

Convert hex to int

I've seen lots of answers to this, but I cannot seem to get any to work. I think I'm getting confused between variable types. I have an input from NetworkStream that is put a hex code into a String^. I need to take part of this string, convert it to a number (presumably int) so I can add some arithemetic, then output the reult on the form. The code I have so far:
String^ msg; // gets filled later, e.g. with "A55A6B0550000000FFFBDE0030C8"
String^ test;
//I have selected the relevant part of the string, e.g. 5A
test = msg->Substring(2, 2);
//I have tried many different routes to extract the numverical value of the
//substring. Below are some of them:
std::stringstream ss;
hexInt = 0;
//Works if test is string, not String^ but then I can't output it later.
ss << sscanf(test.c_str(), "%x", &hexInt);
//--------
sprintf(&hexInt, "%d", test);
//--------
//And a few others that I've deleted after they don't work at all.
//Output:
this->textBox1->AppendText("Display numerical value after a bit of math");
Any help with this would be greatly appreciated.
Chris
Does this help?
String^ hex = L"5A";
int converted = System::Convert::ToInt32(hex, 16);
The documentation for the Convert static method used is on the MSDN.
You need to stop thinking about using the standard C++ library with managed types. The .Net BCL is really very good...
Hope this helps:
/*
the method demonstrates converting hexadecimal values,
which are broken into low and high bytes.
*/
int main(){
//character buffer
char buf[1];
buf[0]= 0x06; //buffer initialized to some hex value
buf[1]= 0xAE; //buffer initialized to some hex value
int number=0;
//number generated by binary shift of high byte and its OR with low byte
number = 0xFFFF&((buf[1]<<8)|buf[0]);
printf("%x",number); //this prints AE06
printf(ā€œ%dā€,number); //this prints the integer equivalent
getch();
}

Compare TCHAR with String value in VC++?

How to Compare TCHAR with String value in VC++ ?
My project is not Unicode.
I am doing like this :
TCHAR achValue[16523] = NULL;
if(achValue == _T("NDSPATH"))
{
return FALSE;
}
When achValue = "NDSPATH" then also this condition does not staisfy.
Any help is appreciated.
A TCHAR or any string array is simply a pointer to the first character. What you're comparing is the value of the pointer, not the string. Also, you're assigning an array to null which is nonsensical.
Use win32 variations of strcmp. If you use the _tcscmp macro, it will use the correct function for multibyte/unicode at compile time.
#define MAX_STRING 16523;
TCHAR achValue[MAX_STRING];
ZeroMemory(achValue, sizeof(TCHAR) * MAX_STRING);
sprintf(achValue, MAX_PATH, _T("NDSPATH"));
if (!_tcscmp(achValue, _T("NDSPATH"))
{
// strings are equal when result is 0
}

Resources