converting String to double in vc++ - visual-c++

Can anyone help me with a way of converting String to double in vc++?
I can't use atoi, since it converts char to double. But I am using istringstream.
std::istringstream stm;
double d;
String name = "32.67";
stm.str(name);
stm >>d;
It will give a compilation error:
error C2664: 'void std::basic_istringstream::str(const std::basic_string &)' :
cannot convert parameter 1 from 'System::String ^' to 'const std::basic_string &'
Please help with different solution or correct this.

std::stringstream str() accepts a std::string as an argument. You're passing it a System::String, wherever that comes from. Given the funky ^ symbol you must be using C++/CLI, using .NET strings.
Use std::string unless you are for some reason required to use the .NET library, in which case you need to either use .NET conversion functions, or convert to a std::string (or char* c-string and use the << operator).

As the other responder has suggestion, you are probably using C++/CLI. In that case:
String ^ name = "32.67";
double d;
d = Double::Parse(name);
Note that if the string cannot be parsed into a double, an exception will be thrown. Use Double::TryParse if you want to avoid that (it returns false if the string cannot be parsed).

I think it is very simple in vc++/CLR programming.
String ^name = "32.56";
String ^no = "56";
Double number_double = Convert::ToDouble(name); // convert String to double
Int number_int = Convert::ToInt32(no); // convert String to integer

Related

C++ override quotes

Ok, so I'm using C++ to make a library that'd help me to print lines into a console.
So, I want to override " "(quote operators) to create an std::string instead of the string literal, to make it easier for me to append other data types to that string I want to output.
I've seen this done before in the wxWidgets with their wxString, but I have no idea how I can do that myself.
Is that possible and how would I go about doing it?
I've already tried using this code, but with no luck:
class PString{
std::string operator""(const char* text, std::size_t len) {
return std::string(text, len);
}
};
I get this error:
error: expected suffix identifier
std::string operator""(const char* text, std::size_t len) {
^~
which, I'd assume, want me to add a suffix after the "", but I don't want that. I want to only use ""(quotes).
Thanks!
You can't use "" without defining a suffix. "" is a const char* by itself either with a prefix (like L"", u"", U"", u8"", R"()") or followed by suffixes like (""s, ""sv, ...) which can be overloaded.
The way that wxString works is set and implicit constructor wxString::wxString(const char*); so that when you pass "some string" into a function it is essentially the same as wxString("some string").
Overriding operator ""X yields string literals as the other answer.

C++ Convert Sytem::String^ to LPCOLESTR

I write in mixed mode (C++/CLI) and I can not resolve this problem:
String ^progID = "Matrikon.OPC.Server";
CLSID clsid;
HRESULT result = CLSIDFromProgID(progID, &clsid);
error C2664: 'CLSIDFromProgID' : cannot convert parameter 1 from 'System::String ^' to 'LPCOLESTR'
How can I convert String^ to LPCOLESTR ?
Thanks!
I made another way:
// 1.
pin_ptr<const WCHAR> str = PtrToStringChars(progID);
LPCOLESTR coleString = (LPWSTR)str;
I have found that pin_ptr will be released if goes out of scope Define the Scope of Pinning Pointers and pin_ptr (C++/CLI)
This code works well for me:
// 2. this is the same like (1.)
String ^progID2 = "Matrikon.OPC.Simulation.1";// This is example of dynamic string
pin_ptr<const WCHAR> PINprogID2 = PtrToStringChars(progID2);
CLSID clsid2;
HRESULT result2 = CLSIDFromProgID(PINprogID2, &clsid2); //(LPCOLESTR, &CLSID)
Another example:
// 3.
pin_ptr<const WCHAR> sclsid3 = PtrToStringChars("{63D5F432-CFE4-11d1-B2C8-0060083BA1FB}");
CLSID clsid3;
CLSIDFromString((WCHAR*)sclsid3, &clsid3); //(LPOLESTR, &CLSID)
I am not much experienced and I am not sure if there are some lack of memory, but I think those codes are correct.
Avoid using the hammer for every nail. C++/CLI lets you just as easily use native types. So it is simply:
LPCOLESTR progid = L"Matrikon.OPC.Server";
// etc..
Non-zero odds (always say why) that you can simply use Type::GetTypeFromProgID().
First, lets convert System::String to char*
IntPtr p = Marshal::StringToHGlobalAnsi(progID);
char *pNewCharStr = static_cast<char*>(p.ToPointer());
second, casting char * to LPCOLESTR using ATL conversion macro:
LPCOLESTR converted_string = A2COLE(pNewCharStr);

How to convert BSTR string to Unsigned Char (Using com technology in the appln)

I am writing small application which uses com technology. I want to convert BSTR string to an unsigned char. To do this, i used W2A() Macro to convert from BSTR to String and then copied String.C_STR() to an unsigned char array. The code snippet is as follows:
Send(BSTR *packet, int length)
{
std::string strPacket = W2A(*packet);
unsigned char * pBuffer = new unsigned char [strPacket.length()+1];
memset(pBuffer,0,strPacket.length()+1);
memcpy(pBuffer,strPacket.c_str(),strPacket.length()+1);
}
This works fine when packet contains normal string. But if the packet contains a NUL character in it, the problem occurs. Some unknown characters appear after that NUL in the pBuffer i.e, after conversion.
Can anyone please let me know how to avoid that? Or is there any other way to do it correctly?
A BSTR is a Windows API type and must be managed with API macros or functions. If you cannot use W2A macro because your string may have null chars inside, you will have to use functions as WideCharToMultiByte that can convert from wide characters of BSTR to narrow chararacters for a char*. Be sure to have the SDK documentation. Alternatively, you could make you program use WCHARs

How to convert unsigned char to LPCTSTR in visual c++?

BYTE name[1000];
In my visual c++ project there is a variable defined name with the BYTE data type. If i am not wrong then BYTE is equivalent to unsigned char. Now i want to convert this unsigned char * to LPCTSTR.
How should i do that?
LPCTSTR is defined as either char const* or wchar_t const* based on whether UNICODE is defined or not.
If UNICODE is defined, then you need to convert the multi-byte string to a wide-char string using MultiByteToWideChar.
If UNICODE is not defined, a simple cast will suffice: static_cast< char const* >( name ).
This assumes that name is a null-terminated c-string, in which case defining it BYTE would make no sense. You should use CHAR or TCHAR, based on how are you operating on name.
You can also assign 'name' variable to CString object directly like:
CString strName = name;
And then you can call CString's GetBuffer() or even preferably GetString() method which is more better to get LPCTSTR. The advantage is CString class will perform any conversions required automatically for you. No need to worry about Unicode settings.
LPCTSTR pszName = strName.GetString();

LPCSTR, TCHAR, String

I am use next type of strings:
LPCSTR, TCHAR, String i want to convert:
from TCHAR to LPCSTR
from String to char
I convert from TCHAR to LPCSTR by that code:
RunPath = TEXT("C:\\1");
LPCSTR Path = (LPCSTR)RunPath;
From String to char i convert by that code:
SaveFileDialog^ saveFileDialog1 = gcnew SaveFileDialog;
saveFileDialog1->Title = "Сохранение файла-настроек";
saveFileDialog1->Filter = "bck files (*.bck)|*.bck";
saveFileDialog1->RestoreDirectory = true;
pin_ptr<const wchar_t> wch = TEXT("");
if ( saveFileDialog1->ShowDialog() == System::Windows::Forms::DialogResult::OK ) {
wch = PtrToStringChars(saveFileDialog1->FileName);
} else return;
ofstream os(wch, ios::binary);
My problem is that when i set "Configuration Properties -> General
Character Set in "Use Multi-Byte Character Set" the first part of code work correctly. But the second part of code return error C2440. When i set "Configuration Properties -> General
Character Set in "Use Unicode" the second part of code work correctly. But the first part of code return the only first character from TCHAR to LPCSTR.
I'd suggest you need to be using Unicode the whole way through.
LPCSTR is a "Long Pointer to a C-type String". That's typically not what you want when you're dealing with .Net methods. The char type in .Net is 16bits wide.
You also should not use the TEXT("") macro unless you're planning multiple builds using various character encodings. Try wrapping all your string literals with the _W("") macro instead and a pure unicode build if you can.
See if that helps.
PS. std::wstring is very handy in your scenario.
EDIT
You see only one character because the string is now unicode but you cast it as a regular string. Many or most of the Unicode characters in the ASCII range has their same number as in ASCII but have the second of their 2 bytes set to zero. So when a unicode string is read as a C-string you only see the first character because C-strings are null ( zero ) terminated. The easy ( and wrong ) way to deal with this is to use std:wstring to cast as a std:string then pull the C-String out of that. This is not the safe approach because Unicode has a much large character space then your standard encoding.

Resources