Visual C++ run times error - visual-c++

All case run in Visual C++ 2005 environment
Function definition:
char *PQgetvalue(const PGresult *res, int tup_num, int field_num);
Case 1.
CString dd;
dd=(LPCTSTR) PQgetvalue(res,i,0);
when above is compiled with NO ERROR but dd has store garbage data like: 〱〰71〱1 몭몭몭몭몭몭ꮫꮫꮫꮫ
Case 2.
CString dd;
dd= PQgetvalue(res,i,0);
No Compilation error and provide correct output.
Question: How to convert Char* to CString
Case 3.
CString dd;
dd= PQgetvalue(res,i,0);
CString dd = PQgetvalue(res,i,0);
There is NO Difference between above code. But second case generate compilation error like:
error C2440: 'initializing' :
cannot convert from 'char *' to 'ATL::CStringT<BaseType,StringTraits>'
Please clarify anyone

For the first case. I guess you compile your project with Unicode character set (to verify open project settings dialog "Configuration properties" - "General" - "Character set"). Thus converting char* to const wchar_t* would gives you garbage. I.e. TCHAR is wchar_t when you compile with Unicode.
On the C2440. Please clarify whether code in Case 3 is correct (i.e. you supply all 3 lines to compiler as provided here.). Also I'd suggest you to search MSDN on this error.

Related

How to convert BSTR string to Unsigned Char (Using com technology in the appln)

I am writing small application which uses com technology. I want to convert BSTR string to an unsigned char. To do this, i used W2A() Macro to convert from BSTR to String and then copied String.C_STR() to an unsigned char array. The code snippet is as follows:
Send(BSTR *packet, int length)
{
std::string strPacket = W2A(*packet);
unsigned char * pBuffer = new unsigned char [strPacket.length()+1];
memset(pBuffer,0,strPacket.length()+1);
memcpy(pBuffer,strPacket.c_str(),strPacket.length()+1);
}
This works fine when packet contains normal string. But if the packet contains a NUL character in it, the problem occurs. Some unknown characters appear after that NUL in the pBuffer i.e, after conversion.
Can anyone please let me know how to avoid that? Or is there any other way to do it correctly?
A BSTR is a Windows API type and must be managed with API macros or functions. If you cannot use W2A macro because your string may have null chars inside, you will have to use functions as WideCharToMultiByte that can convert from wide characters of BSTR to narrow chararacters for a char*. Be sure to have the SDK documentation. Alternatively, you could make you program use WCHARs

How to convert unsigned char to LPCTSTR in visual c++?

BYTE name[1000];
In my visual c++ project there is a variable defined name with the BYTE data type. If i am not wrong then BYTE is equivalent to unsigned char. Now i want to convert this unsigned char * to LPCTSTR.
How should i do that?
LPCTSTR is defined as either char const* or wchar_t const* based on whether UNICODE is defined or not.
If UNICODE is defined, then you need to convert the multi-byte string to a wide-char string using MultiByteToWideChar.
If UNICODE is not defined, a simple cast will suffice: static_cast< char const* >( name ).
This assumes that name is a null-terminated c-string, in which case defining it BYTE would make no sense. You should use CHAR or TCHAR, based on how are you operating on name.
You can also assign 'name' variable to CString object directly like:
CString strName = name;
And then you can call CString's GetBuffer() or even preferably GetString() method which is more better to get LPCTSTR. The advantage is CString class will perform any conversions required automatically for you. No need to worry about Unicode settings.
LPCTSTR pszName = strName.GetString();

How to throw exception in VIsual C++ using a wchar_t string?

I have legacy code which I am incrementally porting to Unicode characters in Visual C++ (wchar_t). I've encountered this bit of code that I'd like to convert:
char tmp[256];
sprintf(tmp, "stuff");
throw exception(tmp);
I want to change it to something like this (this gives me a compile error on exception):
wchar_t tmp[256];
swprintf(tmp, "stuff");
throw exception(tmp);
So far I haven't found any document to give me the Unicode equivalent for throw exception, can anyone help me?
Of course I could convert my "tmp" back into a char string, but that just seems silly to have to do that.
std::exception does not support wchar_t strings, so you will have to either convert your wchar_t buffer into a separate char buffer, or do not switch to a wchar_t buffer to begin with as sprintf() does support formatting Unicode input via its %S and %ls formatting specifiers, eg:
char tmp[256];
sprintf(tmp, "%ls", wchar_t data here);
throw exception(tmp);

How to prevent C6284 when using CString::Format?

The following code is generating warning C6284 when compiled with /analyze on MSVC 2008 : object passed as parameter '%s' when string is required in call to function.
CString strTmp, str;
str = L"aaa.txt"
strTmp.Format (L"File: %s", str);
I'm looking for a nice solution for this that would not require static_cast
Microsoft describes the usage of CString with variable argument functions here:
CString kindOfFruit = "bananas";
int howmany = 25;
printf_s( "You have %d %s\n", howmany, (LPCTSTR)kindOfFruit );
As an alternative you can also use the method PCXSTR CString::GetString() const; to try to fix the warning:
CString strTmp, str;
str = L"aaa.txt"
strTmp.Format (L"File: %s", str.GetString());
One of CString's design flaws, err, features is that it features an implicit conversion to LPCTSTR which makes the warning not that meaningful IMHO. But anyway, if you look at the Microsoft documentation, they actually use casts in their own example. I don't really see the need to avoid a static_cast here, in fact I would welcome it as it does make the implicit conversion more explicit and thus easier to spot.

LPCSTR, TCHAR, String

I am use next type of strings:
LPCSTR, TCHAR, String i want to convert:
from TCHAR to LPCSTR
from String to char
I convert from TCHAR to LPCSTR by that code:
RunPath = TEXT("C:\\1");
LPCSTR Path = (LPCSTR)RunPath;
From String to char i convert by that code:
SaveFileDialog^ saveFileDialog1 = gcnew SaveFileDialog;
saveFileDialog1->Title = "Сохранение файла-настроек";
saveFileDialog1->Filter = "bck files (*.bck)|*.bck";
saveFileDialog1->RestoreDirectory = true;
pin_ptr<const wchar_t> wch = TEXT("");
if ( saveFileDialog1->ShowDialog() == System::Windows::Forms::DialogResult::OK ) {
wch = PtrToStringChars(saveFileDialog1->FileName);
} else return;
ofstream os(wch, ios::binary);
My problem is that when i set "Configuration Properties -> General
Character Set in "Use Multi-Byte Character Set" the first part of code work correctly. But the second part of code return error C2440. When i set "Configuration Properties -> General
Character Set in "Use Unicode" the second part of code work correctly. But the first part of code return the only first character from TCHAR to LPCSTR.
I'd suggest you need to be using Unicode the whole way through.
LPCSTR is a "Long Pointer to a C-type String". That's typically not what you want when you're dealing with .Net methods. The char type in .Net is 16bits wide.
You also should not use the TEXT("") macro unless you're planning multiple builds using various character encodings. Try wrapping all your string literals with the _W("") macro instead and a pure unicode build if you can.
See if that helps.
PS. std::wstring is very handy in your scenario.
EDIT
You see only one character because the string is now unicode but you cast it as a regular string. Many or most of the Unicode characters in the ASCII range has their same number as in ASCII but have the second of their 2 bytes set to zero. So when a unicode string is read as a C-string you only see the first character because C-strings are null ( zero ) terminated. The easy ( and wrong ) way to deal with this is to use std:wstring to cast as a std:string then pull the C-String out of that. This is not the safe approach because Unicode has a much large character space then your standard encoding.

Resources