Converting string to tchar in VC++ - string

how I can convert string to tchar in VC++?
string internetprotocol="127.4.5.6";
TCHAR szProxyAddr[16];
i want to set:
szProxyAddr=internetprotocol;
how i can do it?

#include <atlstr.h>
string internetprotocol="127.4.5.6";
TCHAR szProxyAddr[16];
_tcscpy_s(szProxyAddr, CA2T(internetprotocol.c_str()));
_tcscpy_s is generic strcpy version which works both in Unicode and Multi-Character configurations. CA2T converts const char* to TCHAR*, according to szProxyAddr variable type.
Be careful with destination variable length.

You may try like this:
#include <atlstr.h>
_tcscpy_s(szProxyAddr, CA2T(internetprotocol.c_str()));

Related

Unicode to char* c++ 11

I want to know if is there any way to convert a unicode code to a string or char in C++ 11.
I've been trying with extended latin unicode letter Á (as an example) which has this codification:
letter: Á
Unicode: 0x00C1
UTF8 literal: \xc3\x81
I've been able to do so if it's hardcoded as:
const char* c = u8"\u00C1";
But if i got the byte sequence as a short, how can I do the equivalent to get the char* or std::string 'Á'?
EDIT, SOLUTION:
I was finally able to do so, here is the solution if anyone needs it:
std::wstring ws;
for(short input : inputList)
{
wchar_t wc(input);
ws += wc;
}
std::wstring_convert<std::codecvt_utf8<wchar_t>> cv;
str = cv.to_bytes(ws);
Thanks for the comments they were very helpful.
The C++11 standard contains codecvt_utf8, which converts between some internal character type (try char16_t if your compiler has it, otherwise wchar_t) and UTF-8 encoding.
The problems is that char is only one byte length, while unicode characters require a size of two bytes.
You can still treat it as char*, but you must remember that you are not dealing with an ascii string (there will be zeros).
You may have to switch to wchar_t.

How to convert string to LPSTR in WinAPI function which stores output in string

I am trying to store some contents into a string variable by passing it as a parameter in various types of Windows API functions which accepts variable like char *.
For example, my code is:-
std::string myString;
GetCurrentDirectoryA( MAX_PATH, myString );
Now how do I convert the string variable to LPSTR in this case.
Please see, this function is not meant for passing the contents of string as input, but the function stores some contents into string variable after execution. So, myString.c_str( ) is ruled out.
Edit: I have a workaround solution of removing the concept of string and replacing it with something like
char myString[ MAX_PATH ];
but that is not my objective. I want to make use of string. Is there any way possible?
Also casting like
GetCurrentDirectoryA( MAX_PATH, ( LPSTR ) myString );
is not working.
Thanks in advance for the help.
Usually, people rewrite the Windows functions they need to be std::string friendly, like this:
std::string GetCurrentDirectoryA()
{
char buffer[MAX_PATH];
GetCurrentDirectoryA( MAX_PATH, buffer );
return std::string(buffer);
}
or this for wide char support:
std::wstring GetCurrentDirectoryW()
{
wchar_t buffer[MAX_PATH];
GetCurrentDirectoryW( MAX_PATH, buffer );
return std::wstring(buffer);
}
LPTSTR is defined as TCHAR*, so actually it is just an ordinary C-string, BUT it depends on whether you are working with ASCII or with Unicode in your code. So do
LPTSTR lpStr = new TCHAR[256];
ZeroMemory(lpStr, 256);
//fill the string using i.e. _tcscpy
const char* cpy = myString.c_str();
_tcscpy (lpStr, cpy);
//use lpStr
See here for a reference on _tcscpy and this thread.
Typically, I would read the data into a TCHAR and then copy it into my std::string. That's the simplest way.

How to convert unsigned char to LPCTSTR in visual c++?

BYTE name[1000];
In my visual c++ project there is a variable defined name with the BYTE data type. If i am not wrong then BYTE is equivalent to unsigned char. Now i want to convert this unsigned char * to LPCTSTR.
How should i do that?
LPCTSTR is defined as either char const* or wchar_t const* based on whether UNICODE is defined or not.
If UNICODE is defined, then you need to convert the multi-byte string to a wide-char string using MultiByteToWideChar.
If UNICODE is not defined, a simple cast will suffice: static_cast< char const* >( name ).
This assumes that name is a null-terminated c-string, in which case defining it BYTE would make no sense. You should use CHAR or TCHAR, based on how are you operating on name.
You can also assign 'name' variable to CString object directly like:
CString strName = name;
And then you can call CString's GetBuffer() or even preferably GetString() method which is more better to get LPCTSTR. The advantage is CString class will perform any conversions required automatically for you. No need to worry about Unicode settings.
LPCTSTR pszName = strName.GetString();

Confused about all the different string types and how to use them properly in Visual C++

Years ago, I used to do some basic programming in C. Now I am attempting to relearn what I have forgotten as well as learn Visual C++. I am confused though by all the string options and now the extra layer of trying to make my programs Unicode compatible. I have been reading Beginning Visual C++ 2010 as well as online reading to learn this information.
As an exercise I am writing a very basic program that asks a user to input some text and then display that text in the form of a messagebox. The program works, but my way of getting it to work was more through guesswork and looking at other examples than truly understanding why I need to convert the various strings into different types.
The code is:
#include "stdafx.h"
#include <iostream>
#include <string>
#include "Windows.h"
using std::wcin;
using std::wcout;
using std::wstring;
int _tmain(int argc, _TCHAR* argv[])
{
wstring myInput;
wcout << "Enter a string: ";
getline(wcin, myInput);
MessageBoxW(NULL, myInput.c_str(), _T("Test MessageBox"), 64);
return 0;
}
The MessageBox syntax is:
int WINAPI MessageBox(
__in_opt HWND hWnd,
__in_opt LPCTSTR lpText,
__in_opt LPCTSTR lpCaption,
__in UINT uType
);
On the other hand, if I just use the command line argument as the text of the messagebox, I do not need to convert the string at all and I am not sure why.
#include "stdafx.h"
#include <iostream>
#include <string>
#include "Windows.h"
using std::wcout;
int _tmain(int argc, _TCHAR* argv[])
{
MessageBoxW(NULL, argv[1], _T("Test MessageBox"), 64);
return 0;
}
My confusion is:
Why do I need to use the c_str() for argument 2 to MessageBoxW and why do I need to use the _T() macro (?) in argument 3?
Why did the program work in the second code example without doing some sort of conversion?
What exactly does LPCTSTR mean? I see another variant in MSDN functions called LPTSTR.
Thanks!
1) .c_str() is a standard C++ method to convert from C++ strings to C strings. _tmain, _T('x'), _T("text") and _TCHAR are (somewhat ugly) Microsoft macros that make your program compile either in unicode or non-unicode mode. There's a global setting in the project options that set some macros to configure your project in one of these two modes.
If you are in non-unicode mode (referred to as ANSI mode in MS's documentation) the macros expand to:
main, 'x', "text", char
If you are in unicode mode, the macros expand to
wmain, L'x', L"text", wchar_t
2) and 3) Windows headers are full of typedefs and macros like that. Sometimes they make code more obscure thant it needs to be. In general, LP means pointer (long pointer, i guess, but it's been a while since we needed to distinguish between near and far pointers), C means "const", T means that it will be either char or wchar_t depending on project settings and STR is obviously "string". After all, it's a plain C type, that's why you can pass C strings to them without conversion.
The MessageBoxW function is expecting a C-style wide-character string (WCHAR ). The macro _L() alters your string so that it's Unicode compatible (WCHAR instead of char*).
argv[] doesn't do objects, so you're already getting a WCHAR pointer out of it.
LPCTSTR is basically a WINAPI typedef for const char * or const WCHAR*, depending on whether you are building as UNICODE. Also see this post: LPCSTR, LPCTSTR and LPTSTR
In short, your main function is being passed WCHAR* strings and MessageBoxW expects WCHAR* strings.

When need to use FreeHGlobal()?

If I use Marshal::StringToHGlobalAnsi as following:
char *src = (char *)Marshal::StringToHGlobalAnsi(this->Textbox1->Text).ToPointer();
Do I need to use Marshal::FreeHGlobal() ? And if, What parameter should I give ?
According to MSDN - yes, you need to call FreeHGlobal. http://msdn.microsoft.com/en-us/library/system.runtime.interopservices.marshal.stringtohglobalansi%28v=VS.100%29.aspx:
Because this method allocates the unmanaged memory required for a
string, always free the memory by calling FreeHGlobal
The C# string conversion functions are absolutely horrible by C++ standards.
C++/CLI has its own string conversion helpers, which follow the rules of RAII to automatically clean up temporary buffers. Just use:
#include <stdlib.h>
#include <string.h>
#include <msclr\marshal.h>
using namespace msclr::interop;
marshal_context converter;
const char *src = converter.marshal_as<const char*>(Textbox1->Text);
Attach my 2 practice code for Marshal::FreeHGlobal
Be noted that the argument of Marshal::FreeHGlobal() are different!!
string CPlusPlusString;
String ^VisualString;
VisualString=textBox1->Text;
CPlusPlusString=(char *)Marshal::StringToHGlobalAnsi(VisualString).ToPointer();
Marshal::FreeHGlobal(Marshal::StringToHGlobalAnsi(VisualString));
char *CString;
String ^VisualString;
VisualString=textBox1->Text;
CString = (char*) Marshal::StringToHGlobalAnsi(VisualString).ToPointer();
Marshal::FreeHGlobal(IntPtr(CString));

Resources