If I use Marshal::StringToHGlobalAnsi as following:
char *src = (char *)Marshal::StringToHGlobalAnsi(this->Textbox1->Text).ToPointer();
Do I need to use Marshal::FreeHGlobal() ? And if, What parameter should I give ?
According to MSDN - yes, you need to call FreeHGlobal. http://msdn.microsoft.com/en-us/library/system.runtime.interopservices.marshal.stringtohglobalansi%28v=VS.100%29.aspx:
Because this method allocates the unmanaged memory required for a
string, always free the memory by calling FreeHGlobal
The C# string conversion functions are absolutely horrible by C++ standards.
C++/CLI has its own string conversion helpers, which follow the rules of RAII to automatically clean up temporary buffers. Just use:
#include <stdlib.h>
#include <string.h>
#include <msclr\marshal.h>
using namespace msclr::interop;
marshal_context converter;
const char *src = converter.marshal_as<const char*>(Textbox1->Text);
Attach my 2 practice code for Marshal::FreeHGlobal
Be noted that the argument of Marshal::FreeHGlobal() are different!!
string CPlusPlusString;
String ^VisualString;
VisualString=textBox1->Text;
CPlusPlusString=(char *)Marshal::StringToHGlobalAnsi(VisualString).ToPointer();
Marshal::FreeHGlobal(Marshal::StringToHGlobalAnsi(VisualString));
char *CString;
String ^VisualString;
VisualString=textBox1->Text;
CString = (char*) Marshal::StringToHGlobalAnsi(VisualString).ToPointer();
Marshal::FreeHGlobal(IntPtr(CString));
Related
I'm quite new to pointers in c.
Here is a snippet of code I'm working on. I am probably not passing the pointer correctly but I can't figure out what's wrong.
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
__uint16_t CCrc8();
__uint16_t process_command();
int main () {
//command format: $SET,<0-1023>*<checksum,hex>\r\n
char test_payload[] = "SET,1023*6e";
process_command(test_payload);
return 0;
}
__uint16_t process_command(char *str1) {
char local_str[20];
memcpy(local_str, str1, sizeof(str1));
printf(str1);
printf("\n");
printf(local_str);
}
This results in:
SET,1023*6e
SET,1023
I'm expecting both lines to be the same. Anything past 8 characters is left off.
The only thing I can determine is that the problem is something with sizeof(str1). Any help appreciated.
Update: I've learned sizeof(*char) is 2 on 16bit systems, 4 on 32bit systems and 8 on 64-bit systems.
So how can I use memcpy to get a local copy of str1 when I'm unsure of the size it will be?
sizeof is a compiler keyword. What you need is strlen from #include <string.h>.
The value of sizeof is determinated at compile time. For example sizeof(char[10]) just means 10. strlen on the other hand is a libc function that can determine string length dynamically.
sizeof on a pointer tells you the size of the pointer itself, not of what it points to. Since you're on a 64-bit system, pointers are 8 bytes long, so your memcpy is always copying 8 bytes. Since your string is null terminated, you should use stpncpy instead, like this:
if(stpncpy(local_str, str1, 20) == local_str + 20) {
// too long - handle it somehow
}
That will copy the string until it gets to a NUL terminator or runs out of space in the destination, and in the latter case you can handle it.
I wanna write a program which should receive a input in form of string,
and this input will save in a dynamic array, so I use malloc with a for example 20*sizeof, and I want if the size of string was longer than my allocating memory, improve it's size. But I receive a crash and cannot improve it's size with realloc.
What can I do?
this is my code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int main() {
char *user;
int n = 0;
user = (char*)malloc(20*sizeof(char));
scanf("%s",user);
n = strlen(user);
user = (char*)realloc(user,n);
return 0;
}
The easiest way is to use the m modifier in scanf:
char *user = 0;
scanf("%ms", &user);
// use 'user' -- will be null if there was an error reading.
Unfortunately, this is only available on POSIX systems. On other systems, you'll need to write your own loop reading characters with getchar, and reallocate as you read (as needed).
how I can convert string to tchar in VC++?
string internetprotocol="127.4.5.6";
TCHAR szProxyAddr[16];
i want to set:
szProxyAddr=internetprotocol;
how i can do it?
#include <atlstr.h>
string internetprotocol="127.4.5.6";
TCHAR szProxyAddr[16];
_tcscpy_s(szProxyAddr, CA2T(internetprotocol.c_str()));
_tcscpy_s is generic strcpy version which works both in Unicode and Multi-Character configurations. CA2T converts const char* to TCHAR*, according to szProxyAddr variable type.
Be careful with destination variable length.
You may try like this:
#include <atlstr.h>
_tcscpy_s(szProxyAddr, CA2T(internetprotocol.c_str()));
Years ago, I used to do some basic programming in C. Now I am attempting to relearn what I have forgotten as well as learn Visual C++. I am confused though by all the string options and now the extra layer of trying to make my programs Unicode compatible. I have been reading Beginning Visual C++ 2010 as well as online reading to learn this information.
As an exercise I am writing a very basic program that asks a user to input some text and then display that text in the form of a messagebox. The program works, but my way of getting it to work was more through guesswork and looking at other examples than truly understanding why I need to convert the various strings into different types.
The code is:
#include "stdafx.h"
#include <iostream>
#include <string>
#include "Windows.h"
using std::wcin;
using std::wcout;
using std::wstring;
int _tmain(int argc, _TCHAR* argv[])
{
wstring myInput;
wcout << "Enter a string: ";
getline(wcin, myInput);
MessageBoxW(NULL, myInput.c_str(), _T("Test MessageBox"), 64);
return 0;
}
The MessageBox syntax is:
int WINAPI MessageBox(
__in_opt HWND hWnd,
__in_opt LPCTSTR lpText,
__in_opt LPCTSTR lpCaption,
__in UINT uType
);
On the other hand, if I just use the command line argument as the text of the messagebox, I do not need to convert the string at all and I am not sure why.
#include "stdafx.h"
#include <iostream>
#include <string>
#include "Windows.h"
using std::wcout;
int _tmain(int argc, _TCHAR* argv[])
{
MessageBoxW(NULL, argv[1], _T("Test MessageBox"), 64);
return 0;
}
My confusion is:
Why do I need to use the c_str() for argument 2 to MessageBoxW and why do I need to use the _T() macro (?) in argument 3?
Why did the program work in the second code example without doing some sort of conversion?
What exactly does LPCTSTR mean? I see another variant in MSDN functions called LPTSTR.
Thanks!
1) .c_str() is a standard C++ method to convert from C++ strings to C strings. _tmain, _T('x'), _T("text") and _TCHAR are (somewhat ugly) Microsoft macros that make your program compile either in unicode or non-unicode mode. There's a global setting in the project options that set some macros to configure your project in one of these two modes.
If you are in non-unicode mode (referred to as ANSI mode in MS's documentation) the macros expand to:
main, 'x', "text", char
If you are in unicode mode, the macros expand to
wmain, L'x', L"text", wchar_t
2) and 3) Windows headers are full of typedefs and macros like that. Sometimes they make code more obscure thant it needs to be. In general, LP means pointer (long pointer, i guess, but it's been a while since we needed to distinguish between near and far pointers), C means "const", T means that it will be either char or wchar_t depending on project settings and STR is obviously "string". After all, it's a plain C type, that's why you can pass C strings to them without conversion.
The MessageBoxW function is expecting a C-style wide-character string (WCHAR ). The macro _L() alters your string so that it's Unicode compatible (WCHAR instead of char*).
argv[] doesn't do objects, so you're already getting a WCHAR pointer out of it.
LPCTSTR is basically a WINAPI typedef for const char * or const WCHAR*, depending on whether you are building as UNICODE. Also see this post: LPCSTR, LPCTSTR and LPTSTR
In short, your main function is being passed WCHAR* strings and MessageBoxW expects WCHAR* strings.
I'm trying to compile VC6 project with VC10...
I obtain an error C2678 with set_intersection: I wrote some example to understand. Can anybody explain how to compile this snippets ?
#include <vector>
#include <algorithm>
#include <iostream>
#include <set>
#include <string>
int main( )
{
using namespace std;
typedef set<string> MyType;
MyType in1, in2, out;
MyType::iterator out_iter(out.begin());
set_intersection(in1.begin(),in1.end(), in2.begin(), in2.end(), out_iter);
}
The output :
c:\program files\microsoft visual\studio 10.0\vc\include\algorithm(4494): error C2678: '=' binary : no operator defined which takes a left-hand operand of type 'const std::basic_string<_Elem,_Traits,_Ax>' (or there is no acceptable conversion)
If I use a std::vector instead of std::set the compilation succeeded.
acceptable)
Try
set_intersection(in1.begin(),in1.end(), in2.begin(), in2.end(), inserter(out, out.begin()) );
This is because set_intersection wants to write to the output iterator, which causes the output container to grow in size. However this couldn't be done with just an iterator alone (it could be used to overwrite existing elements but not grow in size)
Edit: fixed the typo. Use inserter for adding to a set. A back_inserter only works for vectors and such.
Edit 2: fixed another typo. STL inserter requires a second argument which is a hint iterator to the likely insert position. Thanks chepseskaf.