Dynamic Memory Deletion in vc++ - visual-c++

I am using _aligned_malloc in my code. But it is throwing error error as shown in image.
CString sBuffer = _T("Hello");
TCHAR* pBuffer;
pBuffer = (TCHAR *)_aligned_malloc(1024, 16);
if (pBuffer == NULL) {
...............Error .. msg
}
pBuffer = sBuffer.GetBuffer(sBuffer.GetLength());
..................................................
.........................................................
sBuffer.ReleaseBuffer(sBuffer.GetLength());
if (pBuffer != NULL) {
_aligned_free(pBuffer);
}

The CString class implements (LPCTSTR) cast operator that you can use to get const TCHAR*.
Please note that TCHAR is defined as char in MBCS mode, and as wchar in UNICODE mode. For more details please refer to tchar.h where its defined.
If you'd like to modify the content of the buffer you'll need to use GetBuffer() method. Don't forget to call ReleaseBuffer() when you done. So, there is no need to allocate memory manually.
You can also easily construct CString from TCHAR*. There is a constructor to do that.

Related

C++ no acceptable conversion for operator+ with class

I have some 15-year-old C++ code that I am trying to bring up to more modern times. At this stage, I'm trying to get code that compiled with Visual C++ 6.0 to now compile with VS 2003 (Microsoft Visual C++ .NET 69462-335-0000007-18915). If we can get this to compile cleanly & run properly, then we can take another step to get it into a more recent version of VS. But I'm having a number of problems...
Here is a snippet of the (simplified) code:
class toS
{
public:
toS() { buff[0] ='\0'; }
operator LPCTSTR() { return buff; }
protected:
void Append (TCHAR c)
{
LPTSTR p = buff + _tcslen(buff);
*p++ = c;
*p = '\0';
}
TCHAR buff[40];
};
class LtoS : public toS
{
public:
LtoS(LONG n, TCHAR c = '\0')
{
_ltot(n, buff, 10);
Append(c);
}
};
void WriteBool(const CString& Section, const CString& Key, bool Value);
CString Section;
int nLine = 0;
std::vector<bool> *BoolVect;
std::vector<bool>::iterator vi;
...
for (vi = BoolVect->begin(); vi != BoolVect->end(); vi++)
WriteBool(Section, "LineVis " + LtoS(nLine++), *vi);
...
From this I get the following error message:
error C2677: binary '+' : no global operator found which takes type 'LtoS' (or there is no acceptable conversion)
Any idea how this code ever worked? If I can find out what it did in the past, I can begin to define the overloaded operator+ to match the functionality.
Compiler error goes away when I make class tos inherit from CString with:
class tos : public CString { ... }
Hopefully this will not only compile, but will execute correctly...
Deriving from several of the comments, try adding a public conversion operator to class toS as follows:
operator LPCTSTR() const { return &buff[0]; }
You may need to explicitly construct the string in the for loop as well, e.g.:
WriteBool(Section, CString("LineVis ") + static_cast<LPCTSTR>(LtoS(nLine++)), *vi);
(Side note: As you probably know since you just extracted code for an example, there's a problem here:
std::vector<bool> BoolVect;
...
for (vi = BoolVect->begin(); vi != BoolVect->end(); vi++)
The notation you're using to access the BoolVect implies that it is a pointer, but it's not being declared as such in your example.)

Converting between WinRT HttpBufferContent and unmanaged memory in C++cx

As part of a WinRT C++cx component, what's the most efficient way to convert an unmanaged buffer of bytes (say expressed as a std::string) back and forth with a Windows::Web::Http::HttpBufferContent?
This is what I ended up with, but it doesn't seem very optimal:
std::string to HttpBufferContent:
std::string m_body = ...;
auto writer = ref new DataWriter();
writer->WriteBytes(ArrayReference<unsigned char>(reinterpret_cast<unsigned char*>(const_cast<char*>(m_body.data())), m_body.length()));
auto content = ref new HttpBufferContent(writer->DetachBuffer());
HttpBufferContent to std::string:
HttpBufferContent^ content = ...
auto operation = content->ReadAsBufferAsync();
auto task = create_task(operation);
if (task.wait() == task_status::completed) {
auto buffer = task.get();
size_t length = buffer->Length;
if (length > 0) {
unsigned char* storage = static_cast<unsigned char*>(malloc(length));
DataReader::FromBuffer(buffer)->ReadBytes(ArrayReference<unsigned char>(storage, length));
auto m_body = std::string(reinterpret_cast<char*>(storage), length);
free(storage);
}
} else {
abort();
}
UPDATE: Here's the version I ended up using (you can trivially create a HttpBufferContent^ from an Windows::Storage::Streams::IBuffer^):
void IBufferToString(IBuffer^ buffer, std::string& string) {
Array<unsigned char>^ array = nullptr;
CryptographicBuffer::CopyToByteArray(buffer, &array); // TODO: Avoid copy
string.assign(reinterpret_cast<char*>(array->Data), array->Length);
}
IBuffer^ StringToIBuffer(const std::string& string) {
auto array = ArrayReference<unsigned char>(reinterpret_cast<unsigned char*>(const_cast<char*>(string.data())), string.length());
return CryptographicBuffer::CreateFromByteArray(array);
}
I think you are making at least one unnecessary copy of your data in your current approach for HttpBufferContent to std::string, you could improve this by accessing the IBuffer data directly, see the accepted answer here: Getting an array of bytes out of Windows::Storage::Streams::IBuffer
I think it's better to use smart pointer (no memory management needed) :
#include <wrl.h>
#include <robuffer.h>
#include <memory>
using namespace Windows::Storage::Streams;
using namespace Microsoft::WRL;
IBuffer^ buffer;
ComPtr<IBufferByteAccess> byte_access;
reinterpret_cast<IInspectable*>(buffer)->QueryInterface(IID_PPV_ARGS(&byte_access));
std::unique_ptr<byte[]> raw_buffer = std::make_unique<byte[]>(buffer->Length);
byte_access->Buffer(raw_buffer.get());
std::string str(reinterpret_cast<char*>(raw_buffer.get())); // just 1 copy

How to convert guid to *char

I would like to convert a CLSID to a *char in c++ so I can display it in a text box. I am new to c++ so please make this as simple a s possible.
Thanks
C'ish solution:
/* 128 bit GUID to human-readable string */
char * guid_to_str(const GUID * id, char * out) {
int i;
char * ret = out;
out += sprintf(out, "%.8lX-%.4hX-%.4hX-", id->Data1, id->Data2, id->Data3);
for (i = 0; i < sizeof(id->Data4); ++i) {
out += sprintf(out, "%.2hhX", id->Data4[i]);
if (i == 1) *(out++) = '-';
}
return ret;
}
This assumes the output buffer has been already allocated, and should be of a size of 37 bytes (including the null terminating character).
The output is of the form "75B22630-668E-11CF-A6D9-00AA0062CE6C"
Usage example:
GUID g;
char buffer[37];
std::cout << guid_to_str(&g, buffer);
Note:
This code exists because I had to implement GUID parsing under Linux, otherwise I would have used the Windows API function StringFromCLSID mentioned by #krowe.
Here is a great example for converting GUID to string and vice versa that I am using in my projects:
std::string guidToString(GUID guid) {
std::array<char,40> output;
snprintf(output.data(), output.size(), "{%08X-%04hX-%04hX-%02X%02X-%02X%02X%02X%02X%02X%02X}", guid.Data1, guid.Data2, guid.Data3, guid.Data4[0], guid.Data4[1], guid.Data4[2], guid.Data4[3], guid.Data4[4], guid.Data4[5], guid.Data4[6], guid.Data4[7]);
return std::string(output.data());
}
GUID stringToGUID(const std::string& guid) {
GUID output;
const auto ret = sscanf(guid.c_str(), "{%8X-%4hX-%4hX-%2hX%2hX-%2hX%2hX%2hX%2hX%2hX%2hX}", &output.Data1, &output.Data2, &output.Data3, &output.Data4[0], &output.Data4[1], &output.Data4[2], &output.Data4[3], &output.Data4[4], &output.Data4[5], &output.Data4[6], &output.Data4[7]);
if (ret != 11)
throw std::logic_error("Unvalid GUID, format should be {00000000-0000-0000-0000-000000000000}");
return output;
}
In the example, it firsts uses char* before converting to string so this is exactly what you are looking for in an efficient way.
The Windows API has a function for this:
CLSID clsid;
HRESULT hr = CLSIDFromProgID ((OLESTR "Adobe.SVGCtl.3"),&clsid);
// Get class id as string
LPOLESTR className;
hr = StringFromCLSID(clsid, &className);
// convert to CString
CString c = (char *) (_bstr_t) className;
// then release the memory used by the class name
CoTaskMemFree(className);
// Now c is ready to use
A CLSID is the same as a UUID, so you can use the UuidToString() function
http://msdn.microsoft.com/en-us/library/windows/desktop/aa379352(v=vs.85).aspx

How to free up memory after converting a managed string to UTF-8 encoded unmanaged char*?

I'm not familiar with C++/CLI so not sure how to free up the memory when using the code below (got the solution here and modified a little):
char* ManagedStringToUnmanagedUTF8Char( String^ s )
{
array<unsigned char> ^bytes = Encoding::UTF8->GetBytes( s );
pin_ptr<unsigned char> pinnedPtr = &bytes[0];
return (char*)pinnedPtr;
}
The above code is working when I tested it by writing the char in a text file. Please let me know if I'm missing something (need to clean up pinnedPtr?).
Now when I use it:
char* foobar = ManagedStringToUnmanagedUTF8Char("testing");
//do something with foobar
//do I need to free up memory by deleting foobar here?
//I tried 'delete foobar' or free(foobar) but it crashes my program
Hans Passant's comment is correct that the returned pointer to the buffer can be moved in memory by the garbage collector. This is because, when the function stack unwinds, pin_ptr will unpin the pointer.
The solution is to
Obtain the System::String buffer and pin it so that the GC cannot
move it.
Allocate memory on the unmanaged heap (or just heap) where
it is not under the GC's jurisdiction, and cannot be moved by the
GC.
Copy memory (and convert to desired encoding) from the
System::String buffer to the buffer allocated on the unmanaged heap.
Unpin the pointer so the GC can once again move the System::String
in memory. (This is done when pin_ptr goes out of the function
scope.)
Sample code:
char* ManagedStringToUnmanagedUTF8Char(String^ str)
{
// obtain the buffer from System::String and pin it
pin_ptr<const wchar_t> wch = PtrToStringChars(str);
// get number of bytes required
int nBytes = ::WideCharToMultiByte(CP_UTF8, NULL, wch, -1, NULL, 0, NULL, NULL);
assert(nBytes >= 0);
// allocate memory in C++ where GC cannot move
char* lpszBuffer = new char[nBytes];
// initialize buffer to null
ZeroMemory(lpszBuffer, (nBytes) * sizeof(char));
// Convert wchar_t* to char*, specify UTF-8 encoding
nBytes = ::WideCharToMultiByte(CP_UTF8, NULL, wch, -1, lpszBuffer, nBytes, NULL, NULL);
assert(nBytes >= 0);
// return the buffer
return lpszBuffer;
}
Now, when using:
char* foobar = ManagedStringToUnmanagedUTF8Char("testing");
//do something with foobar
//when foobar is no longer needed, you need to delete it
//because ManagedStringToUnmanagedUTF8Char has allocated it on the unmanaged heap.
delete foobar;
I'm not familiar with Visual-C++ either, but according to this article
Pinning pointers cannot be used as: [...] the return type of a function
I'm not sure whether the pointer will be valid when the function ends (even though it's disguised as a char*.
It seems that you declare some local variables in the function that you want to pass to the calling scope. 'However, possibly these will be out of scope anyway when you return from the function.
Maybe you should reconsider what you are trying to achieve in the first place?
Note, that in the article you have referenced a std::string (passed by value, i.e. by copy) is used as return parameter.
std::string managedStringToStlString( System::String ^s )
{
Encoding ^u8 = Encoding::UTF8;
array<unsigned char> ^bytes = u8->GetBytes( s );
pin_ptr<unsigned char> pinnedPtr = &bytes[0];
return string( (char*)pinnedPtr );
}
Thereby no local variables are passed out of their scope. The string is handled over by copy as an unmanaged std::string. This is exactly what this post suggests.
When you need a const char* later, you can use the string::c_str() method to get one. Note, that you can also write std::string to a file using file streams.
Is this an option for you?

Access violation exception while using getenv to retrieve an environment variable that doesn't exist

I am using MS Visual Studio 2008 for developing a C++ application. I use the 'getenv()' function to fetch an environment variable, but when the searched environment variable doesn't exist, it throws an access violation exception. What is the issue here and how to correct it?
The docs say that the getenv() function will return a NULL pointer if the searched environment variable doesn't exist, but why am I getting this access violation exception?
The std::string class calls strlen when you use std::string(str), which will produce an access violation when passed a NULL string. What you need to do is something like:
std::string env(const char *name)
{
const char *ret = getenv(name);
if (!ret) return std::string();
return std::string(ret);
}
or
bool getenv(const char *name, std::string &env)
{
const char *ret = getenv(name);
if (ret) env = std::string(ret);
return !!ret;
}
which you could use like this:
std::string myenv;
if (getenv("MYENV", myenv))
doSomethingWithMyEnv(myenv);

Resources