Move constructor (rvalue reference) in implicit conversion - rvalue-reference

I am upgrading a C++ project from MSVC 2008 to 2010, and because of the new CComBSTR move constructor [CComBSTR( CComBSTR&& )], I am getting a compiler error because of an ambiguous call.
Essentially, we have a String class, very similar to std::wstring that have a cast operator to CComBSTR. This is similator to the following code:
class CString {
public:
// ...
operator CComBSTR() {
CComBSTR temp;
/* Encoding conversion here */
return temp;
}
}
class CObjectConfig {
public:
CString GetName() const { return m_name; }
private:
CString m_name;
}
Now, at some places in the code, we do the following:
CObjectConfig config = GetObjectConfig();
CComBSTR objectName( config.GetName() );
In VS2008, this would work because the CString object would be implicitly converted to a CComBSTR rvalue and the copy constructor of CComBSTR (taking a const CComBSTR&) would be called to construct objectName.
In VS2010 with C++0x however, the compiler gives an ambiguous call error because CComBSTR rvalue seems to fit both the copy constructor and the move constructor.
While a bit clumsy, my solution to this problem is to static_cast the call to GetName:
CComBSTR objectName( static_cast<const CComBSTR&>( config.GetName() ) );
// or
CComBSTR objectName( static_cast<CComBSTR&&>( config.GetName() ) );
Both lines compile without error, but I need your advice on whether this is illegal, bad practice or undefined. Thank you.

This looks like a VC2010 bug to me. Either that, or I've incorrectly emulated your situation on my computer (I don't have VC2010). Here's what I'm doing:
#include <iostream>
class CComBSTR
{
public:
CComBSTR() {std::cout << "CComBSTR()\n";}
CComBSTR(const CComBSTR&) {std::cout << "CComBSTR(const CComBSTR&)\n";}
CComBSTR(CComBSTR&&) {std::cout << "CComBSTR(CComBSTR&&)\n";}
};
class CString {
public:
// ...
operator CComBSTR() {
CComBSTR temp;
/* Encoding conversion here */
return temp;
}
};
class CObjectConfig {
public:
CString GetName() const { return m_name; }
private:
CString m_name;
};
CObjectConfig GetObjectConfig()
{
return CObjectConfig();
}
int main()
{
CObjectConfig config = GetObjectConfig();
CComBSTR objectName( config.GetName() );
}
For me on g++-4.4 and clang (with -std=c++0x), this compiles fine. And it either calls or elides a call to CComBSTR(CComBSTR&&). My recommendation for working around this suspected bug is simply:
CComBSTR objectName( CComBSTR(config.GetName()) );
This is equivalent to your:
CComBSTR objectName( static_cast<CComBSTR&&>( config.GetName() ) );
but not as scary looking (and just as efficient). If you want to stay with the static_cast, then go with cast to CComBSTR&& as this will probably be more efficient than construction from a const lvalue.

Related

C++ to C# char[]

C# code:
class Hello{
public void helloWorld(char[] chars){
//do something
}
}
C++ code to call C#:
MyCSDLL::Hello* hello;
//init hello, some calls are ok.
char* myCharPtr;
//init with message
HRESULT result = hello->helloWorld(safeArray, (MyCSDLL::_MyRetVal) _retValPtr);
Adapting from How to create and initialize SAFEARRAY of doubles in C++ to pass to C#
void createSafeArray(SAFEARRAY** saData, char* charPtr)
{
char* iterator = charPtr;
SAFEARRAYBOUND Bound;
Bound.lLbound = 0;
Bound.cElements = 10;
*saData = SafeArrayCreate(VT_R8, 1, &Bound);
char HUGEP *pdFreq;
HRESULT hr = SafeArrayAccessData(*saData, (void HUGEP* FAR*)&pdFreq);
if (SUCCEEDED(hr))
{
do {
*pdFreq++ = *iterator;
} while (*iterator++);
}
}
How to call hello->helloWorld()? it is expecting SAFEARRAY*. The current code gives 80131538 error. How to fix it?
C++ Project is not CLR.
Let's suppose the C# code is this:
namespace ClassLibrary1
{
[ComVisible(true)]
[ClassInterface(ClassInterfaceType.AutoDual)]
public class Hello
{
public void helloWorld(char[] chars)
{
...
}
}
}
Then, you can call it with this C/C++ code, for example:
#import "C:\mycode\ClassLibrary1\bin\Debug\classlibrary1.tlb" raw_interfaces_only
using namespace ClassLibrary1;
HRESULT CallHello(wchar_t* charPtr, int count)
{
CComPtr<_Hello> p;
HRESULT hr = p.CoCreateInstance(__uuidof(Hello));
if (FAILED(hr))
return hr;
SAFEARRAY* psa = SafeArrayCreateVector(VT_UI2, 0, count);
if (!psa)
return E_OUTOFMEMORY;
LPVOID pdata;
hr = SafeArrayAccessData(psa, &pdata);
if (SUCCEEDED(hr))
{
CopyMemory(pdata, charPtr, count * 2); // count is the number of chars
SafeArrayUnaccessData(psa);
hr = p->helloWorld(psa);
}
SafeArrayDestroy(psa);
return hr;
}
.NET's char type is unicode, so the binary size is two bytes, the C equivalent is wchar_t (or unsigned short, etc...). So the safearray element type must match that, that's why I used VT_UI2 (VT_R8 that you used is Real of size 8 bytes, so it's equivalent to .NET's double type).
If you really want to use C's char, then you must do some kind of conversion to a 2-byte character.
Also, you can use the SafeArrayCreateVector function which directly allocates a 1-dimension safe array. Don't forget to call cleanup methods.

C++ no acceptable conversion for operator+ with class

I have some 15-year-old C++ code that I am trying to bring up to more modern times. At this stage, I'm trying to get code that compiled with Visual C++ 6.0 to now compile with VS 2003 (Microsoft Visual C++ .NET 69462-335-0000007-18915). If we can get this to compile cleanly & run properly, then we can take another step to get it into a more recent version of VS. But I'm having a number of problems...
Here is a snippet of the (simplified) code:
class toS
{
public:
toS() { buff[0] ='\0'; }
operator LPCTSTR() { return buff; }
protected:
void Append (TCHAR c)
{
LPTSTR p = buff + _tcslen(buff);
*p++ = c;
*p = '\0';
}
TCHAR buff[40];
};
class LtoS : public toS
{
public:
LtoS(LONG n, TCHAR c = '\0')
{
_ltot(n, buff, 10);
Append(c);
}
};
void WriteBool(const CString& Section, const CString& Key, bool Value);
CString Section;
int nLine = 0;
std::vector<bool> *BoolVect;
std::vector<bool>::iterator vi;
...
for (vi = BoolVect->begin(); vi != BoolVect->end(); vi++)
WriteBool(Section, "LineVis " + LtoS(nLine++), *vi);
...
From this I get the following error message:
error C2677: binary '+' : no global operator found which takes type 'LtoS' (or there is no acceptable conversion)
Any idea how this code ever worked? If I can find out what it did in the past, I can begin to define the overloaded operator+ to match the functionality.
Compiler error goes away when I make class tos inherit from CString with:
class tos : public CString { ... }
Hopefully this will not only compile, but will execute correctly...
Deriving from several of the comments, try adding a public conversion operator to class toS as follows:
operator LPCTSTR() const { return &buff[0]; }
You may need to explicitly construct the string in the for loop as well, e.g.:
WriteBool(Section, CString("LineVis ") + static_cast<LPCTSTR>(LtoS(nLine++)), *vi);
(Side note: As you probably know since you just extracted code for an example, there's a problem here:
std::vector<bool> BoolVect;
...
for (vi = BoolVect->begin(); vi != BoolVect->end(); vi++)
The notation you're using to access the BoolVect implies that it is a pointer, but it's not being declared as such in your example.)

Returning wchar_t* from C++/CLI to native

Let's say we have a pure virtual C++ class:
class INativeInterface {
public:
virtual ~INativeInterface () {};
virtual LPCWSTR GetString () = 0;
};
and then we need to provide an implementation of this interface in C++/CLI:
class HalfManagedImplementation : public INativeInterface {
public:
virtual LPCWSTR GetString () override {
// need to return wchar_t const * pointer which points to the
// data of our managedData string
// pin_ptr is not suitable as it will go out of scope
// what other options do we have here?
// perhaps copying managed string contents to unmanaged heap?
wchar_t * unmanagedString = new wchar_t [managedData->Length + 1];
pin_ptr<const wchar_t> pinnedString = PtrToStringChars (managedData);
wcscpy_s (unmanagedString, managedData->Length + 1, pinnedString);
return unmanagedString;
}
private:
String^ managedData;
void SetString (String^ param){
// do something in .net
managedData = param;
}
};
My main questions are:
Can I allocate a native string on CRT heap and return a pointer to it to the native C++ code as I did above given that the memory allocated by managed code will be de-allocated by the native code?
An example of usage:
LPCWSTR data = cppCliObject->GetString ();
// do stuff with returned data or persist it by copying it somewhere else
delete[] data;
Is the first point valid when native C++ code is in a different dll than C++/CLI one?
Are there any other alternatives or best practices when returning wchar_t data to native C++?

Why aren't these arguments valid?

//Block.h
#pragma once
class Block
{
public:
CRect pos;
int num;
public:
Block(void);
~Block(void);
};
//view class
public:
Block currentState[5]; // stores the current state of the blocks
void CpuzzleView::OnDraw(CDC* pDC)
{
CpuzzleDoc* pDoc = GetDocument();
ASSERT_VALID(pDoc);
if (!pDoc)
return;
//draw the 4 blocks and put text into them
for(int i=0;i<4;i++)
{
pDC->Rectangle(currentState[i].pos);
// i'm getting an error for this line:
pDC->TextOut(currentState[i].pos.CenterPoint(), currentState[i].num);
}
pDC->TextOut(currentState[i].pos.CenterPoint(), currentState[i].num);
The error says that no instance of overloaded function CDC::TextOutW() matches the argument list . But the prototype for the function is:
CDC::TextOutW(int x, int y, const CString &str )
all i've done is that instead of the 2 points i've directly given the point object returned by CenterPoint() ... shouldn't it work?
That's because you didn't supplied arguments list correctly. Please read compiler error message carefully, it's usually helps to solve the problem.
TextOut(currentState[i].pos.CenterPoint(), currentState[i].num);
In this call you passed CPoint object and int. This is not correct, you need to pass int, int and CString (or const char* and int length).
To fix this you shall do something like this:
CString strState;
strState.Format("%d", currentState[i].num); // Or use atoi()/wtoi() functions
TextOut(currentState[i].pos.CenterPoint().x, currentState[i].pos.CenterPoint().x, strState);

Is it necessary to use GC::KeepAlive in C++/CLI when the managed handle is held in a managed container (IList<Foo^>)?

I'm confused about when I need to use KeepAlive in my C++/CLI wrapper code and how lifetimes are handled in it. Consider the following code and note the places where I ask whether KeepAlive is needed.
// convert from managed to native string
inline std::string ToStdString(String^ source)
{
if (String::IsNullOrEmpty(source))
return std::string();
int len = ((source->Length+1) * 2);
/*** Do I need GC::KeepAlive(source) here? ***/
char *ch = new char[ len ];
bool result ;
{
pin_ptr<const wchar_t> wch = PtrToStringChars( source );
result = wcstombs( ch, wch, len ) != -1;
}
std::string target = ch;
delete ch;
if(!result)
throw gcnew Exception("error converting System::String to std::string");
return target;
}
// convert from native to managed string
inline String^ ToSystemString(const std::string& source)
{
return gcnew String(source.c_str());
}
// unmanaged C++ class
struct NativeDog
{
std::string name;
std::string bark() const {return "woof";}
void eat(std::string& food) const {food.clear();}
};
typedef shared_ptr<NativeDog> NativeDogPtr;
// C++/CLI wrapper class
ref class ManagedDog
{
NativeDogPtr* base_;
NativeDog& base() {return **base_;}
ManagedDog() {base_ = new NativeDogPtr(new NativeDog);}
~ManagedDog() {if (base_) delete base_;}
!ManagedDog() {delete this;}
property String^ name
{
String^ get() {return ToSystemString(base().name);}
void set(String^ name)
{
base().name = ToStdString(name);
/*** Do I need GC::KeepAlive(name) here? ***/
}
}
String^ bark() {return ToSystemString(base().bark());}
void eat(String^ food)
{
std::string nativeFood = ToStdString(food);
base().eat(nativeFood);
food = ToSystemString(nativeFood);
/*** Do I need GC::KeepAlive(food) here? ***/
}
};
// unmanaged C++ class
struct NativeKennel
{
vector<NativeDogPtr> dogs;
};
// C++/CLI wrapper class
ref class ManagedKennel
{
NativeKennel* base_;
NativeKennel& base() {return *base_;}
IList<ManagedDog^>^ dogs;
void addDog(ManagedDog^ dog)
{
base().dogs.push_back(*dog->base_);
dogs->Add(dog);
/*** Do I need GC::KeepAlive(dog) here? Will the IList manage the ManagedDog lifetimes? ***/
}
};
Right before calling a managed delegate's function pointer.
This is a common failure mode, the garbage collector cannot see any reference held by native code. The managed code must store a reference to the delegate itself to prevent it from getting garbage collected. There's a debugger assistant for this, not sure why you didn't see it. More details in this MSDN Library article.
None of the above!
If you access managed classes in C++/CLI, KeepAlive won't help. You need to pin the data in memory to stop it from relocating afer a garbage collect. In all of these examples, this is done implicitly by the functions you call.
KeepAlive has a different goal. References stored on the stack are subject to garbage collection immediately after the last time the object is dereferenced. KeepAlive prevents this from happening, by extending the lifetime of your object until after the KeepAlive call.

Resources