cant convert parameter from char[#] to LPWSTR - visual-c++

When I compile this code in Visual C++, I got the below error. Can help me solve this issue..
DWORD nBufferLength = MAX_PATH;
char szCurrentDirectory[MAX_PATH + 1];
GetCurrentDirectory(nBufferLength, szCurrentDirectory);
szCurrentDirectory[MAX_PATH +1 ] = '\0';
Error message:
Error 5 error C2664: 'GetCurrentDirectoryW' : cannot convert parameter 2 from 'char [261]' to 'LPWSTR' c:\car.cpp

Your program is configured to be compiled as unicode. Thats why GetCurrentDirectory is GetCurrentDirectoryW, which expects a LPWSTR (wchar_t*).
GetCurrentDirectoryW expects a wchar_t instead of char array. You can do this using TCHAR, which - like GetCurrentDirectory - depends on the unicode setting and always represents the appropriate character type.
Don't forget to prepend your '\0' with an L in order to make the char literal unicode, too!

It seems you have define UNICODE, _UNICODE compiler flags. In that case, you need to change the type of szCurrentDirectory from char to TCHAR.

Headers:
#include <iostream>
#include <fstream>
#include <direct.h>
#include <string.h>
#include <windows.h> //not sure
Function to get current directory:
std::string getCurrentDirectoryOnWindows()
{
const unsigned long maxDir = 260;
wchar_t currentDir[maxDir];
GetCurrentDirectory(maxDir, currentDir);
std::wstring ws(currentDir);
std::string current_dir(ws.begin(), ws.end());
return std::string(current_dir);
}
To call function:
std::string path = getCurrentDirectoryOnWindows(); //Output like: C:\Users\NameUser\Documents\Programming\MFC Program 5
To make dir (Folder) in current directory:
std::string FolderName = "NewFolder";
std::string Dir1 = getCurrentDirectoryOnWindows() + "\\" + FolderName;
_mkdir(Dir1.c_str());
This works for me in MFC C++.

Related

shm_open() fails with EINVAL when creating shared memory in subdirectory of /dev/shm

I have a GNU/Linux application with uses a number of shared memory objects. It could, potentially, be run a number of times on the same system. To keep things tidy, I first create a directory in /dev/shm for each of the set of shared memory objects.
The problem is that on newer GNU/Linux distributions, I no longer seem to be able create these in a sub-directory of /dev/shm.
The following is a minimal C program with illustrates what I'm talking about:
/*****************************************************************************
* shm_minimal.c
*
* Test shm_open()
*
* Expect to create shared memory file in:
* /dev/shm/
* └── my_dir
*    └── shm_name
*
* NOTE: Only visible on filesystem during execution. I try to be nice, and
* clean up after myself.
*
* Compile with:
* $ gcc -lrt shm_minimal.c -o shm_minimal
*
******************************************************************************/
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <fcntl.h>
#include <errno.h>
#include <sys/mman.h>
#include <sys/stat.h>
#include <sys/types.h>
int main(int argc, const char* argv[]) {
int shm_fd = -1;
char* shm_dir = "/dev/shm/my_dir";
char* shm_file = "/my_dir/shm_name"; /* does NOT work */
//char* shm_file = "/my_dir_shm_name"; /* works */
// Create directory in /dev/shm
mkdir(shm_dir, 0777);
// make shared memory segment
shm_fd = shm_open(shm_file, O_RDWR | O_CREAT, 0600);
if (-1 == shm_fd) {
switch (errno) {
case EINVAL:
/* Confirmed on:
* kernel v3.14, GNU libc v2.19 (ArchLinux)
* kernel v3.13, GNU libc v2.19 (Ubuntu 14.04 Beta 2)
*/
perror("FAIL - EINVAL");
return 1;
default:
printf("Some other problem not being tested\n");
return 2;
}
} else {
/* Confirmed on:
* kernel v3.8, GNU libc v2.17 (Mint 15)
* kernel v3.2, GNU libc v2.15 (Xubuntu 12.04 LTS)
* kernel v3.1, GNU libc v2.13 (Debian 6.0)
* kernel v2.6.32, GNU libc v2.12 (RHEL 6.4)
*/
printf("Success !!!\n");
}
// clean up
close(shm_fd);
shm_unlink(shm_file);
rmdir(shm_dir);
return 0;
}
/* vi: set ts=2 sw=2 ai expandtab:
*/
When I run this program on a fairly new distribution, the call to shm_open() returns -1, and errno is set to EINVAL. However, when I run on something a little older, it creates the shared memory object in /dev/shm/my_dir as expected.
For the larger application, the solution is simple. I can use a common prefix instead of a directory.
If you could help enlighten me to this apparent change in behavior it would be very helpful. I suspect someone else out there might be trying to do something similar.
So it turns out the issue stems from how GNU libc validates the shared memory name. Specifically, the shared memory object MUST now be at the root of the shmfs mount point.
This was changed in glibc git commit b20de2c3d9 as the result of bug BZ #16274.
Specifically, the change is the line:
if (name[0] == '\0' || namelen > NAME_MAX || strchr (name, '/') != NULL)
Which now disallows '/' from anywhere in the filename (not counting leading '/')
If you have a third party tool that was broken by this shm_open change, a brilliant coworker found a workaround : preload a library that overrides the shm_open call and swaps slashes for underscores. It does the same for shm_unlink as well, so the application can properly free shared memory when needed.
deslash_shm.cc :
#include <dlfcn.h>
#include <sys/mman.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <algorithm>
#include <string>
// function used in place of the standard shm_open() function
extern "C" int shm_open(const char *name, int oflag, mode_t mode)
{
// keep a function pointer to the real shm_open() function
static int (*real_open)(const char *, int, mode_t) = NULL;
// the first time in, ask the dynamic linker to find the real shm_open() function
if (!real_open) real_open = (int (*)(const char *, int, mode_t)) dlsym(RTLD_NEXT,"shm_open");
// take the name we were given and replace all slashes with underscores instead
std::string n = name;
std::replace(n.begin(), n.end(), '/', '_');
// call the real open function with the patched path name
return real_open(n.c_str(), oflag, mode);
}
// function used in place of the standard shm_unlink() function
extern "C" int shm_unlink(const char *name)
{
// keep a function pointer to the real shm_unlink() function
static int (*real_unlink)(const char *) = NULL;
// the first time in, ask the dynamic linker to find the real shm_unlink() function
if (!real_unlink) real_unlink = (int (*)(const char *)) dlsym(RTLD_NEXT, "shm_unlink");
// take the name we were given and replace all slashes with underscores instead
std::string n = name;
std::replace(n.begin(), n.end(), '/', '_');
// call the real unlink function with the patched path name
return real_unlink(n.c_str());
}
To compile this file:
c++ -fPIC -shared -o deslash_shm.so deslash_shm.cc -ldl
And preload it before starting a process that tries to use non-standard slash characters in shm_open:
in bash:
export LD_PRELOAD=/path/to/deslash_shm.so
in tcsh:
setenv LD_PRELOAD /path/to/deslash_shm.so

Combining a static char* and a constant string

I want to be able to append a constant string to the end of another string in the form of a char*, and then use the resulting string as an argument for open(). Here's what it looks like:
file1.cpp
#include "string.h"
file2 foo;
char* word = "some";
foo.firstWord = word; //I want the file2 class to be able to see "some"
file2.h
#include <fstream>
#include <iostream>
#define SECONDWORD "file.txt"
class file2{
public:
file2();
static char* firstWord;
static char* fullWord;
private:
ofstream stream;
}
file2.cpp
#include "file2.h"
char* file2::firstWord;
char* file2::fullWord;
fullWord = firstWord + SECONDWORD; //so fullWord is now "somefile.txt" ,I know this doesn't work, but basically I am trying to figure out this part
file2::file2(){
stream.open(fullWord);
}
So I am not very well versed in C++, so any help would be appreciated!
C++-style solution might be the following.
#include <string>
char* a = "file";
char* b = ".txt";
...
stream.open((std::string(a) + b).c_str());
What happens here? First, std::string(a) creates a temporary std::string object. Than b value is added to it. At last, c_str() method returns a c-style string which contains a + b.

Convert a String^ to wstring C++

I programmed a little Application in C++. There is a ListBox in the UI. And I want to use the selected Item of ListBox for an Algorithm where I can use only wstrings.
All in all I have two questions:
-how can I convert my
String^ curItem = listBox2->SelectedItem->ToString();
to a wstring test?
-What means the ^ in the code?
Thanks a lot!
It should be as simple as:
std::wstring result = msclr::interop::marshal_as<std::wstring>(curItem);
You'll also need header files to make that work:
#include <msclr\marshal.h>
#include <msclr\marshal_cppstd.h>
What this marshal_as specialization looks like inside, for the curious:
#include <vcclr.h>
pin_ptr<WCHAR> content = PtrToStringChars(curItem);
std::wstring result(content, curItem->Length);
This works because System::String is stored as wide characters internally. If you wanted a std::string, you'd have to perform Unicode conversion with e.g. WideCharToMultiByte. Convenient that marshal_as handles all the details for you.
I flagged this as a duplicate, but here's the answer on how to get from System.String^ to a std::string.
String^ test = L"I am a .Net string of type System::String";
IntPtr ptrToNativeString = Marshal::StringToHGlobalAnsi(test);
char* nativeString = static_cast<char*>(ptrToNativeString.ToPointer());
The trick is make sure you use Interop and marshalling, because you have to cross the boundary from managed code to non-managed code.
My version is:
Platform::String^ str = L"my text";
std::wstring wstring = str->Data();
With Visual Studio 2015, just do this:
String^ s = "Bonjour!";
C++/CLI
#include <vcclr.h>
pin_ptr<const wchar_t> ptr = PtrToStringChars(s);
C++/CX
const wchart_t* ptr = s->Data();
According to microsoft:
Ref How to: Convert System::String to Standard String
You can convert a String to std::string or std::wstring, without using PtrToStringChars in Vcclr.h.
// convert_system_string.cpp
// compile with: /clr
#include <string>
#include <iostream>
using namespace std;
using namespace System;
void MarshalString ( String ^ s, string& os ) {
using namespace Runtime::InteropServices;
const char* chars =
(const char*)(Marshal::StringToHGlobalAnsi(s)).ToPointer();
os = chars;
Marshal::FreeHGlobal(IntPtr((void*)chars));
}
void MarshalString ( String ^ s, wstring& os ) {
using namespace Runtime::InteropServices;
const wchar_t* chars =
(const wchar_t*)(Marshal::StringToHGlobalUni(s)).ToPointer();
os = chars;
Marshal::FreeHGlobal(IntPtr((void*)chars));
}
int main() {
string a = "test";
wstring b = L"test2";
String ^ c = gcnew String("abcd");
cout << a << endl;
MarshalString(c, a);
c = "efgh";
MarshalString(c, b);
cout << a << endl;
wcout << b << endl;
}
output:
test
abcd
efgh

In haskell how can I uppercase a unicode character with respect to current locale

It turns out that uppercasing a character is a complicated business. If you get out of the basic ASCII character set, the rules for uppercasing a character and lowercasing a character are actually dependent on the locale in which the application is running.
As a demo application, I am attempting to uppercase the letter 'i' (with a dot) and the letter 'i' (without a dot). Now, in en_US, 'i' (with a dot) uppercases to 'I', and 'i' (without a dot) doesn't exist (but still uppercases to 'I').
But, if I switch to Turkish (tr_TR.UTF-8), 'i' (with a dot) must uppercase to 'İ' (also with a dot) and 'ı' (without a dot) must uppercase to 'I' (also without a dot). Lowercase should reverse these operations.
iİıI --> İİII (tr_TR.UTF-8)
iİıI --> IİII (en_US.UTF-8)
Now, I can do this perfectly in C. How can I do it in Haskell? All of the searches that I do point me directly to Data.Char.toUpper, which is not locale-aware. I haven't found any functions that are locale-aware in any way.
Here's a code sample from C. I run it on my Linux machine.
#include <stdio.h>
#include <stdlib.h>
#include <locale.h>
#include <wctype.h>
#include <string.h>
#include <errno.h>
wchar_t latin_small_sharp_s[5] = {0x00df, 0x00df, 0x0053, 0x0053, 0};
wchar_t turkish_is[5] = {0x0069, 0x0130, 0x0131, 0x0049, 0};
char multibyte_turkish_is[7] = {0x69, 0x01, 0x30, 0x01, 0x31, 0x49, 0};
void print_in_locale (const char *locale, const wchar_t *str, const size_t len) {
wchar_t *dest = calloc(len * 2, sizeof(wchar_t));
int i;
if (!setlocale(LC_CTYPE, locale)) {
fprintf(stderr, "Locale %s failed with error: %s", locale, strerror(errno));
exit(1);
}
for (i = 0; i < len; i++) {
dest[i] = towupper(str[i]);
}
printf("%ls, %ls\n", str, dest);
free(dest);
}
int main () {
print_in_locale("de_DE.utf8", latin_small_sharp_s, 5);
print_in_locale("tr_TR.utf8", turkish_is, 5);
print_in_locale("de_DE.utf8", turkish_is, 5);
}
If you saved it to "locale_test.c", you can run it on the command line with...
gcc -o locale_test locale_test.c && ./locale_test
Use the Data.Text.ICU.toUpper function from the text-icu package.
toUpper :: LocaleName -> Text -> Text
Uppercase the characters in a string.
Casing is locale dependent and context sensitive. The result may be
longer or shorter than the original.

How can initialize bits of unsigned char?

I want to initialize bits of a variable of unsigned char with a string of binary digits, but I don't know how to do it.
From your comments, it seems like you want to parse an ASCII string of binary digits. This is how you do that:
#include <stdlib.h>
const char* binary = "10001001";
char* next;
unsigned char value = strtoul(binary, &next, 2);
if (*next) { /* conversion failed */ }
unsigned char s[] = "How now brown cow.\n";

Resources