How can I access an individual character in Platform::String^?
I am using this variable type because it appears to be the only way to write a string to a TextBlock on a Universal Windows App.
I have tried the following methods to get individual characters to no avail:
String ^ str = "string";
std::string::iterator it = str->begin(); //Error: platform string has no member "begin"
std::string::iterator it = str.begin(); //Error: expression must have a class type
str[0] = 't' /*Error: expression must have a pointer-to-object or handle-to-C++/CX
mapping-array type*/
I am putting the String^ in a text block named "textBlock" as follows: textBlock->Text = str;
I am open to approaches other than modifying the Platform::String. My only requirement is that the string ends in a form that can be put into a TextBox
Platform::String represents a sequential collection of Unicode characters that is used to represent text. The controlled sequence is immutable: Once constructed, the contents of a Platform::String can no longer be modified.
If you need a modifiable string, the canonical solution is to use another string class, and convert to/from Platform::String when calling the Windows Runtime, or receiving string data. This is explained under Strings (C++/CX):
The Platform::String Class provides methods for several common string operations, but it's not designed to be a full-featured string class. In your C++ module, use standard C++ string types such as wstring for any significant text processing, and then convert the final result to Platform::String^ before you pass it to or from a public interface.
You could rewrite your code sample as follows:
// Use a standard C++ string as long as you need to modify it
std::wstring s = L"string";
s[0] = L't'; // Replace first character
// Convert to Platform::String^ when required
Platform::String^ ps = ref new Platform::String(s.c_str(), s.length());
An addition to the already supplied answer: you are also able to change a Platform::String^ into a std::wstring so you can modify it, then you can always change it back to Platform::String^
Platform::String^ str = "string";
wstring orig (str->Data());
orig[0] = L't';
str = ref new Platform::String(orig.c_str());
Related
I need to replace some %20 by spaces and got compile errors which i do not understand:
CString str = _T("foo%20bar");
str.Replace('%20',' '); // C4305 argument: truncation from 'int' to 'wchar_t'
str.Replace(_T('%20'),_T(' ')); // C4305 argument: truncation from 'int' to 'wchar_t'
str.Replace(_T("%20"),_T(" ")); // C2664 'int ATL::CStringT<wchar_t,StrTraitMFC_DLL<wchar_t,ATL::ChTraitsCRT<wchar_t>>>::Replace(const wchar_t *,const wchar_t *)': cannot convert argument 1 from 'const char [4]' to 'wchar_t'
What is wrong?
The CString::Replace() method takes null-terminated string pointers as input, not individual characters. Your string literals need to use " instead of ', eg:
CString str = _T("foo%20bar");
str.Replace(_T("%20"), _T(" "));
Note that matches your last example, which you say is also erroring. The only way that can fail with the error message you have shown is if you have a misconfiguration in your project, where UNICODE is defined 1 but _UNICODE is not defined 2.
1: as evident by CString being mapped to CStringT<wchar_t>.
2: as evident by the compiler saying _T("%20") is a const char[] rather than a const wchar_t[].
CString uses TCHAR from the Win32 API, not _TCHAR from the C runtime library. Usually they are interchangeable, but not so in your situation. So, you need to either fix your project's configuration so that _UNICODE is also defined, or else use the Win32 TEXT() macro to match CString's use of TCHAR:
CString str = TEXT("foo%20bar");
str.Replace(TEXT("%20"), TEXT(" "));
Or, simply stop using TCHAR-based APIs altogether (TCHAR dates back to the days of Win9x/ME when Microsoft was pushing everyone to start migrating their code to Unicode), and really should not be used in modern coding if you can avoid it. Just use wchar_t strings directly instead, eg:
CStringW str = L"foo%20bar";
str.Replace(L"%20", L" ");
The last one should have worked, except that you seem to have a wide CString in a project without the UNICODE and/or _UNICODE macro defined.
In this combination, the _T() macro isn't giving you a compatible string literal. But L"whatever" will.
str.Replace(L"%20", L" ");
Notice that this does what you asked, but is not adequate for URL unescaping. You should convert all %xx sequences.
%20 may be formatted string like %d. and Replace function return replaced String and str is not replaced.
try like: str = str.Replace(_T("%%20"), _T(" "));
or
try like: str = str.Replace(_T("%20"),_T(" "));
Extra Info
If you look at this Format specification syntax: printf and wprintf functions article you will see the following clarification:
A basic conversion specification contains only the percent sign and a type character. For example, %s specifies a string conversion. To print a percent-sign character, use %%. ...
My starting point is a string separated by commas, containing a variable number of integers e.g.:
System::String^ tags=gcnew String("1,3,5,9");
Now I would like - with the least amount of steps possible - convert this string into a Integer list:
List<System::Int32>^ taglist= gcnew List<System::Int32>();//to contain 1,3,5,9
Additionally, after manipulating the list, I need to export it back to a string at the end of the day. I saw the question being asked for C# here
but not for C++ which will be slightly different.
I tried directly initialize using the string, but that failed. I also tried .Split but that produces strings. I also dont want to do any complicated streamreader stuff.
The answer in the link must have an equivalent for C++/cli.
As it mentioned in a comments you can use Split to convert the string to an array of strings, then you can use Array::ConvertAll to convert to an array of int values and after manipulating the values you can ise String::Join to convert an array of ints to a single string.
Here's a code sample:
String^ tags = gcnew String("1,3,5,9");
String^ separator = ",";
array<String^>^ tagsArray = tags->Split(separator->ToCharArray());
array<int>^ tagsIntArray = Array::ConvertAll<String^, int>(tagsArray,
gcnew Converter<String^, int>(Int32::Parse));
// Do your stuff
String^ resultString = String::Join<int>(separator, tagsIntArray);
I am slowly understanding things in swift, I am coming for a javascript background so it is somewhat familiar.
However variables are urking me.
in JS a variable can be
var varName = 1; //Number
var varName2 = "petey" //String
var conCat = varname + varName2; // 1petey
however in swift String vars and In var are troubling me. All I want to do is capture decimal number from user input from multiple "textField" sources and store that data to variable (which i already have setup) I need to concatinate a few of them with + then do basic math with the data in the variables with * and /.
How do I make the textFeld only capture?int or how do I convert text field strings to numbers, e.g. int?
A UITextField contains a String, not an Int, so you have to convert it, e.g.
let number : Int? = textField.text.toInt()
So, there actually is a method to convert from String to Int built-in in Swift. Beware, that it returns an optional, because the conversion may fail. So you have to check for nil before you use the variable.
For your other question, take a look at e.g. UITextField - Allow only numbers and punctuation input/keypad. Basically, you will have to adapt UITextField and filter it, once there are new entries. On iOS it might be easier, because you can show a numbers-only keyboard.
The data send from the textField is String. There are multiple ways to convert an Int to a String.
var a:String="\(2+3)" //"5"
And to concatenate a String to Int :
var b:String="Hello "+"\(3*4)" //"Hello 12"
And to Convert a String to an Int from a textField:
var b:Int=textField.text.toInt()
Use the function in swift inputmethod.intValue(); if any text is entered then it returns with a 0.
Can I convert directly between a Swift Character and its Unicode numeric value? That is:
var i:Int = ... // A plain integer index.
var myCodeUnit:UInt16 = myString.utf16[i]
// Would like to say myChar = myCodeUnit as Character, or equivalent.
or...
var j:String.Index = ... // NOT an integer!
var myChar:Character = myString[j]
// Would like to say myCodeUnit = myChar as UInt16
I can say:
myCodeUnit = String(myChar).utf16[0]
but this means creating a new String for each character. And I am doing this thousands of times (parsing text) so that is a lot of new Strings that are immediately being discarded.
The type Character represents a "Unicode grapheme cluster", which can be multiple Unicode codepoints. If you want one Unicode codepoint, you should use the type UnicodeScalar instead.
As per the swift book:
String to Code Unit
To get codeunit/ordinals for each character of the String, you can do the following:
var yourSwiftString = "甲乙丙丁"
for scalar in yourSwiftString.unicodeScalars {
print("\(scalar.value) ")
}
Code Unit to String
Because swift current does not have a way to convert ordinals/code units back to UTF, the best way I found is to still NSString. i.e. if you have int ordinals (32bit but representing the 21bit codepoints) you can use the following to convert to Unicode:
var i = 22247
var unicode_str = NSString(bytes: &i, length: 4, encoding: NSUTF32LittleEndianStringEncoding)
Obviously if you want to convert a array of ints, you'll need to pack them into a array first.
I spoke to an Apple engineer who is working on Unicode and he says they have not completed the implementation of unicode characters in strings. Are you looking at getting a code unit or a full character? Because the only and proper way to get at a full unicode character is by using a for each loop on a string. ie
for c in "hello" {
// c is a unicode character of type Character
}
But, this is not implemented as of yet.
I have defined a createdirectory(const stdStr& path) in a class and I am accessing that function from another class using Directory::CreateDirectory("C:\\Temp");
I am getting an error on "C"\Temp" saying that "
no suitable constructor exists to convert from "const char [4]" to "std::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t>>"
Because your "C:\\Temp" string is an array of char, but the function is using a string templated on wchar. Personally, I avoid Unicode like the plague, but I think you need L"C:\\Temp" (note the preceding L).