The code below prints the response variable preceded by spaces.
The number of spaces to be printed before the response variable is equivalent to the difference of the itemNumber and the examineeResponses.
Now, is it possible to pad the string with zero ("0") instead of spaces using String.format?
def converted = examineeResponses+String.format("%${itemNumber - 1}s", response)
Example using the above codes:
examineeResponses = '1' // String
itemNumber = 10 //int
response = '3' // String
Output:
" 3"
Desired output:
"000000003"
I believe you can do this, but it's a bit hard to understand your question:
int itemNumber = 10
String examineeResponses = '1'
char response = '3'
"$response".padLeft( itemNumber - examineeResponses.length(), '0' )
Though I suspect (you don't say) that you just want it printed itemNumbers characters wide. If this is the case, you just need:
"$response".padLeft( itemNumber, '0' )
And you don't need examineeResponses at all.
One of the bits I struggle with in your question is I don't know what examineeResponses.length() is supposed to do (other than throw an error). Another is I'm not sure that this is what you want to do ;-)
You can't zero pad Strings with String.format in Java or Groovy, you can only zero pad numerics
Try something like this:
def formattedString = "blahblah.com?param=${param}&hash=${hash}"
Related
I have a string = "1337" and I want to convert it to a list of Int, I tried to get every element in the string and convert it to Int like this string[0].toInt but I didn't get the number I get the Ascii value, I can do it with this Character.getNumericValue(number), How I do it without using a built it function? with good complexity?
What do you mean "without using a built in function"?
string[0].toInt gives you the ASCII value of the character because the fun get(index: Int) on String has a return type of Char, and a Char behaves closer to a Number than a String. "0".toInt() == 0 will yield true, but '0'.toInt() == 0 will yield false. The difference being the first one is a string and the second is a character.
A oneliner
string.split("").filterNot { it.isBlank() }.map { it.toInt() }
Explanation: split("") will take the string and give you a list of every character as a string, however, it will give you an empty string at the beginning, which is why we have filterNot { it.isBlank() }, we then can use map to transform every string in our list to Int
If you want something less functional and more imperative that doesn't make use of functions to convert there is this
val ints = mutableListOf<Int>() //make a list to store the values in
for (c: Char in "1234") { //go through all of the characters in the string
val numericValue = c - '0' //subtract the character '0' from the character we are looking at
ints.add(numericValue) //add the Int to the list
}
The reason why c - '0' works is because the ASCII values for the digits are all in numerical order starting with 0, and when we subtract one character from another, we get the difference between their ASCII values.
This will give you some funky results if you give it a string that doesn't have only digits in it, but it will not throw any exceptions.
As in Java and by converting Char to Int you get the ascii equivalence.
You can instead:
val values = "1337".map { it.toString().toInt() }
println(values[0]) // 1
println(values[1]) // 3
// ...
Maybe like this? No-digits are filtered out. The digits are then converted into integers:
val string = "1337"
val xs = string.filter{ it.isDigit() }.map{ it.digitToInt() }
Requires Kotlin 1.4.30 or higher and this option:
#OptIn(ExperimentalStdlibApi::class)
Apologies if this is a duplicate. I have a helper function called inputString() that takes user input and returns a String. I want to proceed based on whether an upper or lowercase character was entered. Here is my code:
print("What do you want to do today? Enter 'D' for Deposit or 'W' for Withdrawl.")
operation = inputString()
if operation == "D" || operation == "d" {
print("Enter the amount to deposit.")
My program quits after the first print function, but gives no compiler errors. I don't know what I'm doing wrong.
It's important to keep in mind that there is a whole slew of purely whitespace characters that show up in strings, and sometimes, those whitespace characters can lead to problems just like this.
So, whenever you are certain that two strings should be equal, it can be useful to print them with some sort of non-whitespace character on either end of them.
For example:
print("Your input was <\(operation)>")
That should print the user input with angle brackets on either side of the input.
And if you stick that line into your program, you'll see it prints something like this:
Your input was <D
>
So it turns out that your inputString() method is capturing the newline character (\n) that the user presses to submit their input. You should improve your inputString() method to go ahead and trim that newline character before returning its value.
I feel it's really important to mention here that your inputString method is really clunky and requires importing modules. But there's a way simpler pure Swift approach: readLine().
Swift's readLine() method does exactly what your inputString() method is supposed to be doing, and by default, it strips the newline character off the end for you (there's an optional parameter you can pass to prevent the method from stripping the newline).
My version of your code looks like this:
func fetchInput(prompt: String? = nil) -> String? {
if let prompt = prompt {
print(prompt, terminator: "")
}
return readLine()
}
if let input = fetchInput("Enter some input: ") {
if input == "X" {
print("it matches X")
}
}
the cause of the error that you experienced is explained at Swift how to compare string which come from NSString. Essentially, we need to remove any whitespace or non-printing characters such as newline etc.
I also used .uppercaseString to simplify the comparison
the amended code is as follows:
func inputString() -> String {
var keyboard = NSFileHandle.fileHandleWithStandardInput()
var inputData = keyboard.availableData
let str: String = (NSString(data: inputData, encoding: NSUTF8StringEncoding)?.stringByTrimmingCharactersInSet(
NSCharacterSet.whitespaceAndNewlineCharacterSet()))!
return str
}
print("What do you want to do today? Enter 'D' for Deposit or 'W' for Withdrawl.")
let operation = inputString()
if operation.uppercaseString == "D" {
print("Enter the amount to deposit.")
}
I need to split a uint to a list of bits (list of chars, where every char is "0" or "1", is also Ok). The way I try to do it is to concatenate the uint into string first, using binary representation for numeric types - bin(), and then to split it using str_split_all():
var num : uint(bits:4) = 0xF; // Can be any number
print str_split_all(bin(num), "/w");
("/w" is string match pattern that means any char).
The output I expect:
"0"
"b"
"1"
"1"
"1"
"1"
But the actual output is:
0. "0b1111"
Why doesn't it work? Thank you for your help.
If you want to split an integer into a list of bits, you can use the %{...} operator:
var num_bits : list of bit = %{num};
You can find a working example on EDAPlayground.
As an extra clarification to your question, "/w" doesn't mean match any character. The string "/\w/" means match any single character in AWK Syntax. If you put that into your match expression, you'll get (almost) the output you want, but with some extra blanks interleaved (the separators).
Regardless, if you want to split a string into its constituting characters, str_split_all(...) isn't the way to go. It's easier to convert the string into ASCII characters and then convert those back to string again:
extend sys {
run() is also {
var num : uint(bits:4) = 0xF; // Can be any number
var num_bin : string = bin(num);
var num_bin_chars := num_bin.as_a(list of byte);
for each (char) in num_bin_chars {
var char_as_string : string;
unpack(packing.low, %{8'b0, char}, char_as_string);
print char_as_string;
};
};
};
The unpack(...) syntax is directly from the e Reference Manual, Section 2.8.3 Type Conversion Between Strings and Scalars or Lists of Scalars
I have an Array where some drive data from WMI are captured:
$drivedata = $Drives | select #{Name="Kapazität(GB)";Expression={$_.Kapazität}}
The Array has these values (2 drives):
#{Kapazität(GB)=1.500} #{Kapazität(GB)=1.500}
and just want to convert the 1.500 into a number 1500
I tried different suggestions I found here, but couldn't get it working:
-Replace ".","" and [int] doesn't work.
I am not sure if regex would be correct and how to do this.
Simply casting the string as an int won't work reliably. You need to convert it to an int32. For this you can use the .NET convert class and its ToInt32 method. The method requires a string ($strNum) as the main input, and the base number (10) for the number system to convert to. This is because you can not only convert to the decimal system (the 10 base number), but also to, for example, the binary system (base 2).
Give this method a try:
[string]$strNum = "1.500"
[int]$intNum = [convert]::ToInt32($strNum, 10)
$intNum
Simply divide the Variable containing Numbers as a string by 1. PowerShell automatically convert the result to an integer.
$a = 15; $b = 2; $a + $b --> 152
But if you divide it before:
$a/1 + $b/1 --> 17
Since this topic never received a verified solution, I can offer a simple solution to the two issues I see you asked solutions for.
Replacing the "." character when value is a string
The string class offers a replace method for the string object you want to update:
Example:
$myString = $myString.replace(".","")
Converting the string value to an integer
The system.int32 class (or simply [int] in powershell) has a method available called "TryParse" which will not only pass back a boolean indicating whether the string is an integer, but will also return the value of the integer into an existing variable by reference if it returns true.
Example:
[string]$convertedInt = "1500"
[int]$returnedInt = 0
[bool]$result = [int]::TryParse($convertedInt, [ref]$returnedInt)
I hope this addresses the issue you initially brought up in your question.
I demonstrate how to receive a string, for example "-484876800000" and tryparse the string to make sure it can be assigned to a long. I calculate the Date from universaltime and return a string. When you convert a string to a number, you must decide the numeric type and precision and test if the string data can be parse, otherwise, it will throw and error.
function universalToDate
{
param (
$paramValue
)
$retVal=""
if ($paramValue)
{
$epoch=[datetime]'1/1/1970'
[long]$returnedLong = 0
[bool]$result = [long]::TryParse($paramValue,[ref]$returnedLong)
if ($result -eq 1)
{
$val=$returnedLong/1000.0
$retVal=$epoch.AddSeconds($val).ToString("yyyy-MM-dd")
}
}
else
{
$retVal=$null
}
return($retVal)
}
Replace all but the digits in the string like so:
$messyString = "Get the integer from this string: -1.500 !!"
[int]$myInt = $messyString -replace '\D', ''
$myInt
# PS > 1500
The regex \D will match everything except digits and remove them from your string.
This will work fine for your example.
It seems the issue is in "-f ($_.Partition.Size/1GB)}}" If you want the value in MB then change the 1GB to 1MB.
Can I convert directly between a Swift Character and its Unicode numeric value? That is:
var i:Int = ... // A plain integer index.
var myCodeUnit:UInt16 = myString.utf16[i]
// Would like to say myChar = myCodeUnit as Character, or equivalent.
or...
var j:String.Index = ... // NOT an integer!
var myChar:Character = myString[j]
// Would like to say myCodeUnit = myChar as UInt16
I can say:
myCodeUnit = String(myChar).utf16[0]
but this means creating a new String for each character. And I am doing this thousands of times (parsing text) so that is a lot of new Strings that are immediately being discarded.
The type Character represents a "Unicode grapheme cluster", which can be multiple Unicode codepoints. If you want one Unicode codepoint, you should use the type UnicodeScalar instead.
As per the swift book:
String to Code Unit
To get codeunit/ordinals for each character of the String, you can do the following:
var yourSwiftString = "甲乙丙丁"
for scalar in yourSwiftString.unicodeScalars {
print("\(scalar.value) ")
}
Code Unit to String
Because swift current does not have a way to convert ordinals/code units back to UTF, the best way I found is to still NSString. i.e. if you have int ordinals (32bit but representing the 21bit codepoints) you can use the following to convert to Unicode:
var i = 22247
var unicode_str = NSString(bytes: &i, length: 4, encoding: NSUTF32LittleEndianStringEncoding)
Obviously if you want to convert a array of ints, you'll need to pack them into a array first.
I spoke to an Apple engineer who is working on Unicode and he says they have not completed the implementation of unicode characters in strings. Are you looking at getting a code unit or a full character? Because the only and proper way to get at a full unicode character is by using a for each loop on a string. ie
for c in "hello" {
// c is a unicode character of type Character
}
But, this is not implemented as of yet.