Create comma delimited datapair string from numeric arrays in MATLAB - string

I have two arrays of values:
t = [0; 1; 2];
q = [0; 100; 200];
I need those to be one string that's like:
str = '0, 0, 1, 100, 2, 200';
I can't see a nice way to do it in MATLAB (R2017a) without using a loop. I'd like to avoid that if possible as there's a pretty large array of values and a lot of files and it'll take forever.
Any ideas?

Combine compose with strjoin:
t = [0; 1; 2];
q = [0; 100; 200];
str = strjoin(compose('%d', [t(:)'; q(:)']), ', ');
Output:
str =
'0, 0, 1, 100, 2, 200'
For non integer numbers, use: %f instead of %d

Here's a possible approach. This works for integer numbers, or if you want a fixed number of decimals in the string representation:
t = [0; 1; 2];
q = [0; 100; 200];
tq = reshape([t(:).'; q(:).'], 1, []);
s = sprintf('%i, ',tq); % or change '%i' to something like '%.5f'
s = s(1:end-2)
Result:
s =
'0, 0, 1, 100, 2, 200'
If you have non-integer numbers and want the number of decimals in the representation to be chosen automatically, you can use mat2str instead of sprintf, but then you need to deal with the spaces using regexpre or a similar function:
t = [0; 1; 2];
q = [0; 100; 200];
tq = reshape([t(:).'; q(:).'], 1, [])
s = regexprep(num2str(tq), '\s+', ', ');

Related

Removing padding at end of rust vector?

I have gotten this neat way of padding vector messages, such that I can know that they will be the same length
let len = 20;
let mut msg = vec![1, 23, 34];
msg.resize(len, 0);
println!("msg {:?}", msg);
Nice, this pads a lot of zeros to any message, running this code will give me:
msg [1, 23, 34, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
But let's say I send this message over some connection, and the other party receives it at their end.
How do I take a vector like this, and strip off all the 0's in the end?
Notice that the length of the original message may be of variable length, but always less than 20
Another thing that could work for me, is to have all the padding at the beginning of the vector, doing something like this:
let len = 20;
let msg = vec![1, 23, 34];
let mut payload = vec![0;len-msg.len()];
payload.extend(&msg);
println!("msg {:?}", payload);
And then just removing, all the preleading zero's.
As stated in the comments above, probably changing your protocol to include the length of the message is cleaner.
But here would be the solution to remove padding in front of the message (if the message doesn't start with zeros):
let msg: Vec<_> = payload.into_iter().skip_while(|x| *x == 0).collect();
Note that this allocates a new Vec for your msg, probably you could use the iterator directly.
Playground

Is there an efficient function in Rust that finds the index of the first occurrence of a value in a sorted vector?

In [3, 2, 1, 1, 1, 0], if the value we are searching for is 1, then the function should return 2.
I found binary search, but it seems to return the last occurrence.
I do not want a function that iterates over the entire vector and matches one by one.
binary_search assumes that the elements are sorted in less-to-greater order. Yours is reversed, so you can use binary_search_by:
let x = 1; //value to look for
let data = [3,2,1,1,1,0];
let idx = data.binary_search_by(|probe| probe.cmp(x).reverse());
Now, as you say, you do not get the first one. That is expected, for the binary search algorithm will select an arbitrary value equal to the one searched. From the docs:
If there are multiple matches, then any one of the matches could be returned.
That is easily solvable with a loop:
let mut idx = data.binary_search_by(|probe| probe.cmp(&x).reverse());
if let Ok(ref mut i) = idx {
while x > 0 {
if data[*i - 1] != x {
break;
}
*i -= 1;
}
}
But if you expect many duplicates that may negate the advantages of the binary search.
If that is a problem for you, you can try to be smarter. For example, you can take advantage of this comment in the docs of binary_search:
If the value is not found then Result::Err is returned, containing the index where a matching element could be inserted while maintaining sorted order.
So to get the index of the first value with a 1 you look for an imaginary value just between 2 and 1 (remember that your array is reversed), something like 1.5. That can be done hacking a bit the comparison function:
let mut idx = data.binary_search_by(|probe| {
//the 1s in the slice are greater than the 1 in x
probe.cmp(&x).reverse().then(std::cmp::Greater)
});
There is a handy function Ordering::then() that does exactly what we need (the Rust stdlib is amazingly complete).
Or you can use a simpler direct comparison:
let idx = data.binary_search_by(|probe| {
use std::cmp::Ordering::*;
if *probe > x { Less } else { Greater }
});
The only detail left is that this function will always return Err(i), being i either the position of the first 1 or the position where the 1 would be if there are none. An extra comparison is necessary so solve this ambiguity:
if let Err(i) = idx {
//beware! i may be 1 past the end of the slice
if data.get(i) == Some(&x) {
idx = Ok(i);
}
}
Since 1.52.0, [T] has the method partition_point to find the partition point with a predicate in O(log N) time.
In your case, it should be:
let xs = vec![3, 2, 1, 1, 1, 0];
let idx = xs.partition_point(|&a| a > 1);
if idx < xs.len() && xs[idx] == 1 {
println!("Found first 1 idx: {}", idx);
}

Sequence Of Zero

Consider the sequence of numbers from 1 to 𝑁. For example, for 𝑁 = 9,
we have 1, 2, 3, 4, 5, 6, 7, 8, 9.
Now, place among the numbers one of the three following operators:
"+" sum
"-" subtraction
"#" Paste Operator --> paste the previous and the next operands.
For example, 1#2 = 12
How can I calculate the number of possible sequences that yield zero ?
Example for N = 7:
1+2-3+4-5-6+7
1+2-3-4+5+6-7
1-2#3+4+5+6+7
1-2#3-4#5+6#7
1-2+3+4-5+6-7
1-2-3-4-5+6+7
See the fourth sequence, it is same as 1-23-45+67 and the result is 0.
All of the above sequences evaluate to zero.
Here is my recursion based solution just to build your intuition so that you can approach and improve this solution using dynamic programming on your own (implemented in c++):
// N is the input
// index_count is the index count in the given sequence
// sum is the total sum of a given sequence
int isEvaluteToZero(int N, int index_count, int sum){
// if N==1, then the sequence only contains 1 which is not 0, so return 0
if(N==1){
return 0;
}
// Base case
// if index_count is equal to N and total sum is 0, return 1, else 0
if(index_count==N){
if(sum==0){
return 1;
}
return 0;
}
// recursively call by considering '+' between index_count and index_count+1
// increase index_count by 1
int placeAdd = isEvaluteToZero(N, index_count+1, sum+index_count+1);
// recursively call by considering '-' between index_count and index_count+1
// increase index_count by 1
int placeMinus = isEvaluteToZero(N, index_count+1, sum-index_count-1);
// place '#'
int placePaste;
if(index_count+2<=N){
// paste the previous and the next operands
// For e.g., (8#9) = 8*(10^1)+9 = 89
// (9#10) = 9*(10^2)+10 = 910
// (99#100) = 99*(10^3)+100 = 99100
// (999#1000) = 999*(10^4)+1000 = 9991000
int num1 = index_count+1;
int num2 = index_count+2;
int concat_num = num1*(int)(pow(10, (int)num2/10 + 1) + 0.5)+num2;
placePaste = isEvaluteToZero(N, index_count+2, sum+concat_num) + isEvaluteToZero(N, index_count+2, sum-concat_num);
}else{
// in case index_count+2>N
placePaste = 0;
}
return (placeAdd+placeMinus+placePaste);
}
int main(){
int N, res=1, index_count=1;
cout<<"Enter N:";
cin>>N;
cout<<isEvaluteToZero(N, index_count, res)<<endl;
return 0;
}
output:
N=1 output=0
N=2 output=0
N=3 output=1
N=4 output=1
N=7 output=6

Problems using newpad with a size larger than the screen

I'm trying to scroll some text larger than the screen.
The docs say newpad is not limited by the screen size, but initiating it with values greater than the terminal available columns or lines fails to print anything:
newpad(LINES + 1, COLS); // fails
newpad(LINES, COLS); // works
Entire code for reference:
extern crate ncurses;
use ncurses::*;
fn main() {
initscr();
start_color();
use_default_colors();
cbreak();
noecho();
curs_set(CURSOR_VISIBILITY::CURSOR_INVISIBLE);
let pad = newpad(1000, COLS);
refresh();
let mut x = 0;
while x < 1000 {
x += 1;
wprintw(pad, &format!("Line number {}\n", x));
}
prefresh(pad, 0, 0, 0, 0, LINES, COLS);
getch();
endwin();
}
The behavior is a bit odd.
If the number of lines or rows is greater than the viewport, the last two prefresh arguments must be at most LINES - 1 and COLS - 1 respectively:
prefresh(pad, 0, 0, 0, 0, LINES - 1, COLS - 1);
If it's less, there's no need to subtract 1, as the code will work as expected.

how to convert a swift String to an array of CGGlyph

This snippet can be used for drawing CGGlyphs with a CGContext.
//drawing
let coreGraphicsFont = CTFontCopyGraphicsFont(coreTextFont, nil)
CGContextSetFont(context, coreGraphicsFont);
CGContextSetFontSize(context, CTFontGetSize(coreTextFont))
CGContextSetFillColorWithColor(context, Color.blueColor().CGColor)
CGContextShowGlyphsAtPositions(context, glyphs, positions, length)
But how do I obtain the CGGlyphs from a swift string which contains emoji symbols like flags or accented characters?
let string = "swift: \u{1F496} \u{65}\u{301} \u{E9}\u{20DD} \u{1F1FA}\u{1F1F8}"
Neither of these approaches shows the special characters, even though they are correctly printed to the console. Note that this first approach returns NSGlyph but CGGlyph's are required for drawing.
var progress = CGPointZero
for character in string.characters
{
let glyph = font.glyphWithName(String(character))
glyphs.append(CGGlyph(glyph))
let advancement = font.advancementForGlyph(glyph)
positions.append(progress)
progress.x += advancement.width
}
or this second approach which requires casting to NSString:
var buffer = Array<unichar>(count: length, repeatedValue: 0)
let range = NSRange(location: 0, length: length)
(string as NSString).getCharacters(&buffer, range: range)
glyphs = Array<CGGlyph>(count: length, repeatedValue: 0)
CTFontGetGlyphsForCharacters(coreTextFont, &buffer, &glyphs, length)
//glyph positions
advances = Array<CGSize>(count: length, repeatedValue: CGSize.zero)
CTFontGetAdvancesForGlyphs(ctFont, CTFontOrientation.Default, glyphs, &advances, length)
positions = []
var progress = CGPointZero
for advance in advances
{
positions.append(progress)
progress.x += advance.width
}
Some of the characters are drawn as empty boxes with either approach. Kinda stuck here, hoping you can help.
edit:
Using CTFontDrawGlyphs renders the glyphs correctly, but setting the font, size and text matrix directly before calling CGContextShowGlyphsAtPositions draws nothing. I find that rather odd.
If you generate glyphs yourself, you also need to perform font substitution yourself. When you use Core Text or TextKit to lay out and draw the text, they perform font substitution for you. For example:
let richText = NSAttributedString(string: "Hello😀→")
let line = CTLineCreateWithAttributedString(richText)
print(line)
Output:
<CTLine: 0x7fa349505f00>{run count = 3, string range = (0, 8), width = 55.3457, A/D/L = 15/4.6875/0, glyph count = 7, runs = (
<CTRun: 0x7fa34969f600>{string range = (0, 5), string = "Hello", attributes = <CFBasicHash 0x7fa3496902d0 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153bb720 [0x10e85a7b0]>{contents = "NSFont"} = <CTFont: 0x7fa3496182f0>{name = Helvetica, size = 12.000000, matrix = 0x0, descriptor = <CTFontDescriptor: 0x7fa34968f860>{attributes = <CFBasicHash 0x7fa34968f8b0 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153c16c0 [0x10e85a7b0]>{contents = "NSFontNameAttribute"} = <CFString 0x1153b4700 [0x10e85a7b0]>{contents = "Helvetica"}
}
>}}
}
}
<CTRun: 0x7fa3496cde40>{string range = (5, 2), string = "\U0001F600", attributes = <CFBasicHash 0x7fa34b11a150 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153bb720 [0x10e85a7b0]>{contents = "NSFont"} = <CTFont: 0x7fa3496c3eb0>{name = AppleColorEmoji, size = 12.000000, matrix = 0x0, descriptor = <CTFontDescriptor: 0x7fa3496a3c30>{attributes = <CFBasicHash 0x7fa3496a3420 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153c16c0 [0x10e85a7b0]>{contents = "NSFontNameAttribute"} = <CFString 0x11cf63bb0 [0x10e85a7b0]>{contents = "AppleColorEmoji"}
}
>}}
}
}
<CTRun: 0x7fa3496cf3e0>{string range = (7, 1), string = "\u2192", attributes = <CFBasicHash 0x7fa34b10ed00 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153bb720 [0x10e85a7b0]>{contents = "NSFont"} = <CTFont: 0x7fa3496cf2c0>{name = PingFangSC-Regular, size = 12.000000, matrix = 0x0, descriptor = <CTFontDescriptor: 0x7fa3496a45a0>{attributes = <CFBasicHash 0x7fa3496a5660 [0x10e85a7b0]>{type = mutable dict, count = 1,
entries =>
2 : <CFString 0x1153c16c0 [0x10e85a7b0]>{contents = "NSFontNameAttribute"} = <CFString 0x11cf63230 [0x10e85a7b0]>{contents = "PingFangSC-Regular"}
}
>}}
}
}
)
}
We can see here that Core Text recognized that the default font (Helvetica) doesn't have glyphs for the emoji or the arrow, so it split the line into three runs, each with the needed font.
The Core Text Programming Guide says this:
Most of the time you should just use a CTLine object to get this information because one font may not encode the entire string. In addition, simple character-to-glyph mapping will not get the correct appearance for complex scripts. This simple glyph mapping may be appropriate if you are trying to display specific Unicode characters for a font.
Your best bet is to use CTLineCreateWithAttributedString to generate glyphs and choose fonts. Then, if you want to adjust the position of the glyphs, use CTLineGetGlyphRuns to get the runs out of the line, and then ask the run for the glyphs, the font, and whatever else you need.
If you want to handle font substitution yourself, I think you're going to want to look into “font cascading”.

Resources