Reading strings into Matlab from excel? - string

I would like to read strings into Matlab from an excel file
ID = xlsread('data.xlsx',1, 'D2:D4')
the cells in range D2:D4 have strings in them. When I try to import the strings into Matlab all I get is an empty list? what can I do to fix this?

If you're in Matlab 2010 you can also do something like this to avoid having extra values in your workspace.
[~, ~, raw] = xlsread('data.xlsx',1, 'D2:D4')

I need to use this
[num, txt, raw] = xlsread('data.xlsx',1, 'D2:D4')
the txt will import stings into Matlab.

Related

Excel Unicode appearing literally in cell value

I've got strings in an Excel table that have literal unicode values, i.e.   and I'm trying to compare them to another string which instead of the unicode string, has simply a space.
How do I do this? I tried this:
textref = replace(textref, ChrW$(160), " ")
as well as
textref = replace(textref, " ", " ")
and I can't seem to get it to work. Alternatively, is there a way to make Excel render the text as spaces?
Thanks in advance

Table Segmentation,Data Reduction, Sorting And Writing in MATLAB/EXCEL

I would like to do a data reduction operation on a spreadsheet. Preferably I would like to use MATLAB/(or excel) since I need separate output files for each case.
The link is for the spreadsheet is below
Spreadsheet
A screenshot of the spreadsheet is as below
The output I required in text files is something as below
The first sheet in the .xls file is the main input. Wheras the the following sheets (d**) are my required output. I also need these sheets in a separate ASCII file (.dat) to plot hem later on. Here is how the algorithm works
Lookup the number/string in column B(FileName)
Extract all data in Columns C and D (Saturation and ETC) with same FileName Value(Column B)
Lookup the matching FileName(Column B) value in Column E (ImageIndex).
Copy Value of ImageName(Column F) to the corresponding Value in Image(IndexColumn E)
Result would be three columns (ImageName,Saturation,ETC). ImageName would be same for each subcase
Sort the columns based on Saturation
Write each sub case as a separate .dat file
I tried using a few recipes using categorical arrays (findgroups and splitapply) in MATLAB. Didn't seem to work out for me. I would be later working on a larger data set so automation is necessary. I think this could be done using macros on excel, but I would prefer using MATLAB since I would use MATLAB to plot the data. Any other alternative suggestions are welcome
Thanks,
Here's a Matlab solution. You could do it with a rather convoluted accumarray call, but readability would be rather bad, so I'm opting for a loop here.
out is a structure which you can use to either write files, or to plot the data.
tbl = readtable('yourFile.xls');
%# get the group indices for the files
%# this assumes that you have cleaned up the dash after the 1
%# so that all of the entries in the FileName column are numeric
idx = tbl.FileName;
%# the uIdx business is to account for the possibility
%# that there are images missing from the sequence
uIdx = unique(idx);
nImages = length(uIdx);
%# preassign output structure
out(1:nImages) = struct('name','','saturation',0,'etc',0);
%# loop to extract relevant information
for iImage = uIdx(:)'
myIdx = idx==iImage;
data = tbl(myIdx,{'Saturation','ETC'});
data = sortrows(data,'Saturation');
name = tbl.ImageName{tbl.ImageIdx==iImage};
out(iImage==uIdx).name = name;
out(iImage==uIdx).saturation = data.Saturation;
out(iImage==uIdx).etc= data.ETC;
end
%# plotting
for iImage = 1:nImages
figure('name',out(iImage).name)
plot(out(iImage).saturation, out(iImage).etc,'.');
end

importing several text files into excel spreadsheet using matlab

I have several text files with 2 columns, containing only numbers which i would like to import into a single Excel spreadsheet (Excel 2016) using matlab. Reason for using matlab (R2014a) is because i later have scripts that process the data as well as its the only progaming language i am mildly familiar with.
I tried to use the following
Using matlab to save a 100 text files to a one excel file but into different spread sheet?
but i just couldnt understand anything as I am a newbie and this example I think is for making several excel files while I want only one. Thanks for the help! Greatly appreciated.
content = dir();
col = 1;
for i = 1:10
if content(i).isdir ~= 1
fileID = fopen('AN050ME.ASC');
data = textscan(fileID, '%s %s');
fclose(fileID);
datum(:, col) = data{1};
col = col + 1;
datum(:, col) = data{2};
col = col + 1;
clear data;
end
end
filename = 'Datum.xls';
sheet=1;
xlswrite(filename, datum, sheet, 'A1');
close all;
This is basic working algorithm, you need to further work on it to optimize it for speeed
Hints:
1. pre-declare the size of datum, based of number of files.
2. if all the files you have to read are of same extension, read only
them through dir()
Good luck for fine tuning

How to read a specific range of several excel files in MATLAB

I want to write a program that is able to read a specific range of numerous excel files in a folder.
because I need MATLAB to read from several excel files, I can't use a coding like this :
xlsread('Report1',1,'k41')
Is it possible to modify below codes in a way to be able to read 'K41' cellular from each excel file?
clc
clear all
Folder = 'D:\Program Files\MATLAB\R2013a\bin';
XLfiles = dir(fullfile(Folder, '*.xlsx'));
for i = 1:length(XLfiles)
data = xlsread(fullfile(Folder, XLfiles(i).name));
end
As excaza said, xlsread should work, just check the 'range' parameter from xlsread, needs to be a string in this format (this is to import only C1).
'C1:C2'
If you use K41:K41 it imports nothing, as it is a 0 range. Maybe that is the confusion here.
See if this might work
all_cells = []; %store all your cells in here
Folder = 'D:\Program Files\MATLAB\R2013a\bin';
XLfiles = dir(fullfile(Folder, '*.xlsx'));
for i = 1:length(XLfiles)
all_cells (end+1)= xlsread(fullfile(Folder, XLfiles(i).name), 'K41:K42');
end

R reading Excel files with carriage returns

I create a routine in R to import multiple Excel files that I need to merge in one big txt file. I use the read.xls function. Some of these xls files have carriage returns in cells ("\n"). Then, when I write the txt files (write.table) R interpret this "\n" as new lines.
How can I clean the xls files or read properly them to remove the not necessary "\n"?
Thanks!
The columns in your table are almost certainly factors (that's the default for character columns in R). So, we can just change the factors in each column.
First some dummy data
R> dd = data.frame(d1 = c("1", "2\n", "33"),
d2 = c("1\n", "2\n", "33"))
##Default, factor
R> levels(dd[,1])
[1] "1" "2\n" "33"
Next, we use a for loop to go over the column names:
for(i in 1:ncol(dd))
levels(dd[,i]) = gsub("\n","", levels(dd[,i]))
If you want to remove the for loop and use sapply, then this should work
##Can this be improved?
sapply(1:ncol(dd),
function(i) levels(dd[,i]) <<- gsub("\n","", levels(dd[,i])))

Resources