Empty PHPExcel file using liuggio/ExcelBundle in Symfony - excel

I have some code that iterates over the rows and columns of an Excel sheet and replaces text with other text. This is done with a service that has the excel file and a dictionary as parameters like this.
$mappedTemplate = $this->get('app.entity.translate')->translate($phpExcelObject, $dictionary);
The service itself looks like this.
public function translate($template, $dictionary)
{
foreach ($template->getWorksheetIterator() as $worksheet) {
foreach ($worksheet->getRowIterator() as $row) {
$cellIterator = $row->getCellIterator();
$cellIterator->setIterateOnlyExistingCells(false); // Loop all cells, even if it is not set
foreach ($cellIterator as $cell) {
if (!is_null($cell)) {
if (!is_null($cell->getCalculatedValue())) {
if (array_key_exists((string)$cell->getCalculatedValue(), $dictionary)) {
$worksheet->setCellValue(
$cell->getCoordinate(),
$dictionary[$cell->getCalculatedValue()]
);
}
}
}
}
}
}
return $template;
}
After some debugging I found out that the text actually is replaced and that the service works like it should. The problem is that when I return the new PHPExcel file as a response to download, the excel is empty.
This is the code I use to return the file.
// create the writer
$writer = $this->get('phpexcel')->createWriter($mappedTemplate, 'Excel5');
// create the response
$response = $this->get('phpexcel')->createStreamedResponse($writer);
// adding headers
$dispositionHeader = $response->headers->makeDisposition(
ResponseHeaderBag::DISPOSITION_ATTACHMENT,
$file_name
);
$response->headers->set('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
$response->headers->set('Pragma', 'public');
$response->headers->set('Cache-Control', 'maxage=1');
$response->headers->set('Content-Disposition', $dispositionHeader);
return $response;
What am I missing?

Your code is missing the calls to the writer.
You only create the writer, but never use it, at least not in your shared code examples:
$objWriter = new PHPExcel_Writer_Excel2007($objPHPExcel);
$response = $this->get('phpexcel')->createStreamedResponse($objWriter)
Another thing is the content type: Do you have the apache content types setup correctly?
$response->headers->set('Content-Type', 'application/vnd.ms-excel; charset=utf-8');

Related

Laravel Excel Import Write into Batches of Excel file

I have a large number of records in an excel sheet which is taking too much time while importing to the 3 tables in database. To overcome this issue, I am trying to import it through batch by creating small excel files with less data so that I run queue jobs of laravel on it. I am trying with the below code but it doesn't work, it throws an error of Array to String conversion and also sub excel files are not creating. I am using excel Maatweb, but I am handling it through controller.
Can someone lead me please.
function importBatchFiles(Request $request)
{
$this->validate($request, [
'file' => 'required|mimes:xls,xlsx,csv'
]);
$file = $request->file('file');
//$fileName='orders_'.$request->get('company_id').'_'.date('Y-m-d').uniqid().'.xlsx';
if ($file->isValid()) {
//$file->move('order_list', $fileName);
$data = Excel::toArray(new OrdersImport, request()->file('file'));
foreach($data as $key => $value)
{
foreach($value as $row)
{
$inputs[] = $row;
}
}
$data1 = array_slice($inputs, 1);
$parts = (array_chunk($data1, 500));
foreach($parts as $index => $part){
$filename = resource_path('pending-files/'.date('Y-m-d').$index.'.'.$request->file->extension());
file_put_contents($filename, $part);
}
return Response::json(['success' => 'Orders Queued for importing.']);
}else{
return Response::json(['error' => 'Some errror']);
}
}
When using the WithChunkReading concern, you can execute each chunk into a queue job. You can do so by simply adding the ShouldQueue contract.
For more details please refer to this link: https://docs.laravel-excel.com/3.1/imports/queued.html

Laravel Excel Object of class stdClass could not be converted to string

I created a query that works on a single query but when I use ->get() it prompt
Object of class stdClass could not be converted to string
Here is my code:
$result = DB::table('users')->get(); // the data you want to download as csv
$csv = (array)$result; // stored the data in a array
return Excel::create('csvfile', function ($excel) use ($csv) {
$excel->sheet('mySheet', function ($sheet) use ($csv) {
$sheet->fromArray($csv);
});
})->download('xls');
Quick solution is decoding it
json_decode( json_encode($data), true);
Here is my code now
$data = DB::table('users')->get();
$csv= json_decode( json_encode($data), true);
return Excel::create('SalesReport', function($excel) use ($csv) {
$excel->sheet('Sales-Report', function($sheet) use ($csv)
{
$sheet->fromArray($csv);
});
})->download('xls');

GENERATE MULTIPLE TEXT FILES IN ACUMATICA LOCALLY

How can I generate several text files at the same time locally?
I am using the method:
throw new PXRedirectToFileException (file, true);
![enter image description here][1]
However, this method only generates 1 text file. I need more than 1 text file to be generated at a time.
List<object> data1099Misc = new List<object> { };
ARInvoice ari = Base.Document.Current;
foreach (xvrFSCab diot in PXSelect<xvrFSCab,
Where<xvrFSCab.invoiceNbr,
In<Required<xvrFSCab.invoiceNbr>>>>.Select(Base, ari.InvoiceNbr))
{
data1099Misc.Add(CreatePayerARecord(diot));
}
FixedLengthFile flatFile = new FixedLengthFile();
flatFile.WriteToFile(data1099Misc, sw);
sw.Flush();
sw.FlushAsync();
int cont = 0;
while ( cont<3)
{
cont = cont + 1;
string path = "DIOTJOSE" + ".txt";
PX.SM.FileInfo file = new PX.SM.FileInfo(path, null, stream.ToArray());
throw new PXRedirectToFileException(file, true);
}
Acumatica had the same issue when they had to open multiple reports at one click (with RedirectException).
For this reason Acumatica supports multiple RequiredException only for Reports.
They have a method called "CombineReport" that works with multiple PXReportRequiredException (PXReportsRedirectList)
Sad part is that they did not make something for other RequiredException or RedirectException
I tried to make my own "Combine" method but I was not able to create it just because the RedirectHelper.TryRedirect method use hardcoded types of the RedirectException inside body instead to use an generic or base object :(

How do I download data trees to CSV?

How can I export nested tree data as a CSV file when using Tabulator? I tried using the table.download("csv","data.csv") function, however, only the top-level data rows are exported.
It looks like a custom file formatter or another option may be necessary to achieve this. It seems silly to re-write the CSV downloader, so while poking around the csv downloader in the download.js module, it looks like maybe adding a recursive function to the row parser upon finding a "_children" field might work.
I am having difficulty figuring out where to get started.
Ultimately, I need to have the parent-to-child relationship represented in the CSV data with a value in a parent ID field in the child rows (this field can be blank in the top-level parent rows because they have no parent). I think I would need to include an ID and ParentID in the data table to achieve this, and perhaps enforce the validation of that key using some additional functions as data is inserted into the table.
Below is currently how I am exporting nested data tables to CSV. This will insert a new column at the end to include a parent row identifier of your choice. It would be easy to take that out or make it conditional if you do not need it.
// Export CSV file to download
$("#export-csv").click(function(){
table.download(dataTreeCSVfileFormatter, "data.csv",{nested:true, nestedParentTitle:"Parent Name", nestedParentField:"name"});
});
// Modified CSV file formatter for nested data trees
// This is a copy of the CSV formatter in modules/download.js
// with additions to recursively loop through children arrays and add a Parent identifier column
// options: nested:true, nestedParentTitle:"Parent Name", nestedParentField:"name"
var dataTreeCSVfileFormatter = function(columns, data, options, setFileContents, config){
//columns - column definition array for table (with columns in current visible order);
//data - currently displayed table data
//options - the options object passed from the download function
//setFileContents - function to call to pass the formatted data to the downloader
var self = this,
titles = [],
fields = [],
delimiter = options && options.delimiter ? options.delimiter : ",",
nestedParentTitle = options && options.nestedParentTitle ? options.nestedParentTitle : "Parent",
nestedParentField = options && options.nestedParentField ? options.nestedParentField : "id",
fileContents,
output;
//build column headers
function parseSimpleTitles() {
columns.forEach(function (column) {
titles.push('"' + String(column.title).split('"').join('""') + '"');
fields.push(column.field);
});
if(options.nested) {
titles.push('"' + String(nestedParentTitle) + '"');
}
}
function parseColumnGroup(column, level) {
if (column.subGroups) {
column.subGroups.forEach(function (subGroup) {
parseColumnGroup(subGroup, level + 1);
});
} else {
titles.push('"' + String(column.title).split('"').join('""') + '"');
fields.push(column.definition.field);
}
}
if (config.columnGroups) {
console.warn("Download Warning - CSV downloader cannot process column groups");
columns.forEach(function (column) {
parseColumnGroup(column, 0);
});
} else {
parseSimpleTitles();
}
//generate header row
fileContents = [titles.join(delimiter)];
function parseRows(data,parentValue="") {
//generate each row of the table
data.forEach(function (row) {
var rowData = [];
fields.forEach(function (field) {
var value = self.getFieldValue(field, row);
switch (typeof value === "undefined" ? "undefined" : _typeof(value)) {
case "object":
value = JSON.stringify(value);
break;
case "undefined":
case "null":
value = "";
break;
default:
value = value;
}
//escape quotation marks
rowData.push('"' + String(value).split('"').join('""') + '"');
});
if(options.nested) {
rowData.push('"' + String(parentValue).split('"').join('""') + '"');
}
fileContents.push(rowData.join(delimiter));
if(options.nested) {
if(row._children) {
parseRows(row._children, self.getFieldValue(nestedParentField, row));
}
}
});
}
function parseGroup(group) {
if (group.subGroups) {
group.subGroups.forEach(function (subGroup) {
parseGroup(subGroup);
});
} else {
parseRows(group.rows);
}
}
if (config.columnCalcs) {
console.warn("Download Warning - CSV downloader cannot process column calculations");
data = data.data;
}
if (config.rowGroups) {
console.warn("Download Warning - CSV downloader cannot process row groups");
data.forEach(function (group) {
parseGroup(group);
});
} else {
parseRows(data);
}
output = fileContents.join("\n");
if (options.bom) {
output = "\uFEFF" + output;
}
setFileContents(output, "text/csv");
};
as of version 4.2 it is currently not possible to include tree data in downloads, this will be comming in a later release

Fast access to excel data in X++

Can someone give me a clue how can I get the fast access of the excel data. Currently the excel contains more than 200K records and when I retrieve from the X++ code it takes a lot of time to retrieve all the records.
Following are the classes I am using to retrieve the data.
1 - SysExcelApplication, SysExcelWorksheet and SysExcelCells.
I am using the below code to retrieve cells.
excelApp.workbooks().open(filename);
excelWorksheet = excelApp.worksheets().itemFromName(itemName);
excelCells = excelWorkSheet.cells();
///pseudo code
loop
excelCells.item(rowcounter, column1);
similar for all columns;
end of loop
If any of the special property needs to be set here please tell me.
Overall performance will be a lot better (huge!) if you can use CSV files. If you are forced to use Excel files, you can easy and straigforward convert this excel file to a csv file and then read the csv file. If you can't work that way, you can read excel files throug ODBC (using a query string like connecting to a database) that will perform better that the Office API.
First things, reading Excel files (and any other file) will take a while for 200 K records.
You can read an Excel file using ExcelIo, but with no performance guaranties :)
As I see it, you have 3 options (best performance listed first):
Convert your Excel file to CSV file, then read with CommaIo.
Read the Excel file using C#, then call back to X++
Accept the fact and take the time
use CSV, it is faster, below is code example:
/* Excel Import*/
#AviFiles
#define.CurrentVersion(1)
#define.Version1(1)
#localmacro.CurrentList
#endmacro
FilenameOpen filename;
CommaIo file;
Container con;
/* File Open Dialog */
Dialog dialog;
dialogField dialogFilename;
dialogField dialogSiteID;
dialogField dialogLocationId;
DialogButton dialogButton;
InventSite objInventSite;
InventLocation objInventLocation;
InventSiteID objInventSiteID;
InventLocationId objInventLocationID;
int row;
str sSite;
NoYes IsCountingFound;
int iQty;
Counter insertCounter;
Price itemPrice;
ItemId _itemid;
EcoResItemColorName _inventColorID;
EcoResItemSizeName _inventSizeID;
dialog = new Dialog("Please select file");
dialogSiteID = dialog.addField(extendedTypeStr(InventSiteId), objInventSiteId);
dialogLocationId = dialog.addField(extendedTypeStr(InventLocationId), objInventLocationId);
dialogFilename = dialog.addField(extendedTypeStr(FilenameOpen));
dialog.filenameLookupFilter(["#SYS100852","*.csv"]);
dialog.filenameLookupTitle("Please select file");
dialog.caption("Please select file");
dialogFilename.value(filename);
if(!dialog.run())
return;
objInventSiteID = dialogSiteID.value();
objInventLocationID = dialogLocationId.value();
/*----- validating warehouse*/
while
select maxof(InventSiteId) from objInventLocation where objInventLocation.InventLocationId == objInventLocationId
{
If(objInventLocation.InventSiteID != objInventSiteID)
{
warning("Warehouse not belongs to site. Please select valid warehouse." ,"Counting lines import utility");
return;
}
}
filename = dialogFilename.value();
file = new commaIo(filename,'r');
file.inFieldDelimiter(',');
try
{
if (file)
{
ttsbegin;
while(file.status() == IO_Status::OK)
{
con = file.read();
if (con)
{
row ++;
if(row == 1)
{
if(
strUpr(strLtrim(strRtrim( conpeek(con,1) ))) != "ITEM"
|| strUpr(strLtrim(strRtrim( conpeek(con,2) ))) != "COLOR"
|| strUpr(strLtrim(strRtrim( conpeek(con,3) ))) != "SIZE"
|| strUpr(strLtrim(strRtrim( conpeek(con,4) ))) != "PRICE"
)
{
error("Imported file is not according to given format.");
ttsabort;
return;
}
}
else
{
IsCountingFound = NoYes::No;
_itemid = "";
_inventColorID = "";
_inventSizeID = "";
_itemid = strLtrim(strRtrim(conpeek(con,1) ));
_inventColorID = strLtrim(strRtrim(conpeek(con,2) ));
_inventSizeID = strLtrim(strRtrim(conpeek(con,3) ));
itemPrice = any2real(strLtrim(strRtrim(conpeek(con,4) )));
}
}
}
if(row <= 1)
{
ttsabort;
warning("No data found in excel file");
}
else
{
ttscommit;
}
}
}
catch
{
ttsabort;
Error('Upload Failed');
}

Resources