Several people have asked, in a roundabout way, but I have yet to see a workable solution. Is there any way to open an excel file directly from memory (like a byte[]) ? Likewise is there a way to write a file directly to memory? I am looking for solutions that will not involve the hard disk or juggling temporary files. Thanks in advance for any suggestions.
Excel does not support saving to memory, at least not that I have been able to find - and I have looked because I could use it.
SpreadsheetGear for .NET can save to and load from a byte array. Here is a simple example:
using System;
using SpreadsheetGear;
namespace Program
{
class Program
{
static void Main(string[] args)
{
// Create a simple Hello World workbook.
IWorkbook workbook = Factory.GetWorkbook();
IWorksheet worksheet = workbook.Worksheets[0];
worksheet.Cells["A1"].Value = "Hello World";
// Save to memory.
byte[] data = workbook.SaveToMemory(FileFormat.OpenXMLWorkbook);
// Load from memory and output the contents of A1.
workbook = Factory.GetWorkbookSet().Workbooks.OpenFromMemory(data);
worksheet = workbook.Worksheets[0];
Console.WriteLine("A1={0}", worksheet.Cells["A1"].Value);
}
}
}
You can see live SpreadsheetGear samples here and download the free trial here if you want to try it yourself.
Disclaimer: I own SpreadsheetGear LLC
Related
So I've noticed a strange behavior which I would like to share and see if anyone has had the similar problem.
We are using on Prem solution where we pickup a file or a http event request, map it to an outgoing xml xsd/schema and then create the file later on prem.
The problem was that the system where we save the file does not cooperate so good with the logic app, the logic app failes sometime because the system takes the file before the logic app can finish writing the full content.
The system receiving the files only read .xml files, so we though we should first rename the files to tmp, let logic app create the files and then rename them.
This solution sounded quite simple before we started actually applying it to the logic app.
If we take FileSystem function which has Rename File function and use the parameters “Name” from the create file on prem
{
"statusCode": 404,
"message": "Resource not found"
}
We get the message 404 that the resource is not found, now this complicates a lot of things, I’ve checked the privileges on the account that should not be an issue.
What we also have tried is listing all files in the folder, creating a foreach and then adding a rule and the Rename File function. This makes it work but the logic app does not cope well with receiving a lof of files at ones with that solution.
But the Rename Files works when it’s in a foreach loop and we extract the file names in a list from root folder or normal folder.
But why does it not work with just using the Rename Function? Is this perhaps an azure function bug in the Logic app Rename File Function?
So after discussing with Microsoft support on Azure they have actually confirmed that there is a bug with the “Create File” function.
It looks like all the data and information is actually lost during that functions, the support technicians do not know why that is happening but they have had similar cases which people have reported.
I have not stumbled across any of those posts, but I will post how we solved the problem with a work around.
FYI, The support team has taken the case further so that the developers at azure should look into it, because it’s not just “name” tag which is lost from Create a File, ( it’s all valuable options are actually lost ).
So first we initialize a variable and then actually set the variable name with two steps before we create the file:
The name is set with a temp name and a GUID.
Next step is creating the file with the temp-name used in function “Set Variable Temp FileName”
And on the Rename File function we use the Path from where we store the temp file and add \”FILENAME”
And add the “New Name” which we want to use.
This proved to work but is a workaround, support confirmed that you should be able to just use the “RenameFile” after creating the file with a temp name and changing it to the desired name.
But since Create a File does not send or pass any information at all from this list we have to initialize Variables to make it work.
If anyone has stumbled on the same problem where the Backend system reads the files before they are managed to be created by the logic app and you need some workaround this worked good for me.
Hope it helps!
We recently had the same issue; and the workaround of renaming the file also failed.
The cause seems to be that the Azure On Prem Gateway creates a file (or renames a file), then releases its lock, before checking that the file exists. In the gap between releasing the lock and checking that the file exists, the file may be picked up (deleted) thus causing LogicApps to think the step failed (reporting a 404 error), and thus confusion.
Our workaround was to create a Windows service which we hosted on the file servers (so they'd be able to respond to file changes before anything else on the network). This service has a configuration file which accepts a list of paths and file filters, and it uses the FileSystemWatcher to monitor for new files, or renamed files. When it detects a match it takes out a read lock on the file. This ensure it's not blocked by anything writing to the file (i.e. so it doesn't have to wait for the On Prem Gateway's write aciton to complete before obtaining its own lock), but whilst our service holds its lock the file can't be deleted (so the consumer can't remove the file / buying time for the On Prem Gateway to perform it's post-write read and report success). Our service releases its own lock after a defined period (we've gone with 30 seconds, though you could likely get away with much less). At that point, the consumer can successfully consume the file.
Basic code for the file watch & locking logic below:
sing System;
using System.IO;
using System.Diagnostics;
using System.Threading.Tasks;
namespace AzureFileGatewayHelper
{
public class Interceptor: IDisposable
{
object lockable = new object();
bool disposed = false;
readonly FileSystemWatcher watcher;
readonly int lockTimeInMS;
public Interceptor(string path, string filter, int lockTimeInSeconds)
{
lockTimeInMS = lockTimeInSeconds * 1000;
watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Filter = filter;
watcher.NotifyFilter = NotifyFilters.LastAccess
| NotifyFilters.LastWrite
| NotifyFilters.FileName
| NotifyFilters.DirectoryName;
watcher.Created += OnIncercept;
watcher.Renamed += OnIncercept;
}
public Interceptor(InterceptorConfigElement config) : this(config.Path, config.Filter, config.TimeToLockInSeconds) { Debug.WriteLine($"Loaded config ${config.Key}: Path: '${config.Path}'; Filter: '${config.Filter}'; LockTime: : '${config.TimeToLockInSeconds}'."); }
public void Start()
{
watcher.EnableRaisingEvents = true;
}
public void Stop()
{
if (watcher != null)
watcher.EnableRaisingEvents = false;
}
private async void OnIncercept(object source, FileSystemEventArgs e)
{
using (var fs = new FileStream(e.FullPath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
Debug.WriteLine($"Locked: {e.FullPath} {e.ChangeType}");
await Task.Delay(lockTimeInMS);
}
Debug.WriteLine($"Unlocked {e.FullPath} {e.ChangeType}");
}
public void Dispose()
{
if (disposed) return;
lock (lockable)
{
if (disposed) return;
Stop();
watcher?.Dispose();
disposed = true;
}
}
}
}
The following has been my development strategy for a long time. I divide the Excel applications in two large classes:
(a) Excel does not have to be present: The app may even run in an OS such as Linux, etc. I use 3rd. party Excel libraries. In Windows, I take advantage of resources such as OpenXML or even better: ClosedXML.
(b) Office Applications such as AddIns: Interop.
However, for a rather complex worksheet that is handled by multiple applications, I have decided that it is best to allow Excel itself to do all the work. I have the typical AddIns but am in the process of porting a regular app which used ClosedXML to Interop. As you may see in the following post, some colleagues have had trouble going that route. The missing class Globals seems to present a serious obstacle.
https://social.msdn.microsoft.com/Forums/en-US/cf4b0ca8-9c3b-41df-a24a-b2c195eaee4e/cannot-access-globals-object-from-c-console-application?forum=exceldev
What is the proper way to use Interop in a regular (non AddIn, non Office) application?
TIA
Here's the answer:
using Microsoft.Office.Interop.Excel;
public class InteropExcel
{
private Application excelApp;
private Workbook workbook;
private Worksheet worksheet;
internal void createExcelFile(string targetFilepath)
{
excelApp = new Application();
excelApp.Visible = false;
workbook = excelApp.Workbooks.Add();
worksheet = excelApp.ActiveSheet;
workbook.SaveAs(targetFilepath);
excelApp.Quit();
}
}
I'm trying to create a custom ImageFilter that requires me to temporarily write the image to disk, because I'm using a third party library that only takes FileInfo objects as parameters. I was hoping I could use IStorageProvider to easily write and get the file but I can't seem to find a way to either convert an IStorageFile to FileInfo or get the full path to the Media folder of the current tenant to retrieve the file myself.
public class CustomFilter: IImageFilterProvider {
public void ApplyFilter(FilterContext context)
{
if (context.Media.CanSeek)
{
context.Media.Seek(0, SeekOrigin.Begin);
}
// Save temporary image
var fileName = context.FilePath.Split(new char[] { '\\' }, StringSplitOptions.RemoveEmptyEntries).LastOrDefault();
if (!string.IsNullOrEmpty(fileName))
{
var tempFilePath = string.Format("tmp/tmp_{0}", fileName);
_storageProvider.TrySaveStream(tempFilePath, context.Media);
IStorageFile temp = _storageProvider.GetFile(tempFilePath);
FileInfo tempFile = ???
// Do all kinds of things with the temporary file
// Convert back to Stream and pass along
context.Media = tempFile.OpenRead();
}
}
}
FileSystemStorageProvider does a ton of heavy lifting to construct paths to the Media folder so it's a shame that they aren't publicly accessible. I would prefer not to have to copy all of that initialization code. Is there an easy way to directly access files in the Media folder?
I'm not using multitenancy, so forgive me if this is inaccurate, but this is the method I use for retrieving the full storage path and then selecting FileInfo objects from that:
_storagePath = HostingEnvironment.IsHosted
? HostingEnvironment.MapPath("~/Media/") ?? ""
: Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Media");
files = Directory.GetFiles(_storagePath, "*", SearchOption.AllDirectories).AsEnumerable().Select(f => new FileInfo(f));
You can, of course, filter down the list of files using either Path.Combine with subfolder names, or a Where clause on that GetFiles call.
This is pretty much exactly what FileSystemStorageProvider uses, but I haven't had need of the other calls it makes outside of figuring out what _storagePath should be.
In short, yes, you will likely have to re-implement whatever private functions of FileSystemStorageProvider you need for the task. But you may not need all of them.
I was struggling with a similar issue too and i can say that the IStorageProvider stuff is pretty much restricted.
You can see this when viewing the code of FileSystemStorageFile. The class already uses FileInfo to return data but the struct itself isn't accessible and other code is based on this. Therefore you would have to basically reimplement everything from scratch (own implementation of IStorageProvider). The easiest option is to simply call
FileInfo fileInfo = new FileInfo(tempFilePath);
but this would break setups where no file system based storage provider is used like AzureBlobStorageProvider.
The proper way for this task would be to get your hands dirty and extend the storage provider interfaces and update all the code that is based on it. But as far as i can remember the issue here is that you need to update the Azure stuff also and then things get really messy. Due to this fact i aborted that approach when trying to do this heavy stuff on my project.
I need to create docx files based on a templates.
The template should contain the place holders and I should be able to fill the the place holders from java .
Is it possible to do it , If so suggest me the good and efficient way to do it .
A little late for the original question, but if anyone else needs to dynamically create docx documents from templates, you might want to have a look at the DocxStamper Java library which I created on top of docx4j.
It allows to use the Spring Expression Language in docx templates and you can create a document out of a template with a couple lines like this:
MyData data = ...; // your own POJO containing the data
InputStream template = ...; // InputStream to the template file
OutputStream out = ...; // OutputStream to the resulting document
DocxStamper stamper = new DocxStamperConfiguration()
.build();
stamper.stamp(template, context, out);
out.close();
As discussed elsewhere before, there are 3 basic approaches:
BEST: content control data binding
cheap/cheerful: Variable replacement (ie magic strings on the document surface), but brittle (the split run problem)
LEGACY: MERGEFIELD with or without other field codes.
Docx4j supports all three approaches, but we generally recommend content control databinding, since it aligns with Microsoft's direction (as best can be ascertained), and is most powerful.
You'll want to consider the technical skills of your template authors.
See https://github.com/centic9/poi-mail-merge for a simple "Variable replacement" method. It does not work if one replacement-string has multiple formats applied, but does work well for simple cases where the template is carefully crafted.
Basically it reads the template and data from CSV or an Excel file and then merges it into multiple result files, one for each line of data.
It works on the DOCX XML content, so is not fully using Apache POI XWPF support, but this way formatting and other things from the template are used as expected without the need for full support for everything in Apache POI (which has DOCX support still as part of the "scratchpad" component as support is not considered fully done yet).
You can use Word template with following syntax of LINQ Reporting to achieve your requirements using Aspose.Words for Java.
<< tag_name [expression] -switch1 -switch2 ...>>
A tag body typically consists of the following elements:
A tag name
An expression surrounded by brackets
A set of switches available for the tag, each of which is preceded by the “-“ character
Assume, that you have the Sender class defined in your application as follows:
public class Sender {
public Sender(String name, String message) {
_name = name;
_message = message;
}
public String getName() {
return _name;
}
public String getMessage() {
return _message;
}
private String _name;
private String _message;
}
To produce a report containing a message of a concrete sender on its behalf, you can use a template document with the following content.
<<[s.getName()]>> says: "<<[s.getMessage()]>>."
To build a report from the template, you can use the following source code.
Document doc = new Document(getMyDir() + "temp_HelloWorld.docx");
Sender sender = new Sender("LINQ Reporting Engine", "Hello World");
ReportingEngine engine = new ReportingEngine();
engine.buildReport(doc, sender, "s");
doc.save(getMyDir() + "out.docx");
I work with Aspose as Developer evangelist.
We need to import SharePoint Document Library (which could be holding multiple document in multiple formats) to a destination folder (on different server) using SSIS.
There are open source SSIS adapters available for SharePoint. You can use these.
http://sqlsrvintegrationsrv.codeplex.com/
http://sqlsrvintegrationsrv.codeplex.com/releases/view/17652
You can also create a script task in ssis and use c# to extract files from SharePoint Library to a local folder. Make sure the SharePoint url only contains siteurl/Library/Folder/File (no special characters). Below copies one file but can be modified to copy multiple files. Good luck.
using System.net;
public void main ()
{
WebClient Client = new WebClient();
Client.UseDefaultCredentials = true;
Client.DownloadFile("yourSharePointurl/File.ext",
#"YourLocalFolder/File.ext");
Dts.TaskResult = (int)ScriptResults.Success;
}
Change this "using System.net;" to using "System.Net;"
My compilation error got fixed