SQL Server CE on Azure website - azure

Trying to run SQL Server CE on an Azure website, but I am getting error:
Unable to load the native components of SQL Server Compact corresponding to the ADO.NET provider of version 8876. Install the correct version of SQL Server Compact. Refer to KB article 974247 for more details

You cannot use SQL Server CE on Azure Web Sites. Using Azure Web Sites you have to use external database - such as Azure SQL Database or MySQL.
Generally speaking for the cloud and any cloud service (IaaS, PaaS, SaaS), you shall never rely on local file system, but rather persist your files on a durable storage such as Azure Blob Storage. Thus you can't (shall not) use services like SQL Server CE.

This seems to work:
Reference System.Data.SqlServerCe 4.0, set "Copy Local" = true.
Azure runs as a an x86 process (not Amd):
Environment.GetEnvironmentVariable("PROCESSOR_ARCHITECTURE");
On dev machine find SQLServerCE dependencies:
C:\ProgramFiles\Microsoft SQL Server Compact Edition\v4.0\Private\x86
Create SqlServerCE\x86 folder in web project.
Copy x86 files into new x86 folder:
sqlceca40.dll
sqlcecompact40.dll
sqlceer40EN.dll
sqlceme40.dll
sqlceqp40.dll
sqlcese40.dll
Add this block to top of application_start in global.asax
string dest = AppDomain.CurrentDomain.SetupInformation.PrivateBinPath;
string src = AppDomain.CurrentDomain.SetupInformation.ApplicationBase + "SqlServerCE\\x86";
foreach (var file in Directory.GetFiles(src))
{
var fileinfo = new FileInfo(file);
string destpath = Path.Combine(dest, fileinfo.Name);
if (!File.Exists(destpath))
{
fileinfo.CopyTo(destpath);
}
}
Note: I am not happy with this solution but I can't figure out how to get the files into the bin folder on deployment. post build events don't seem to work. If anyone has a better solution please suggest it.

Related

Use NetOffice.PowerPointApi on azure app service

I have written a code to save all the slides in a presentation as jpeg. It works well in visual studio locally on my system, but when I deploy it on Azure app service, I get 500 internal server error.
IIS received the request; however, an internal error occurred during the processing of the request. The root cause of this error depends on which module handles the request and what was happening in the worker process when this error occurred. IIS was not able to access the web.config file for the Web site or application. This can occur if the NTFS permissions are set incorrectly. IIS was not able to process configuration for the Web site or application. The authenticated user does not have permission to use this DLL. The request is mapped to a managed handler but the .NET Extensibility Feature is not installed.
The code:
using pptd = NetOffice.PowerPointApi;
using NetOffice.PowerPointApi.Enums;
using NetOffice.OfficeApi.Enums;
public void genThumbnails(string originalfileName,string renamedFilename, string dirPath)
{
pptd.Application pptApplication = new pptd.Application();
pptd.Presentation pptPresentation = pptApplication.Presentations.Open(dirPath + renamedFilename, MsoTriState.msoFalse, MsoTriState.msoFalse, MsoTriState.msoFalse);
int i = 0;
foreach (pptd.Slide pptSlide in pptPresentation.Slides)
{
pptSlide.Export(dirPath + originalfileName + "_slide" + i + ".jpg", "jpg", 1280, 720);
i++;
}
pptPresentation.Close();
}
What is the mistake that I am doing? Does NetOffice package also need MS Office installed on the server like Office.Interop?
The standard windows and Linux web apps used blessed operating system images. As part of the PaaS design, customers are limited as to what they can run as there is no MS Office inter-op present and also because Azure Web Apps is a sandbox.
My suggestion would be to create a container image that has the necessary dependencies that you need and then deploy your custom container to an Azure Web App Container.

Location for SSH private key and temporary SFTP download data in Azure functions

I am writing an Azure function that uses WinSCP library to download files using SFTP and upload the files on blob storage. This library doesn't allow to get files as a Stream. Only option is to download them locally. My code also uses a private key file. So i have 2 questions.
sessionOptions.SshPrivateKeyPath = Path.GetFullPath("privateKey2.ppk");
is working locally. I have added this file in solution with option "copy to output" and it works. But will it work when Azure function is deployed?
While getting the files I need to specify local path where the files will be downloaded.
var transferResult = session.GetFiles(
file.FullName, Path.GetTempPath() + #"SomeFolder\" + file.Name, false,
transferOptions);
The second parameter is the local path.
What should I use in place of Path.GetTempPath() that will work when Azure function is deployed?
For the private key, just deploy it along with your function project. You can simply add it to your VS project.
See also Including a file when I publish my Azure function in Visual Studio.
For the download: The latest version of WinSCP already supports streaming the files. Use the Session.GetFile method.
To answer your question about the temporary location, see:
Azure Functions Temp storage.
Where to store files for Azure function?

Accessing Azure IIS logs from within the same website

I have an Azure website configured to write IIS logs to file system. I would like to have a dashboard page within my website where administrators can view reports about traffic on the site, which has been generated by parsing these logs.
I have tried to access the log directory in code by both DirectoryInfo.GetFiles(), and by attempting to connect over FTP using FtpLib.
From outside of Azure, I can connect to the FTP and download the logs, but from code running in the Azure website, I cannot.My assumption is that Azure does not allow outbound FTP traffic from website code.
The folder structure for Azure (by inspecting the FTP) looks something like:
Site: /site/wwwroot
Logs: /LogFiles/http/RawLogs
Within the Azure portal you can create virtual directories, but they are only allowed within /site.
Site is running as an Azure Web Site, MVC 4, Integrated pipeline, 64bit, .NET 4.5, and for FTP I am using FtpLib v1.0.1.2. FtpLib fails at Login() with message: Unknown error (0x2ee2)
I am aware that I can change the logging within Azure to log to Blob Storage, however this would result in additional monthly cost. Are there any other options to access these files?
Thanks.
Edit: Have been asked to supply code, here is the FTP version (works locally, not on Azure):
using (var ftp = new FtpConnection("XXXXXXXX.windows.net", "XXXXXXXX", "XXXXXXXX"))
{
ftp.Open();
ftp.Login(); //Fails here
ftp.SetLocalDirectory(Server.MapPath("~/")); //Temp
ftp.SetCurrentDirectory("/LogFiles/http/RawLogs");
foreach (var f in ftp.GetFiles("*.log"))
{
ftp.GetFile(f.Name, f.Name, false);
ftp.RemoveFile(f.Name);
}
}
And here is the file system version:
//var logRoot = Server.MapPath("~/../../LogFiles/http/RawLogs"); //Throws error about traversal outside of site root
//var logRoot = "/LogFiles/http/RawLogs"; //Throws error: Could not find a part of the path 'D:\LogFiles\http\RawLogs'.
var logRoot = "LogFiles/http/RawLogs"; //Throws error: Could not find a part of the path 'D:\Windows\system32\LogFiles\http\RawLogs'.
foreach (var f in new DirectoryInfo(logRoot).GetFiles("*.log"))
{
f.CopyTo(root + f.Name, true);
f.Delete();
}
I see the problem with paths to the log files. AzureWebsites uses C Drive, but in your implementation you are getting D Drive. Use Server.MapPath("~") and then do string manipulations on top of it to get the right ROOT Path. So Root directory will be having two more directories - LogFiles and Site. As you already got the Root directory, append it with LogFiles directory and read all files from there.

Microsoft.Jet.OLEDB.4.0 error on Azure Cloud Service

My requirement is to upload excel file to the folder which is on website root and then read this file's data into the datatable.
This is working fine on my existing hosting provider. But now I've uploaded my website on Windows Azure cloud service. After porting to Azure cloud service I'm getting an error on "Microsoft.Jet.OLEDB.4.0" provider.
I have use the “Microsoft.Jet.OLEDB.4.0” provider for read data from the excel file and add data to datatable. It work fine on local too, but when I host my web application on azure cloud service it generate following error
“Microsoft.Jet.OLEDB.4.0 provider is not registered on the local machine”
Please have a look on some line of codes:
string strpath = Server.MapPath(OAppPath);
strpath = strpath + "\\MYDATAFOLDER\\" + System.IO.Path.GetFileName(FileUpload1.PostedFile.FileName);
FileUpload1.PostedFile.SaveAs(strpath);
string excelConnectionString = "";
excelConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;" + "Data Source=" + strpath +
";Extended Properties=\"Excel 8.0;;IMEX=1;HDR=yes\"";
var connection = new OleDbConnection(excelConnectionString);
connection.Open();
var dtSheets = new DataTable();
dtSheets = connection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
GC.Collect();
GC.WaitForPendingFinalizers();
What is alternate solution of this?
I looked on following too:
Link 1
Link 2
Thanks
Kapil
This is probably caused because your application is running on 64 bits mode. According to Microsoft, The Microsoft OLE DB Provider for Jet and the Microsoft Access ODBC driver are available in 32-bit versions only.
As described in How to get a x64 version of Jet?:
The Microsoft Jet Database Engine 4.0 components entered a state of functional deprecation and sustained engineering, and have not received feature level enhancements since becoming a part of Microsoft Windows in Windows 2000.
One alternative is to use a startup task in your cloud service to deploy the Microsoft Access Database Engine 2010 Redistributable. You'll have to change your connection string to the formats supported by this new driver.
Another approach, which I have used with success, would be to use a library such as ExcelDataReader to read the spreadsheet data.

azure storage account

I'm trying to deploy an application on Azure but I'm facing some problems.
on my dev box, all works fine but I have a problem when I'm trying to use the application once it is deployed.
on the dev box, I have an action that I do manually wich crates the test tables in my local sql server express.
but I do not know how to create the tables on the server ? so when I run my website application, it says TableNotFound.
Can sy guide me through this final step ? do I need to make sg additional ?
Thx in advance
The table storage client provides a method to create the schema in the cloud storage; I forget the name (will look it up in a second); call that when you initialise whatever you're using as your data service layer.
Edit: The following snippet is what I use:
StorageAccountInfo = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
TableStorage.CreateTablesFromModel( typeof( <Context> ), info );
where <Context> is your data context object.

Resources