Accessing Azure IIS logs from within the same website - azure

I have an Azure website configured to write IIS logs to file system. I would like to have a dashboard page within my website where administrators can view reports about traffic on the site, which has been generated by parsing these logs.
I have tried to access the log directory in code by both DirectoryInfo.GetFiles(), and by attempting to connect over FTP using FtpLib.
From outside of Azure, I can connect to the FTP and download the logs, but from code running in the Azure website, I cannot.My assumption is that Azure does not allow outbound FTP traffic from website code.
The folder structure for Azure (by inspecting the FTP) looks something like:
Site: /site/wwwroot
Logs: /LogFiles/http/RawLogs
Within the Azure portal you can create virtual directories, but they are only allowed within /site.
Site is running as an Azure Web Site, MVC 4, Integrated pipeline, 64bit, .NET 4.5, and for FTP I am using FtpLib v1.0.1.2. FtpLib fails at Login() with message: Unknown error (0x2ee2)
I am aware that I can change the logging within Azure to log to Blob Storage, however this would result in additional monthly cost. Are there any other options to access these files?
Thanks.
Edit: Have been asked to supply code, here is the FTP version (works locally, not on Azure):
using (var ftp = new FtpConnection("XXXXXXXX.windows.net", "XXXXXXXX", "XXXXXXXX"))
{
ftp.Open();
ftp.Login(); //Fails here
ftp.SetLocalDirectory(Server.MapPath("~/")); //Temp
ftp.SetCurrentDirectory("/LogFiles/http/RawLogs");
foreach (var f in ftp.GetFiles("*.log"))
{
ftp.GetFile(f.Name, f.Name, false);
ftp.RemoveFile(f.Name);
}
}
And here is the file system version:
//var logRoot = Server.MapPath("~/../../LogFiles/http/RawLogs"); //Throws error about traversal outside of site root
//var logRoot = "/LogFiles/http/RawLogs"; //Throws error: Could not find a part of the path 'D:\LogFiles\http\RawLogs'.
var logRoot = "LogFiles/http/RawLogs"; //Throws error: Could not find a part of the path 'D:\Windows\system32\LogFiles\http\RawLogs'.
foreach (var f in new DirectoryInfo(logRoot).GetFiles("*.log"))
{
f.CopyTo(root + f.Name, true);
f.Delete();
}

I see the problem with paths to the log files. AzureWebsites uses C Drive, but in your implementation you are getting D Drive. Use Server.MapPath("~") and then do string manipulations on top of it to get the right ROOT Path. So Root directory will be having two more directories - LogFiles and Site. As you already got the Root directory, append it with LogFiles directory and read all files from there.

Related

Use NetOffice.PowerPointApi on azure app service

I have written a code to save all the slides in a presentation as jpeg. It works well in visual studio locally on my system, but when I deploy it on Azure app service, I get 500 internal server error.
IIS received the request; however, an internal error occurred during the processing of the request. The root cause of this error depends on which module handles the request and what was happening in the worker process when this error occurred. IIS was not able to access the web.config file for the Web site or application. This can occur if the NTFS permissions are set incorrectly. IIS was not able to process configuration for the Web site or application. The authenticated user does not have permission to use this DLL. The request is mapped to a managed handler but the .NET Extensibility Feature is not installed.
The code:
using pptd = NetOffice.PowerPointApi;
using NetOffice.PowerPointApi.Enums;
using NetOffice.OfficeApi.Enums;
public void genThumbnails(string originalfileName,string renamedFilename, string dirPath)
{
pptd.Application pptApplication = new pptd.Application();
pptd.Presentation pptPresentation = pptApplication.Presentations.Open(dirPath + renamedFilename, MsoTriState.msoFalse, MsoTriState.msoFalse, MsoTriState.msoFalse);
int i = 0;
foreach (pptd.Slide pptSlide in pptPresentation.Slides)
{
pptSlide.Export(dirPath + originalfileName + "_slide" + i + ".jpg", "jpg", 1280, 720);
i++;
}
pptPresentation.Close();
}
What is the mistake that I am doing? Does NetOffice package also need MS Office installed on the server like Office.Interop?
The standard windows and Linux web apps used blessed operating system images. As part of the PaaS design, customers are limited as to what they can run as there is no MS Office inter-op present and also because Azure Web Apps is a sandbox.
My suggestion would be to create a container image that has the necessary dependencies that you need and then deploy your custom container to an Azure Web App Container.

NullPointerException downloading pdf file from azure app service, Spring-boot Java 8

I have a spring boot application with Java 8 in which I can create and download a PDF file with Itext PDF library, in local and development environment this works fine but when I deploy the application in my azure app service I can't download this file, I had a NullPointerException.
2020-11-24T13:04:12.310887019Z: [INFO] ****Downloading PDF file: /home/site/descargas/
2020-11-24T13:04:13.814155878Z: [INFO] *** ERROR : java.lang.NullPointerException
2020-11-24T13:04:13.815304521Z: [ERROR] com.itextpdf.text.DocumentException: java.lang.NullPointerException
2020-11-24T13:04:13.816149053Z: [ERROR] at com.itextpdf.text.pdf.PdfDocument.add(PdfDocument.java:821)
2020-11-24T13:04:13.816580670Z: [ERROR] at com.itextpdf.text.Document.add(Document.java:277)
And my code is:
document = new Document();
FileOutputStream coursesFile = new FileOutputStream(DIRECTORIO_TEMP+"cursos.pdf");
PdfWriter writer = PdfWriter.getInstance(document, coursesFile);
PDFEvent eventoPDF = new PDFEvent();
writer.setPageEvent(eventoPDF);
document.open();
//margenes de documento
document.setMargins(45, 45, 80, 40);
PdfPTable table = new PdfPTable(new float[] {15,35,50,10});
table.setWidthPercentage(100);
table.getDefaultCell().setPaddingBottom(20);
table.getDefaultCell().setPaddingTop(20);
Stream.of("ID", "Área", "Nombre", "Horas").forEach(columnTitle -> {
PdfPCell header = new PdfPCell();
header.setBackgroundColor(COLOR_INFRA);
header.setBorderWidth(1);
header.setHorizontalAlignment(Element.ALIGN_CENTER);
header.setPhrase(new Phrase(columnTitle, fuenteCabecera));
table.addCell(header);
});
for (Curso curso : cursos) {
table.setHorizontalAlignment(Element.ALIGN_CENTER);
table.addCell(new Phrase(curso.getCodigoCurso(), fuenteNegrita ) );
table.addCell(new Phrase(curso.getAreaBean().getNombre_area(), fuenteNormal ));
table.addCell(new Phrase(curso.getNombreCurso(), fuenteNormal ));
PdfPCell celdaHoras = new PdfPCell( new Phrase(curso.getHoras() + "", fuenteNormal ) );
celdaHoras.setHorizontalAlignment(Element.ALIGN_CENTER);
table.addCell(celdaHoras);
}
document.add(new Paragraph(Chunk.NEWLINE));
document.add(new Paragraph(Chunk.NEWLINE));
document.add(table);
document.close();
coursesFile.close();
The file permission in my Azure app service are:
Newest
Answer For Linux:
I don't know what method you used to deploy the whole process of webapp. However, the descargas folder will not be created automatically in any way.
No matter what method you use to deploy the webapp, it is recommended that you log in to kudu and check the descargas folder and whether the files under the folder exist. It is not recommended to use FTP to upload.
In addition, there should be no concept of D:\ drive and virtual application under Linux. It is recommended to use relative paths in the code to read files.
And
PRIVIOUS
Answer For Windows:
This error occurs because the access path must be wrong. I don't know if your code uses a relative path or an absolute path.
So my advice is:
Use absolute paths to access files.
I have test and it works for me, both in my local and on azure.
The file path like D:\home\site\myfiles\a.pdf .
Use virtual directories to access files
You also can use Virtual Directory to access your file, the path you access by broswer,like https://www.aa.azurewebsites.net/myfiles/a.pdf, and you also can check it by kudu, like D:\home\site\myfiles\a.pdf.
For more details, you can refer to the offical doc,https://learn.microsoft.com/en-us/azure/app-service/configure-common#windows-apps-uncontainerized .

Files transfered via FTP to Azure app services are not accessible with URLS

I have deployed an ASP.NET CORE web API project on Azure app services. I have copied a file using an FTP client to /site/wwwroot. Now let suppose file name is xyz.jpg, so it should be accessible with link somename.azurewebsites.net/xyz.jpg but ITS NOT. I have tried pasting the file in other folders to but nothing works.
I also have a controller for uploading pictures. It's also working fine. It uploads the picture in desired folder, i can see the picture via FTP client but still the picture is not accessible via any link. What am I doing wrong here ?
For a Web API application, you have to define the request and response yourself in the controller, or your link can't be recognized by the application.
For example, you can add the method to your controller. It works on my side.
[Route("myroute/{pic}")]
public IActionResult Get(string pic)
{
Byte[] b = System.IO.File.ReadAllBytes("image/"+pic);
return File(b, "image/jpeg");
}
In my code, pictures are stored in the folder called image in the root directory, and I define a route called myroute.
Here's my link to access the picture.https://myappname.azurewebsites.net/myroute/mypicname.jpg
Hope it helps.

SQL Server CE on Azure website

Trying to run SQL Server CE on an Azure website, but I am getting error:
Unable to load the native components of SQL Server Compact corresponding to the ADO.NET provider of version 8876. Install the correct version of SQL Server Compact. Refer to KB article 974247 for more details
You cannot use SQL Server CE on Azure Web Sites. Using Azure Web Sites you have to use external database - such as Azure SQL Database or MySQL.
Generally speaking for the cloud and any cloud service (IaaS, PaaS, SaaS), you shall never rely on local file system, but rather persist your files on a durable storage such as Azure Blob Storage. Thus you can't (shall not) use services like SQL Server CE.
This seems to work:
Reference System.Data.SqlServerCe 4.0, set "Copy Local" = true.
Azure runs as a an x86 process (not Amd):
Environment.GetEnvironmentVariable("PROCESSOR_ARCHITECTURE");
On dev machine find SQLServerCE dependencies:
C:\ProgramFiles\Microsoft SQL Server Compact Edition\v4.0\Private\x86
Create SqlServerCE\x86 folder in web project.
Copy x86 files into new x86 folder:
sqlceca40.dll
sqlcecompact40.dll
sqlceer40EN.dll
sqlceme40.dll
sqlceqp40.dll
sqlcese40.dll
Add this block to top of application_start in global.asax
string dest = AppDomain.CurrentDomain.SetupInformation.PrivateBinPath;
string src = AppDomain.CurrentDomain.SetupInformation.ApplicationBase + "SqlServerCE\\x86";
foreach (var file in Directory.GetFiles(src))
{
var fileinfo = new FileInfo(file);
string destpath = Path.Combine(dest, fileinfo.Name);
if (!File.Exists(destpath))
{
fileinfo.CopyTo(destpath);
}
}
Note: I am not happy with this solution but I can't figure out how to get the files into the bin folder on deployment. post build events don't seem to work. If anyone has a better solution please suggest it.

Access denied upon doing a GetDirectories() but Dir in Powershell works

I have a problem I hope someone might help me with.
I've created a custom action page where I among other things will scan a directory on a remote server for a set of directories, and inside those directories I am searching for a set of files.
However, when I execute the code on the production server I get an Access denied exception.
If I use the same code on my testserver (accessing the same remote server) it works just fine.
If I use powershell or explorer on the production server I can access the remote directory and files with no problems.
I am using the same account in all scenarios (if I print out Page.User.Identity.Name and SPContext.Current.Web.CurrentUser.LoginName they are the same and equal to the account I use on the test server and the one I am logged on with on the production server when accessing the remote server from command line or explorer).
The code looks like this:
string user = SPContext.Current.Web.CurrentUser.LoginName.Remove(0,7);
string user_path = "\\\\srv\\share1\\subdir\\dir\\" + user;
// The line below will raise an exception on the production server.
foreach (string board_path in Directory.GetDirectories(user_path, "Board*")) {
foreach (string board_file in Directory.GetFiles(board_path, "Board*.xml")) {
.
.
}
}
I cant figure out why the code runs on the testserver but not on the production machine. I am using SharePoint 2010 Standard.
Thanks in advance for any kind of help I can get.
/Fredrik
The problem was solved by using SPSecurity.RunWithElevatedPrivileges()!
/Fredrik

Resources