I am getting started with Office Add-ins through the Java script API. I am going through this tutorial. When I proceed with the Try It Out section. I get this error. I am getting the add-in to run fine when I give the absolute path in the source location node of the manifest for example E:\Excel-Add-in-Javascript\first-excel-addin\Home.html but its the relative path that is not working for example \\SAAD\Excel-Add-in-Javascript\first-excel-addin\Home.html Kindly let me know if you a solution.
The source location node should not contain a relative path. It should use a complete path, either on the internet or on a network share.
In your case, you need to make \\SAAD a network share, not just a folder.
I don't think that serving from the file path (file:///C:/Users/username/Desktop/something.html) or share is a supported scenario. It may work, but note that it will run differently (and sometimes not run, or be overly permissive) than when you deploy the app for real.
To be clear, you can have a manifest file on a network share for ease of testing the add-in -- and in fact, it's the easiest way to get your add-in registered with Office desktop. But the web content should be served off of a web server (anything from hosting via an IIS local-host web server, to using an Azure Website, to putting your content on github and serving it via https://rawgit.com/).
Related
I made a program trying to see how many color pages are on an uploaded PDF file, using Ghostscript. Now, it worked perfectly on my local environment, but I can't put the site online, as the web hosting service I use tells me that any application that needs root access cannot be installed on the server. Is it the case with Ghostscript, and if so, is there any other way I can use it ?
I'm new to all this, and I know now I should have asked the web hosting service first thing, but I'd really like to not have to put my work in the bin over this !
No Ghostscript doesn't require root privileges and I've no idea why they would claim it does. Also, I would strongly urge you to review the AGPL license.
I am working on a project that has us deploying to an Azure Web Site.
The code is overall working and now we are focusing more on security.
Right now we are having an issue that back end configuration files are visible with the direct URL.
Examples (Link won't work):
https://myapplication.azurewebsite.net/foldername/FileName.xml (this
file is in a folder that is contained within the root application)
https://myapplication.azurewebsite.net/vApp/FileName.css (this file
is a part of virtual application sub folder)
I have found this to be true with multiple extensions and locations.
Extensions like:
.css
.htm
.xml
.html
the list likely goes on
I understand that certain files are downloaded to the client side and that those can't be stopped. However backend XML files are something we don't pass to the client (especially if has connection strings).
I did read a similar article, Azure App Service Instrumentation Profiling?
However this didn't directly relate to my issue.
Any insight would extremely helpful.
Do not store sensitive information in flat files, especially under your site root. Even if you web.config it just right you're still one botched commit away from disaster.
Use Application Settings instead, that's what they're for.
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-configure
For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)
I'm using multiple computers for development and I want to be able to store my files in my dropbox folder. I went to change the physical path in IIS from c:\inetpup\wwwroot to the dropbox folder but I get this error:
The requested page cannot be accessed
because the related configuration data
for the page is invalid.
I couldn't find the config file so I was wondering if anyone had done this before or whether there a better way to sync everything nicely across several PCs?
I tried it (IIS 7.5, Win 7) and it should work just fine to let your physical path of your web look at your dropfox folder. I would guess your web.config file generally contains malformed XML (see KB942055).
I'd suggest, try to map it to an empty folder just with an index.html file and see if this error still occurs.
As a workaround, I guess you can put Dropbox in your wwwroot folder and set up a virtual directory that points to Dropbox. However, there are some security issues that may hinder you from doing so. I come across a nice tutorial on how to set up Dropbox to IIS as FTP Publishing. Hope it helps.
Hodgin's guide on using Dropbox as FTP publishing.
Since 2005, when Microsoft prevented HtmlHelp functioning off a network share, e.g.:
\\appserver\tos\PointScanner.exe
\\appserver\tos\PointScanner.chm
What are we supposed to do instead?
(Given that the application is not installed locally.)
To rephrase: What is Microsoft's intended, supported, out-of-the-box, help solution?
You can allow access via the Registry setting described here:
http://support.microsoft.com/kb/896054/
If you don't want to open any security vulnerabilities by modifying Registry settings your application could also create a local copy of the .chm file, e.g. in the users temp folder (%TMP%) and open the help from there. You can remove the file again when your application exits (in case you don't want to leave anything behind on the user's workstation)
I started with the registry change mentioned by divo. Eventually I moved from network folder based chm files to actual "html help". This was easy for me since I use RoboHelp which can generate either format from the same source code.