Excel Add-In Requires Re-Installation to Update Files even when Cache is being Updated - excel

I currently have a few users setup to use my Add-In via sideloading from a network share. Today I made some changes to a JS file and tested with a user to see if their computer/Add-In updated and it did not.
First I tried just running the Add-In, that didn't work
Then I tried hitting "refresh" --> "Add" to "re-install" Add-In but that didn't work.
I had to completely wipe the users installation and start from scratch, that worked.
I figured this was something I could set/adjust in my nginx webserver so I set out to read/test but during my testing, I noted in dev tools that it was in fact getting 200 response and fetching updated files when they were updated on the nginx server.
This tells me that somehow Excel itself is caching the files outside of the browsers awareness or something and I'm at a loss what to do.
I will NOT change my file names for each deployment as that is no good for git and I prefer having the same file names.
Is there some other way I can ensure/push users Excel to fetch updated code?
Note: Users are 100% Windows Clients w/ Office 365
Update:
I need a server side solution, I don't have the ability to run a script on end users computer to clear the cache/re-install with each update

there are few ways you can get the job done
you can cache-control headers for your JavaScript files to
"no-cache" or "no-store". and it should fetch them from the server every
time. You can do this by adding the following lines to your nginx
configuration file:
location ~ .js$ {
add_header Cache-Control "no-cache";
}
Use versioned file names for your JavaScript files. This will force
the client's browser to fetch the updated files, because they will
have a different file name than the ones that are cached. For
example, you can use a version number or a timestamp in the file
name, like "script-v1.js" or "script-2022-01-01.js".
Use service worker
Use a different mechanism for updating the Add-Ins, such as
Microsoft's Office Add-In Deployment API or the Office 365
Management APIs

Related

iis 10 Static Website: Deleting default site and creating completely new site (how to access new site)

This post needs help from experienced iis administrators, but must be explained in details for EXTREME newbies.
What I am doing:
I have two computers, both running Windows 10. One is a desktop and one is a laptop.
iis is enabled on both computers. Each computer can access the iis web server from the other and pull up a page from the other - using the ip address.
There is no DNS or host files being used (this is by ip address only), nor do I want to use any sort of naming.
Both computers are running an identical website, and the website files are in a different directory than the default. The structure is like this:
C:\inetpub\ROOT\myWebsite\myIndex.html
web.config
Changes I've made - now a few problems.
On both computers I have deleted the DefaultAppPool and the default website that comes installed with iis. This has not stopped the website from completely working, so adding that back seems unlikely to fix my problem.
I have deleted my application pool and website from iis (never deleting the actual files from the file system) several times, and added it several times. Each time I do this, my site comes back, but with the same problem I am having.
I have deleted all of the default documents, and the only default document listed in iis is myIndex.html.
myIndex.html initially displays a graphic image (using the standard tag), and this image comes up. Sort of. See explanation below.
The problem I am having
Before I started this project, I had iis working on the desktop with the default site and app pool and simply added some of my own files with really simple text content and some pics. I had replaced the default iis splash image with my own image, and all that worked with no problem.
the image that comes up is a link to another page that has a list of links to other stuff in my website. It all works no problem there.
Now, with the setup I have now, on the desktop I was originally using (in the paragraph above) if I pull up my website locally, myIndex.html loads in the browser and my image comes up, and everything works fine.
The same is true on the laptop, when I access the site locally.
However, if I attempt to access the desktop site (using its ip address) from the laptop, it pulls up the old splash image from the default site I deleted.( I left those files there even though I deleted the site from within iis). All those files are in the default location C:\inetpub\wwwroot.
If I move those files to another directory, thus leaving C:\inetpub\wwwroot completely empty, then when I access the site on the desktop (via the ip address) from the laptop, my new site comes up without a problem.
While it seems I may have solved my problem by moving the file from the previous project, doing that does not teach me how iis is actually working, and why files from a website that no longer exists in iis are still being accessed from remote computers.
So, please teach me something about the internal workings of iis, and how it chooses to access the different application pools and websites.
Again, please word your answers for complete newbies, because I know a little but not enough to get real technical.
I have been reading posts on stackexchange.com and other sites; links to microsoft docs etc. That's not helping as those docs are expecting too much prerequisite knowledge, and speaking in terms that are not really explaining things in a way I can understand.
You have described several different problems. I will try to address each of them (contrary to S/O recommendations).
First, when you make changes, and they don't seem to show up, it is usually because of caching. IIS always wants to cache files/configs. So does your web browser. So, to force an accurate test, you need to dump your browser cache and cycle IIS (to make sure it drops its cache and loads new files and configs). Start there.
Second, IIS is designed for settings inheritance. Which means, each app and each folder will inherit settings and permissions from the parent, unless you override them. Overriding them can be done by files and/or IIS configs (application vs folder). The IIS configs are the stronger of the two.
Also, the IIS config for "default files" might have come into-play for your test. If you didn't set up MyIndex.html as the top-most default file, then IIS would look for other files first. In fact, if you don't have MyIndex.html in the list of default files, IIS would have to depend on your app to choose that as a default page (MVC routing, etc).

How to clear client side browser cache for Excel 2016 Task Pane add-in on Mac?

This is a follow-up to a previous question where the answer to "How do I ensure I see the latest JS code in my Task Pane add-in" involved controlling the client side caching behavior via server-added meta/no-cache tags (or versioning the server resources).
However, I am looking for a manual way, on the client, for the end-user to clear out the client side cache that appears to be storing JavaScript files and preventing an updated JS file on the server from being used by the Task Pane add-in. During development, I'll be updating the JS resources frequently on the server and I am looking for a client side solution that allows those updated files to be used.
Environment: Desktop version of Excel 2016 running on Mac (OSX 10.11.5)
Task Pane add-in using v1.2 of the Excel/Office.js.
Scenario: deploy add-in artifacts to web server, run add-in on Mac. Then update code in foo.js in add-in, re-deploy to web server. Run add-in and see old (pre-update) behavior from foo.js.
What I have tried:
On the same Mac, loaded foo.js directly from the web app in Safari. I can see the changes in the js code that I expect to be in the updated version.
Cleared the Safari cache (Privacy > Remove All Website Data) (I suspected that this would not work based on #1 - Safari does not appear to share a cache with Excel but worth a shot) - did not change anything.
Poked around under ~/Library/Containers/com.microsoft.Excel trying
to find a cache - deleted ~/Library/Containers/com.microsoft.Excel/Data/Library/Caches/com.microsoft.Excel - no help.
Used the Reload menu item from the Task Pane's context menu (looks
like [i] on the Mac) - no difference: still seeing old foo.js.
Where are the JavaScript files referenced by an Excel (desktop) 2016 Task Pane add-in stored? (on the Mac) and how can the end user remove them?
I stumbled upon com.Microsoft.OsfWebHost while poking around using the "defaults" command from a Terminal on the Mac. Some googling turned up this article which basically provided the answer. In a slight refinement to what is instructed there, this is what I did:
ensured that I had quit Excel
in Finder, navigated to
~/Library/Containers/com.Microsoft.OsfWebHost/Data/Library/Caches
renamed (or deleted) the folder named "com.Microsoft.OsfWebHost" there
opened my workbook in Excel (it already has the add-in Inserted)
nothing seemed to be loading, so I used the Reload command from the context menu in the Task Pane
Now I could see that my updated version of the java script file has been loaded.
This is based on empirical evidence only, no corroboration from Microsoft or Apple doc, so your mileage may vary.
BTW, there are some interesting "defaults" properties for the com.Microsoft.OsfWebHost:
defaults read com.Microsoft.OsfWebHost
{
WebKitCacheModelPreferenceKey = 1;
WebKitDebugFullPageZoomPreferenceKey = 1;
WebKitPluginsEnabled = 0;
WebKitUsesPageCachePreferenceKey = 0;
}
Googling for WebKitCacheModelPreferenceKey did not turn up any official documentation, but there seem to be suggestions that setting it to 0 might suppress caching.

How to specify Excel default proxy when streaming Excel via VCS

I would like to using Excel with PowerPivot to read data from an SSRS report. When I try and access the report using atomsvc, the connection works fine, but I get the error: "The remote server returned an error: (407) Proxy Authentication Required."
Some googling showed me that I need to use a config file (Excel.exe.config) placed in the same directory as Excel.exe to tell Excel to use the defaultcredentials, my problem is that the company I'm working at uses thin clients and VCS, so Excel with Powerpivot is not present on my machine, it is instead streamed from a server on launch.
Is there any other way to specify the configuration credentials to Excel? I've tried startup switches, but there doesn't seem to be one to specify this. Perhaps there is a way to configure the streamed application on the VCS side? My alternative is to tap into the SSRS Web Service, but that would be a lot more work, and more complex for whomever supports this after me. Another option could be to add a registry entry?

Deployment race condition causing CDN to cache old or broken files

Our current deploy process goes something like this:
Use grunt to create production assets.
Create a datestamp and point files at our CDN (eg /scripts/20140324142354/app.min.js).
Sidenote: I've heard this process called "versioning" before but I'm not sure if it's the proper term.
Commit build to github.
Run git pull on the web servers to retrieve the new code from github.
This is a node.js site and we are using forever -w to watch for file changes and update the site accordingly.
We have a route setup in our app to serve the latest version of the app via /scripts/*/app.min.js.
The reason we version like this is because our CDN is set to cache JavaScript files indefinitely and this purposely creates a cache miss so that the code is updated on the CDN (and also in our users' browsers).
This works fine most of the time. But where it breaks down is if one of the servers lags a bit in checking out the new code.
Sometimes a client hits the page while a deploy is in progress and tries to retrieve the new JavaScript code from the CDN. The CDN tries to retrieve it but hits a server that isn't finished checking out the new code yet and caches an old or partially downloaded file causing all sorts of problems.
This problem is exacerbated by the fact that our CDN has many edge locations and so the problem isn't always immediately visible to us from our office. Some edge locations may have pulled down old/bad code while others may have pulled down new/good code.
Is there a better way to do these deployments that will avoid this issue?
As a general rule of thumb:
Don't do live upgrades. (unless the language supports it, but even then think twice)
Pulling code using git pull and then waiting for the app to notice changes to files sounds a lot like the 90's: uploading php files to an apache web server using ftp (or sftp if you are cool) and waiting for apache to notice that they were updated. It can't happen atomically, so of course there is a race condition. Some users WILL get a half built and broken site.
I recommend only upgrading your live and running application while no one is using it. Hopefully you have a pool of servers behind a load balancer of some sort, which will allow you to remove them one at a time and upgrade them.
This will mean that users will be able to use both the old and the new site at the same time depending on how and when they access it, but that is much better then not being able to access it at all.
Ideally you would be able to spin up copies of each of the web servers that you have running with the new version of the site. Check that the new version does work, and then atomically update the load balancer so that everyone gets bumped to the new site at the same time. And only once everything is verified to be working perfectly the old machines are shut down and decommissioned, or reused.
step 4 in your procedure should be:
git archive --remote $yourgithubrepo --prefix=$timestamp/ | tar -xf -
stop-server
ln -sf $timestamp current
start-server
your server would use the current directory (well, a symlink) at all times. no matter how long the deploy takes, your application is in a consistent state.
I'll go ahead and post our far-from-ideal monkey-patch that we're using right now.
We deploy once which may or may not go as planned, once we're sure the code is deployed on all the servers we do another build where the only thing that changes is the version number.
Then we deploy again server by server.
The race condition still exists but because the application code between the two versions is the same this masks the issue since no matter which server the CDN hits it gets the "latest" code.

How to backup and restore IIS configuration from script

I'm writting a script that sets up a lot of different applications in Windows (mainly svn and open source servers for http, dns, mail, ftp and db). This script is intended to be executed in new/clean Windows workstations for new developers, it automatically sets everything up to create an environment very similar to the one in production. After it's executed, everything runs locally and the developer can start working right away.
This not only helps new developers, but all existing developers whenever there are changes in the whole system, everything is replicated locally.
The one thing I'm still not able to do is making some kind of backup of an IIS server that is running a web app (it's in the Prod server) and restoring it automatically to the new developer's machine so he doesn't have to install/configure IIS locally.
I've read about using appcmd.exe to create and restore backups, but that works only for the same machine (it uses encryption keys and those keys change between computers).
Is there a way, a scriptable way, to take everything IIS related from one server and restore it on another server, without user intervention and having the restored IIS run exactly as the original?
Thanks in advance!
Francisco
Just putting this here so anyone who comes across this will have an understanding as to why this wasn't answered. A website has a massive amount of variables associated with it that prevents any easy methods to copy all of its configuration through one or even just a few cmdlets.
To get started though you would want to become very familiar with the applicationHost.config file and how you access the properties within it using the Get-WebConfigurationProperty. One way to get familiar with how to script against webconfiguration properties is to use the Configuration Editor in IIS. Whenever you make a change in the Configuration Editor, before commiting the changes there is a nifty little link titled Generate Script, which will have a Powershell tab you can use to help you gather the proper Get/Set commands for the configuration elements within the applicationHost.config file.
I've created something almost exactly like what the OP is looking for and it spans 4 modules (over 20,000 lines of code) and has a SQL backend that holds all of the configuration elements.
When a website has everything from underlying DLLs that may need registered, IsapiCGI Restrictions and IsapiFilters, accounts that are tied to the AppPool that may need added to certain local groups on the server, to secure bindings that require a certificate to be loaded on the server. You can see that this isn't a simple undertaking. (and these are just a small portion of the variables that a website may contain)
There is however a large chunk of cmdlets that Microsoft provides you out of the box that you can leverage to aid you in developing something like this inside the WebAdministration module. I know this is four years old but hope anyone who stumbled on this will find the above useful.

Resources