Flash File Too Large - Won't Publish - flash-cs5

I have a 105 mb flash .fla file that needs to be published. It used to publish fine, but after a bit of tweaking, it cannot publish at all. The flash application would crash. I would assume this occurs from the sheer size of the file.
Is there any workaround? A third-party flash compiler perhaps?

I've found that the only way to solve this problem is to save the file, move it to a different location on your hard drive, and then open it again. This solution has worked for me multiple times.

Related

What kind of .apk file is stored in <project-name>/<module-name>/app/release?

I've been looking for the signed file for uploading to an online emulator, but I can't seem to find a legit file. I've been forced to use the .apk in //release/apk instead of the "suggested" //build/outputs/apk/.
The reason I don't use the suggestion is that there's no file whatsoever under outputs-apk (only a debug file).
What do? Thanks

why download apk file is buffered and gives user old version

We hold our landing page on Azure and it is for users to download an Android apk file. This landing page is a html file. Here is the markup for users to download:
download here
It all works fine until now. Users start to complain that the app they downloaded cannot work properly. But when we tested, it works fine.
Finally we find out that, although the link is
http://www.[mysite].com/android/[MyAndroidApp].apk
but sometimes when user click it, it goes to
http://101.44.1.131/cloud/223.210.55.28/files/9216...636//www.[mysite].com/android/[MyAndroidApp].apk
This is a buffer and holds an old version of our app!
Can anyone tell me why this happen and how can I prevent it buffer our old version?
How often do you update this apk file?
May be a caching issue, but not sure exactly.
Have you tried using Azure storage? Upload the file on there, and then link directly to it.
Should cost you less in the long run and not cause any buffering/cache issues
I would suggest you try to put version numbers after your filename. This is also a good practice for .js files. A problem is very often that it's cached and the cache not updated correctly. It's a general problem in the web.
So. Try to put version numbers after the file name, and let us know if this works.
Thank you all for your suggestions.
We have found the reason. Looking at the redirect url, it is actually some ISPs cached our apk files. They are doing this so that they can save themselves money and bandwidth. This is a common practice in some countries and is well documented.
How evil it is.
Our solution is thus change the file name very time we deploy a new version.

How to throttle bandwidth for OverGrive in Linux (Debian)?

I've installed trickle but can't seem to get it to throttle overGrive.
$ trickle -d 500 -u 100 overgrive
trickle: Could not reach trickled, working independently: No such file or directory
trickle: exec(): No such file or directory
Is there another way to get overGrive to stop sucking up all my bandwidth while syncing?
I managed to solve the issue I had and my overGrive works fine for the last couple of weeks. It turned out that it had synchronized with some public files created by different users, and which didn't have anything in common with my Google Drive account. What was common for all these files is that it belonged to some courses and had names like: "MATH&141, Mod 01: Quiz 2 Toolkit". For some reason these files didn't have .doc extension and had symbols & and : in names, which seems to cause overGrive stack on it forever.
Anyway, I did perfom the following steps and it fixed the issue:
downloaded and installed the latest version of overgrive.
clear all trash files from Google Drive online
delete all files from your local Google Drive folder if present and restart overGrive:
.overgrive.lastsync
.overgrive.cache
turn off automatic sync, and start synchronization manually.
wait until full synchronization is finished.
You can check in your Home folder log file called .overgrive.log to see if there are some errors during the synchronization. It might happen that overGrive blocks on some specific file and try to synchronize it over and over again causing large use of download/upload.

How to stop "Failed to delete the temporary file..." warnings from VS2012 compiler

I have two rather large solutions that both experience the same problem. The issue is that I am warned about an inability to delete temporary files. The messages all look like this:
Failed to delete the temporary file
"C:\Users\Don\AppData\Local\Temp\tmp07197280428445c484ba0cda58178903.exec.cmd".
The process cannot access the file
'C:\Users\Don\AppData\Local\Temp\tmp07197280428445c484ba0cda58178903.exec.cmd'
because it is being used by another process.
I have seen suggestions of using pre-build commands to first delete things, but that is a lot of projects, and I'm not going there.
Anyone know how else I might remedy this, that does not involve "fixing" each project individually?
If it makes any difference, I'm compiling C# .NET 3.5 projects.
My idea is, to write a small addin for Visual Studio, which can delete files on build. You could configure it with filepaths and then just run sth like this:
foreach (var item in paths)
File.Delete(item);
And the config you could put solutionwide.
I get that too - the problem is that the compilation system itself is holding onto the file when it attempts to delete it. I think it deletes it afterwards anyway as I've never seen the named files hanging around afterwards so its just an annoyance that can be ignored.
The files seem to be the command that VS is running that is built up from the build settings.
I assume its a .NET thing where the GC hasn't cleaned up the object that has the handle to the file when the system attempts to delete it. If so, directly shows the benefit of RAII over GC :-)
A likely source for the problem is that your antivirus software is busy scanning the file in question, which prevents the rightful owner deleting it. Curb the enthusiasm of the antivirus and your problem will be solved.
Unload the project from your solution, than reload it. It should create the missing files and be buildable again.
If you have installed any third party cleaner tool and activated the active mode (always running in background) this will lock the temp folder in the appdata so Visual Studio is unable to restore the Nuget package on build and there will be a build error.
Try uninstalling the cleaner and restarting the system. When I had this problem, that was how I fixed it.

Safe to Run LogParser Against Live Production IIS Log?

Is it safe to run LogParser against our live production IIS log file?
Currently, I have been copying it over to another location and then running LogParser 2.2 against the log file.
Instead, I would really like to run it against the live data so that I can see changes to it immediately, however, I am a little concerned that it might cause issues.
Does anyone know if querying the live IIS logs would cause a problem?
It shouldn't cause any problems as I don't believe it locks the file. Why would it be a problem to copy the file though just to make sure? Even if you just copy it to a local folder, a batch file could make that easy, to copy the file and run it through logparser.
But it should be fine against live files.
It's definitely safe, since as Tom says Log Parser does not lock input files. Running in production on live log files was a key target scenario.

Resources