The extension package size exceeds the maximum package size limit - azure

I made a custom extension task in VS Code. And, I'm facing a problem while uploading the extension package to Marketplace. It's showing an error: "The extension package size '32440237 bytes' exceeds the maximum package size '26214400 bytes" as my extension size is ~32MB.
When I looked deep, then I got like node_modules folder (where all packages are present) size is increasing if I install some external packages like:
azure-pipelines-task-lib
azure-pipelines-tasks-azure-arm-rest-v2
#pulumi/azure
I also tried the solution which was given in this. But, no luck.
I'm worried, is there any way to decrease or compress the size of node_modules or extension-package.
or,
How the size is increasing for the node_modules folder?
if anyone has knowledge on this, please let me know.
Thanks in Advance!!

Well, I got the answer to this.
First thing, in VS Marketplace, the default size of uploading the extension package is 25MB.
So, if your extension package exceeds the max size limit, then don't worry, just do one thing:
Mail your issue to this mail id: vsmarketplace#microsoft.com. Then, one of from support team will contact you within 24hrs-48hrs.
Lastly, they will ask you the reason for extending the size limit, then you have to give the proper reason. Then, bingo!!
Your issue will resolve within 2-3 days max.

Related

How does the unpacked size affect the minified size of an npm pckage?

I'm currently trying to reduce the bundle size of an npm-package I have created. And I managed to get to an unpacked size of about 210kb.
https://www.npmjs.com/package/#agile-ts/core/v/0.0.12 <- 210kb
https://www.npmjs.com/package/#agile-ts/core/v/0.0.11 <- 304kb (with comments)
One change I made was to remove all the comments with help of the 'tsconfig' file, which reduced my unpacked size about 100kb BUT the minified size stayed the same (57kb)?
https://bundlephobia.com/result?p=#agile-ts/core#0.0.12 <- 57kB
https://bundlephobia.com/result?p=#agile-ts/core#0.0.11 <- 57kB (with comments)
So I was wondering how the unpacked size does affect the minified size. Are in the minified size comments already removed? I found no answer to this question on the internet.
Another package I found has an unpacked size of about 325kb
https://www.npmjs.com/package/#hookstate/core <- 325kb
but a minified size of 16.7kB.
https://bundlephobia.com/result?p=#hookstate/core#3.0.6 <- 16.7kb
-> It is about 30% larger in the unpacked size but 70% smaller in the minified size?
The only difference I found, is that the package I just mention consists out of 10 files and my package consists out of 66 files.
So it's smaller than my package.. but then it should be smaller in the unpacked size too.
In case you have any idea how to reduce the package size.. feel free to contribute ^^ or to give me some advice
https://github.com/agile-ts/agile/issues/106
Thank you ;D
What should matter is NOT how much the package contains on disk, but how much it takes space in a final application after all the bundling and minification is applied. This process includes variable names renaming, removal of comments, tree shaking and removal of unused/unreferenced code. There are tools which allow to inspect final application size and size of dependencies. They are different depending on what bundler you use.
It is very bad idea to remove comments from source code for the purpose of minifying your package download size. As well as to remove other development supportive files like Typescript definitions, etc.

Limit Chromium cache size in Linux

I need to limit the cache size of Chromium in my debian computer. I've tried to edit the master-preferences in order to solve this problem, but every time I reopen the browser this file restore its original values.
How can I modify this values to have for example a limit of 10M cache everytime?
Absolutely. An easy fix to this is to add the following argument to the command.
chromium-browser --disk-cache-size=n
say n is 500000000 this would be 500 MB
You can check to make sure it increased it by typing the following in your browser and then looking at the Max Size value.
chrome://net-internals/#httpCache
Please see https://askubuntu.com/questions/104415/how-do-i-increase-cache-size-in-chrome/104429#104429

Orchard CMS Disk Space

Evening all,
I am playing with Orchard CMS and I have a quick question. I keep my source code on a 10GB partition on my PC; I downloaded the source for orchard (~40MB) and placed it on that drive.
Started visual studio, opened the solution and started a build, realised quite quickly that it was going to take some time so I went off and got a drink, came back to find it had errored out of the build and that the last 3GB of disk space on my dev drive had been filled. This can't be normal, can it?
Does anyone know how much free disk space I'll need to build orchard from the source? I am limited by the size of the SSD in my laptop and I'm not going to upgrade just so I can use orchard!
Problem is that vanilla source projects don't disable "copy local/private" for references. Therefore every project in the solution creates copies of all references in it's bin folders. This obviously isn't necessary here and increases size exponentially (since these references are shipped together anyway so better if they are included just once).
You have 2 options:
(Recommended) Don't compile the source, I've been writing modules on top of precompiled version and never needed to make changes to the core source, that may do more harm than good. But if you really need to compile >
Force references to not copy locally, either manually for every single reference in every single project or find macro or maybe some VS magic to enforce it globally.

How do I limit log4j files based on size (only)?

I would like to configure log4j to write only files up to a maximum specified size, e.g. 5MB. When the log file hits 5MB I want it to start writing to a new file. I'd like to put some kind of meaningful timestamp into the logfile name to distinguish one file from the next.
I do not need it to rename or manipulate the old files in any way when a new one is written (compression would be a boon, but is not a must).
I absolutely do not want it to start deleting old files after a certain period of time or number of rolled files.
Timestamped chunks of N MB logfiles seems like the absolute basic minimum simple strategy I can imagine. It's so basic that I almost refuse to believe that it's not provided out of the box, yet I have been incapable of figuring out how to do this!
In particular, I have tried all incarnations of DailyRollingFileAppender, RollingFileAppender or the 'Extras' RollingFileAppender. All of these delete old files when the backupcount is exceeded. I don't want this, I want all my log files!
TimeBasedRollingPolicy from Extras does not fit because it doesn't have a max file size option and you cannot associate a secondary SizeBasedTriggeringPolicy (as it already implements TriggeringPolicy).
Is there an out of the box solution - preferably one that's in maven central?
I gave up trying to get this to work out of the box. It turns out that uk.org.simonsite.log4j.appender.TimeAndSizeRollingAppender is, as the young folk apparently say, the bomb.
I tried getting in touch some time back to get the author to stick this into maven central, but no luck so far.

GoogleEarth crashed or still loading?

How do you know if the (free) GoogleEarth App is still loading data or hasnt crashed for some reaons.
Im loading a huge kml file, 100% cpu usage, but it never stops processing..
Are there any limits about the KML size of the displayed file?
How many KML MBs the Google Earth PC application can show without crashing ?
I don't think there are any file size limitations for Google Earth when using .kml files. There is however limits with regard to the size of images, so if your kml is trying to load images (such as screen overlays) then perhaps your problem lies there.
This is related to the specifications of your computer, so you can find the maximum resolution of images you can import by looking under the 'About'. I am not sure where to find info about the kb size of the image.
Next, you can try creating a .kmz out of the .kml - A KMZ is simply a compressed file the same as a .zip is - learn more about it
http://code.google.com/apis/kml/documentation/kmzarchives.html
Also you can try breaking the file up into smaller ones, either by using network links
http://code.google.com/apis/kml/documentation/kmlreference.html#networklink
or Regions
http://code.google.com/apis/kml/documentation/regions.html
By breaking the file up into smaller ones, you might also discover which part/element of the KML is causing hassles if you have some kind of format error

Resources