On my Windows 10 machine it's 3.5GG. What is it storing? How can I trim it down?
It is storing all the downloaded crates you have used, their binaries, the index of registries, etc. It is going to take a lot of space and will keep increasing.
You can safely remove .cargo/registry/ (if you remove the entire folder, you will lose installed binaries and your Cargo configuration if you had one). Afterwards, everything you use again will be downloaded and it will start growing back. It is a common way of getting rid of very old dependencies you are not using anymore.
Related
I built Arangodb3 on centos7.4, using the default configuration, now 500G disk has been used up, but the amount of data is not large, how can I solve this problem?
If you want to compile ArangoDB yourselves (instead of using our precompiled binary packages), you only have to keep the source and the build-folder until you ran
make install
after that you can remove it again to free up disk space.
Please note that after removing the source and build folder you will have to re-download (or git clone https://github.com/arangodb/arangodb.git) the source again, and re-compile it once more completely which will take longer than an incremental build.
These two folders take up significant space on my hard disk
%USER_HOME%/.gradle
%USER_HOME%/.AndroidStudio3.1
Can they be safely deleted? If not, what subfolders can I safely delete?
Well, it's "safe" in that it won't blow up your computer. But the .gradle folder is a dependency cache - every library your app needs is going to need to be stored there in order for you to actually build / test / run your code. It's to be expected that it'll take up a lot of space. You might get some mileage out of deleting it and re-downloading the dependencies, especially if there are a lot of old dependencies in there that you're not using it, but by virtue of your work it's going to get re-created and start filling back up again.
The Android Studio folder is a bit similar - it's not a dependency cache in that lots of different things aren't going to get installed there, but it's still necessary for you to actually build your code. If you delete it you're just going to have to reinstall things there to get your code to work.
I recently started using Bitbake for building Yocto. Everytime I build, it consumes more space and currently I'm running out of disk space. The images are not getting overwritten. A set of new files with timestamp is getting created for every build. I have deleted old files from build/tmp/deploy/images/. But it doesn't make much difference in the disk free space. Is there any other locations from where I can delete stuff?
The error I observe during build is:
WARNING: The free space of source/build/tmp (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
WARNING: The free space of source/build/sstate-cache (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
WARNING: The free space of source/build/downloads (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
Kindly suggest some pointers to avoid this issue.
In order of effectiveness and how easy the fix is:
Buy more disk space: Putting $TMPDIR on an SSD of its own helps a lot and removes the need to micromanage.
Delete $TMPDIR (build/tmp): old images, old packages and workdirectories/sysroots for MACHINEs you aren't currently building for accumulate and can take quite a lot of space. You can normally just delete the whole $TMPDIR once in a while: as long as you're using sstate-cache the next build should still be pretty fast.
Delete $SSTATE_DIR (build/sstate-cache): If you do a lot of builds sstate itself accumulates over time. Deleting the directory is safe but the next build will take a long time as everything will be rebuilt.
Delete $DL_DIR (build/downloads): If you use a build directory for a long time (while pulling updates from master or changing to newer branch) the obsolete downloads keep taking disk space. Keep in mind that deleting the directory will mean re-downloading everything. Looking at just the largest files and deleting the old versions may be a useful compromise here.
There are some official ways instead of deleting.
By deliberately deleting you could be forcing unnecessary builds & downloads. Some elements of the build could be not controlled by bitbake, and you can find yourself in a situation that you cannot rebuild these items in an easy way.
With these recommendations, you can beat the non written 50GB per build yocto rule:
Check your IMAGE_FSTYPES variable. My experience says it is safe to delete all images of these files that are not symlinks, or symlinks targets. Avoid the last one generated to avoid breaking the last build link, and any related with bootloaders and configuration files, as they could be rarely regenerated.
If you are keeping more than one build with the same set of layers, then you can use a common download folder for builds.
DL_DIR ?= "common_dir_across_all_builds/downloads/"
And afterwards:
To keep your /deploy clean:
RM_OLD_IMAGE: Reclaims disk space by removing previously built versions of the same image from the images directory pointed to by the DEPLOY_DIR variable.Set this variable to "1" in your local.conf file to remove these images:
RM_OLD_IMAGE = "1"
IMAGE_FSTYPES Remove the image types that you do not plan to use, you can always enable a particular one when you need it:
IMAGE_FSTYPES_remove = "tar.bz2"
IMAGE_FSTYPES_remove = "rpi-sdimg"
IMAGE_FSTYPES_remove = "ext3"
For /tmp/work, do not need all the workfiles of all recipes. You can specify which ones you are interested in your development.
RM_WORK_EXCLUDE:
With rm_work enabled, this variable specifies a list of recipes whose work directories should not be removed. See the "rm_work.bbclass" section for more details.
INHERIT += "rm_work"
RM_WORK_EXCLUDE += "home-assistant widde"
Having installed a few packages with a large number of dependencies via cabal install, I now have several hundred megabytes of source files in my ~/.cabal/packages/hackage.haskell.org directory. As I'm trying to work on a small SSD, space is at a premium for me. Can I safely remove these, or will doing so cause failures later on?
Remove ~/.cabal/packages/hackage.haskell.org won't cause any failure, but cabal-install will redownload the huge 00-index.tar next time when you try compiling something and this single file is 80+% the size of the folder. It's the index of the whole haskell universe, now around 200MB and hopefully will grow without bound in future.
Compiled libraries and executables won't be affected, so if you are not going to build anything more it's fine to remove the whole folder.
Evening all,
I am playing with Orchard CMS and I have a quick question. I keep my source code on a 10GB partition on my PC; I downloaded the source for orchard (~40MB) and placed it on that drive.
Started visual studio, opened the solution and started a build, realised quite quickly that it was going to take some time so I went off and got a drink, came back to find it had errored out of the build and that the last 3GB of disk space on my dev drive had been filled. This can't be normal, can it?
Does anyone know how much free disk space I'll need to build orchard from the source? I am limited by the size of the SSD in my laptop and I'm not going to upgrade just so I can use orchard!
Problem is that vanilla source projects don't disable "copy local/private" for references. Therefore every project in the solution creates copies of all references in it's bin folders. This obviously isn't necessary here and increases size exponentially (since these references are shipped together anyway so better if they are included just once).
You have 2 options:
(Recommended) Don't compile the source, I've been writing modules on top of precompiled version and never needed to make changes to the core source, that may do more harm than good. But if you really need to compile >
Force references to not copy locally, either manually for every single reference in every single project or find macro or maybe some VS magic to enforce it globally.