Is it safe to delete the %USER_HOME%/.gradle and %USER_HOME%/.AndroidStudio3.1 folder in Android Studio? - android-studio

These two folders take up significant space on my hard disk
%USER_HOME%/.gradle
%USER_HOME%/.AndroidStudio3.1
Can they be safely deleted? If not, what subfolders can I safely delete?

Well, it's "safe" in that it won't blow up your computer. But the .gradle folder is a dependency cache - every library your app needs is going to need to be stored there in order for you to actually build / test / run your code. It's to be expected that it'll take up a lot of space. You might get some mileage out of deleting it and re-downloading the dependencies, especially if there are a lot of old dependencies in there that you're not using it, but by virtue of your work it's going to get re-created and start filling back up again.
The Android Studio folder is a bit similar - it's not a dependency cache in that lots of different things aren't going to get installed there, but it's still necessary for you to actually build your code. If you delete it you're just going to have to reinstall things there to get your code to work.

Related

Why is the ~/.cargo directory so big?

On my Windows 10 machine it's 3.5GG. What is it storing? How can I trim it down?
It is storing all the downloaded crates you have used, their binaries, the index of registries, etc. It is going to take a lot of space and will keep increasing.
You can safely remove .cargo/registry/ (if you remove the entire folder, you will lose installed binaries and your Cargo configuration if you had one). Afterwards, everything you use again will be downloaded and it will start growing back. It is a common way of getting rid of very old dependencies you are not using anymore.

Meaning warning "File is touched by more than one package"

I am creating a simple linux kernel with buildroot and I am adding a small driver I've done myself, I created the Config.in file and drivername.mk to be able to select the driver in make menuconfig succesfully.
When executing make to build the image, the compilation goes correctly until my driver starts to compile, it looks to compile and create the image right but I get loooots of warnings saying that different files in ./lib/gcc/arm-buildroot-linux-uclibcgnueabihf/ are touched by more than one package: [u'host-gcc-initial', u'host-gcc-final'].
Anyone can explain me a bit about this issue and what is causing it? Do you need any more info to know what is happening? Is it safe to ignore them?
Thanks beforehand
Actually, doing a search on 'touched by more than one package', I found http://lists.busybox.net/pipermail/buildroot/2017-October/205602.html, where we find that this warning can safely be ignored if you're not doing a parallel build and aren't a kernel maintainer.
That said, if you're submitting code for inclusion in the Linux kernel, please be a good citizen and make sure you identify all of the things your code is dependent upon. (I'm not actually an active kernel hacker, so I don't know what method they're using for this right now.)
The basic idea is that there are a bunch of steps in compiling things that need to be done in a logical order. In a small project, we simply use dependencies that we know to put in because we also coded in that dependency. But with a project the size of the kernel, you can guarantee that not everyone does this. Some of them instead just specify dependencies if they're needed for things to build properly - if the default order works, things could go years before someone figures out that there was a missing dependency, causing them grief when they were trying to update just the one thing that was a missing dependency, and the other code not getting updated as a result.
When you're doing things in parallel, on the other hand, it becomes a lot more complicated. Now you really need to have every dependency specified, because there is no longer any inherent dependable order. Some people will probably still build serially, while others use two processing threads. I'll use 8. I've worked in groups that would be inclined to do 30, because they're on a 32 processor machine, and don't really need all of those during the off hours. Suddenly the fact that the file you needed from a directory that normally got processed 30 directories before yours is now getting processed at the same time as your file that needed it, because you didn't list the dependency and everything in those 30 directories that hasn't already been processed and isn't being processed has a dependency that's not yet finished its processing.

Bitbake build consumes more space

I recently started using Bitbake for building Yocto. Everytime I build, it consumes more space and currently I'm running out of disk space. The images are not getting overwritten. A set of new files with timestamp is getting created for every build. I have deleted old files from build/tmp/deploy/images/. But it doesn't make much difference in the disk free space. Is there any other locations from where I can delete stuff?
The error I observe during build is:
WARNING: The free space of source/build/tmp (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
WARNING: The free space of source/build/sstate-cache (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
WARNING: The free space of source/build/downloads (/dev/sda4) is running low (0.999GB left)
ERROR: No new tasks can be executed since the disk space monitor action is "STOPTASKS"!
Kindly suggest some pointers to avoid this issue.
In order of effectiveness and how easy the fix is:
Buy more disk space: Putting $TMPDIR on an SSD of its own helps a lot and removes the need to micromanage.
Delete $TMPDIR (build/tmp): old images, old packages and workdirectories/sysroots for MACHINEs you aren't currently building for accumulate and can take quite a lot of space. You can normally just delete the whole $TMPDIR once in a while: as long as you're using sstate-cache the next build should still be pretty fast.
Delete $SSTATE_DIR (build/sstate-cache): If you do a lot of builds sstate itself accumulates over time. Deleting the directory is safe but the next build will take a long time as everything will be rebuilt.
Delete $DL_DIR (build/downloads): If you use a build directory for a long time (while pulling updates from master or changing to newer branch) the obsolete downloads keep taking disk space. Keep in mind that deleting the directory will mean re-downloading everything. Looking at just the largest files and deleting the old versions may be a useful compromise here.
There are some official ways instead of deleting.
By deliberately deleting you could be forcing unnecessary builds & downloads. Some elements of the build could be not controlled by bitbake, and you can find yourself in a situation that you cannot rebuild these items in an easy way.
With these recommendations, you can beat the non written 50GB per build yocto rule:
Check your IMAGE_FSTYPES variable. My experience says it is safe to delete all images of these files that are not symlinks, or symlinks targets. Avoid the last one generated to avoid breaking the last build link, and any related with bootloaders and configuration files, as they could be rarely regenerated.
If you are keeping more than one build with the same set of layers, then you can use a common download folder for builds.
DL_DIR ?= "common_dir_across_all_builds/downloads/"
And afterwards:
To keep your /deploy clean:
RM_OLD_IMAGE: Reclaims disk space by removing previously built versions of the same image from the images directory pointed to by the DEPLOY_DIR variable.Set this variable to "1" in your local.conf file to remove these images:
RM_OLD_IMAGE = "1"
IMAGE_FSTYPES Remove the image types that you do not plan to use, you can always enable a particular one when you need it:
IMAGE_FSTYPES_remove = "tar.bz2"
IMAGE_FSTYPES_remove = "rpi-sdimg"
IMAGE_FSTYPES_remove = "ext3"
For /tmp/work, do not need all the workfiles of all recipes. You can specify which ones you are interested in your development.
RM_WORK_EXCLUDE:
With rm_work enabled, this variable specifies a list of recipes whose work directories should not be removed. See the "rm_work.bbclass" section for more details.
INHERIT += "rm_work"
RM_WORK_EXCLUDE += "home-assistant widde"

Remove package sources from cabal directory

Having installed a few packages with a large number of dependencies via cabal install, I now have several hundred megabytes of source files in my ~/.cabal/packages/hackage.haskell.org directory. As I'm trying to work on a small SSD, space is at a premium for me. Can I safely remove these, or will doing so cause failures later on?
Remove ~/.cabal/packages/hackage.haskell.org won't cause any failure, but cabal-install will redownload the huge 00-index.tar next time when you try compiling something and this single file is 80+% the size of the folder. It's the index of the whole haskell universe, now around 200MB and hopefully will grow without bound in future.
Compiled libraries and executables won't be affected, so if you are not going to build anything more it's fine to remove the whole folder.

Orchard CMS Disk Space

Evening all,
I am playing with Orchard CMS and I have a quick question. I keep my source code on a 10GB partition on my PC; I downloaded the source for orchard (~40MB) and placed it on that drive.
Started visual studio, opened the solution and started a build, realised quite quickly that it was going to take some time so I went off and got a drink, came back to find it had errored out of the build and that the last 3GB of disk space on my dev drive had been filled. This can't be normal, can it?
Does anyone know how much free disk space I'll need to build orchard from the source? I am limited by the size of the SSD in my laptop and I'm not going to upgrade just so I can use orchard!
Problem is that vanilla source projects don't disable "copy local/private" for references. Therefore every project in the solution creates copies of all references in it's bin folders. This obviously isn't necessary here and increases size exponentially (since these references are shipped together anyway so better if they are included just once).
You have 2 options:
(Recommended) Don't compile the source, I've been writing modules on top of precompiled version and never needed to make changes to the core source, that may do more harm than good. But if you really need to compile >
Force references to not copy locally, either manually for every single reference in every single project or find macro or maybe some VS magic to enforce it globally.

Resources