I sync my gradle projects to a usb stick and then I snyc two computers with it. Both are used by myself only, I'm just switching machines regularly (home vs work in my case).
I often get cannot resolve symbol (mostly with all support v4 classes only) exceptions from the IDE (compling works though) after this that can't be repaired anymore...
Any ideas how to solve this? And especially how to avoid this?
What I do already
I have defined a build directory outside of the projects to avoid conflicts there based on wrong intermediate files like following:
allprojects {
buildDir = "M:/tmp/${rootProject.name}/${project.name}"
}
What I try if the problem occurs
I make a Invalidate Caches - Invalidate and Restart in android studio => does nearly never help
I delete the .idea folder from my project => helps sometimes, is not very convenient, as I need to import the project again and set up the sdk again afterwards... and it does not always help either
after syncing my project I make a clean build
It's not the most beautiful solution, but it's the best I found.
If I delete the .idea/libraries folder after syncing, this solves the issue.
Related
I feel dumb for asking this, but I thought that GitHub and Gradle would fully manage dependencies in Android Studio.
I commented an implementation from Maven (by using "//") and then invalidated caches and restarted, then clean project and rebuild.
// implementation "com.some-implementation:the-implementation:1.0.0"
And to my surprise when I double shift for searching project files in IntelliJ the file is still there and accesible.
I've been working on a personal project for some time now, and after finding out that files of unused dependencies are still present on the project, now I wonder how on earth will I get rid of them entirely.
I cannot even remember how many implementations have been used and then abandoned.
Do these files end up adding to the compile time, or to the size of the build?
I am not sure , but i think that even with gradle clean build wont delete the old files , If you can locate the android cache , usually at Users/%USER_NAME%/.android/cache-build , you can safely delete all the unwanted jars.
But with my naïf understanding that even deleting gradle cache and the android cache you will keep seeing those files in the Search everywhere . and its probably related to the search mechanism .
But i have no idea why this will be an issue for you , its cached and am almost sure that it wont be loaded to any cloned or newley used project .
So opening new project or clone your current progress to a new location will remove these unwanted dependencies .
Everytime I restart Android Studio it forgets the configured modules despite them shown as configured in the ProjectStructure>Dependencies menu and I manually have to remove them from ProjectStructure>Dependencies and then add them again.
I also tried invalidating the chache, cleaning and rebuilding but the only thing that works is to manually remove the dependencies and to add them again.
What is causing this behavior?
I never faced this problem, but i would check this points:
Do you have other software like git accessing the project folder and maybe overriding some important project files?
Does your user have the rights to write data into the project folder and the configuration files?
Does the Event Log shows something interesting?
Does the log shows something? Help > Show Log in Files > idea.log
i recommend using Ctrl+F to find any occurrence of Error
If you are on linux you can use find -cmin -30 to get a list of all files that were changed in the last 30 minutes. That might be useful to spot the problem.
It might help if you give us your operating system (and version) and android-studio version.
I want to be able to work across multiple workstations synchronously jumping from one to the other without having to worry about committing.
I have windows personal and work desktop and a Mac OSX laptop. At the moment, I point my project to a cloud directory and have the local install of Android Studio pointing to a gradle offline cache in another cloud directory. This keeps failing as it tells me that the path to gradle is invalid. Which I understand because gradle is referenced in different locations on different machine (considering the differing file management system in MACOSX and Windows7).
Edit: When I try to open the project, it brings up the "Import Project from Gradle" screen. To which it has the option for me to select "Use local gradle distribution" and select the Gradle home directory. I pointed it to the cache directory, and it tells me:
Cannot Save Settings
Gradle location is incorrect.
Location:C:/Users/Username/.gradle
All my research (include these answers here, and here) suggest that VCS is the way to go. However, I don't see this as a solution to my problem. I'm not looking to version control, I'm looking to transition seamlessly across workstations. Of course I will still use Version Control System for the purpose of saving a working version of my code, or sharing it with other developers, but there has to be a better way when I simply just want to keep all workstations synced.
I come from web development, and I synchronise local environment on AMPPS across multiple computers without any issue. This meant I can transition from my personal desktop, laptop, and work desktop instantly. It frustrates me if I have to remember to commit every time I move around. If I have to do this 20 times a day, and it takes about a minute to do this, that's 20 minutes that could have been spent writing a couple of functions. And what if I forget to commit, then I get to work, or home, that would be a day wasted because I won't actually have the current up to date code...
So the question remains, is there a way to instantly synchronise Android Studio projects? How do I keep all my code base (ie gradle) in sync?
Ok thanks to the comments above which pointed me in the right direction.
Android Studio create some local files that are specific to the machine that you are on. Following on this principle, to sync the "source" files (files that are specific to your application only), you must ignore all these local files. This is similar to what you would store on github. I followed the answer for this question to apply the ignore rules.
Having ignored all the "local files", when I create a new project, the source files are synchronised across all my workstations. In order to establish a local version, I need to "import" the project first. Once it has been imported, "local files" will be created for that particular machine. From then on, I can "open" the project locally.
To summarise:
Set your sync to ignore files as per .gitignore or refer to this question.
Create a project on one of your workstation and save it in the cloud.
When you are ready to work on the project for the first time on another workstation, "import" the project.
Once the project has been imported, all local files should have been created.
From then on, use the "open" option to continue working on the project.
I hope this helps somebody else, saving hours on googling.
I have two rather large solutions that both experience the same problem. The issue is that I am warned about an inability to delete temporary files. The messages all look like this:
Failed to delete the temporary file
"C:\Users\Don\AppData\Local\Temp\tmp07197280428445c484ba0cda58178903.exec.cmd".
The process cannot access the file
'C:\Users\Don\AppData\Local\Temp\tmp07197280428445c484ba0cda58178903.exec.cmd'
because it is being used by another process.
I have seen suggestions of using pre-build commands to first delete things, but that is a lot of projects, and I'm not going there.
Anyone know how else I might remedy this, that does not involve "fixing" each project individually?
If it makes any difference, I'm compiling C# .NET 3.5 projects.
My idea is, to write a small addin for Visual Studio, which can delete files on build. You could configure it with filepaths and then just run sth like this:
foreach (var item in paths)
File.Delete(item);
And the config you could put solutionwide.
I get that too - the problem is that the compilation system itself is holding onto the file when it attempts to delete it. I think it deletes it afterwards anyway as I've never seen the named files hanging around afterwards so its just an annoyance that can be ignored.
The files seem to be the command that VS is running that is built up from the build settings.
I assume its a .NET thing where the GC hasn't cleaned up the object that has the handle to the file when the system attempts to delete it. If so, directly shows the benefit of RAII over GC :-)
A likely source for the problem is that your antivirus software is busy scanning the file in question, which prevents the rightful owner deleting it. Curb the enthusiasm of the antivirus and your problem will be solved.
Unload the project from your solution, than reload it. It should create the missing files and be buildable again.
If you have installed any third party cleaner tool and activated the active mode (always running in background) this will lock the temp folder in the appdata so Visual Studio is unable to restore the Nuget package on build and there will be a build error.
Try uninstalling the cleaner and restarting the system. When I had this problem, that was how I fixed it.
We recently installed NuPeek for our NuGet repository and NuPeek as symbols server.
NuGet works (above) fine. It was set up within an hour.
The Symbols Server on the other hand is a different story. Packages are pushed to NuPeek (normal packages and symbol packages). I see on the server that both are picked up and placed in the correct folder (source files too, .cs in this case).
I have set up Visual Studio so it can find the correct symbols server. When I create a new project, install the package (that also has a symbols package), use the code from that package and try to debug it, the following happens:
In the cache folder the "package" is downloaded
The cache folder also has a src folder. Within a source folder which has a folder with the same name as the package -> version.
The version folder is empty
The folder cache/ packagename.pdb/guid/ packagename.pdb is present
Still, Visual Studio cannot find the correct CS file to show. After some digging in the NuPeek server folders I noticed the folder symbolsPath -> temp -> PackageName -> lib -> net45 is empty, while the symbols.nupkg clearly has sources (one cs-file to be excact).
I had this working before, but we switched servers (Azure website to Azure CloudService), but I'm 99% sure this is not the problem.
Am I missing something? Does anyone has any clue?
Thanks in advance!
Does anyone has any clue?
If anyone does then the author of the project, Jérémie Chassaing, would be the most likely candidate. Don't hesitate to add an issue to the issue tracker. Not much there right now and he looks pretty responsive so worth your time.
Do run through the setup checklist first:
Tools + Options, Debugging, Symbols, add http://myserver/NuPeek/symbols to the Symbol file locations list. Ensure that you have a valid Cache symbols directory selected
Tools + Options, Debugging, General, tick the "Enable source server support" option
Untick the "Enable Just My Code" option.
Tick the "Print source server diagnostics" option. Update your question with what you see in the Output window so we'll have a better shot at figuring out the real problem
Ok, this is an old question but as i found the solution today i'll post it here.
This is probably because you instaled NuPeek too deep in your website structure.
A bug in Nupeek requires that it is installed at the root level of your site (for ex: www.domain.com, and not www.domain.com/Nupeek/).
Else you can fix the bug in SymbolTools.cs, replace the SourceBaseUri getter code by this :
private static string SourceBaseUri
{
get
{
var httpRequest = HttpContext.Current.Request;
var applicationUri = new Uri(new Uri(httpRequest.Url.GetLeftPart(UriPartial.Scheme | UriPartial.Authority)), Path.Combine(httpRequest.ApplicationPath, "source"));
return applicationUri.ToString();
}
}
Hope this helps.