TFS creates a $tf folder with gigabytes of .gz files. Can I safely delete it? - visual-studio-2012

I am using visual studio 2012 with Microsoft TFS 2012.
On the workspace that is created on my c: drive, a hidden folder $tf is created. I suspect TFS from creating this folder. It's lurking diskspace as the current size is several gigabytes now and it's about 25% diskspace of the total amount of gigabytes needed for the complete workspace. So this hidden $tf folder is quite huge.
The structure is like this:
c:\workspace\$tf\0\{many files with guid in filename}.gz
c:\workspace\$tf\1\{many files with guid in filename}.gz
Does anyone know if I can delete this $tf folder safely or if it is absolutely necessary to keep track of changes inside the workspace?

TFS keeps a hash and some additional information on all file in the workspace so that it can do change tracking for Local Workspaces and quickly detect the changes in the files. It also contains the compressed baseline for your files. Binary files and already compressed files will clog up quite a bit of space. Simple .cs files should stay very small (depending on your FAT/NTFS cluster size).
If you want to get rid of these, then set the Workspace type to a server workspace, but lose the advantages of local workspaces.
Deleting these files will be only temporarily since TFS will force their recreation as soon as you perform a Get operation.
You can reduce the size of this folder by doing a few things:
Create small, targeted workspaces (only grab the items you need to do the changes you need to make)
Cloak folders, exclude folders containing items you don't need. Especially folders containing lots of large binary files
Put your dependencies in NuGet packages instead of checking them into source control..
Put your TFS workspace on a drive with a small NTFS/FAT cluster size (a cluster size of 64Kb will seriously enlarge the amount of disk space required if all you have are 1KB files.
To setup a server workspace, change the setting hidden in the advanced workspace setting section:

The simple answer: I deleted the $tf files once: the net result was that newly added files showed up in my pending changes, but when I changed an existing file, the change did not show up in my pending changes. So I would not recommend deleting this folder.

To answer the original question, the answer is yes. However, in order for TFS to track changes, it will need to be recreated, albeit with fewer folders and much smaller disk space. To do that:
First delete all the tf$ folders currently in your current workspace folder.
Next, move all of the remaining contents of the original folder to another empty folder, preferably one on another drive;
Perform a "Get latest" into the original (now empty) workspace folder (this will cause a single tf$ folder to be created in that original folder).
Now copy all of the contents you moved into the backup folder over the top of the 'Get latest' results in the original workspace folder.
By performing these steps in that order, you will end up with the tf$ entries TFS needs, but in a single folder and much more compact - additionally, the deltas of any changes you made that had not been checked in will be preserved and TFS will recognize them as pending changes as it should.
Our Certitude AMULETs C++ solution has 72 advanced projects in it, and we have to do this once a month to keep compiling and search speeds reasonable.

I deleted the $tf directory, and GetLatest behaved - it asked me if I wanted to keep the local files or replace with server. I could then check as normal.
The mildly annoy part was about 30 files I had locally that I had told to ignore appeared.

Related

Recovering an old Perforce depot

I recently lost a lot of files, due to a hard disk error, but still have a Perforce folder containing files from a particular project. Trouble is, it's not the actual project files, but some kind of Perforce storage format, with files ending in ",d" and ",v"
Is there any way I can restore the original files from what I have?
I imagined Perforce would be able to open the folder as a depot, and I'd simply be able to get the files into a new workspace, but I can't see any way to open an existing depot. I tried editing the depot's "Storage location for versioned files" to point at the folder, but the depot still shows as empty.
I only used Perforce briefly (comparing it to Git) so I don't really understand how it works. Any help or advice would be much appreciated.
The thing you're missing is the database (the db.* files), which need to be in the server root (P4ROOT) when you start the server. If you don't have database files in P4ROOT, the server will create an empty database on startup. The database is the source of truth for a Perforce server, so if it's empty, the server is empty.
If you have a checkpoint (a file called checkpoint.N) and/or journal file, that contains the metadata you need to reconstruct the database; recover the server with p4d -jr checkpoint.N journal, and then you can use the p4 depot command to make sure the Map of your depot(s) points at the directory where your archive files are, and use other commands to inspect the actual files (start with p4 verify to see which files are in the database but missing/corrupted in the archive backup).
If you only have the archive files but no database (and no checkpoint or journal to recover it from), you're in a more difficult spot since the database is what maps the archive files into the actual depot structure (e.g. it includes all the copy-by-reference pointers that constitute branches in Perforce). However, you can extract the contents of the archive files one file at a time using conventional tools; the ,v files are in RCS format (use the co command to retrieve their content), and the ,d directories contain regular old .gz files, one per revision.
Using the deep magic to synthesize a database from the archive files on their own is also a possibility, but the RCS/CVS conversion scripts are so old I'd expect a lot of fiddling to be required to get them working with a current version of Perforce.

Unable to delete renamed folders

I was renaming several files/folders when VS decided that would be the perfect time to crash. On re-opening, I now have 2 versions of the folders I renamed, one with the old name and one with the new. The new folders were not linked to source control, so I added each of them. Now, when trying to delete the old ones, I get the following error:
This operation cannot be completed. You are attempting to remove or delete a source-controlled item where the item is exclusively locked elsewhere or otherwise unable to be deleted from source control.
I know no one else had those files checked out, and all the files in them have the little green + as if they are new files. I can delete all those files so the folder is empty, but I still cannot delete it. I'm also unable to exclude the folder from project.
If I open a file explorer, I can delete the old folder and then VS will allow me to delete the folder. However, when trying to rename one of the folders back, it gives the error above, even though the folder had been deleted.
My internet searching powers are coming up short and I'm not sure what else to try. Any ideas on how I can fix this?
In case it matters, the affected folders contain .cs, .cshtml, and .js files. The OS is Windows 10.
Generally the files/folders are not really deleted from TFS as they are in source control, unless you permanently destroy them, See Destroy Command.
Files and folders under version control can be easily moved, renamed, and deleted from Source Control Explorer. (Make sure you have these folders mapped in your workspace.)
Just check and try below things:
Note that you cannot delete a folder that has pending changes on any
of its children (including if any of those children are being moved
out of the folder – they’re still children until that changeset is
checked in.)
So, in this case you need to move the children out of the folder
and check those changes in, then delete the folder as a separate
changeset.
Besides in earlier version of TFS and VS have some problems with
deleting empty folders. In this case you can try creating an item
inside the folder you are trying to delete. After that try
deleting again. See this article for details.
You can also try to delete the renamed folder from the command line.
Se Delete Command (Team Foundation Version Control) for details.
Get latest first, then try deleting again.
Remap the workspace or create a new workspace and map to a new
location, then try deleting again.

Perforce: move directory with keeping history

Colleagues,
I have two different directories:
- path/animals/dir1
- path/cars/dir2
I want to move dir1 into path/cars/ with keeping all history of this folder.
I tried several ways: merge, copy and rename/move using P4V,
but all of them leads to erasing history in moved directory.
If there is a way how can I do it?
Thanks in advance!
We have used p4 move extensively in recent versions of P4 to do these kinds of directory moves.
In my experience, if you branch/delete (or more recently move) the files, you'll have the history of the original location intact (with delete/move records), but if you look on a revision graph in P4V, you should see all of the older revisions in the previous locations before they were moved (deleted).
If you use p4 sync with a particular date or changelist, you should also get the original directory back with the versions of the files at that time (and the newly moved versions should be removed after the sync, assuming both directories were in the area that you were syncing).
I have experienced problems (this probably will change in 2013.2, as I've read that Perforce is changing the default integration engine) with integration across move/delete and move/add transactions which require the use "generation 3" integration option, but once that's specified (which you can do by adding a -3 to the command line p4 integrate command, everything works better for me across those moves.
The history of the moved directory has not been erased. I'm guessing you think this is the case because the files are longer visible in P4V. By default, P4V does not display deleted files in the depot tree. Since you moved the files to a new location, they were deleted from their old location and are now no longer displayed. You need to turn on the option to "Show Deleted Depot Files". If you click on the yellow funnel icon to the right of the Depot/Workspace tabs you'll see that option. Check it and P4V will then display the deleted files in their original location.
While moving folder to another destination all history still present but only for appropriate files, not for a directories.
Here is a proof:
http://comments.gmane.org/gmane.comp.version-control.perforce/19820

Viewing all descendant files in TFS/VS

Is it possible to see all descendant files in the marked folder in the Source Control Explorer window in Visual Studio?
Other source control software I have used have this option, and it makes it very easy to iterate through all the files in a folder recursively and see what has changed.
I think TFS does not support this feature, but there are others options available :
Pending Changes
You don't need to iterate through all the files to see what has changed. You can view easily what has changed using Pending changes. A pending change is a change (Add, Edit, Delete ...) that has not been check-in in TFS. You can view these changes for a single directory or the whole Project. In addition, you can check-in pending changes only for one directory. You will always see the summary window to view all changes before. Right-click the item (Folder or File), and you will see options : Check in, Undo or Shelve. More info here.
File & Folder Comparison
In Source Control Explorer, you can compare the differences between two server folders, two local folders, or a server folder and a local folder. Simply right click on the target folder. It's quite a powerfull feature when you know it.
Read morehere.

How to exclude a folder and not its children from SVN Update in Tortoise SVN

I am working on a shared project which is put in SVN. The directory structure of the project is as follows:
ParentDir
- Child_Dir_1
+ GrandChild_Dir_1
+ GrandChild_Dir_2
- Child_Dir_2
Child_Dir_1 contains configuration files (Eclipse's .LAUNCH files), and people put all sorts of file in this folder.
So each time I update my source code (by right clicking on ParentDir and picking Update), I got a lot of configurations that I don't really need, and I have to delete them manually.
I still need to have the children of Child_Dir_1 (which are GrandChild_Dir_1 and GrandChild_Dir_2) to be updated.
I have tried to go to set the "ignore" property of the of Child_Dir_1 to exclude *.LAUNCH files, but each time I update the source code, the ones that I manually deleted are restored to Child_Dir_1.
Since you are using TortoiseSVN:
Go into Child_Dir_1, select GrandChild_Dir_1 and GrandChild_Dir_2 and right click, TortoiseSVN -> Update.
That will update only those two folders.
Ignore is so that you can "ignore" ( from commit, status etc.) untracked files, files that are not checked in.
It could be that manojlds' answer is the solution for you, but I have doubts. The problem here is that those files are really part of the project. They are kind of unavoidable, and must be in sync with the rest of working copy.
Option 1 (best): Remove all configuration files form repository, or better yet have in repository only configuration template files (with, say, $ as first character in file names). Each user could copy those template files to true configuration files and change them accordingly. Configuration files should not ever be committed. Only template files should, but updating template files will not mess with current configuration files of any user.
Option 2 (second best): Ignore those configuration files. Use your own files for your own configuration, with names that don't clash with existing. You may even add your files to SVN, but you may just as well not add them. Does not matter, as long as you don't need your configuration on another machine.
Option 3: Use ignore-on-commit group. Use those configuration files that already exist. Change them to your likings, but don't ever commit them. To ensure that you don't commit them by accident flag them as non-committable (go to commit window, select all non-committable files, right click > Move to changelist > ignore-on-commit). The problem with this is your files are not protected from other users' updates, but may actually be a good thing.
Option 4: Chop the folder out (a horrible hack). Remove Child_Dir_1 from working copy (Right click on it > Update to revision > set Working depth to Exclude). Save the folder elsewhere first, because it will disappear. After that create it again, inside it checkout all subfolders (GrandChild_Dir_1 and GrandChild_Dir_2), and copy your configuration files. Now you have complete control over folder's contents, but update and commit become more complicated.
Edit: There is option 5 in theory, but I doubt it can be implemented successfully. You can try: Use NTFS hard links. Copy the whole tree with all files as hard links to existing files, except .svn folders and their contents. Original directory is used for SVN operations update, commit, add and delete, and new directory is used for editing files. From new directory delete all the files you don't need, and insert all the files you do need that are not the part of SVN. The problem here is minor extra work when deleting files from and adding them to SVN.

Resources