Recovering an old Perforce depot - perforce

I recently lost a lot of files, due to a hard disk error, but still have a Perforce folder containing files from a particular project. Trouble is, it's not the actual project files, but some kind of Perforce storage format, with files ending in ",d" and ",v"
Is there any way I can restore the original files from what I have?
I imagined Perforce would be able to open the folder as a depot, and I'd simply be able to get the files into a new workspace, but I can't see any way to open an existing depot. I tried editing the depot's "Storage location for versioned files" to point at the folder, but the depot still shows as empty.
I only used Perforce briefly (comparing it to Git) so I don't really understand how it works. Any help or advice would be much appreciated.

The thing you're missing is the database (the db.* files), which need to be in the server root (P4ROOT) when you start the server. If you don't have database files in P4ROOT, the server will create an empty database on startup. The database is the source of truth for a Perforce server, so if it's empty, the server is empty.
If you have a checkpoint (a file called checkpoint.N) and/or journal file, that contains the metadata you need to reconstruct the database; recover the server with p4d -jr checkpoint.N journal, and then you can use the p4 depot command to make sure the Map of your depot(s) points at the directory where your archive files are, and use other commands to inspect the actual files (start with p4 verify to see which files are in the database but missing/corrupted in the archive backup).
If you only have the archive files but no database (and no checkpoint or journal to recover it from), you're in a more difficult spot since the database is what maps the archive files into the actual depot structure (e.g. it includes all the copy-by-reference pointers that constitute branches in Perforce). However, you can extract the contents of the archive files one file at a time using conventional tools; the ,v files are in RCS format (use the co command to retrieve their content), and the ,d directories contain regular old .gz files, one per revision.
Using the deep magic to synthesize a database from the archive files on their own is also a possibility, but the RCS/CVS conversion scripts are so old I'd expect a lot of fiddling to be required to get them working with a current version of Perforce.

Related

How can I move a folder to another changelist using P4V?

I have accidentally added a few folders to my default changelist that I don't want to submit to the server. How can I move these changes to another changelist, or remove them from the changelist without affecting the files on disk?
I have created a new changelist and moved some individual files / changes to this list but the folder contains many autogenerated files and this will take too long to do file by file.
I also looked at using the "revert" option but I think some of these files may have been previously added to the server in error. Reverting seems like it will change these files on disk to the previous server version.
You can specify the folder path in "Find File".
And use "*" to match all files in the contains filed.
Now you can select all the files in your folder by using "Ctrl+A"
From P4V you can multi-select the files in the pending changes window and then drag them into a new changelist. If they're all in the same directory they'll all be grouped together since it's sorted by depot path.
If you just want to have them not be open but also not modify them on disk, go to the command line and do:
p4 revert -k //depot/path/...
The -k option lets you keep your local files. This isn't available from P4V as far as I know (since it leaves your workspace out of sync with the depot state, it's usually a bad idea).
If you have generated files in your workspace that aren't supposed to go into the depot, you should exclude them from your client's View, e.g.:
View:
//depot/... //myclient/...
-//depot/path_to_generated_files/... //myclient/path_to_generated_files/...
This will essentially "hide" these files from all Perforce operations; you will never be able to add files from this workspace path, and if somebody else adds files to that depot path, you won't sync them down to your workspace. Two notes on this:
If you already have some of these files in the depot and they're currently synced, excluding them from your view and then syncing your client will remove them from your client. You can use sync -k, much like revert -k, to keep your local copies while telling the server that your client is properly up to date.
If you're using streams, you can do this for ALL clients of the stream by adding an Ignored path.

How are perforce stream depots mapped to the backend

When we search the Linux box hosting perforce I only see normal depots. What's the mapping from the front end to the backend for stream depots please?
i.e. In the p4 client I see depot, main, and mynewstream.
In linux I see, var/perforce/depot and var/perforce/main but no streamdepots.
Thanks.
It's the same as for any other depot. Run p4 depot -o DEPOTNAME and look at the Map: field -- that tells you the directory where the archives are stored on the back end. It defaults to DEPOTNAME/... (relative to P4ROOT) but can also be an absolute path pointing to any writable filesystem.
Note that if a depot only includes files that are branched or copied from other depots, there are no physical back end copies (the database "lazy copies" them by reference from archives in other depots). If you've just seeded your stream depot by branching from another depot and haven't made any edits to it, that'd explain why you don't see a directory for it on the back end.

How to uncompress perforce Depot files

- how to uncompress perforce Depot files?
The files I have now ending with [,v] and some files end with [,d] containing [1.1.gz].
What i did In details:
In P4V I created a Workspace, put some important files, Submitted it to the Depot then decided to delete what's in the Depot by clicking Mark for Delete it just mark it with a red X what I think, So I head to C:\Program Files\Perforce\Server\depot and deleting it from there, now the files in the Recycle Bin but doing so doesn't make it disappear from P4V so I opened P4Abmin in the Depot tap I did Obliterate and its gone finally.
Later discovered that Marking files for Delete in the Depot delets it from the Workspace, and only thing that I have is what I restored from the Recycle Bin and it's compressed files, how can I uncompress it.
Don't touch the Perforce server's depot or db files unless you know what you're doing -- normally the server handles the job of managing those files and the relationships between them, and randomly messing with those files will usually break things, much like if you randomly shuffled blocks on your hard disk around without knowing how your filesystem works. I mention this first so that you'll know for next time, and second so that if you happen to have access to a time machine, you can fix this problem by going back and informing your past self to keep their paws out of P4ROOT. :)
If in the future you want to temporarily delete files from the depot, use the normal "Mark for Delete" command in P4V (or p4 delete in the CLI) followed by "Submit". If you want to permanently delete them, that's what the "obliterate" command is for. In neither case should you be deleting files out from under the server -- everything should happen from the client (that is, P4V, the p4 CLI, P4Win, etc).
If you restore the deleted files to exactly where they were, you should be able to rely on Perforce to get the files back, provided you have not already obliterated them from the db. (Hopefully obliterate noticed the archive files were gone and it failed with an error instead of blasting the db entries...)
If you no longer have the db entries for the files, you can try to extract the archives manually with command line tools (luckily the content isn't encrypted or in a weird proprietary format) -- you should be able to gunzip the .gz files and co (RCS) the ,v files. I'd expect most unzip utilities to understand gzip, but RCS is a pretty old format so you may have to do a little digging to find Windows tools for it (I think Cygwin may have RCS tools bundled with it). Good luck!

Perforce client missing files that are on the HDD

I have a lot of files within the file structure of the perforce depot that I am unable to see with the perforce clients p4 command line or p4v gui even when logged in as admin.
I have tried to find any meta data I can through p4 files and p4 filelog commands but it always returns:
"- no such file(s)."
Also I have run p4 verify and p4 dbverify to see if there we were any errors on the server but they returned no errors. There just seems to be no records of the files except for the fact that they are taking up room on the HDD.
My current theory is that they are from failed commits but I do not know how to get perforce to acknowledge the files so I can obliterate them.
Background info:
This is a simple perforce setup with just the main depot and an archive depot for old projects. (The mystery files are in the main depot)
The server version is: P4D/NTX64/2012.2/551823 (2012/11/09).
There isn't necessarily a one-to-one mapping between what's in the server's depot filesystem and the actual structure of the depot as defined in the metadata -- depot revisions are written once and are not moved or duplicated even if they're moved or duplicated from the point of view of the client. So you definitely shouldn't make the assumption that because a given file in the depot filesystem doesn't correspond to a depot file path that it's not actually providing the underlying storage for some other existing file (especially if you've used obliterate on some branches of a file while leaving others intact -- the remaining archive file may be the content for one of the ones you left).
That said, it is also possible for archives to become "orphaned" as part of a failed submit as you suggest. If the amount of space involved is small I'd suggest not worrying about it (the orphaned files won't cause any problems in terms of collisions), but if it's important to be able to clean them up, your best bet is to use "snap -n" to make sure there aren't any of those dependencies and then delete them manually (just to be safe I'd keep a backup of them at least until you've run your next verify to make sure nothing important has gone missing). Run:
p4 snap -n //... //depot/path/to/mystery/file
This says "show me files anywhere in the depot (//...) with archive dependencies on //depot/path/to/mystery/file". If you run the command without the -n it will actually break those dependencies by making physical copies (don't do this if you're worried about space since you'll end up with N redundant copies of the archive).
The inverse of p4 snap -n (i.e. "where does the archive for this depot file live?") is p4 fstat -Oc //depot/file.

TFS creates a $tf folder with gigabytes of .gz files. Can I safely delete it?

I am using visual studio 2012 with Microsoft TFS 2012.
On the workspace that is created on my c: drive, a hidden folder $tf is created. I suspect TFS from creating this folder. It's lurking diskspace as the current size is several gigabytes now and it's about 25% diskspace of the total amount of gigabytes needed for the complete workspace. So this hidden $tf folder is quite huge.
The structure is like this:
c:\workspace\$tf\0\{many files with guid in filename}.gz
c:\workspace\$tf\1\{many files with guid in filename}.gz
Does anyone know if I can delete this $tf folder safely or if it is absolutely necessary to keep track of changes inside the workspace?
TFS keeps a hash and some additional information on all file in the workspace so that it can do change tracking for Local Workspaces and quickly detect the changes in the files. It also contains the compressed baseline for your files. Binary files and already compressed files will clog up quite a bit of space. Simple .cs files should stay very small (depending on your FAT/NTFS cluster size).
If you want to get rid of these, then set the Workspace type to a server workspace, but lose the advantages of local workspaces.
Deleting these files will be only temporarily since TFS will force their recreation as soon as you perform a Get operation.
You can reduce the size of this folder by doing a few things:
Create small, targeted workspaces (only grab the items you need to do the changes you need to make)
Cloak folders, exclude folders containing items you don't need. Especially folders containing lots of large binary files
Put your dependencies in NuGet packages instead of checking them into source control..
Put your TFS workspace on a drive with a small NTFS/FAT cluster size (a cluster size of 64Kb will seriously enlarge the amount of disk space required if all you have are 1KB files.
To setup a server workspace, change the setting hidden in the advanced workspace setting section:
The simple answer: I deleted the $tf files once: the net result was that newly added files showed up in my pending changes, but when I changed an existing file, the change did not show up in my pending changes. So I would not recommend deleting this folder.
To answer the original question, the answer is yes. However, in order for TFS to track changes, it will need to be recreated, albeit with fewer folders and much smaller disk space. To do that:
First delete all the tf$ folders currently in your current workspace folder.
Next, move all of the remaining contents of the original folder to another empty folder, preferably one on another drive;
Perform a "Get latest" into the original (now empty) workspace folder (this will cause a single tf$ folder to be created in that original folder).
Now copy all of the contents you moved into the backup folder over the top of the 'Get latest' results in the original workspace folder.
By performing these steps in that order, you will end up with the tf$ entries TFS needs, but in a single folder and much more compact - additionally, the deltas of any changes you made that had not been checked in will be preserved and TFS will recognize them as pending changes as it should.
Our Certitude AMULETs C++ solution has 72 advanced projects in it, and we have to do this once a month to keep compiling and search speeds reasonable.
I deleted the $tf directory, and GetLatest behaved - it asked me if I wanted to keep the local files or replace with server. I could then check as normal.
The mildly annoy part was about 30 files I had locally that I had told to ignore appeared.

Resources