Error while integrating a deleted file in perforce - perforce

A parent project is throwing the following error when trying to integrate a child project.
Exception: < FILE > - can't integrate (already opened for delete)
Under what scenario does this occur?

When Perforce integrates changes, it tries to resolve the incoming changes in the files in the local workspace. This exception is basically saying that it can't attempt to resolve the changes for the given file, since the file has been marked as being deleted on the local workspace, so it has no place to resolve it.
There are flags you can pass to the integrate command to allow the integration to proceed anyway. However, I've found it's generally not a good idea to integrate changes on top of in-progress changes, so I would recommend finishing (or shelving) your in-progress changes, then attempting the integration.

This question comes up first on google when searching for "already opened for delete", so I thought I might post a way around this if you're not integrating, just trying to undo your own delete:
cp file file.bak # Make a backup of the file
p4 revert file # Revert the file
mv file.bak file # Recover from backup
p4 edit file # Open the file for edit

Related

Perforce error while running command p4 copy on new created workspace

I just created a new workspace. I am getting "Can't clobber writable file" error while doing p4 copy from a branch to another. Concerned file has been deleted on source branch. I did not touch concerned file. Even doing p4 sync -f before p4 copy command does not help. What could be issue ?
See How to fix Perforce error "Can't clobber writable file" or Perforce Error Message - Can't Clobber Writable File for more information on the "can't clobber" error in general.
In the specific case you describe where you just made a new workspace, my guess would be that you made the new workspace in a folder where files already existed locally (maybe on top of an existing workspace?). If you did create this workspace folder on top of an existing workspace, stop and pick a new root folder for this workspace; the other workspace won't "know" that its filesystem is being modified by the operations you do in this workspace, and when you switch back to that workspace everything is going to be bad (you might find that you've lost pending changes, sync won't be fetching the right thing unless you force-sync everything, et cetera).
If messing up another workspace isn't a concern, just do:
p4 clean
and once that's done your p4 copy should work.
The file is writable on your local machine. p4 is trying to protect you from getting rid of a file you might have edited.
Since you say you created a new workspace, I'm assuming it contains the same root as your previous workspace. If you know you want to get rid of everything, you can manually delete those files and retry your p4 sync -f or the hand p4 refresh.

Perforce reconcile command doesn't recognize files opened for edit that were deleted

Scenario:
I have a folder of files that are generated by an external tool that we check into Perforce for revision control, however we don't have knowledge ahead of time about when it's going to add/remove files from that structure.
So today our workflow is to checkout the entire directory, and then allow the tool to regenerate all of the files/hierarchy. When I run the reconcile command it successfully finds new files, but it fails to find files that were deleted.
Is there a better way to handle this?
Upgrade your Perforce server to 2014.2:
http://www.perforce.com/perforce/doc.current/user/relnotes.txt
Minor new functionality in 2014.2
#841159 **
'p4 reconcile' will now detect files that are open for edit but
missing from the client, and reopen them for delete.
A workaround is to do "p4 revert -k" prior to "p4 reconcile" so that it'll start over from scratch. The "-k" option tells revert to forget the files are open but NOT to actually undo the local changes.
A modified workflow that might make more sense if you're already using "p4 reconcile" religiously is to skip the "p4 edit" and use either the "allwrite" client spec option or the "+w" filetype modifier to make the files writable.

Perforce cmd line option to populate workspace

I am new to Perforce and have some experience using ClearCase earlier. I am using a Windows XP client and trying to set up my perforce client/workspace.
The Perforce view I have has mappings of type:
//depot/path/to/folder/... //my_workspace/depot/path/to/folder/...
However, I have not attempted the "Get latest revision" action (in p4v) for this workspace. That means, I don't have a local copy of the folder in question.
My question is: How do I populate the workspace with contents of the folder from the command line when the folder isn't present in the workspace ? The manual for p4 sync talks about getting a certain revision when the file is present in the workspace.
In terms of ClearCase, when the config spec for a snapshot view is having loadrules too, then cleartool can be told to pick the config spec from a text file and also load the contents of the view. I am trying to achieve a similar thing for Perforce.
Thanks in advance,
Parag Doke
Running the sync command will populate the workspace. If a file or folder isn't already present, it will be created during the sync operation.
With no additional flags specified, p4 sync will populate your workspace with the latest contents at the time the command is started.

Perforce not syncing files correctly

I'm using Perforce P4V, the graphical tool, to interface with my Perforce server here at work. I have a project I added to the depot and I accidentally deleted it from my workspace on my local computer, problem is when I use the Get Revision Action (the GUI equivalent of sync), the files don't get updated, i.e. I can see the files on the server that I want, but they won't sync correctly with my local PC. It's frustrating me that the files aren't getting pulled from the server. What I'm assuming should be happening is if files are altered in anyway on my local PC, I should be able to grab the revision from the server, which then pulls the data to my local PC and overwrites the changes locally on my PC, but that isn't happening. Is there something I'm missing?
Perforce keeps track of the files that it thinks that you have on your local workstation. If you delete those files locally (and don't "tell" perforce about it), then Perforce will still think that you have those files. If you want to get them back, you need to "force sync" the files. In p4v, you can use the "Get Revision..." item and in the subsequent dialog, you can check the "force operation" checkbox to tell Perforce to give you all the files again regardless of whether Perforce thinks that you need them.
Just to complete the information, if you ever do want to remove the files locally, you can do so through p4v by choosing the "Remove from Workspace" item. Doing so will remove the files locally as well as tell perforce that you no longer have those files so that next time you sync, those files will be retrieved from the server.
Like other people have mentioned, one solution is to do a "force sync" the entire depot which is basically overwriting everything from server into your local. The downside to this is that it could take a LONG time to finish if you are working on a big depot.
Another alternative is to compare your local workspace with the server, then only force sync the files that are missing from your workspace.
p4 diff -sd //Depot/path/… | p4 -x – sync -f
-sd option: Show only the names of unopened files that are missing from the client workspace, but present in the depot.
There are more options (sa/se/etc.) available if -sd is not what you need. see here.
credits for the command goes to this blog.
They won't update because according to Perforce you still have the files on your local machine.
You need to use the "Get Revison..." option and enable the "Force Operation" option.
This will tell Perforce to refresh all the files even those it thinks you have the latest version of.
"Get Revision" will update only files that are not opened (checked out) even when "Force Operation" is enabled. You should revert all files marked as checked out in that workspace, and then use "Get Revision" with "Force Operation"
I did as you suggested, but I kept getting the message that the files were still open for edit and cannot be deleted, when trying Remove from Workspace.
Also, Get Revision returned with a message that no files were updated.
What I ended up having to do was Revert the files, then do the Get Revision action, that solved the problem.
For people coming into this question, this worked for me on the mac command line ...
cd into your local perforce workspace - the base directory of the checked out files that you are working on.
p4 sync -f
-f is to force the sync.
This can also come in handy when you restore a mac from a time machine backup.
https://www.perforce.com/perforce/r12.1/manuals/cmdref/sync.html
Check out the file, change it a little bit and then revert. Perforce will replace the local file with the latest revision.

Tortoise Check-in error Checksum mismatch

I cannot figure out why I get this error during check-in. I checked in successful only a few hours ago so not sure why now it's complaining
Error: Commit failed (details follow):
Error: Checksum mismatch for
Error: 'C:\sss\sss\trunk\xxxx\.svn\text-base\Header.ascx.svn-base'; expected:
Error: '3cee96f580409a1711a47541a07860dd', actual: 'a5fc0f8819b88bf32ab38d4c9a6b0654'
Error: Try a 'Cleanup'. If that doesn't work you need to do a fresh checkout.
I got latest and also performed a clean-up which said successful so not sure what else to do.
Something has gotten out of sync or has become corrupt, and because it's in your .svn BASE directory, unless you are confident tinkering with this, you're probably better off deleting the parent of the .svn directory and then perform an update. Of course, take a backup or see if an export works before doing this, so you don't lose any changes.
FWIW, I get this sometimes with our library references where Visual Studio seems to keep a lock on some files (even though it's not compiling) and won't let me update them. I believe this is related to the xml documentation files.
Note: Subversion 1.7+ implements a new working copy approach which centralises the meta data, and it now has a single .svn directory at the root of your working copy. Your best bet is a cleanup, failing that a fresh checkout into another directory and export or file copy the corrupted working copy except for the .svn directory, over to the fresh checkout, and commit any local changes.
Looks like one of your SVN files is corrupt. First, check-in everything that can safely be checked in, and make sure to backup everything. Then fix the offending file - usually this involves deleting it from your repository. This should be okay if you're checking in a new version anyway.
I received a similar error after our project repository was moved to a new server. Try reverting your file and reapplying your changes.
I had same problem after googling for some help found articles that suggested to override the checksum in the .svn\entries file. But in that file the checksum was actually as the the expected one in the error message.
To fix the problem, I navigated to .svn\text-base dir of problem file's directory and found out that there's a copy of the file i was trying to check in changes for. I opened that file in Notepad++ and replaced it's content with content of the file to be commited and i was able to commit afterwards.
But just in case, make a backup copy of the .svn\text-base file.
I think this happened because i did an svn update before commit because it complained that my version is outdated. Anyway, it's fixed for me and hope my solution helps someone else too.
With Tortoise SVN, I choose to delete the file in Repo Browser.
First back up the problem file. and use Repo Browser delete the problem file in it, then update local folder so the file in local folder is deleted. Then copy back the backup file and Add > Commit, then I can update successfully.
The disadvantage of this method is the history of this file will be removed.
Also see another post.

Resources