how to merge 2 pending changelist with common files - perforce

I have a pending CL in my workspace with files 'A', 'B', 'C'.
I have a local uncommitted changelist saved on my disk with files 'A', 'C', 'D'.
So while merging this local CL with p4merge to the existing workspace, it shows following error,
"file already checked out in this workspace"
Is there a proper way to do this?

I am not sure what a local uncommitted changelist means here.
If I consider the CL with files 'A', 'C', 'D' to be another pending CL in same Work Space in which you have the pending CL with files 'A', 'B', 'C', then I think a force integration or merge can help you...

Related

fastest way to match data from two massive lists with differing data types?

I have data regarding a directory structure of unknown (and massive) size and data regarding the same structure from perforce. Using Python, I need to be able to match the local data with the perforce data and generate a list of files that reflects all of the data on the users workspace (local directory), including all of the files missing from perforce, as well as all the data in the depot that is missing from the workspace.
Local Directory Structure Data:
I have full control over how I mine out that data (currently using os.walk)
Perforce Data:
Not much control over how the data is returned
Currently comes as a list of dictionaries
Data returns very fast regardless of size.
#this list is hundreds of thousands of entries.
p4data_example = [{'depotFile': '//Path/To/Data/file.extension', 'clientFile': 'X:\\Path\\To\\Data\\file.extension', 'isMapped': '', 'headAction': 'add', 'headType': 'text', 'headTime': '00000', 'headRev': '1', 'headChange': '0000', 'headModTime': '00000', 'haveRev': '', 'otherOpen': ['stuff'], 'otherAction': ['move/delete'], 'otherChange': ['00000'], 'otherOpens': '1'}]
I need to operate on the local directory files whether or not they have matching p4 data.
path_to_data = "X:\Path\To\Data"
p4data = p4.run('fstat', "%s\..." % path_to_data)
for root, dirs, files in os.walk(path_to_data, topdown = False):
for file in files:
os.path.join(root,file)
matchingp4 = None
for p4item in p4Data:
if p4item['clientFile'] == file_name:
matchingp4 = p4item
break
do_stuff_with_data(foo, bar)
I am confident this is not the most efficient way to handle this.
The extended time seems to come from:
Getting all of the local data
Needing to loop over the data so many times to find matches.
I need this to run as fast as possible. Ideally this would run in just a couple seconds but I understand that not knowing how large the data set can get will cause this to vary by an unknown amount.
Using Python, I need to be able to match the local data with the perforce data and find all of the local files missing from perforce and all of the perforce data that differs from the local data.
(snip)
I am confident this is not the most efficient way to handle this.
Correct. Just run p4 reconcile and Perforce will do all of this automatically. :)
reconcile does essentially what you're trying to do, but much more efficiently -- the client walks the local tree, sends a list of files to the server, and then instead of doing an NxN comparison the server uses the mapping information to directly request additional client checks (i.e. checksumming to detect differences) as appropriate for individual files.

What is the meaning of the last letter "a", "b", or "s" in Azure DevOps' C:\agent\_work\1\a? Anyone know what that last letter stands for?

What is the meaning of the last letter "a", "b", or "s" in Azure DevOps' directory structure:
C:\agent_work\1\a
C:\agent_work\1\b
C:\agent_work\1\s
Anyone know what that last letter stands for?
Extra credit: what's the "1" represent too?
Check out the predefined variables:
\a likely stands for "artifacts"
Build.ArtifactStagingDirectory - The local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a
\b likely stands for "binaries"
Build.BinariesDirectory - The local path on the agent you can use as an output folder for compiled binaries. For example: c:\agent_work\1\b.
\s likely stands for "sources"
Build.Repository.LocalPath - The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s. This variable is synonymous with Build.SourcesDirectory.
On Hosted build agents "1" seems to be static, on a self hosted agent I have noticed this number is unique per pipeline.

P4python needs to check what files are changed in a specific changelist

Using python and p4python I'm trying to shows the files that are changed in a changelist. I
result = p4.run_describe("2631893", tagged = 0)
This shows the files in a change list and not what is
result = p4.run_diff("-sa")
shows all the changed files in the client. What I am looking for is a run_diff similar functions that gives the name of changed files in a specific changelist. Is it possible?
UPDATE:
After thinking twice, I came to the fact that I shiuld probably write what I am trying to do
The idea is this that I check out some simulink models , run code generation for all models. There are already some generated code in the depot that belongs to each Simulink model. I need to check if the models generate the same code that is already in depo. If they are not the same then the name of those files should be printed. So my strategy is this
1) Make a changelist. DONE
2) check iut the models in that changelist DONE
3) check out all the already gererated files in a different change list (lets call it CL 2) DONE
4) generate code DONE
5) Revert unchanged files from that changelist (dont know how to do. It should only revert unchanged files from THAT Changelist e g. CL2)
6) if CL2 is empty then fine. Otherwise print the file name.
P4.revert('-a' , CL2)
Does not work. And i dont inow how to get the number of file in a CL from python.

Terraform: dynamically create list of resources

Due to some prior decisions, there is a script that creates ALBs and a completely separate one to setup alarms for each ALB created (odd, but I can't change this).
I could hard code a list of all the ALBs and iterate thru them, ie:
albs = ['a', 'b']
I know how to iterate thru a list with a "for_each".
What I need is to build the list dynamically so I don't have to manually maintain the list. I know I can get a list of ALBs using:
terraform state list [options] ## https://www.terraform.io/docs/commands/state/list.html
but that doesn't really help (sure, I can pipe that to a file and iterate through the lines in the file and pass them as parameters - but that is ugly as sin)
How do I dynamically build the list with all my ALBs? Something like:
albs = state_list([options])
Thanks! Using AWS.

Perforce - how to back-out changelist from master branch

I have following changelists in perforce:
1 - some work on //depot/A/file
2 - some work on //depot/A/file
3 - branching of //depot/A to //depot/B
4 .... - some work on //depot/A/file
And I want to backout changelist 2 on //depot/B.
I've tried following:
p4 sync //depot/B/file#1
p4 edit //depot/B/file
p4 sync //depot/B/file#2
....
but error occured on first line.
//depot/B/file#1 - no file(s) at that changelist number.
Is there any way how to achieve this without submitting into //depot/A branch?
Here's what I'd do:
p4 copy //depot/A/...#1 //depot/B/...
p4 submit
p4 merge //depot/A/...#2 //depot/B/...
p4 resolve -ay
p4 submit
p4 merge //depot/A/... //depot/B/...
p4 resolve -am
p4 resolve
p4 submit
You could potentially do this all within a single changelist as well, but it gets a little trickier then -- the above keeps it simple and leaves a history that is easy to follow (i.e. each revision is clearly "copied from this change," "ignored this change", or "merged these changes" rather than a single revision that mushes those actions all together).
You can't simply take out 2 from B because it came together from A as one change (1 & 2).
I think the only way to achieve this is:
roll back 3 on B (p4 edit //depot/B/file; p4 sync //depot/B/file#0; p4 submit //depot/B/file or p4 delete //depot/B/file; p4 submit //depot/B/file)
integrate 1 from A to B
integrate 4 from A to B
Having said that, this has two drawbacks:
if you ever want to re-integrate 2 from A to B in the future, P4 will be confused because it knows that it already has integrated 2 from A to B
if you want to integrate back from B to A, this will propagate the reversal of 2 on B back to A, which probably isn't what you want.
So, even though it's more elaborate, the only correct way to revert an integration is exactly what you don't want to do:
roll back 2 on A
integrate A to B
re-submit 2 on A
Based on your attempt to sync to //depot/B/file#1, I'm assuming the file did not previously exist on //depot/B/...?
If my assumption is correct, you'll want to delete the file:
p4 delete //depot/B/file
and submit it.
If my assumption is incorrect and your newly-branched file is #2 or higher, then:
p4 edit //depot/B/file#1
p4 resolve -ay //depot/B/file
p4 submit

Resources