Perforce Code checkout in windows - perforce

My perforce server is running on Linux.
I installed Perforce client [command line] on windows. When I try to checkout the files using following commands:
p4 -u user -P passwd sync -f ...
I see that its refreshing all the files but not checking out. It could be related to directory structure problem in windows. [Linux and windows have different directory structures like "a/b" and "a\b".]
Please help me how to checkout the code base in this situation.

If by "refreshing all the files" you mean that you're getting their local copies, then that's exactly what p4 sync does. Don't expect it to check files out in the sense of marking them as being worked on by you. For that, use p4 edit.

Related

How to find the local perforce depot path?

I am trying to write a maven compile command and want it to work on my colleagues' machines as well, and since they have a different perforce depot path than mine, I need a unified way to find it.
I have tried p4 where but it didn't help.
When you run the p4 where command make sure that you're using the correct client workspace.
E.g.:
p4 set P4CLIENT=my_client_name
p4 where //depot/...
or:
p4 -c my_client_name where //depot/...
The example you describe of p4 where returning the wrong path indicates that it's giving the answer in terms of a different client workspace.

Revert file in Perforce P4V from a different computer

I have a couple files checked out in Perforce on a different computer that I won't have access to until tomorrow. I received a request to revert them but I'm not sure how to do that from my home computer. I don't see an option and I can't find any results about this for the visual client. Is it even possible?
I ended up finding a suitable solution for myself.
I opened P4V and navigated to my workspaces. I edited the one with the checked out files and changed the workspace root and host to be from my current computer. After applying that, I was able to revert them as I normally would from the P4V client.
p4 revert takes as an option the client name, i.e workspace name.
From P4V>File> Open Command window here and try :
p4 revert -C *yourotherworkspace* -c changelist //...
If this is not enough you might have to add some global options settings like :
p4 -H remote_computer_name revert -C *yourotherworkspace* -c changelist //...

Going back to the history of my working directory - git, bash

I used git commit command directly on a remote repo, then git reseting hard to HEAD^,and I don't know which files of the remote directory were deleted, is possible to find the history of the files of my remote working directory ? Perhaps a bash command would suffice ?
You can view what files were deleted with:
git log --diff-filter=D --summary | grep delete

p4 sync issue: Unable to sync all files after changing mapping and deleting folder

I wanted to change the path of mapping in perforce client
For ex:
//depot/a/... /home/user/xyz/...
to
//depot/a/... /home/user/p4/xyz/...
After changing the path using p4 client, I have deleted folder xyz using rm -rf.
Then ran p4 sync but I think some files are not getting synced to new path p4/xyz/.
The server thinks that the workspace/client already has the #head revision of these files and does not need to sync them again.
You can try using -f option to force a refresh however that is taking a hammer to problem that just needs a pair of pliers.
I would try a p4 sync -k //depot/a/...#none to update the server 'have' list. Type [p4 help sync][1] for more information about the command.
Try using the force (-f) flag. See documentation.
There should also be a checkbox in the GUI, if you're using the visual client.
Same problem here and none of the solution worked. It turns out that file I was trying to sync was p4 opened. Once I reverted that back using p4 revert it synced back.
Below solution works for me.
p4 sync -f //depot/a/...
What I did is,synced the whole base directory,directory which contains all my files and folders.
-f is necessary because it will actually copy, which doesn't happen in normal sync situation(try doing ls on file for confirmation).
In case of a single file rm -rf situation you can do
p4 sync -f path/to/file

Keep Remote Directory Up-to-date

I absolutely love the Keep Remote Directory Up-to-date feature in Winscp. Unfortunately, I can't find anything as simple to use in OS X or Linux. I know the same thing can theoretically be accomplished using changedfiles or rsync, but I've always found the tutorials for both tools to be lacking and/or contradictory.
I basically just need a tool that works in OSX or Linux and keeps a remote directory in sync (mirrored) with a local directory while I make changes to the local directory.
Update
Looking through the solutions, I see a couple which solve the general problem of keeping a remote directory in sync with a local directory manually. I know that I can set a cron task to run rsync every minute, and this should be fairly close to real time.
This is not the exact solution I was looking for as winscp does this and more: it detects file changes in a directory (while I work on them) and then automatically pushes the changes to the remote server. I know this is not the best solution (no code repository), but it allows me to very quickly test code on a server while I develop it. Does anyone know how to combine rsync with any other commands to get this functionality?
lsyncd seems to be the perfect solution. it combines inotify (kernel builtin function which watches for file changes in a directory trees) and rsync (cross platform file-syncing-tool).
lsyncd -rsyncssh /home remotehost.org backup-home/
Quote from github:
Lsyncd watches a local directory trees event monitor interface (inotify or fsevents). It aggregates and combines events for a few seconds and then spawns one (or more) process(es) to synchronize the changes. By default this is rsync. Lsyncd is thus a light-weight live mirror solution that is comparatively easy to install not requiring new filesystems or blockdevices and does not hamper local filesystem performance.
How "real-time" do you want the syncing? I would still lean toward rsync since you know it is going to be fully supported on both platforms (Windows, too, with cygwin) and you can run it via a cron job. I have a super-simple bash file that I run on my system (this does not remove old files):
#!/bin/sh
rsync -avrz --progress --exclude-from .rsync_exclude_remote . remote_login#remote_computer:remote_dir
# options
# -a archive
# -v verbose
# -r recursive
# -z compress
Your best bet is to set it up and try it out. The -n (--dry-run) option is your friend!
Keep in mind that rsync (at least in cygwin) does not support unicode file names (as of 16 Aug 2008).
What you want to do for linux remote access is use 'sshfs' - the SSH File System.
# sshfs username#host:path/to/directory local_dir
Then treat it like an network mount, which it is...
A bit more detail, like how to set it up so you can do this as a regular user, on my blog
If you want the asynchronous behavior of winSCP, you'll want to use rsync combined with something that executes it periodically. The cron solution above works, but may be overkill for the winscp use case.
The following command will execute rsync every 5 seconds to push content to the remote host. You can adjust the sleep time as needed to reduce server load.
# while true; do rsync -avrz localdir user#host:path; sleep 5; done
If you have a very large directory structure and need to reduce the overhead of the polling, you can use 'find':
# touch -d 01/01/1970 last; while true; do if [ "`find localdir -newer last -print -quit`" ]; then touch last; rsync -avrz localdir user#host:path; else echo -ne .; fi; sleep 5; done
And I said cron may be overkill? But at least this is all just done from the command line, and can be stopped via a ctrl-C.
kb
To detect changed files, you could try fam (file alteration monitor) or inotify. The latter is linux-specific, fam has a bsd port which might work on OS X. Both have userspace tools that could be used in a script together with rsync.
I have the same issue. I loved winscp "keep remote directory up to date" command. However, in my quest to rid myself of Windows, I lost winscp. I did write a script that uses fileschanged and rsync to do something similar much closer to real time.
How to use:
Make sure you have fileschanged installed
Save this script in /usr/local/bin/livesync or somewhere reachable in your $PATH and make it executable
Use Nautilus to connect to the remote host (sftp or ftp)
Run this script by doing livesync SOURCE DEST
The DEST directory will be in /home/[username]/.gvfs/[path to ftp scp or whatever]
A Couple downsides:
It is slower than winscp (my guess is because it goes through Nautilus and has to detect changes through rsync as well)
You have to manually create the destination directory if it doesn't already exist. So if you're adding a directory, it won't detect and create the directory on the DEST side.
Probably more that I haven't noticed yet
Also, do not attempt to synchronize a SRC directory named "rsyncThis". That will probably not be good :)
#!/bin/sh
upload_files()
{
if [ "$HOMEDIR" = "." ]
then
HOMEDIR=`pwd`
fi
while read input
do
SYNCFILE=${input#$HOMEDIR}
echo -n "Sync File: $SYNCFILE..."
rsync -Cvz --temp-dir="$REMOTEDIR" "$HOMEDIR/$SYNCFILE" "$REMOTEDIR/$SYNCFILE" > /dev/null
echo "Done."
done
}
help()
{
echo "Live rsync copy from one directory to another. This will overwrite the existing files on DEST."
echo "Usage: $0 SOURCE DEST"
}
case "$1" in
rsyncThis)
HOMEDIR=$2
REMOTEDIR=$3
echo "HOMEDIR=$HOMEDIR"
echo "REMOTEDIR=$REMOTEDIR"
upload_files
;;
help)
help
;;
*)
if [ -n "$1" ] && [ -n "$2" ]
then
fileschanged -r "$1" | "$0" rsyncThis "$1" "$2"
else
help
fi
;;
esac
You could always use version control, like SVN, so all you have to do is have the server run svn up on a folder every night. This runs into security issues if you are sharing your files publicly, but it works.
If you are using Linux though, learn to use rsync. It's really not that difficult as you can test every command with -n. Go through the man page, the basic format you will want is
rsync [OPTION...] SRC... [USER#]HOST:DEST
the command I run from my school server to my home backup machine is this
rsync -avi --delete ~ me#homeserv:~/School/ >> BackupLog.txt
This takes all of the files in my home directory (~) and uses rsync's archive mode (-a), verbosly (-v), lists all of the changes made (-i), while deleting any files that don't exist anymore (--delete) and puts the in the Folder /home/me/School/ on my remote server. All of the information it prints out (what was copied, what was deleted, etc.) is also appended to the file BackupLog.txt
I know that's a whirlwind tour of rsync, but I hope it helps.
The rsync solutions are really good, especially if you're only pushing changes one way. Another great tool is unison -- it attempts to syncronize changes in both directions. Read more at the Unison homepage.
Great question I have searched answer for hours !
I have tested lsyncd and the problem is that the default delay is far too long and no example command line give the -delay option.
Other problem is that by default rsync ask password each time !
Solution with lsyncd :
lsyncd --nodaemon -rsyncssh local_dir remote_user#remote_host remote_dir -delay .2
other way is to use inotify-wait in a script :
while inotifywait -r -e modify,create,delete local_dir ; do
# if you need you can add wait here
rsync -avz local_dir remote_user#remote_host:remote_dir
done
For this second solution you will have to install inotify-tools package
To suppress the need to enter password at each change, simply use ssh-keygen :
https://superuser.com/a/555800/510714
It seems like perhaps you're solving the wrong problem. If you're trying to edit files on a remote computer then you might try using something like the ftp plugin for jedit. http://plugins.jedit.org/plugins/?FTP This ensures that you have only one version of the file so it can't ever be out of sync.
Building off of icco's suggestion of SVN, I'd actually suggest that if you are using subversion or similar for source control (and if you aren't, you should probably start) you can keep the production environment up to date by putting the command to update the repository into the post-commit hook.
There are a lot of variables in how you'd want to do that, but what I've seen work is have the development or live site be a working copy and then have the post-commit use an ssh key with a forced command to log into the remote site and trigger an svn up on the working copy. Alternatively in the post-commit hook you could trigger an svn export on the remote machine, or a local (to the svn repository) svn export and then an rsync to the remote machine.
I would be worried about things that detect changes and push them, and I'd even be worried about things that ran every minute, just because of race conditions. How do you know it's not going to transfer the file at the very same instant it's being written to? Stumble across that once or twice and you'll lose all of the time-saving advantage you had by constantly rsyncing or similar.
Will DropBox (http://www.getdropbox.com/) do what you want?
User watcher.py and rsync to automate this. Read the following step by step instructions here:
http://kushellig.de/linux-file-auto-sync-directories/
I used to have the same setup under Windows as you, that is a local filetree (versioned) and a test environment on a remote server, which I kept mirrored in realtime with WinSCP. When I switched to Mac I had to do quite some digging before I was happy, but finally ended up using:
SmartSVN as my subversion client
Sublime Text 2 as my editor (already used it on Windows)
SFTP-plugin to ST2 which handles the uploading on save (sorry, can't post more than 2 links)
I can really recommend this setup, hope it helps!
I have been using WinSCP on Wine for a few years now and it works fine for the syncing operations you mention.
Here are some instructions I posted to Github on how to setup via wine: WinSCP_On_Wine
Just be aware that WinSCP is not being actively tested on wine so there may be some quirky issues. however, I use it daily on Ubuntu 20.04 for all my devops and have never lost a file and rarely experience any of such quirks.
You can also use Fetch as an SFTP client, and then edit files directly on the server from within that. There are also SSHFS (mount an ssh folder as a Volume) options. This is in line with what stimms said - are you sure you want stuff kept in sync, or just want to edit files on the server?
OS X has it's own file notifications system - this is what Spotlight is based upon. I haven't heard of any program that uses this to then keep things in sync, but it's certainly conceivable.
I personally use RCS for this type of thing:- whilst it's got a manual aspect, it's unlikely I want to push something to even the test server from my dev machine without testing it first. And if I am working on a development server, then I use one of the options given above.
Well, I had the same kind of problem and it is possible using these together: rsync, SSH Passwordless Login, Watchdog (a Python sync utility) and Terminal Notifier (an OS X notification utility made with Ruby. Not needed, but helps to know when the sync has finished).
I created the key to Passwordless Login using this tutorial from Dreamhost wiki: http://cl.ly/MIw5
1.1. When you finish, test if everything is ok… if you can't Passwordless Login, maybe you have to try afp mount. Dreamhost (where my site is) does not allow afp mount, but allows Passwordless Login. In terminal, type:
ssh username#host.com
You should login without passwords being asked :P
I installed the Terminal Notifier from the Github page: http://cl.ly/MJ5x
2.1. I used the Gem installer command. In Terminal, type:
gem install terminal-notifier
2.3. Test if the notification works.In Terminal, type:
terminal-notifier -message "Starting sync"
Create a sh script to test the rsync + notification. Save it anywhere you like, with the name you like. In this example, I'll call it ~/Scripts/sync.sh I used the ".sh extension, but I don't know if its needed.
#!/bin/bash
terminal-notifier -message "Starting sync"
rsync -azP ~/Sites/folder/ user#host.com:site_folder/
terminal-notifier -message "Sync has finished"
3.1. Remember to give execution permission to this sh script. In Terminal, type:
sudo chmod 777 ~/Scripts/sync.sh
3.2. Run the script and verify if the messages are displayed correctly and the rsync actually sync your local folder with the remote folder.
Finally, I downloaded and installed Watchdog from the Github page: http://cl.ly/MJfb
4.1. First, I installed the libyaml dependency using Brew (there are lot's of help how to install Brew - like an "aptitude" for OS X). In Terminal, type:
brew install libyaml
4.2. Then, I used the "easy_install command". Go the folder of Watchdog, and type in Terminal:
easy_install watchdog
Now, everything is installed! Go the folder you want to be synced, change this code to your needs, and type in Terminal:
watchmedo shell-command
--patterns="*.php;*.txt;*.js;*.css" \
--recursive \
--command='~/Scripts/Sync.sh' \
.
It has to be EXACTLY this way, with the slashes and line breaks, so you'll have to copy these lines to a text editor, change the script, paste in terminal and press return.
I tried without the line breaks, and it doesn't work!
In my Mac, I always get an error, but it doesn't seem to affect anything:
/Library/Python/2.7/site-packages/argh-0.22.0-py2.7.egg/argh/completion.py:84: UserWarning: Bash completion not available. Install argcomplete.
Now, made some changes in a file inside the folder, and watch the magic!
I'm using this little Ruby-Script:
#!/usr/bin/env ruby
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Rsyncs 2Folders
#
# watchAndSync by Mike Mitterer, 2014 <http://www.MikeMitterer.at>
# with credit to Brett Terpstra <http://brettterpstra.com>
# and Carlo Zottmann <https://github.com/carlo/haml-sass-file-watcher>
# Found link on: http://brettterpstra.com/2011/03/07/watch-for-file-changes-and-refresh-your-browser-automatically/
#
trap("SIGINT") { exit }
if ARGV.length < 2
puts "Usage: #{$0} watch_folder sync_folder"
puts "Example: #{$0} web keepInSync"
exit
end
dev_extension = 'dev'
filetypes = ['css','html','htm','less','js', 'dart']
watch_folder = ARGV[0]
sync_folder = ARGV[1]
puts "Watching #{watch_folder} and subfolders for changes in project files..."
puts "Syncing with #{sync_folder}..."
while true do
files = []
filetypes.each {|type|
files += Dir.glob( File.join( watch_folder, "**", "*.#{type}" ) )
}
new_hash = files.collect {|f| [ f, File.stat(f).mtime.to_i ] }
hash ||= new_hash
diff_hash = new_hash - hash
unless diff_hash.empty?
hash = new_hash
diff_hash.each do |df|
puts "Detected change in #{df[0]}, syncing..."
system("rsync -avzh #{watch_folder} #{sync_folder}")
end
end
sleep 1
end
Adapt it for your needs!
If you are developing python on remote server, Pycharm may be a good choice to you. You can synchronize your remote files with your local files utilizing pycharm remote development feature. The guide link as:
https://www.jetbrains.com/help/pycharm/creating-a-remote-server-configuration.html

Resources