Perforce - Map a folder onto another location on the server - perforce

On perforce, we have a few projects all sharing the same 'framework' code. Right now we keep a copy of the framework under each project folder and merge between the projects. We considered keeping the framework in one folder and then mapping it onto each workspace. This would however require each developer to do it and might become messy. Is there a way to do this mapping server side so that anyone who takes a copy of a project will also get the framework folder?
We currently have
\\Project1\Source\Framework
\\Project2\Source\Framework
We would like to have
\\Project1\Source
\\Project2\Source
\\Framework
and somehow have each developer get Framework under each project's Source folder on her machine.

This is what streams are for. Make //Project1 and //Project2 stream depots, and then define streams that look like this:
Stream: //Project1/Source
Paths:
share ...
import Framework/... //Framework/...
Stream: //Project2/Source
Paths:
share ...
import Framework/... //Framework/...
Clients associated with those streams will automatically get the mappings that you're looking for.

Related

Using haxe to edit remote file?

I've searched in haxelib for a library to use for remotely editing a file on a server using ssh connection with haxe, or listing files in directory..
Has any one done this with haxe?
I want to build a desktop app to create a yaml editor that will change settings files of several servers using a frontend like haxe-ui.
Ok, there are probably a lot of ways you could do it, but I would suggest separating your concerns:
desktop app to create a yaml editor
Ok, that's a fine use case for Haxe / a programming language. Build an editor, check.
change settings files (located on) several servers
Ok, so you have options here. Either
Make the remote files appear as local files via some network file system, or
Copy the files locally, edit them , and copy them back, or
Roll your own network-enabled service that runs on each server, receives commands, and modifies the files.
Random aside: Given that these are settings files, you probably also want to restart some service after changes are made.
I'd say option 2 is the easiest. There are even many ways to do that:
Use scp to both bring the settings files to a local location, edit them locally, and then push them back. And if you setup SSH keys, you won't have to bother with passwords.
Netcat is another tool for pushing bytes (aka files) over the network. It's simpler than scp, but with no security measures.
Or, get creative / crazy, and say, "my settings files will all be stored in a git repo. The 'sync' process will be a push / pull setup."
There are simply lots of ways to get this done.

Setting up a trigger to watch new folders Azure Logic Apps

I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)

source code location for debugging multiple instance of an application

Hi have an application running separateley (1 instance for customer) in different folders, 1 per each customer.
Each customer is a separate user on my machine.
At the moment I have the source code in each of these folders where I rebuild the code per each instance. Would it be better if I do something like the following?
create a shared folder where I build the code
deploy the binary in each user folder.
allow permission for each user to access the source code in READ ONLY mode.
when it is time to debug, by using gdb in each user folder will allow to read the source code and debug will happen.
Do you think that this could be a better approach or there are better practice?
My only concern is that each user has the chance to read the source code, but since the user will not access directly his folder (it is in my control) this should not trouble me.
I am using CENTOS 6.4, SVN and G++/GDB.
in different folders
There are no "folders" on UNIX, they are called directories.
I rebuild the code per each instance
Why would you do that?
Is the code identical (it sounds like it is)? If so, build the application once. There is no reason at all to have multiple copies of the resulting binary, or the sources.
If you make the directory with sources and binaries world-readable, then every user will be able to debug it independently.

How to handle downloadable assets in MvvmCross

I have an app that synchronises content with a web server so that the app ends up with an offline and cut down version of the server based web pages. All text and html is stored in a SQLite database but what is the best approach for handling file assets? In my case this is a mix of image and audio files.
The synchronisation is all set up in the core project and my Touch project has a Content directory set up for storing the assets and my intention had been to have a similar setup for Droid. I could pass the list of files needed to the UI projects and download them from there but that seems wrong.
Thanks.
For that I would create a Service in Mvx which the ViewModels you create use for getting the external assets. Take for instance the Daily Dilbert Tutorial. You could consider the daily comics as being very similar to your external assets, where the DilbertService is used to get all the comics and presents them in a List. However your list could be a list of files located on the SDcard or where you decide to store your files.

Publish Web App files to different locations

I'm start to develop modules for DotNetNuke. I followed different tutorials(most by Chris Hammond) but there is something i don't like, and I'm searching a different way to do so.
He recommends to put a DotNetNuke installation, with IIS and SQL Server, on the developer PC and put your Project into the DesktopModules folder. I don't like it because i want to separate my project from DotNetNuke.
Is there a way to split the build/publish to different location like dll into folder x and all other stuff into folder y?
You could, but I don't really see the point. I see where you're coming from because it seemed awkward to me at first as well, but it really is the most efficient way to develop on the DNN platform. I have mine set up so all of my modules are in the same solution and branched in source from the root DNN folder. We don't keep the DNN core in source so the developer is responsible for that, although that may change at some point to keep versioning consistent.
By keeping your project located where it's installed, you can develop your modules the same way you'd develop any other web app you're building. If you make a change in markup you just have to save the file and refresh your page. If you change something in code just build and refresh.
If you really must keep them separate, you can absolutely do so (really the only benefit of this that I can see is that if you uninstall a module and accidentally click the checkbox to delete files - it happens - you don't have to worry about it). Create your project where you want it, change the Output Path to your DNN bin folder, and create post-build events to copy all of your .js, .ascx, and .css files (plus any others you may need - images, HTML files, XML files, etc.) to appropriate folder(s) in the DesktopModules folder. Just remember that you have to build the project every time you make ANY changes to test them, and you have to write/change your post-build events every time you add a new type of resource, change/add a directory, etc.
Either that or you can build an install package and uninstall/reinstall the module every time you change some padding in your stylesheet ... but I'd stick with keeping the project in the DesktopModules folder.
Sure you can do that.
Set up your solution and module projects anywhere, build the projects and copy the appropriate parts (such as the *.ascx , *.ascx.resx, *.dnn files) back to your website folder -
website/ DesktopModules/Your_module_name
Copy the module dll to the website's bin and you good to go.

Resources