Related
we are testing a setup to automate building of NSF's using headless designer.
when a developer pushes a change to a repository on github ultimately this will result in an update of an NSF that resides on a Domino server.
local odp -> github -> local nsf with headless designer -> replace design nsf on domino server
however we noticed that the process stops sometimes. as far as we can see headless designer cannot "copy" (or translate) the design elements from the ODP into a new local NSF. so only an empty skeleton NSF is created.
we noticed that the stop does NOT occurs when the name property in the .project file of the ODP has changed.
so somehow it looks as designer still has the ODP in memory and does not notice any changes, unless it "finds" a "new" project via the project description "name".
anyone experienced something similar ? or recommendations how to start designer without any cache?
An (in-?)elegant solution which I use, since I want to keep things tidy and separate for each build, is to compute a unique filename for the NSF. This will create a separate NSF for each build and while it will retain things like the Application name and template name, it is unique in enough other ways to not cause issues for the DDE headless build.
Defining in CI Config
]1
Using Env Vars in PowerShell Script
For example, I use an app specific prefix, defined in my GitLab CI config, which then is used with the unique build number (both set as environment variables), which my modified version of Egor Margineanu's PowerShell script picks up for the build.
Bottom Line
The unique namespacing means no conflict from DDE's perspective.
I am creating a Windows 10 Universal App which uses a local SQLite Database.
In order for the app to use the database file It must be placed in:
C:\Users\<Username>\AppData\Local\Packages\<Name of Package>\Local State
Now I understand this is the 'local' file structure for the application. However I have a pre-made database that the app needs to interact with and therefore should be bundled as part of the app on install.
Is there a method of including my database in a usable fashion when distributing my application via a side-load install?
Furthermore, This problem is of paramount importance as This 'C:\' Directory will not exist when pushing my application to the mobile phone or other Windows 10 (not a desktop) device.
You cannot package the database directly as read-write data (local state). If you only ever need to read from the database, you can just include it in your project and read it from Package.Current.InstalledLocation.
If you need to write to the database, but it contains some initial values you want to ship with your app, then you still need to include the database in your project, but then copy it from the InstalledLocation to ApplicationData.Current.LocalFolder if it doesn't exist when your app starts up.
You can all ways export your existing data base as SQL script and save it in your project assets.
On the first run of your application you can create the Sqlite file in your LocalFolder, and run the script with CREATE and INSERT queries.
I have created an Azure Cloud Service (call it A) in Visual Studio 2013, using Azure SDK v2.6. The cloud service has a single role from project B, project B references project C, and project C references project D. Project D includes a content file called D.dll.config. I have verified that when I build, the file D.dll.config exists in D\bin\Debug\, C\bin\Debug\, and B\bin\Debug\. However, when I run A in both the emulator and on Azure, my config file is absent.
On my local machine, this directory is A\csx\Debug\roles\B\approot\. Does anyone know how to get the configuration file to be included with my cloud service? A brief explanation as to why it is not being included to begin with would also be appreciated.
My coworker stumbled across a solution while he was creating a new cloud service. If I add a reference from project B (the one that contains the cloud service role) to project D (the one that contains the config file as content), then D.dll.config is included in the approot directory in my emulator, as well as in Azure.
This solution is still not ideal, as I have to add an explicit reference to all dll's with content. However, it is the best solution that I know of thus far.
I'm trying to use TFS but there is one problem:
I created project called NewWeb and added him to source control, after that I deleted this project and mapped Global to another project on cloud but one folder from still shown in the tree under the deleted project.
NewWeb is a separate Team Project, so has nothing to do with MVC4. You cannot map two source control folders to the same folder on your hard drive, so you can't have one folder shared by two team projects.
Instead, you will need the global stuff to be held in one place in source control and then referenced from there by your code solutions. Either use a separate Team Project for globals and each application, or have one Team Project for all your code, and place each app in its own subfolder.
finally, I'd advise that you try to minimise the mappings, I.e. map $/ to your hard drive so your workspace matches the source control layout. Using mappings to move things about Introduces complexity that leads to problems.
Try deleting $tf (hidden) folder from your project folder.
To avoid conflicts use tfs destroy command to destroy all the folders.
Goto File -> Source control -> Advenced -> Workspaces, in the popup click edit and remap the folder where there should be.
Check in pending changes.
I have a larger solution that I desire to distribute via ClickOnce. It consists of one main shell executable that directly references only a small subsection of libraries and processes that constitute the solution.
The solution consists of a few other processes and several libraries (some C++). I need to be able to include all of these libraries and processes in one ClickOnce distribution for both local builds and TFS server builds.
I cannot reference every other library and process form the shell project. And I do not wish to push these files into a MSI to be treated as a prerequisite as it would defeat the purpose of using ClickOnce to distribute/update the product.
What is the correct method to incorporate all of our necessary files/projects into a single ClickOnce distribution?
The IDE won't detect native DLLs as dependencies when publishing, but you can run the SDK tools directly to include them manually in your ClickOnce distribution. You can either use mage.exe in your post-build script or run MageUI.exe to have a wizard to guide you through the package generation.
Suggested reading:
Walkthrough: Manually Deploying a ClickOnce Application
Understanding Dependencies of a Visual C++ Application
There is an alternative to Visual Studio for this kind of situation. You could try using Mage, but it can be a little tricky to use. My company wrote an alternative called ClickOnceMore.
ClickOnceMore is a ClickOnce build tool for when you don't want or can't use Visual Studio to do ClickOnce builds.
There is a specific page on the UI for including files (using rules to include anything from a single file to an entire directory trees) so you should be able to do exactly what you need with it.
This is what I have done in a similar situation. I use TFS at work, so convert the terms to whatever you may use (or not use) for source control.
I have a main workspace that I use for all development of my application, I keep this workspace pristine.
I then created another workspace with a proper name (ex: solution-deploy) and in this workspace I do the following:
Get latest and merge everything from source-control into the deployment workspace
I build a Release build of my application
I r-click on the root (I put them in the root, because I need to access them from there, put them in whatever folder you want) project folder for my deployment project and select "Add -> Existing Item"
I browse in the file selector to the Release directory of the assemblies I want to add to my deployment package, select them, then I use the arrow next to the Add button and drop down to "Add As Link", do this for all of the assemblies you want to add and place them wherever you want them to be organized in your deployment
In the Solution Explorer, select the added assemblies and in the Properties window set the Build Action to "Content", this should be all you have to do, but others have had to also set the "Copy to Output Directory" to "Copy Always", I don't do that
Run a Release Build
Go to the Properties view for your deployment Project
Go to the Publish Tab and Click on the Application Files button
Your files should all be available and added to the Deployment
Set up your ClickOnce settings however you need them to be
Publish your ClickOnce package
Your published package should contain all of the assemblies you need now.
Keep your separate Deployment workspace set up this way and never check it in. Do your work in your development workspace. Whenever a new deployment is needed, open your solution in your Deployment workspace and get the latest code, build, then publish.