Windows 10 universal app factory install - windows-10

Other than delivering the windows 10 app in the Windows Store, is there any other way to deliver it?
The objective is that, when I turn on the laptop/device (out of the box experience), I want the Windows 10 app already factory installed, so the user does not have to "download" the app from the Windows Store.
Is there some "backdoor" that I can preload the windows 10 app while burning the windows 10 image to the laptop/device?
Thanks!

If you're creating the image that is applied to the machines at the factory you can include it when creating the image.
There are various types of apps and many types of licensing though.
Typically people ask for this type of thing as part of an enterprise deployment scenario. If that's what you're after then a provisioning solution once removed "from the box" is probably more appropriate. There are MDM and linked ActiveDirectory solutions for provisioning apps and device configuration. I suggest you start by looking for such.
Update
If not an enterprise scenario, how would you already have the apps on the machine when you take it out of the box? I think what you're really after is a way to set up the machine when you take it out of the box. For that I'd recommend chocolately as an easy way of getting a machine set up how you want it.
In terms of the types of apps you are installing there are different considerations for services, win32 apps and windows store apps. What you are installing and how you create an image can have different consequences.
There are many scenarios regarding licensing but consider an app that is licensed on a per machine basis. If you created an image for machines as they come from the factory (typically enterprises do this when buying a lot and want an easy way to have them preconfigured) then you can't put the same single use license on all machines.

Related

How to offer a C++ Windows software as a service

We (ISV) are currently planning to offer our software on a rental/subscription basis as a service.
It's a native Windows (C++ / .NET) B2B application.
Our software needs access to the file system (drives) on the customers computer and it also needs access to the network (e.g. be able to find other computers in the network).
We want to offer our customers a service where they do not have to bother themselves with setup/updates and always work with the newest version of our software. So we need a single point of maintenance.
In the first phase we do not expect a lot of our customers (let's say 20) to change to this model, so it would not be a problem to have to set them up and manage them manually, but in the long run a solution that allows an automated set/sign up process would be required.
What I found most promising was Citrix XenDesktop/XenApp with VM hosted Apps and personal vDisks, but it seems that the Citrix solution is not able to get access to the network on the client PC (I tried it with the trial in the Azure Marketplace). Also it seems to be high priced.
What would be other possible ways to meet these requirements?
Unless you can make some significant architectural changes to eliminate the need to access the local filesystem and and eliminate the need to do local network browsing, I would recommend focusing on optimizing your local installation and update process. And skip the virtualization/service idea "for now".
You can still go to subscription model with a locally installed application. Just require your application to "phone home" to check its licensing/subscription status on startup.

Sharing Data between Windows Phone 8 and Windows 8 Store App

I am about to begin a fairly simple application, but I want to make sure I structure the backend of the application correctly because I plan to expand on it greatly in the future. Here's my question:
I am creating both a Windows Phone 8 and Windows 8 Store application. In this case, it is a unit conversion application where the user is given the ability to define custom unit conversion units. I would like to allow the user to essentially sync those custom units between the two platforms so that they don't need to define them multiple times.
What backend approach should I take?
XML storage coupled with SkyDrive, Azure, a local database that syncs over USB....There are a lot of options, and I'm not sure which way is preferred in the scenario I described above. Any help or suggestions would be greatly appreciated.
As for actual data sharing I would suggest using Azure, which is a bit more reliable and also transparent for the user (as opposed to a local db syncing over USB) and cleaner than XML-files in SkyDrive (the user doesn't need to see these files anyway).
As for code sharing you could use two techniques:
Portable Class Libraries
Linked Files
I have recently written two articles on this:
http://www.kenneth-truyers.net/2013/03/27/portable-class-libraries-or-source-code-sharing/
http://www.kenneth-truyers.net/2013/02/24/patterns-for-sharing-code-in-windows-phone-and-windows-8-applications/
It doesn't actually have to be Azure, if you are shooting for lower price range. You can also choose a webhosting and use build a WebAPI service, which will help you sync your data and put them on all devices. Of course, Azure is being preferred as the ultimate solution, because it offers much more features.
I have used windows 8 roaming data support for one app. In my case, data is simply the history of user operations in the app and data size is < 1k. windows 8 roaming data support can support up to 100k of data as per documentation and is a good start for w8 apps with very low investment. it covers for all w8 devices. it is certainly good for simple key/value pair kind of data for user.
Now the caveats - currently, it does not support windows phone roaming currently. It is a feature ask for phone 8 - it can be voted up. Finally, this will not roam to android and other mobile devices.
Another way to think about it - when do you need to build it?
If it is simply per user data storage - backend need not come in place in first release. You can start with w8 roaming data support and in future release x, it can be moved from windows 8 roaming data to skydrive or your web api or azure. what I mean to say, it need not be built affront.
If the backend is going to allow sharing of data between users or experience over aggregated data from multiple users - then, it is all together different problem. in that case, roaming data is not a solution. a backend web api or service is must.
HTH.
Other references: guidelines for windows 8 roaming data.

Azure configuration for a university student

Hopefully my question is in the right forum here. I've just checked out the pricing model of windows azure and checked out the different configuration options:
http://www.windowsazure.com/de-de/pricing/calculator/
I have been working as a developer for almost two years now and worked a lot with IIS and the WPF technology. As a little private project I checked out HTML 5 and JS with MVC4 Web API and wondered what azure configuration I'd need to host a MVC 4 Web API project. Would it be rather a virtual machine or a full calculator? What benefits grants one over another?
I am going to start my studies soon, so I'd like the cheapest I can possibly get. I won't use it a lot (mainly for testing reasons), as well I think there won't be too much traffic either. Would a virtual machine also include the possibility of using IIS?
Could I also run a MVC project with something else than VM/full calculator?
And what would happen if for some reason my traffic just explodes? Would my services just be shut down until I increase the power of my machine? Or would I just get a huge bill and be surprised quite a lot?
Use websites.
You can start with 10 Web Sites absolutely free! So this is the cheapest. And it certainly supports MVC4 Web API.
For starter you can get a 3 month trial with enough credits to start. By default you'll have a spending limit on your account. This mean if you start to get too much traffic your services will shut down and you won't have to pay any extra. I think you can configure how much you are willing to pay but I never tried, it is still the default which is 0$.
You should start with Shared Web Sites and move to reserved instance, VM or web role later if you ever need to scale up or out.

Azure releasing complications

We are considering to build a webapplication and rely on Azure. The main idea behind this application is that users are able to work together on specific tasks in the cloud. I'd love to go for the concept of instant releasing where users are not bothered with downtime but I have no idea how I can achieve this (if possible at all). Lets say 10.000 users are currently working on this webapplication, and I release software with database updates.
What happens when I publish a new release of my software into Azure?
What will happen to the brilliant work in progress of my poor users?
Should I bring the site down first before I publish a new release?
Can I "just release" and let users enjoy the "new" world as soon as they request a new page?
I am surprised that I can't find any information about releasing strategies in Azure, am I looking in the wrong places?
Windows Azure is a great platform with many different features which can simplify lots of software management tasks. However, bear in mind that no matter how great platform you use, your application depends on proper system architecture and code quality - well written application will work perfectly fine; poorly written application will fail. So do not expect that Azure will solve all your issues (but it may help with many).
What happens when I publish a new release of my software into Azure?
Windows Azure Cloud Services has a concept of Production and Staging deployments. New code deployment goes to staging first. Then you can do a quick QA over there (sometimes "warm up" the application to make sure it has all caches populated - but that depends on application design) and perform "Swap" - your staging deployment becomes production and production deployment becomes staging. That gives you ability to perform "rollback" in case of any issues with the new code. Swap operation is relatively fast as it is mostly internal DNS switch.
What will happen to the brilliant work in progress of my poor users?
It is always good idea to perform code deployments during the lowest site load (night time). Sometimes it is not possible e.g. if your application is used by global organization. Then you should use "the lowest" activity time.
In order to protect users you could implement solutions such as "automatic draft save" which happens every X minutes. But if your application is designed to work with cloud systems, users should not see any functionality failure during new code release.
Should I bring the site down first before I publish a new release?
That depends on architecture of your application. If the application is very well designed then you should not need to do that. Windows Azure application I work with has new code release once a month and we never had to bring the site down since the beginning (for last two years).
I hope that will give you better understanding of Azure Cloud Services.
Yes you can.
I suggest you create one of the visual stdio template applications and take a look at the "staging" and "production" environments located directly when you click your azure site in portal manager.
Say for example the users work on the "production" environment which is connected to Sqlserver1. You publish your new release to "staging" which is also connected to Sqlserver1. Then you just switch the two using the swap and staging becomes the "production" environment.
I'm not sure what happens to their work if they have something stored in sessions or server caches. Guess they will be lost. But client side stuff will work seamlessly.
"Should I bring the site down first before I publish a new release?"
I would bring up a warning (if the users work conissts of session stuff and so forth) saying brief downtime in 5 minutes and then after the swith telling everyone it is over.

How do I choose between using a web installer versus a standalone installer

For example, the .Net Framework 4.0 is available in either format. Is there any scenario where internet access is not a consideration (always on, high bandwidth) but the standalone installar option is the better choice?
Also, when utilizing a web installer, are there any specific advantages with respect to:
1) long-term disk space usage?
2) the ability to cleanly uninstall/repair the software?
There's many ways to package and distribute that are all optimized for differnet scenarios. What works well for an online distribution would not work well for an offline distribution or even in-the-middle scenarios.
For example, consider .NET. They have a Web and a Full.
The full is pretty much going to be best for Media based distributions and Enterprise customers where they want to put the framework on a network share. The Web is going to work best for a single user (Home) that wants the shortest possible download.
To understand shortest understand that the .NET 3.5 SP1 installer is actually a bootstrapper with many packages to account for 2.0, 3.0, 3.5, SP's, Hotfixes, 32bit components, 64bit components ( x64 and Itanium ).
For a home user with .NET 3.0 x86 OS there might be very little to download. For an enterprise user you can get it all from the network share without needing to repeatedly download bits from MSFT over and over. For a media customer their might not be any internet connection at all.
This is all seperate from caching concerns. An installer can choose to cache or not cache installation files regardless of whether it came from media, network or internet download on demand.
Other installs might not be as layered as .NET and have very little distinction between Web and Full. ( i.e. always all )
If you want to download it quickly, choose web installers, but beware of some, as these may be flagged by antiviruses. For extra security, choose standalone installers.

Resources