I've been following the progress of VS 2013 as its progressed through its releases (Beta, RC, and now GA) and have had an overall positive feeling about the release. Today, I spent the day working entirely in the new version (GA Premium) to ensure that it provides all of the basic functionality needed by my team prior to adoption.
It works well for general dev tasks - given several of my favorite extensions haven't been ported yet - but while attempting to administer our Azure storage from within the IDE it appears to be removed from the previous release.
While in VS2012's Server Explorer, I'm able to link to Azure account to discover all of my top level objects - Cloud Services, Service Bus, Storage, and VMs. Now within VS2013 - following the same setup procedures from the previous version - I only see Mobile Services, SQL Databases, and Web Sites under the Windows Azure node of Server Explorer.
After a little googling I found little insight or even others asking the same thing (since GA) so I'm starting to wonder if it's just me or if this is the way it'll be from now on??? Can anyone confirm and/or provide evidence from Microsoft indicating that this is intentional?
Installing the Windows Azure SDK should give you all of the features you're looking for. You can find directions here: http://www.windowsazure.com/en-us/develop/visual-studio-2013/ for using the current version of the SDK with the VS 2013 GA release.
Related
I need to create a virtual machine regarding the development of Dynamics CRM.
I need the below software into the Virtual Machine.
Visual Studio 2017 Community Edition
Microsoft Dynamics CRM Report Authoring Extension
Google Chrome Web Browser
Microsoft Office
Adobe Reader
Can someone help me to identify that? Can I install the above things into the Azure Virtual Machine in Azure DevTest Lab?
If yes, then how much would the cost be to use the above-configured virtual machine?
I need a virtual machine should have (4 cores and 8GB of RAM).
Any help would much be appreciated.
What you're asking is perfectly possible and it can be easily achieved using Visual Studio preconfigured VM's.
The VM size shouldn't be lower than a D2 v3 (in a recent project we used D4 and I was quite happy with its performance) which costs around $154/month. Of course, this is the price for the VM working 24x7 but price can go down substantially if you turn off the VM outside working hours (it can go down to around $45/month). You can estimate how many hours the VM will be needed and then calculate the price using this information.
If you're looking to use the Dynamics 365 Report Authoring Extension, please be aware that the latest Visual Studio version its supports is 2015. If you still want to use 2017 on your development, you can have a separate VM specific for reporting with VS 2015 and BIDS. Preconfigured VM's support both version so it's not a problem, and again, as you're only paying for a VM when is on, you won't double your costs by taking this approach.
I have developed the extension for microsoft edge browser.Now i want to pack the extension so that i can publish it. But I have not found any information that how to pack the extension. Can anyone tell me how to pack it.?
Currently you can't.
For the Windows 10 Anniversary Update, we are intentionally starting with a small set of extensions. The list of extensions is locked - you can see the list at our extensions page here. We want to be mindful about what extensions are available on the platform and watch for telemetry and feedback and make sure the reliability, performance and functionality of the browser isn’t impacted by these new features. Extension developers can submit a request to https://aka.ms/extension-request to be considered for a future update.
https://developer.microsoft.com/en-us/microsoft-edge/platform/faq/
Newly released steps for packaging an Edge extension are available here: https://developer.microsoft.com/en-us/microsoft-edge/platform/documentation/extensions/guides/packaging/
As Elad mentions, however, submitting to the Windows Store is still a process managed by Microsoft. Submitting a request to https://aka.ms/extension-request will get you added to the list for future consideration.
I believe you can use Visual Studio 2015 Community to do it (free download from Microsoft). I've been trying the same thing, but I'm using Windows 7 and it keeps crashing, so I don't know if it's possible from my OS (or even if publishing an "app" is the same as publishing an "extension").
Packaging Apps for Windows 10
I am using this video as a reference. It is basically explaining how to build Azure Api Apps using VS 2013. I want to do the same in VS2015 and have installed the latest Azure SDK. I am finding the following things missing and not able to find its equivalent in VS2015 or any documentation that says how its done differently using VS 2015
Here are a list of the things that I was not able to find:
At 4.58 into the video the option to right click and convert a regular Web Api app into a Azure App Api.
At 5.23 the apiapp.json file is no longer available in a VS 2015 Api App project. What if I want to change the swagger url? Where do I configure that?
At 12.13 into the video there is an option to generate an Azure Api App client. I could not find this either in VS 2015. All I have is this below
Am I bound to only provide urls for published API? What if I moved the API to a different URL later? I have not been thru this process yet so my next question is where do I configure the URL, which can possibly change as I move from DEV all the way thru to PROD.
Is there something missing in my tooling? Do I need to install something else other than the latest Azure SDK? How do I do those three items if my azure tools are up to date. I am using Microsoft Azure Tools for Microsoft Visual Studio 2015 - v2.9.40518.2
Back in December 2015, Microsoft made several changes/improvements to API Apps (they are listed here).
1-This feature no longer exist, you need to add the Swashbuckle nuget package manually.
2-The apiapp.json file is no longer used. Look for the SwaggerConfig.cs class in the App_Start folder.
3-The option has been renamed. Right-Click on the project in Solution Explorer and select Add/REST API Client...
After finding out that the caching API is severely divergent depending if you're targeting Windows Azure or Windows Server, I'm concerned that Microsoft isn't going to continue to develop AppFabric for Windows Server. Does anyone know if AppFabric for Windows Server is still being supported/developed?
It is currently being supported by Windows Server 2012:
http://blogs.msdn.com/b/workflowteam/archive/2012/10/25/appfabric-now-supported-on-windows-server-2012.aspx
Microsoft has released 4 cumulative updates for AppFabric (latest one being April 2013):
http://support.microsoft.com/kb/2800726
I think your answer is that it is being supported. However, I have not seen any publishings/blog posts about the technology unfortunately. All the development on distributed caching has been done on the Azure side, where they have 3 different caching API offerings (albeit one being deprecated and another in preview mode).
I personally have stopped using it, since the API causes problems with the Azure SDK/API and there are better options out there if you have a hybrid environment with Linux (i.e. Redis)
Edit (10/06/2014): Note Microsoft's guidance on using the AppFabric Cache for Azure..."We just announced support for the Azure Redis Cache, and we recommend new development use this cache." AND "If you recommend Redis, why do you have Managed Cache, an option that you do not recommend? To support customers who made investments into Velocity Cache, who have dependency on it in their apps, to give them as much time as they need to move to the Redis cache."
Link: https://azure.microsoft.com/en-us/pricing/details/cache/
Looks like Microsoft (at least for Azure) is recommending, developers start using the Redis Cache. May not apply for Windows Server Caching, but I with the slower release cadence of AppFabric...I would strongly look at other options.
I asked the team for you. Here's the answer today:
AppFabric will continue to be supported under Microsoft Support Lifecycle.
In March Microsoft shipped a CU5 for AF 1.1 and are actively working on a CU6.
The basic information is this:
If AppFabric is currently working for them, stick with AppFabric
For new development, evaluate Redis. This is Open Source and not MS-Supported.
We are working on a supported solution for Redis in the future.
// end
According to a MS blog post Microsoft is now ending support for AppFabric on the 2nd of April 2016 - see here: http://blogs.msdn.com/b/appfabric/archive/2015/04/02/windows-server-appfabric-1-1-ends-support-4-2-2016.aspx
I am in a situation where the corporation has just recently upgraded to TFS 2008. They have no intention of upgrading to TFS 2010 at this time. As a development group, we've moved to Visual Studio 2010 this week. As with any large corporation, we cannot get our own environment created to install TFS 2010. Steps on too many toes, and isn't corporate standard. Etc.
I want to take full advantage of the new testing features in relation to the new UI Testing and other features. This appears to require TFS 2010. So my "dream" is to do my daily work at the office and write tests, but at night, have my code synchronized with my TFS 2010 server at home and run automated builds with the full testing capabilities enabled.
So is there is best practice for this? I've read up on the Workspace theory and the binding issues that are involved and that sounds the biggest hurdle to overcome.
Possible Solution - Create two workspaces $/WorkProject and $/WorkProject-Mirror and use a custom application using FileSystemWatcher to kick off a job that synchronizes code changes and a custom rewrite of the bindings. Use job on work laptop and home machine to allow bi-directional binding.
Research to see if TFS Integration Platform will help with this
You are correct the new testing UI (Test Manager 2010) requires TFS 2010, you are also correct that you can use the TFS Integration Platform between a TFS2008 & TFS2010 server. Then use test manager on the 2010 server.
All the above should be easy, the tough part will be the bindings in the solution file. I would suggest you have a second one created that points to your TFS2010 server so that you can open the correct solution file for the correct environment without stepping on your co-workers toes.
I think the two workspace route is overkill, it's just a solution file you need.
I wonder if you could use a read-only account to perform a get from TFS2008 and then do a check-in to your TFS2010 with a more-privileged account. I'm sure those two things and a little clever PowerShell scripting could get you what you're looking for.
I would encourage you to write a second utility to monitor that this script continues to work and to notify you if it detects a failure or something.