We are about to split our testing and production instances in Windows Azure into two separate subscriptions. Currently we have 3 Windows Azure SQL Database instances that reside within the same subscription:
Production
Reporting
Testing
In order to completely isolate production we are splitting these into:
Production Subscription
Production
Reporting
Testing Subscription
Testing
At the moment we use the CREATE DATABASE X AS COPY OF [ServerName].Y command to copy databases from production to testing before we obfuscate the live data. This operation can be performed so long as the databases are geo-located in the same data centre and we have a shared login across the instances that created the database in the first place (As indicated by this article).
However; the article does not indicate whether the source and destination instance need to belong to the same subscription. Are we able to copy the database between the production subscription and testing subscription (And vica verca) assuming we use a consistent login?
You can just do a backup (Export) to blob storage and then Import it in the new subscription.
http://msdn.microsoft.com/en-us/library/f6899710-634e-425a-969d-8db1267e9471
Update:
If you can use SSMS, this answer is right. I only want to add some details.
You can export the source database into storage in Azure Portal.
After exporting, you can find the bacpac file.
Open SSMS, and connect to the destination server.
Right click the node Database and select Import Data-tier Application
Then you can choose import the database from local disk or Azure storage.
After that, you have copied the database from source to destination.
For anyone landing here, it does appear to be possible to use CREATE DATABASE newDB AS COPY OF [server].[olddb] ( OPTION [, OPTION ...] ) even when the servers are in different subscriptions.
See more at Create Database (Azure SQL Database) - MSDN
Example from MS Docs:
CREATE DATABASE db_copy AS COPY OF ozabzw7545.db_original ( SERVICE_OBJECTIVE = 'P2' ) ;
In my setup I have the admin account and password (login) the same on both servers - that probably helps.
Operation will fail if you don't have admin permissions on original server.
I have found through testing that I am not able to change the Edition from Standard to Premium despite including the 'Edition' option - I'm not sure why that is.
I have created copies of databases across Azure subscriptions successfully.
Here are the steps -
On the target Azure subscription create a database server (if you haven't already created one), and on that create a new DB (any name, doesn't matter) but with the same password as the source database on your source Azure subscription. For me it didn't work with different passwords, so I just went ahead with using the same, but I am sure there is a way to make it work with different passwords as well.
Run this on the newly created database in your target Azure -
CREATE DATABASE NEWDBNAME
AS COPY OF [Source Azure Server Name here].[source DB]
Let Azure handle the new DB pricing tier (Basic, Standard etc), because you can immediately change it from the portal after the DB is created. In my case the target DB was created with the same pricing tier as the source DB.
Also, the server names in azure are usually - NAME.database.windows.net. So in your source name above, just put NAME.
Now on your target Azure subscription you will have 2 databases on the new DB server. One which was created in step 1 and the other in step 2 which is the actual copy. You can go ahead and safely delete the one which you don't need.
If you want to copy other source DBs to the same target server created in 1 above, just run the same command again.
I guess you already have a solution, however for anyone landing here, you can use the Azure PowerShell API's to Create a new server in the source subscription, create a copy and switch over the new server to the destination subscription
Sample code is available on technet
The code is self explanatory, however in the interest of SO best practices,
Key portions of the code are
Create a new server:
$newserver = New-AzureSqlDatabaseServer -Location $targetServerLocation -AdministratorLogin $targetServerLoginID -AdministratorLoginPassword $targetServerLoginPassword
Create a database copy:
Start-AzureSqlDatabaseCopy -ServerName $sourceServerName -DatabaseName $sourceDatabaseName -PartnerServer $newserver.ServerName -PartnerDatabase $targetdatabaseName
Transfer the server
$uri = "https://management.core.windows.net:8443/" + $sourceSubscriptionID + "/services" + "/sqlservers/servers/" + $newserver.ServerName + "?op=ChangeSubscription"
Invoke-RestMethod -Uri $uri -CertificateThumbPrint $certThumbprint -ContentType $contenttype -Method $method -Headers $headers -Body $body
You can do this in SSMS on the target server using
CREATE DATABASE db1 AS COPY OF sourcesrv.db1
to copy from sourcesrv.database.windows.net which is in a different subscription.
However, you must first check you can connect in SSMS to the SOURCE server too, or you will get a totally confusing error message which hides the actual problem.
The source server may be one you regularly connect to, but not from the IP address you're currently on. In that case you must add the IP to the server's firewall rules. This is easily done using the dialog that appears when you try to connect from SSMS:
Leave the default radiobutton checked ("Add my client IP") and press OK.
If you omit this check and it fails to authenticate you, instead of telling you the correct reason as above, it tells you you can't make a copy on the SOURCE server!
--In SSMS connected to targetsrv:
CREATE DATABASE db1 AS COPY OF sourcesrv.db1
--Here it should say, "Your client IP address does not have access" to sourcesrv, like when
--you try to connect in SSMS. Instead, it says you can't copy to the SOURCE, even though you
--told it to copy FROM the source to the TARGET:
--Msg 45137, Level 16, State 1, Line 7
--Insufficient permission to create a database copy on server 'sourcesrv'.
Note that at the time of writing, the two servers need to be configured with the same admin credentials, otherwise the CREATE DATABASE command will fail with this same confusing error message.
I understand that this is quite old question and still wanted to add yet another option.
If you want to have this task automated, you do not want to do it manually (Export-Import), you want to copy the database to the existing server (and not create a new temporary one that will be moved across subscriptions) and you do not want to have same credentials on source and target servers because of security considerations, you can use ARM.
There is a option to create database as copy ("createMode": "Copy",) and it will make it across subscriptions! Simple example:
{
"$schema": "http://schema.management.azure.com/schemas/2014-04-01-preview/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
},
"resources": [
{
"apiVersion": "2014-04-01-preview",
"location": "australiaeast",
"name": "{DESTINATION-SERVER-NAME}/{DESTINATION-DATABASE-NAME}",
"properties": {
"createMode": "Copy",
"sourceDatabaseId": "/subscriptions/{SOURCE-SUBSCRIPTION-ID}/resourceGroups/{SOURCE-RESOURCE-GROUP-NAME}/providers/Microsoft.Sql/servers/{SOURCE-SERVER-NAME}/databases/{SOURCE-DATABASE-NAME}",
"requestedServiceObjectiveName": "S2"
},
"type": "Microsoft.Sql/servers/databases"
}
]
}
Two things to note - the service principal that will be executing this deployment will need to be have a contributor access on source and the sourceDatabaseId is a full resource id.
If you do it from Azure DevOps using "Azure resource group deployment" task - it will create a SP for subscription. You will need give Contributor access to it. SP can be found in Project Settings -> Service Connections.
There is a more simple solution, which maybe wasn't available when this question was answered. No SMSS or PowerShell needed. It can all be done in the portal. Go to the source SQL Database and click Export. This will create a .bacpac file in Azure Storage. Go to the target SQL Server and click Import. Done.
Note 1: if the target SQL Sever is in a different account/subscription that cannot access the source account's Azure Storage, just manually download the file from the source Azure Storage and upload it to an Azure Storage instance that the target can access.
Note 2: the imported database will have a name that includes the export date. You can change the name by running ALTER DATABASE [dbname] MODIFY NAME = [newdbname] on the target database. You can even do this in the portal using the new Query Editor.
I have a c# web application running on IIS in a windows server core container.
In the dockerfile I create a new user 'myUser' without password.
I add the credentials to my Azure File store in the Dockerfile as well:
USER myUser
RUN powershell "cmdkey /add:mystore.file.core.windows.net /user:AZURE\mystore /pass:XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=="
I add a new application pool Identity using 'myUser', and use that application pool for my application.
When I start the container and connect using 'docker exec', I am logged on as the new user.
I can access the path with 'ls \mystore.file.core.windows.net\dockerstore\'
The credentials are listed okay with 'cmdkey /list'.
However, my application which runs under the same user complaints it cannot reach the store. System.IO.IOException reported on Directory.Exists().
I have done this execise on my local box as well, and the application runs without issues.
I have tried using a user with password as well, to no avail.
The application use the full UNC-path to the store.
Tried the same thing on a windows service application. Same thing: Can list files in a powershell session, but my application cannot access it.
Am I missing something?
Edit: Here's what I did:
NET USER myAzureFilesUser myAzureFilesPasswordXXXXXXXXXXX== /add /y
NET LOCALGROUP Administrators /add myAzureFilesUser
Import-Module WebAdministration
$processModelProperties = #{userName='myAzureFilesUser ';password='myAzureFilesPasswordXXXXXXXXXXX==';identitytype=3}
Set-ItemProperty (Join-Path 'IIS:\AppPools\' 'My AppPool Name') -name processModel -value $processModelProperties
You need to create local user account with the same username and password as Azure File storage account and perform some additional tasks as described here. https://blogs.iis.net/davidso/azurefile
One question triggering in my mind that how can I find database server machine info where my Sharepoint link/site exist. I'm using Sharepoint 2013 server and looking any way to get my database location info.
There are many ways to find what you are looking for. The first one that springs to mind would be to go to:
Central Admin > Perform a backup OR CentralAdmin /_admin/Backup.aspx
This will give you a detailed list of each web application and by expanding you will be able to find the Content Database's name.
Finding the SQL Database Server you could go to the Central Admin > Application Management > Manage Content Databases
Select the desired web application and the clicking on the content DB name will show you the DB Server name and the DB Name.
How to determine SharePoint DB server using PowerShell
The following script allows to determine SharePoint DB server from a connection string stored in Windows Registry:
if ((Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null)
{
Add-PsSnapin Microsoft.SharePoint.PowerShell
}
Function GetDbConnectionString()
{
$SPFarm = Get-SPFarm
$SPVersion = $SPFarm.BuildVersion.Major.ToString() + "." + $SPFarm.BuildVersion.Minor.ToString()
$ConfigDBKey = 'HKLM:\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\' + $SPVersion + '\Secure\ConfigDB'
(Get-ItemProperty -Path $ConfigDBKey -Name dsn).dsn
}
GetDbConnectionString
I can't describe this problem but it happened after I restore a site collection from our development machine to the production server. I see the same error page also when I try to access the site collection.
I tried to set customErrors to Off on all web.config files on the server but no luck.
I deleted the Site collection and the web application of that site but nothing changed. other site collections are working just fine.
Please help.
This is what I see when trying to create a new Web Application.
Maybe One reason is There is already a web application with the same name or same port in Central Application ,
Confirm that that is not the case?
You can try to follow steps given below:
Go to Central Admin->System Settings->Mange Service on server
Check status of following service:
Microsoft SharePoint Foundation Web Application
If it is stuck at stopping state, follow the steps given below:
Open command Prompt
Navigate to 14 Hive/Bin path
Enter following command
stsadm -o provisionservice -action stop -servicetype spwebservice
perform IISRESET with "Noforce" attribute
Execute following command
stsadm -o provisionservice -action start -servicetype spwebservice
Perform IISRESEt with "Noforce" attribute
Hope this helps
i have one application server and one DB server i have installed MOSS on App server and its content DB is on the DB server. due to policy reasons i had to rename the Sharepoint database server
now the problem is MOSS is not working. So how do i make it work
thier is stsadm command renameserver but that is to change the host name
Go into Central Admin, and under Content Databases, remove the database from the web application. Unfortunately, you can't reattach a content db using Central Admin, so you need to use stsadm. Here's the command:
STSADM –o addcontentdb –url <URL name> -databasename <database name> -databaseserver <database server name>
Are you searching for this command??
stsadm -o setconfigdb
You have to use it when the the database server are renamed or you want to connect the MOSS to other config database.
In case of renaming app db you have to do it with "stsadm -o addcontentdb" like the previuos answer said.