We are using this template from Azure.
DB used is "postgresql flexible-server" as DB from web app, we can connect to DB.
What is the best practice to deploy and execute DB/Table schema script on database server as one time process ?
E.g. using some extension on Server to deploy script, like we have for VM using DSC (desired state configuration) or CSE (Custome script extension)
Related
I have a Nodejs script which will fetch data from bigquery and populate some of it in a database (running on Google Cloud Platform).
If the script is fetchdata.js, then running it on server from the command line would be node fetch.js
However, I am not sure how to accomplish this with the Google Cloud Platform.
It seems that I have to deploy this script as a separate service (using Google App engine/ Compute engine) and then call that service externally (using the browser) to execute that piece of code. Am I missing something here?
Earlier I thought this could be accomplished using Google Cloud Functions but even that does not seem to be the correct use case.
Yes, You would need a client to connect to you SQL (database) in order to insert the fetched data and for that you would need to use one of the services (Google Compute Engine or Google App Engine)
In App Engine you can run your NodeJS script and then connect the App Engine instance to Cloud SQL database ( if you are using Google Cloud SQL database )
In Compute Engine, you can run the script and connect the Compute Engine Virtual Machine to the database
I want to run tests using real SQL databases. The SQL server and the databases would be on Azure. I'm trying to configure a build definition on VSTS where the tasks would set up an Azure resource group with the databases, set the connection strings on my solution and run the Entity Framework Core migrations to the databases. After the tests a task should delete the resource group.
I created an Azure resource group with the databases I need and downloaded the PowerShell deployment files, the script, template.json and so on.
The PowerShell script task can have an inline script or a path to the script. Should I add the Powershell scripts to the solution so that the VSTS could access them? The inline script option seems to be for small few line scripts and the Azure deployment is quite big with multiple files.
Setting up the connection strings shouldn't be too difficult. Several people suggest using the Replace Tokens task from the market place.
I'm not sure how to run the database migrations after that. Can I run the migrations on PowerShell script task? How can I ensure that the needed cmdlets work?
Since the script is quite big, you can’t use Inline script, you can add it to the project or other path of server (Add additional mapping for different path).
Regarding database migration, if you enabled EF migration in a web app, you can create a publish profile (web deploy package) with database migration enabled, then publish/deploy with /p:DeployOnBuild=true /p:PublishProfile=[profile name];DesktopBuildPackageLocation="$(build.artifactstagingdirectory)\webEF.zip" MSBuild arguments, then deploy package to Azure Web APP through Azure Web Deploy task.
I am new to CloudShell and I want to deploy a SQL database to Azure SQL Server.
Previously (using Windows PowerShell) I have deployed a database using a dacpac file in combination with the Dac framework.
But CloudShell is something on portal & I don't have the Dac framework associated with it. Is there any other way to update/deploy database to Azure SQL Server using CloudShell?
I got the workaround for the problem. I have copied Dac framework (130) folder to CloudShell VM which contains SqlPackage.exe file.
Then I ran the PowerShell script to deploy dacpac & It works!
I am still trying to install SSTD in CLoudShell environment. But for now, I can deploy a database using dacpack.
You can actually use PowerShell in the Azure Cloud Shell through the public preview. From then, you can import .bacpac files into your Azure SQL Database as explained in this documentation page. The Dac framework is used internally by the Azure import/export module.
We have a sandbox environment for clients to play in and we want to restore the database to a clean state every night.
I have not had any success in finding an example on how to set this up - easy to do this in a local environment using a SQL Agent Job - no idea how to accomplish this in Azure.
One way to accomplish this would be ARM template and a bacpac file. Your ARM template can be deployed with PowerShell or Azure CLI using cron.
Blog Post: https://blogs.msdn.microsoft.com/kaevans/2016/03/28/deploy-bacpac-to-azure-sql-database-using-arm/
My client has a Node app running on Azure as a Cloud Service and using an Azure-hosted MS-SQL database. They would like another instance of the app running which they could use for testing changes before deployment to the production site. This instance would also have a separate database so that testing would not interfere with the live database. What is the most straightforward way to set this up?
I think you can either create another cloud service to deploy your testing app, or use the staging slot of the cloud service you are using now. And create another database for test purpose.