How do I deploy to GAE from the Google Cloud Source Repository? - google-cloud-source-repos

I'm considering moving my php Google App Engine project from Codenvy to the Google Source Repository and edit it there with the Source Editor, but I don't see how to cause it to deploy my project. How do I do that?

Here's what works for me, found through guesswork, trial, error, Billy and only a little docs.
I had set up GC Repositories to have a repository which is a mirror to bitbucket , auto-named default. Note: gcloud for default below can fail to recognise a repository that got that name by Rename. And can mistake a non-existent repository for an empty one.
Recipe 1
UPDATE: Now, after updating the bitbucket source, the deployed app does not show the update, despite "Deployment successful"]4. I don't know why - perhaps due to version number. Workaround: Use Recipe 2.
1 Ensure project's app.yaml file contains application: and version: e.g. this
2 Go to Google Cloud Patform and select the project
3 Click Activate Google Cloud Shell http://i.imgur.com/Axjy17q.png
4 In Google Cloud Shell, enter:
gcloud source repos clone default
appcfg.py update default
rm -rf default
This took ~20s to deploy and ~30s to complete.
Recipe 2
1 Ensure project's app.yaml file does not contain application: or version: (else you'll get an error like this) e.g. this
2 Go to Google Cloud Patform and select the project
3 Click Activate Google Cloud Shell http://i.imgur.com/Axjy17q.png
4 In Google Cloud Shell enter:
gcloud source repos clone default
gcloud --quiet app deploy default/app.yaml
rm -rf default
Warning: This can leave a previous version accessible.
This took ~65s to complete.
Re timing, compare this, taking ~20s . Timings are for a Hello World project.

Related

How can I deploy arbitrary files from an Azure git repo to a Databricks workspace?

Databricks recently added support for "files in repos" which is a neat feature. It gives a lot more flexibility to the projects, since we can now add .json config files and even write custom python modules that exists solely in our closed environment.
However, I just noticed that the standard way of deploying from an Azure git repo to a workspace does not support arbitrary files. First off, all .py files are converted to notebooks, breaking the custom modules that we wrote for our project. Secondly, it intentionally skips files ending in one of the following: .scala, .py, .sql, .SQL, .r, .R, .ipynb, .html, .dbc, which means our .json config files are missing when the deployment is finished.
Is there any way to get around these issues or will we have to revert everything to use notebooks like we used to?
You need to stop doing deployment the old way as it depends on the Workspace REST API that doesn't support arbitrary files. Instead you need to have a Git checkout in your destination workspace, and update that checkout to a given branch/tag when doing release. This is could be done via Repos API, or databricks cli. Here is an example of how to do that with cli from DevOps pipeline.
- script: |
echo "Checking out the releases branch"
databricks repos update --path $(STAGING_DIRECTORY) --branch "$(Build.SourceBranchName)"
env:
DATABRICKS_HOST: $(DATABRICKS_HOST)
DATABRICKS_TOKEN: $(DATABRICKS_TOKEN)
displayName: 'Update Staging repository'

Github Codespaces - Could not detect the platform/language from repo

I'm trying to open this repository using Github Codespaces. Note that this repository is correctly configured for local devcontainer development.
However, when I try to open it in CodeSpaces, it seems to build the container correctly, but fails with:Could not detect any language/platform in the source directory (full log here)
What am I missing?
It looks like you may have run into a regression that Codespaces had during the time specified in your log file.
Given your configuration, Oryx should no longer run, which means you should no longer run into this issue.
Would you mind retrying?

Proper way to set up a release pipeline in Azure Devops for Python based Azure Function

I've a working build pipeline in Azure Devops that essentially installs Python3.6, sets up a virtual environment (.env) and then executes all unit tests. It then uses as its final step, a copy operation to move all files, including the virtual environment to a drop folder.
My problem arises from creating a release pipe. I am running a bash script for the release pipeline that essentially installs the azure functions command tools, and then I activate the python virtual environment before I call the func azure publish instruction.
The error I get states that settings are encrypted and that I need to call func setting add to add settings, however, when run locally, the script executes without any error whatsoever.
Does anyone have a working release pipeline in Azure Devops for a python-based Azure Function that they'd be able to share with me, so I can perhaps see what I am doing wrong?
Here is the relevant bit of script that executes:
#!/usr/bin/env bash
FUNCTION_APP_NAME="secret"
FUNCTION_APP_FOLDER="evenMoreSecret"
# Install Azure Functions Core Tools
echo "--> Install Azure Functions Core Tools"
wget -q https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update
sudo apt-get install azure-functions-core-tools -y
echo ">>>>>>>> Initialize Python Virtual Environment"
source .env/bin/activate
echo "--> Publish the function app to Azure Functions"
cd $FUNCTION_APP_FOLDER
func azure functionapp publish $FUNCTION_APP_NAME --build-native-deps
The script is executed using an Azure CLI, using a security principal which is tied to the azure account that it is targeting.
Usually with Azure DevOps you create several build steps that result in some build artifacts - these are defined in the azure-pipelines.yml file. You then do a release step to release the artifacts that you have created - this is created within the UI. This can involve deploying to a test server and then to production or however you want to configure it. What you are describing is doing the build and release step all in the one yaml file as the func publish is essentially doing a release and it seems to all be in the one script.
In the next release of the az cli there is a new command called az functionapp devops-build that will set up the DevOps pipeline with the seperate build and release steps. However, in the mean time, we have created a series of beta yaml files that we hope you can just drag and drop to do the build and release steps just within the build part (as you are doing).
The beta yaml files are here:
https://github.com/Azure/azure-functions-devops-build/wiki/Yaml-Samples
I must disclaim that they are not fully tested, nor are they supported yet.
I will answer myself as I've solved the problem.
To #Oliver Dolk: We do NOT want to publish as part of a build pipeline. The only thing I'm interested in is to set up a virtual environment and then run the unit tests.
The RELEASE stage is where we want to deploy the scripts copied over from the build step. These artifacts are then the basis for releasing into dev, test and prodution environment.
I was missing a very important step in my script; To create a local.settings.json file which contains encrypted settings for the functionapp.
In order to solve the problem, I only had to call the following:
func azure functionapp fetch-app-settings $FUNCTION_APP_NAME
This calls the azure functionApp, and retrieves it's settings into an encrypted local.settings.json which is then used during publishing.
For a complete script reference of both the build YAML script and the bash script that does the deployment, I've put both in an anonimized github repo:
https://github.com/digitaldias/Python-Examples

Azure Web Apps - how to run script before deployment

I'm trying to use Azure Web Apps (Linux) to host a basic static site. I configured everything so a new deployment happens with every Git push. I put my pre-built pages in my repo to confirm everything works fine with this setup.
Now I've removed the pre-built pages and kept only the templates and the build script (which is basically just an npm install and a mustatic 'compile') and I'd like to run this build script in my web app. I've scoured the internet but can't find anything.
How can I run a script upon first deployment and after each Git-push-triggered deployment?
How can I run a script upon first deployment and after each Git-push-triggered deployment?
First, you need to generate custom deployment script by using azure-cli tool.
1) Set the cli working mode to asm.
azure config mode asm
2) Run the custom deployment script generator command.
azure site deploymentscript --node -t bash
This will generate the files required to deploy your site.
.deployment - Contains the command to run for deploying your site.
deploy.sh - Contains the deployment script.
Now you can edit the deploy.cmd file and add your custom steps.
After that done, add the generated files to your repository (.deployment and deploy.sh) and push your repository to your Azure Web App and see your custom deployment running.
For more details, please refer to this blog post.

Cannot setup CodeCommit in EB CLI

After terminating my previous environment in EB CLI with eb terminate , which executed successfully, I have been trying to deploy my node app in a different region. When I navigate to my app folder containing and hit eb init, I am prompted with
$ eb init
Cannot setup CodeCommit because there is no Source Control setup, continuing with initialization
What can I do from here?
I double checked in IAM and the user has full codecommit access
It looks like you have not initialized git in your directory. Try running git init in the directory you want to use with CodeCommit via the EB CLI.
If you have done this and it is still not working the environment you are using may not be accessible to git and may need to be installed.
It happened to me.
I turned out that it happens because I already have a .elasticbeanstalk folder in this directory.
Which means there is already an elastic bean application configured.
And it will continue with the existing configurations.
Solution:
Delete the .elasticbeanstalk directory.

Resources