I want to evaluate ARM template file which contains actual values passed by the user before it passed to deployment engine.
Is there any way to do that?
I have started creating evaluation code through PEGJS library of nodejs by using which I can evaluate only particular condition or expression present in AZURE ARM functions but can't evaluate actual template which is passed to deployment engine to create any service.
Also, I have checked sdk azure-rest-client but can't find any way please help me to find the solution for the above issue?
There is no built-in way of doing that, you can use validate deployment api call (its also implemented in different sdk's\cli's) but it doesnt actually guarantee that the template will work, it does some basic sanity checks.
You best bet is to write a script that would deploy the template and a set of tests that would validate the outcode.
Related
Some context: Our Cloud Build process relies on manual triggers and about 8 substitutions to customize deploys to various firebase projects, hosting sites, and preview channels. Previously we used a bash script and gcloud to automate the selection of these substitution options, the "updating" of the trigger (via gcloud beta builds triggers import: our needs require us to use a single trigger, it's a long story), and the "running" of the trigger.
This bash script was hard to work with and improve, and through the import-run shenanigans actually led to some faulty deploys that caused all kinds of chaos: not great.
However, recently I found a way to pass substitution variables as part of a manual trigger operation using the Node.js library for Cloud Build (runTrigger with subs passed as part of the request)!
Problem: So I'm converting our build utility to Node, which is great, but as far as I can tell there isn't a native way to steam build logs from a running build in the console (except maybe with exec, but that feels hacky).
Am I missing something? Or should I be looking at one of the logging libraries?
I've tried my best scanning Google's docs and APIs (Cloud Build REST, the Node client library, etc.) but to no avail.
I want to ask whether anyone has experience using Terraform to deploy step functions?
I'm experimenting with step functions and need to deploy to multiple environments in a repeatable and auditable manner. I develop my step functions in the AWS console (Workflow Studio) in my sandbox environment but eventually I need to deploy them to my higher envs.
Currently this is done by exporting the step function from the sandbox environment as JSON and putting that into a Terraform module that is used to deploy the solutions. This poses a problem because the sandbox step function is invoking a lambda that lives in the sandbox environment while the other envs have their own lambda deployed of course, which the step function should be calling.
To solve that problem, the step function JSON is actually a template file where the ARNs for the lambdas are replaced with a variable which is then expanded per environment with the appropriate value.
But all this makes for a terrible development experience. Every time a change is made to the step function, I have to export the JSON, copy it into the Terraform module and replace all the sandbox ARNs with the correct template variables.
Does anybody have suggestions on how to streamline this? Are step functions only good for ad-hoc data processing where repeatable and auditable deployments are not needed or am I missing some obvious solution here?
I would like to know how to post multiple records to SAP using "BatchRequestBuilder" along with ChangeSet .I am using a custom odata service call(ODataCreateRequestBuilder),not using the VDM model. I did'nt get any blog or documentation to start with.
Can you please help me in this regard.
Updated:
Below is what I am trying to post to SAP
[{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""},{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""}]
SAP SDK version : 3.9.0
I have added below code with only one CreateRequest.
ChangeSet changeSet = new ChangeSetBuilder().addCreateRequest( ODataCreateRequestBuilder.withEntity(sapConfig.getServiceUrlRepriceList(),
sapConfig.getEntityRepriceList())
.withBodyAsMap(responseBody)
.build()).build();
BatchResult batchResult = BatchRequestBuilder.withService("URL?").addChangeSet(changeSet).build().execute(httpClient);
Can you let me know if this is correct.Also let me know what I have to pass in the service.Is it service URL?
Thanks,
Arun Pai
The BatchRequestBuilder is actually not directly part of the SAP Cloud SDK but a dependency that the SDK internally uses to execute batch requests. That is why on the SDK level there is no documentation on how to use it.
Roughly, a batch request comprises of multiple change sets which in turn group together multiple operations. The ChangeSetBuilder allows you to build up change sets which you can then pass to a BatchRequestBuilder.
So if you want to run create requests in batch mode you would want to leverage public ChangeSetBuilder addCreateRequest(ODataCreateRequest oDataCreateRequest).
You can take a look at how the SAP Cloud SDK uses these classes to build up batch requests to get an idea how it works in detail. As a starting point look towards BatchFluentHelperBasic. However, unless you don't know the service you want to query at compile time, I recommend that you leverage the generator to generate this code so that you can use the VDM instead which simplifies this.
If you extend your question to hold more specific information on what you actually want to achieve I can expand my answer to give a more concrete example. Also please include the SDK version you are using.
I am trying to find a good example of the json body for Create Build Definition in Azure Devops. Most of the documentation I find has api definitions, but I haven't been able to see an example json body to work from.
Microsoft Documentation:
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/definitions/create?view=azure-devops-rest-5.1
I have found this article that describes doing something similar to what I hope to accomplish. However, they are trying to duplicate the same build definition across different projects.
Similar Example:
https://www.nebbiatech.com/2018/11/29/automating-build-pipeline-creation-using-azure-devops-services-rest-api/
Ultimately, I would like to be able to generate (either create new or clone/modify) as many standard build definitions within a single project as are necessary by my automation. Each one of these build definitions will pull from a different repository within the project and have a different cosmetic name for the pipeline, but will be otherwise identical.
Any suggestions are greatly appreciated. Thanks!
For the usage of YAML build as comment suggested, it will meet your requirements. It letting you define your build in a YAML file that lived with your code. This meant you could use the same branching and code review practices for your build definitions as you did for your code.
The best way to get started with YAML pipelines is through the quickstart guide and Customize your pipeline . After that, to learn how to configure your YAML pipeline the way you need it to work, see conceptual topics such as Build variables and Jobs.
As for a sample of application/json body when you use Rest API to create build definition. You could also refer below links:
How to create Build Definitions through VSTS REST API
Create VSTS Build Definitions using PowerShell
We got one assignment to compile selected siebel objects using VBA macros.
When i say selected it means list of objects will be available in one excel sheet.
is it possible to compile automatically in VBA?
any help will be appreciated. Thanks in advance.
I can help you with this.
NO.
You can double check with Oracle support.
As #Ranjith already mentioned, there is no supported API to compile an SRF. This applies to both the VBA COM and the Java Bean.
Even if you managed to find an undocumented way of compiling the SRF using VBA, it would be unsupported by Oracle. Any issue you have afterwards they will request you reproduce your issue with a standard compile. So, I'd also recommend not investing in this route.
For arguments sake I'll assume that there is a supported way for a moment. Even then I'd argue that Excel is the worst way to automate a compile and deployment of a SRF. It's a client application, it can't - or is difficult - to run on a command line and doesn't interface with proper Continuous Integration tools like Jenkins, Travis CI, Bamboo and the lot.
Building a CI/CD pipeline for Siebel from scratch is complex. Take your time to research the matter. Have a look at the commercial party support and if you do want to develop your own, find a good DevOps engineer and couple him with a strong Siebel Engineer with deployment experience.
As all previous commentators mentioned, this is a challenge, but still possible.
Matter the fact you can use scripting on the Siebel Tools Object Compiler service, which is triggered via siebdev.exe batch compile call. Messing around RepositoryName input parameter can give you the way to pass Excel file name into the service.
Incremental compilation could be performed, following these complex steps on the PreInvokeMethod hook:
Open a transaction, using EAI Transaction Service (may require some ddl libraries from the Windows Siebel Server distribution)
Create a new project (e.g. "__my_incremental_compilation__")
Find the desired repository objects and move them to your project
Pass project name to the ProjectsList parameter of the service's Inputs property set
Continue service call (wait for the end of the compilation)
Rollback the transaction
This worked well for me, when I got stuck with the same question.
Hope it helps you!