Node js Create Dialogflow Agents under single project - node.js

I am currently evaluating means to create multiple Dialogflow agents programmatically. While doing the analysis I tried REST interface released on June 13, 2019, with this interface I am able to edit an existing agent. However, could not create a new one.
Need pointers on below points:
Can we create multiple agents under a single project

Nope. From this page:
Note: You can only create one agent for a GCP project. If you need multiple agents, you will need to create multiple projects.

Related

How to add all users to a project, including new ones?

I would like to setup a sandbox project in my school GitLab server (self-hosted, free), that all users, especially new ones, can use to test whatever they need.
How can I add all users to the same project?
I already read this releated question (that asks the opposite), but it only partially help; the most useful answer tells me to use the API, which is good if I want to add all current users to a project, but I also want to add new ones.
Is there a way to add a user to a project, triggered by that user being confirmed?
One builtin method would be to use system hooks. For example, you can create a hook that responds to user_create events and adds the user to the project.
Another way may be just to run a scheduled CI pipeline that scripts this or similar automation (e.g. cron job on the server or whatever).
You can use the users list API to enumerate all current users in your GitLab instance (requires admin privileges). You can also use the project membership API to enumerate all members of the project. You can compare the two results to find any users that need to be added.
Pseudocode:
project_members = get_project_members(project_id=1234) # project members API
for user in get_all_gitlab_users(): # list users API
if user not in project_members:
add_project_member(user=user, project_id=1234)

How to add test result attachments during Azure Devops build or release via a C# application?

I'm wanting to add some test case attachments during a build or release but I'm struggling to find a valid approach to do this. I'm not using MSTest.
I tried creating a custom build/release task but I've found the azure-devops-node-api package to be flaky at best, and seemingly lacking contributors.
This is what I would hope to do...
Use C# if possible
Have the code/task available for either a build or release across multiple repositories and projects (same organization) without code duplication
Automatically authenticate with the currently running build/release without needing PAT tokens or any other form of authentication
Access to both Azure File Storage and Azure Devops
Works with any build or release agent
Is this achievable? I've seen odd articles in various places but nothing like whats described above. For example this shows promise in terms of validating the current build/release in a C# application however it is 4 years old now and doesn't explain how to integrate with a pipeline.
Can anyone help?
Thanks,
I've always leveraged mstest, so within the runner we've had access in c# to the TestContext that supports adding the attachements directly to the test result.
It looks like the API is exposed for adding attachments to the runs though, so I would think you can create something either in c# or in powershell that accomplishes what you are asking. You will likely need to make sure the agent phase has access to the OAuth token.
POST https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/Results/{testCaseResultId}/attachments?api-version=5.1-preview.1

Azure Data Factory V2 multiple environments like in SSIS

I'm coming from a long SSIS background, we're looking to use Azure data factory v2 but I'm struggling to find any (clear) way of working with multiple environments. In SSIS we would have project parameters tied to the Visual Studio project configuration (e.g. development/test/production etc...) and say there were 2 parameters for SourceServerName and DestinationServerName, these would point to different servers if we were in development or test.
From my initial playing around I can't see any way to do this in data factory. I've searched google of course, but any information I've found seems to be around CI/CD then talks about Git 'branches' and is difficult to follow.
I'm basically looking for a very simple explanation and example of how this would be achieved in Azure data factory v2 (if it is even possible).
It works differently. You create an instance of data factory per environment and your environments are effectively embedded in each instance.
So here's one simple approach:
Create three data factories: dev, test, prod
Create your linked services in the dev environment pointing at dev sources and targets
Create the same named linked services in test, but of course these point at your tst systems
Now when you "migrate" your pipelines from dev to test, they use the same logical name (just like a connection manager)
So you don't designate an environment at execution time or map variables or anything... everything in test just runs against test because that's the way the linked servers have been defined.
That's the first step.
The next step is to connect only the dev ADF instance to Git. If you're a newcomer to Git it can be daunting but it's just a version control system. You save your code to it and it remembers every change you made.
Once your pipeline code is in git, the theory is that you migrate code out of git into higher environments in an automated fashion.
If you go through the links provided in the other answer, you'll see how you set it up.
I do have an issue with this approach though - you have to look up all of your environment values in keystore, which to me is silly because why do we need to designate the test servers hostname everytime we deploy to test?
One last thing is that if you a pipeline that doesn't use a linked service (say a REST pipeline), I haven't found a way to make that environment aware. I ended up building logic around the current data factories name to dynamically change endpoints.
This is a bit of a bran dump but feel free to ask questions.
Although it's not recommended - yes, you can do it.
Take a look at Linked Service - in this case, I have a connection to Azure SQL Database:
You have possibilities to use dynamic content for either the server name and database name.
Just add a parameter to your pipeline, pass it to the Linked Service and use in the required field.
Let me know whether I explained it clearly enough?
Yes, it's possible although not so simple as it was in VS for SSIS.
1) First of all: there is no desktop application for developing ADF, only the browser.
Therefore developers should make the changes in their DEV environment and from many reasons, the best way to do it is a way of working with GIT repository connected.
2) Then, you need "only":
a) publish the changes (it creates/updates adf_publish branch in git)
b) With Azure DevOps deploy the code from adf_publish replacing required parameters for target environment.
I know that at the beginning it sounds horrible, but the sooner you set up an environment like this the more time you save while developing pipelines.
How to do these things step by step?
I describe all the steps in the following posts:
- Setting up Code Repository for Azure Data Factory v2
- Deployment of Azure Data Factory with Azure DevOps
I hope this helps.

Transfer set of repositories from one gitlab group to another subgroup all at once

How to transfer the repository or a whole set of repositories from one gitlab group to another subgroup. For example companyname.gitlab.com/team one/. To gitlab.com/team_first/phase1/
The repositories/projects themselves still need to be exported by API, one by one.
But the new "Group Import/Export " feature from GitLab 13.0 (May 2020) can be a welcome addition.
Export and Import Groups in the UI
Previously, users could only migrate groups by using the Export/Import API to create an Export file, then using the API a second time to upload the file to the target instance.
As a first step toward a more frictionless solution, we have enabled Group Export in the GitLab UI.
We plan to introduce similar Import functionality to the UI within the next few weeks.
See documentation, issue and Epic.
See GitLab 14.2 (August 2021)
Group Migration achieves parity with group import/export
The new GitLab Migration feature can now migrate an entire group with all its subgroups and related data. The data migrated includes everything contained in group exports, making this a much easier way to migrate entire groups.
The pre-existing group import/export is a two-step process that requires exporting a file and then importing it into another GitLab instance.
Now, users can initiate a group migration with a single click. Migration also includes all the subgroups and their data, which previously required separate export and import processes for each subgroup.
See Documentation and Epic.
See GitLab 15.6 (November 2022)
Associate MRs to issues when migrating groups with projects
When migrating groups using GitLab Migration, GitLab now preserves associations of imported merge requests to issues.
This populates the Related merge requests section
on the issue details page.
See Documentation and Issue.

Script Deployment Management Tool for NetSuite

We are looking at removing developers from production and want a simple kind of deployment management tool. One suggestion that some members are using with SalesForce is Jenkins. I have never used Jenkins or any kind of deployment tool before. I normally just copied my code from IDE and updated the file in the SuiteScript file cabinet.
Does Jenkins work for NetSuite? Or what do you recommend for this purpose?
We are planning to use Bit Bucket (which runs Git in the background) as our version control in case that matters.
Thank you for any help
IMO the greatest challenge in integrating with any CI environment(be it Jenkins or any other) is the fact that you can move code files from one system to another using code/APIs but, NOT things like scripts, custom records, fields its deployments , etc. for which you need a bundling process and hence, manual intervention.
NetSuite in recent Suiteworld 2015 said that its coming up "Change Management" which would allow you to put everything that is part of your app to version control system such as git. Please see SuiteAnswer Id 42387, when this feature is rolled out, you can integrate with your CI tool to automatically copy/deploy your app details to an another NetSuite account and run your tests there and accordingly pass/fail your build.
Why do you want to remove developers from Production? This will severely hamper their ability to create solutions for your NetSuite account and will create a ton of overhead for them.
If you must have them out of Production, then probably your "best" option would be to have them build their solutions in Sandbox and then use SuiteBundles for deployment to Production. A Production Admin would need to update the appropriate Bundle(s) for all Production migrations.
NetSuite has also built a SuiteCloud IDE plugin for Eclipse which allows uploading and downloading files (no copy-paste necessary), so if you're not using that I would recommend it.
We are using Jenkins for our own internal automated testing, but not for deployment into NetSuite. I do not know if someone has already built a NetSuite plugin for Jenkins; it is likely you would have to build your own file upload mechanism using the NetSuite Web Services SOAP API, but that would still only allow deployment of source files. Developers will most likely also need to be creating and updating custom records, fields, lists as well as Script records and Script Deployment records, which you will not be able to do through Jenkins or any other tool that I know of.

Resources