Repository and Task input filepath - azure-pipelines-build-task

Is there a way to clear an input field in a task every-time the source directory/repository is changed?
Context:I am developing a custom VSTS task, and in it I have a input field type filepath.
I would like to clear the input field in my task every time the user changes the GetSource or Repository.
Existing behavior in "Copy Files to:" task:
eg: Consider in Get sources -->select a source I have selected GitHub, and a repository out of my account.
In the "Copy Files to: " task , input field "Source Folder" I go ahead and select a folder from my Repo (Browse Source Folder--> Select path).
If I return to the GetSource and change the "repository" or "selected source", then this input field in the "Copy Files to: " task is not relevant anymore. But it remains the same. I have to manually clear the field.
I would like to have this cleared automatically.
Is there a way we can accomplish this?

I checked the azure pipelines tasks samples and the task schema but did not find such a way. If you would like that feature, you can create a request for this feature in developer community.
That will allow you to directly interact with the appropriate product group, and make it more convenient for the product group to collect and categorize your suggestions.

Related

MS Project : How to set Task Summary Name when importing from Excel

I'm using TargetProcess (scrum) to manage our projects and encode timesheets.
So I made a tool to extract all needed data from targetprocess to an excel file, then import this excel file to our MSProject file to update all data.
Everything works perfectly except one thing :
In the Import Wizard, as I want to update my MsProject file and not create a new one, I select the "Merge the data into the active project" option. I've an ID as "Merge Key".
The structure is :
Project
Epic
Feature
So I've 3 Levels.
When I do my import, if I've no new epic or feature, everything works correctly, all my data (time, complete, date, etc) are well updated.
But if there is a new Feature (for example), this new feature is append to the end of my MSProject file and it not placed in the correct project.
So for example, if I had :
Project_A
Epic_A1
Feature_A1a
Project_B
Epic_B1
Feature_B1a
and if I add a new feature "Feature_A1b" in the epic "Epic_A1" of project "Project_A" in TargetProcess (or the excel file used for import), the result after the new import (merge) will be :
Project_A
Epic_A1
Feature_A1a
Project_B
Epic_B1
Feature_B1a
Feature_A1b
instead of :
Project_A
Epic_A1
Feature_A1a
Feature_A1b
Project_B
Epic_B1
Feature_B1a
I tried to define the "Task Summary Name" in my excel file, and bind it to the "Task Summary Name" field in MsProject during the import, but MsProject don't care about it.
Is there any way to tell MsProject to append the new feature to the correct project and not at the end of the file ?
That is correct--when the Import Wizard is used to merge data, new tasks are added at the end of the schedule.
The way around this is to avoid having the Import Wizard add the new tasks. Instead, before running the wizard, add placeholder tasks where you want the new tasks to go and update your merge source file with the Unique IDs of these placeholder tasks (presuming that's what is used as the primary key).

Azure Storage Account file details in a table in databricks

I am loading data via pipelines in ADLS gen2 container.
Now I want to create a table that has details that when the pipeline start running and then completed. like below fields
where
startts - start time of job
endts- end time of job
but extractts is the extraction time which is what i want to create.. is there any approch i can create table ?? help will be really appreciated.
It might not be exactly what you need, but you can add columns inside a copy activity with the property #pipeline().TriggerTime to get your "startts" field. Go to the bottom option on the source tab and select new, and on value choose "Add dynamic content".
Here is an example
For your "extractts" you could use the property of the activity called "executionDuration", which will give you the time it took to adf to run the activity in seconds. Again, use dynamic content and get this: #activity('ReadFile').output.executionDuration, then store this value wherever you need.
In this example I'm storing the value in a variable, but you could use it anywhere.
I dont understand your need for a field "endts", I would just simply do startts+extractts to get it.

GItlab: has there any way we can select the variable values as drop down menu

In Gitlab ci We have declared our variables like below
USER_NAME:
value: ""
description: "Enter Username"
File_Name:
description: "Enter the file name"
It only give a text box where We can enter the value.
Can I make a drop down select box is there any option available to make it a select box.Then we can select the value from drop down.
GitLab 15.7 (December 2022) has finally implemented this:
Select predefined CI/CD variables values from a dropdown list
Previously, you could pre-fill CI/CD variables in the “Run pipeline” page, with a specific value.
Unfortunately, if you had multiple options for the variable’s value, you still had to manually input the option you wanted. This was an error-prone process because you could easily input an invalid value, or just mistype it.
In this release, we’ve added the ability to set a list of values which are surfaced in a drop-down list in the “Run pipeline” page.
Now you can define the exact list of values that are valid for each CI/CD variable when running a pipeline manually, greatly simplifying your workflow when using manually-triggered pipelines.
See Documentation and Issue.
No it is not possible. What we ended up doing, if we want to have a better configurable way, was using the gitlab pages with a form, which will use the API or a POST to create a new pipeline. This way we are quiet flexible with the options to provide to the users.
I see an active proposal for it https://gitlab.com/gitlab-org/gitlab/-/issues/29159 (almost 2 years). But no activity for the moment.
It's not possible yet
Backend ticket is created for it May 19, 2022
https://gitlab.com/gitlab-org/gitlab/-/issues/362934
Original proposal https://gitlab.com/gitlab-org/gitlab/-/issues/29159

How to change the title of a report to be different from .rdl file name

What is the variable name that is being displayed in the web-portal as the report name?
Example: if my RDL file is named Reportforseeingthings.rdl
I want the name displayed in the web portal as "Report for seeing things"
Is there a report properties variable that can be modified before deployment to handle this along with a server config to grab that variable instead of the file name?
Open to other solutions as well.
As far as I know there is no way to do this. You can set the file name and description properties but the report name will always be the file name. You can modify it once deployed.
So options are potentially:
Just rename the file in the solution
Manually edit after deployment
Create linked reports with the desired name, redeploying will update the base report but the linked report name will stay the same.
Put the required title in the description and have a script that updates the names on demand or as part of a trigger or scheduled job.
something like
UPDATE r
SET Name = Description
FROM ReportServer.dbo.Catalog r
WHERE Name != Description and Description IS NOT NULL
Note that even doing this, the breadcrumb trail in the web portal will still show the original name, but the list/card views will show the name you want.

Default folders on SharePoint Document Set

I'm trying to achieve a project documentation solution on SharePoint 2013 using a custom Document Set Content Type. Each new project Document Set would hold four types of documents.
My initial thought was to use folders for each type to allow easy drag and drop behavior for adding documents. The problem is, by default, folders can be created only by adding default documents to a Document Set Content Type. I wish to create the folders upon the creation of new instance of this Document Set but not add any documents inside these folders.
This is what I have tried so far:
Using workflow to delete default documents inside folders that were created in the new Document Set.
Custom folder Content Types.
The reason for use of folders is to avoid metadata tagging when uploading documents.
I'm quite restricted with the tools that I can use so the most basic level solutions would be appreciated.
To be more clear, I would hope to have a Document Set Content Type which holds four folders. Document Set would act as a holder for all documentation relevant for a single project and folders inside would group the documentation into groups. Each new project will generate documents in its lifespan and these documents should be grouped into categories, say marketing, finance, production and HR. So when I have a new project, I can create a new instance of custom Document Set with pre created, named, empty folders inside.
There's two ways you can achieve this I think. First would be to set-up your document set to have an empty document deployed to all of the folders you'd like to create. Simply add "Default Content" to your document set and there specify the correct folders. Unfortunately there's no way to create empty folders, so you'll need to put a dummy document in each folder which the user can then delete or override.
Second would require some coding. By attaching a remote event receiver to the library, you could trigger some custom code which would create the folder structure inside of the document set once created. This would allow you to create the folder structure without the need for dummy documents; so it's a bit more work which requires a developer, but the end result is also better.
The second solution you might also be able to pull off in a workflow (or Microsoft Flow) which then uses the API's to create the folders, but that's most likely less clean and more hassle to get working.
Create a custom content type called "Folder Creator". Use that content type to create the subfolders. Then have the "Folder Creator" content type auto delete using SharePoint Information Management Policies. Be sure to set your "Expiration Policy" job to run at least once a day.

Resources