VSTS extension can't find sdk - azure-pipelines-build-task

I've gone through several examples, but despite my efforts every time I execute my realease task it can't locate the sdk:
[error]System.Management.Automation.CmdletInvocationException: The term 'Trace-VstsEnteringInvocation' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
I've tried with the SDK under the task root per the advice of this post. I've also tried it with the version number removed from the path.
|-- task root
|----- ps_modules
|----- VstsTaskSDK
|----- 0.10.0
|------ <corresponding sdk files, including VstsTaskSdk.psd1>
|----- deploy.ps1
|----- icon.png
|----- task.json
I've also tried it with the sdk folder one level above the task as I've seen that work. Again, tried that with both the version number present and removed.
|-- root
|-- task root
|----- deploy.ps1
|----- icon.png
|----- task.json
|-- sdk
|----- ps_modules
|----- VstsTaskSDK
|----- 0.10.0
|------ <corresponding sdk files, including VstsTaskSdk.psd1>
Then I tried messing around with the files stanza in the vss-extension.json file, including sdk, or sdk/ps_modules/VstsTaskSdk, etc. 15 permutations I've tried and no success.
When I create the package I am running the following cmd from the extension root (not sure if relevant):
tfx extension create --manifest-globs .\vss-extension.json
Any ideas on what I'm missing?
UPDATE
I have not been able to resolve this so I simplified my task for testing purposes. My task now contains just the sdk; no parameters, and just two cmdlets in my powershell:
Trace-VstsEnteringInvocation $MyInvocation
Trace-VstsLeavingInvocation $MyInvocation
My Task.json:
{
"id": "<myguid>",
"name": "VstsSdkTest",
"friendlyName": "VstsSdkTest",
"description": "A Test Task to troubleshoot vststasksdk issue",
"helpMarkDown": "",
"category": "Deploy",
"visibility": [
"Deploy"
],
"instanceNameFormat": "VstsSdkTest",
"author": "<myAuthor>",
"version": {
"Major": 0,
"Minor": 1,
"Patch": 1
},
"execution": {
"Powershell3": {
"target": "VstsTaskSDKTest.ps1"
}
}
}
My folder structure looks like this:
|-- VstsSdkTest
|----- VstsTaskSDKTest.ps1
|----- icon.png
|----- task.json
|----- ps_modules
|----- VstsTaskSDK
|----- <corresponding sdk files, including VstsTaskSdk.psd1>
Even with this simplified task every time it executes (on the hosted agent) it cannot find the SDK. It seems to be looking for the SDK in a folder under the task that correlates to the task version number.
2017-10-24T20:39:08.9599715Z ##[error]File not found: 'd:\a\_tasks\VstsSdkTest_<myGuid>\0.1.1\ps_modules\VstsTaskSdk\VstsTaskSdk.psd1'

The issue was due to me not understanding I had to update the version number in the task.json file whenever I moved the vststasksdk folder. I assumed that I just needed to update the vss-extension.json version because it contains the file stanza. Not the case. The final folder structure that worked for me was having the sdk stored in a folder at the task root.
|-- VstsSdkTest
|----- VstsTaskSDKTest.ps1
|----- icon.png
|----- task.json
|----- ps_modules
|----- VstsTaskSDK
|----- <corresponding sdk files, including VstsTaskSdk.psd1>

You are using PowerShell and call SDK command, so using PowerShell3 instead (not PowerShell).
Part code of task.json file:
"execution": {
"PowerShell3": {
"target": "deploy.ps1",
"argumentFormat": ""
}

Related

Npm package missing build files after npm install [duplicate]

I would like to publish a npm package that contains my source as well as distribution files. My GitHub repository contains src folder which contains JavaScript source files. The build process generates dist folder that contains the distribution files. Of course, the dist folder is not checked into the GitHub repository.
How do I publish a npm package in a way that when someone does npm install, they get src as well as dist folder? Currently when I run npm publish from my Git repository, it results in only the src folder being published.
My package.json file looks like this:
{
"name": "join-js",
"version": "0.0.1",
"homepage": "https://github.com/archfirst/joinjs",
"repository": {
"type": "git",
"url": "https://github.com/archfirst/joinjs.git"
},
"main": "dist/index.js",
"scripts": {
"test": "gulp",
"build": "gulp build",
"prepublish": "npm run build"
},
"dependencies": {
...
},
"devDependencies": {
...
}
}
When you npm publish, if you don't have an .npmignore file, npm will use your .gitignore file (in your case you excluded the dist folder).
To solve your problem, create a .npmignore file based on your .gitignore file, without ignoring the dist folder.
Source: Keeping files out of your Package
Take a look at the "files" field of package.json file:
package.json, files
From the documentation:
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)
Minimal example of how to use data files from a script
Another common use case is to have data files that your scripts need to use.
This can be done easily by using the techniques mentioned at: How can I get the path of a module I have loaded via require that is *not* mine (i.e. in some node_module)
The full example can be found at:
Source: cirosantilli/linux-kernel-module-cheat/npm/data-files/
Published: cirosantilli-data-files
With this setup, the file mydata.txt gets put into node_modules/cirosantilli-data-files/mydata.txt after installation, because we added it to our files: entry of package.json.
Our function myfunc can then find that file and use its contents by using require.resolve. It also just works on the executable ./cirosantilli-data-files of course.
package.json
{
"bin": {
"cirosantilli-data-files": "cirosantilli-data-files"
},
"license": "MIT",
"files": [
"cirosantilli-data-files",
"mydata.txt",
"index.js"
],
"name": "cirosantilli-data-files",
"repository": "cirosantilli/linux-kernel-module-cheat",
"version": "0.1.0"
}
mydata.txt
hello world
index.js
const fs = require('fs');
const path = require('path');
function myfunc() {
const package_path = path.dirname(require.resolve(
path.join('cirosantilli-data-files', 'package.json')));
return fs.readFileSync(path.join(package_path, 'mydata.txt'), 'utf-8');
}
exports.myfunc = myfunc;
cirosantilli-data-files
#!/usr/bin/env node
const cirosantilli_data_files = require('cirosantilli-data-files');
console.log(cirosantilli_data_files.myfunc());
The is-installed-globally package is then useful if you want to generate relative paths to the distributed files depending if they are installed locally or globally: How to tell if an npm package was installed globally or locally
just don't mention src and dist inside the .npmignore file to get the scr and dist inside the node_modules ... that's it
Another point is if there is a .gitignore file, and .npmignore is missing, .gitignore's contents will be used instead.

How to convert a react app to an npm module? [duplicate]

I would like to publish a npm package that contains my source as well as distribution files. My GitHub repository contains src folder which contains JavaScript source files. The build process generates dist folder that contains the distribution files. Of course, the dist folder is not checked into the GitHub repository.
How do I publish a npm package in a way that when someone does npm install, they get src as well as dist folder? Currently when I run npm publish from my Git repository, it results in only the src folder being published.
My package.json file looks like this:
{
"name": "join-js",
"version": "0.0.1",
"homepage": "https://github.com/archfirst/joinjs",
"repository": {
"type": "git",
"url": "https://github.com/archfirst/joinjs.git"
},
"main": "dist/index.js",
"scripts": {
"test": "gulp",
"build": "gulp build",
"prepublish": "npm run build"
},
"dependencies": {
...
},
"devDependencies": {
...
}
}
When you npm publish, if you don't have an .npmignore file, npm will use your .gitignore file (in your case you excluded the dist folder).
To solve your problem, create a .npmignore file based on your .gitignore file, without ignoring the dist folder.
Source: Keeping files out of your Package
Take a look at the "files" field of package.json file:
package.json, files
From the documentation:
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)
Minimal example of how to use data files from a script
Another common use case is to have data files that your scripts need to use.
This can be done easily by using the techniques mentioned at: How can I get the path of a module I have loaded via require that is *not* mine (i.e. in some node_module)
The full example can be found at:
Source: cirosantilli/linux-kernel-module-cheat/npm/data-files/
Published: cirosantilli-data-files
With this setup, the file mydata.txt gets put into node_modules/cirosantilli-data-files/mydata.txt after installation, because we added it to our files: entry of package.json.
Our function myfunc can then find that file and use its contents by using require.resolve. It also just works on the executable ./cirosantilli-data-files of course.
package.json
{
"bin": {
"cirosantilli-data-files": "cirosantilli-data-files"
},
"license": "MIT",
"files": [
"cirosantilli-data-files",
"mydata.txt",
"index.js"
],
"name": "cirosantilli-data-files",
"repository": "cirosantilli/linux-kernel-module-cheat",
"version": "0.1.0"
}
mydata.txt
hello world
index.js
const fs = require('fs');
const path = require('path');
function myfunc() {
const package_path = path.dirname(require.resolve(
path.join('cirosantilli-data-files', 'package.json')));
return fs.readFileSync(path.join(package_path, 'mydata.txt'), 'utf-8');
}
exports.myfunc = myfunc;
cirosantilli-data-files
#!/usr/bin/env node
const cirosantilli_data_files = require('cirosantilli-data-files');
console.log(cirosantilli_data_files.myfunc());
The is-installed-globally package is then useful if you want to generate relative paths to the distributed files depending if they are installed locally or globally: How to tell if an npm package was installed globally or locally
just don't mention src and dist inside the .npmignore file to get the scr and dist inside the node_modules ... that's it
Another point is if there is a .gitignore file, and .npmignore is missing, .gitignore's contents will be used instead.

Is there a way to automatically copy files to wwwroot?

I have my index.html in the directory called wwwroot and it's being accessed from the browser on locahost:5001. Whe nI installed some packages using NPM, the directory node_modules was placed on the same level as wwwroot.
When I'm linking to the files, I use a relative path like this.
href="../node_modules/package_this_or_that/package.min.js"
It seems to me that a better approach would be to have those delivered to the wwwroot directory and have them reside there. Not all the contents of the packages, just the files that are actually being used (skipping readmes etc.).
Is there a package for that? Or is it something that needs to be done using a build script?
This answer recommends using GULP which seems dated now.
You are not supposed to access node_modules files from front-end like your html or cshtml files. So you are right you should copy them to the wwwroot folder.
You can use grunt as linked in Tseng comment but I personally prefer Gulp, I think it's much quicker and easier to use.
Your package.json file:
{
"version": "1.0.0",
"name": "asp.net",
"private": true,
"devDependencies": {
"gulp": "3.9.1",
"gulp-cached": "1.1.0",
}
}
Then create a gulpfile.js at your project's root level and you can write something like
var gulp = require('gulp'),
cache = require('gulp-cached'); //If cached version identical to current file then it doesn't pass it downstream so this file won't be copied
gulp.task('default', 'copy-node_modules');
gulp.task('copy-node_modules', function () {
try {
gulp.src('node_modules/**')
.pipe(cache('node_modules'))
.pipe(gulp.dest('wwwroot/node_modules'));
}
catch (e) {
return -1;
}
return 0;
});
Finally open the Task Runner Explorer (if you are using Visual Studio) and execute either your default task or directly the copy-node_modules task.
Gulp is very useful, I suggest you explore other different gulp tasks. You can concat and minify both CSS and JS files, remove comments, you can even create a watch task that executes other tasks as soon as a file changes.

Requiring compiled ES6 Modules from dist

I have two questions.
Question #1
I'm writing npm package on ES6 and have following package.json:
{
"name": "mypackage",
"bin": {
"mybin": "dist/bin/mybin.js"
},
"dependencies": [...],
"devDependencies": [...],
"directories": {
"lib": "dist"
},
"main": "dist/index.js",
"scripts": {
"compile": "./node_modules/.bin/babel ./src --optional runtime --presets es2015,stage-0 -d ./dist",
"prepublish": "npm run compile"
}
}
Everything is compiled successfully every time.
However, I can include:
var mypackage = require('mypackage');
But I'm not able to include subfolder with the same starting path:
var constants = require('mypackage/core/constants');
Of course constants.js has following full path mypackage/dist/core/constants.js
But I would like to include it without dist part..
For now to include constants I should write like this:
var constants = require('mypackage/dist/core/constants');
Which doesn't make a lot of sense.
I don't like approach when I should use NODE_PATH to solve this issue.
I need solution without making users extra-efforts to include dist folder contents.
At least users should not rely to compilation/publishing folders structure, they should not even know anything about this.
Question #2
How can I compile all the .es6 files to dist and then copy all the other files except compiled from src to dist?
For example, I have different templates, assets, etc.
I would like structure of dist to be exactly the same including all the files as in src but .es6 compiled to .js.
I know obvious solution to copy entire src to dist and then compile everything from dist to dist, but it doesn't look like a smart way for me.
On the other hand, I wouldn't like to specify every single asset/image/template to copy to dist folder.
May be there's gulp plugin to make exact copy from folder to folder but excluding all the files with given extension (or regexp)?
Update #1
#molda
I have also structure inside:
-- src/modules
---- module1
-------- static
---- module2
-------- static
So keeping all the static files in src/static isn't solution

How can I publish an npm package with distribution files?

I would like to publish a npm package that contains my source as well as distribution files. My GitHub repository contains src folder which contains JavaScript source files. The build process generates dist folder that contains the distribution files. Of course, the dist folder is not checked into the GitHub repository.
How do I publish a npm package in a way that when someone does npm install, they get src as well as dist folder? Currently when I run npm publish from my Git repository, it results in only the src folder being published.
My package.json file looks like this:
{
"name": "join-js",
"version": "0.0.1",
"homepage": "https://github.com/archfirst/joinjs",
"repository": {
"type": "git",
"url": "https://github.com/archfirst/joinjs.git"
},
"main": "dist/index.js",
"scripts": {
"test": "gulp",
"build": "gulp build",
"prepublish": "npm run build"
},
"dependencies": {
...
},
"devDependencies": {
...
}
}
When you npm publish, if you don't have an .npmignore file, npm will use your .gitignore file (in your case you excluded the dist folder).
To solve your problem, create a .npmignore file based on your .gitignore file, without ignoring the dist folder.
Source: Keeping files out of your Package
Take a look at the "files" field of package.json file:
package.json, files
From the documentation:
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)
Minimal example of how to use data files from a script
Another common use case is to have data files that your scripts need to use.
This can be done easily by using the techniques mentioned at: How can I get the path of a module I have loaded via require that is *not* mine (i.e. in some node_module)
The full example can be found at:
Source: cirosantilli/linux-kernel-module-cheat/npm/data-files/
Published: cirosantilli-data-files
With this setup, the file mydata.txt gets put into node_modules/cirosantilli-data-files/mydata.txt after installation, because we added it to our files: entry of package.json.
Our function myfunc can then find that file and use its contents by using require.resolve. It also just works on the executable ./cirosantilli-data-files of course.
package.json
{
"bin": {
"cirosantilli-data-files": "cirosantilli-data-files"
},
"license": "MIT",
"files": [
"cirosantilli-data-files",
"mydata.txt",
"index.js"
],
"name": "cirosantilli-data-files",
"repository": "cirosantilli/linux-kernel-module-cheat",
"version": "0.1.0"
}
mydata.txt
hello world
index.js
const fs = require('fs');
const path = require('path');
function myfunc() {
const package_path = path.dirname(require.resolve(
path.join('cirosantilli-data-files', 'package.json')));
return fs.readFileSync(path.join(package_path, 'mydata.txt'), 'utf-8');
}
exports.myfunc = myfunc;
cirosantilli-data-files
#!/usr/bin/env node
const cirosantilli_data_files = require('cirosantilli-data-files');
console.log(cirosantilli_data_files.myfunc());
The is-installed-globally package is then useful if you want to generate relative paths to the distributed files depending if they are installed locally or globally: How to tell if an npm package was installed globally or locally
just don't mention src and dist inside the .npmignore file to get the scr and dist inside the node_modules ... that's it
Another point is if there is a .gitignore file, and .npmignore is missing, .gitignore's contents will be used instead.

Resources