What Visual Studio Code extension API should be used to add folders and move files in a Visual Studio Code workspace? - node.js

I want to write an extension that will:
Read the folders and files in the current workspace
Reorganise those files by creating new folders and moving the files into them
So I need an API to do the above. I am not clear on whether I should use
the standard fs node module,
the File System Provider or
the workspace namespace.
I read the answer here, but it doesn't say what to use to create folders and move files within the workspace.

The FileSystemProvider is to be used if you want to serve files from non-local storage like FTP sites or virtual file systems inside remote devices and present them to Visual Studio Code as storage directories. The FileSystemProvider is a view/controller of the remote storage. In other words, you have to implement all the file manipulation operations by communicating with the remote storage.
If you want to just manipulate the files in the current workspace and also be able to use URI's from FileSystemProviders, you use vscode.workspace.fs.
You can also use the Node.js fs module, but that only handles local disk workspaces (URI's with scheme file:). I recommend to use the synchronous versions of the fs methods. I had some troubles using the asynchronous fs methods in Visual Studio Code (I did not know of vscode.workspace.fs at that time).

I found extension.ts, sample code for implementing FileSystemProvider provided by Microsoft.
The below steps are provided which helps you understand how to create a new folder (createDirectory) and move files within the workspace (using the copy command, copying all files in an old folder to a new folder and use the delete command if you don’t wish for files to remain in the old folder).
The first step in FileSystemProvider is to register the filesystem provider for a given scheme using registerFileSystemProvider:
const memFs = new MemFS();
context.subscriptions.push(
vscode.workspace.registerFileSystemProvider('memfs', memFs, {
isCaseSensitive: true
}));
Next is to register the command registerCommand to perform your operations in FileSystemProvider like readDirectory, readFile, createDirectory, copy, delete.
context.subscriptions.push(
vscode.commands.registerCommand('memfs.init', _ =>
{
#TODO: add required functionalities
}));
Read folders and read files
for (const [name] of memFs.readDirectory(vscode.Uri.parse('memfs:/'))) {
memFs.readFile(vscode.Uri.parse(`memfs:/${name}`));
}
Create the new folder
memFs.createDirectory(vscode.Uri.parse(`memfs:/folder/`));
Move files to newly created folder. Unfortunately there doesn't seem to be a separate move file command, but you can use the Copy command to copy to the new folder and then delete:
COPY
copy(source: Uri, destination: Uri, options: {overwrite: boolean})
DELETE
context.subscriptions.push(vscode.commands.registerCommand('memfs.reset', _ => {
for (const [name] of memFs.readDirectory(vscode.Uri.parse('memfs:/'))) {
memFs.delete(vscode.Uri.parse(`memfs:/${name}`));
}
}));

Related

how do you create a folder and write to heroku's ephemeral storage?

i'm using nodejs, and when my script uses fs.mkdir nothing seems to happen... it works well locally. is there an alternate command/function i can use to create and write to folders in heroku's file system?
(yes i'm aware the ephemeral system is temporary, with my use case, all files will be deleted after 5 minutes)
You can create a tmp folder in the root of your project, which is where you will write to and read files from in Heroku. For instance, first line of code below allows you to save data to a specified file path inside the tmp folder. The second line creates the stream for that file
fs.writeFileSync(`/tmp/${filename}.json`, dataToSave)
const fileStream = fs.createReadStream(`/tmp/${filename}.json`)

Location for SSH private key and temporary SFTP download data in Azure functions

I am writing an Azure function that uses WinSCP library to download files using SFTP and upload the files on blob storage. This library doesn't allow to get files as a Stream. Only option is to download them locally. My code also uses a private key file. So i have 2 questions.
sessionOptions.SshPrivateKeyPath = Path.GetFullPath("privateKey2.ppk");
is working locally. I have added this file in solution with option "copy to output" and it works. But will it work when Azure function is deployed?
While getting the files I need to specify local path where the files will be downloaded.
var transferResult = session.GetFiles(
file.FullName, Path.GetTempPath() + #"SomeFolder\" + file.Name, false,
transferOptions);
The second parameter is the local path.
What should I use in place of Path.GetTempPath() that will work when Azure function is deployed?
For the private key, just deploy it along with your function project. You can simply add it to your VS project.
See also Including a file when I publish my Azure function in Visual Studio.
For the download: The latest version of WinSCP already supports streaming the files. Use the Session.GetFile method.
To answer your question about the temporary location, see:
Azure Functions Temp storage.
Where to store files for Azure function?

Possible to create file in sources directory on Azure DevOps during build

I have a node script which needs to create a file in the root directory of my application before it builds the file.
The data this file will contain is specific to each build that gets triggered, however, I'm having no luck on Azure DevOps in this regards.
For the writing of the file I'm using fs.writeFile(...), something similar to this:
fs.writeFile(targetPath, file, function(err) { // removed for brevity });
however, this throws an expection:
[Error: ENOENT: no such file or directory, open '/home/vsts/work/1/s/data-file.json']
Locally this works, I'm assuming this has got to do with permissions, however, I tried adding a blank version of this file to my project, however, it still throws this exception.
Possible to create file in sources directory on Azure DevOps during
build
The answer is Yes. This is fully supported scenario in Azure Devops Service if you're using Microsoft ubuntu-hosted agent.
If you met this issue when using microsoft-hosted agent, I think this issue is more related to one path issue. Please check:
The function where the error no such file or directory comes. Apart from the fs.writeFile function, do you also use fs.readFile in the xx.js file? If so, you should make sure the two paths are same.
The structure of your source files and your real requirements. According to your question you want to create it in Source directory /home/vsts/work/1/s, but the first line indicates that you actually want to create file in root directory of my application.
1).If you want to create file in source directory /home/vsts/work/1/s:
In your js file: Use something targetpath like './data-file.json'. And make sure you're running command node xx.js from source directory. (Leaving CMD task/PS task/Bash task's working directory blank!!!)
2).If you want to do that in root of application folder like /home/vsts/work/1/s/MyApp:
In your js file: Use __dirname like fs.writeFile(__dirname + '/data-file.json', file, function(err) { // removed for brevity }); and fs.readFile(__dirname + '/data-file.json',...).

IFileProvider Azure File storage

I am thinking about implementing IFileProvider interface with Azure File Storage.
What i am trying to find in docs is if there is a way to send the whole path to the file to Azure API like rootDirectory/sub1/sub2/example.file or should that actually be mapped to some recursion function that would take path and traverse directories structure on file storage?
just want to make sure i am not missing something and reinvent the wheel for something that already exists.
[UPDATE]
I'm using Azure Storage Client for .NET. I would not like to mount anything.
My intentention is to have several IFileProviders which i could switch based on Environment and other conditions.
So, for example, if my environment is Cloud then i would use IFileProvider implementation that uses Azure File Services through Azure Storage Client. Next, if i have environment MyServer then i would use servers local file system. Third option would be environment someOther with that particular implementation.
Now, for all of them, IFileProvider operates with path like root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to get sub3 info/content or should the path be broken into individual directories and get reference/content for each step?
I hope that clears the question.
Now, for all of them, IFileProvider operates with path like ˙root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to getsub3` info/content or should the path be broken into individual directories and get reference/content for each step?
For access the specific subdirectory across multiple sub directories, you could use the GetDirectoryReference method for constructing the CloudFileDirectory as follows:
var fileshare = storageAccount.CreateCloudFileClient().GetShareReference("myshare");
var rootDir = fileshare.GetRootDirectoryReference();
var dir = rootDir.GetDirectoryReference("2017-10-24/15/52");
var items=dir.ListFilesAndDirectories();
For access the specific file under the subdirectory, you could use the GetFileReference method to return the CloudFile instance as follows:
var file=rootDir.GetFileReference("2017-10-24/15/52/2017-10-13-2.png");

Where should I store cache of a custom CLI npm module?

I am developing an npm module, where user can interact with it through a terminal by executing commands:
> mymodule init
> mymodule do stuff
When executing certain commands user is being asked for some data, which will be used by the module. Since this data won't really change while using the module and since these commands can be executed pretty frequently, it is not the best option to ask user for the data any time he runs a command. So I've decided to cache this data, and as soon as it should live through multiple module calls, the easiest way to store it that I see is a file (the data structure allows to store it in a simple JSON). But I am to quite sure where should this file go on a user's machine.
Where in the file system should I store a cache in a form of a file or multiple files for a custom npm module, considering that the module itself can be installed globally, on multiple operation systems and can be used in multiple projects at the same time?
I was thinking about storing it in a module's folder, but it might be tricky in case of global installation + multi-project use. The second idea was to store it in OS specific tmp storage, but I am not quite sure about it too. I am also wondering if there are some alternatives to file storage in this case.
I would create a hidden folder in the user's home directory (starting with a dot). For instance, /home/user/.mymodule/config.cfg. The user's home directory isn't going anywhere, and the dot will make sure it's out of the user's way unless they go looking for it.
This is the standard way that most software stores user configs, including SSH, Bash, Nano, Wine, Ruby, Gimp, and even NPM itself.
On some systems you can cache to ~/.cache by create a sub-directory to store your cache data, though its much more common for applications to create a hidden directory in the users home directory. On modern windows machines you can use create a directory in C:/Users/someUser/AppData. In Windows using a . suffix will not hide a file. I'd recommend you do something platform agnostic like so:
var path = require('path');
function getAppDir(appName, cb) {
var plat = process.platform;
var homeDir = process.env[(plat == 'win32') ? 'USERPROFILE' : 'HOME'];
var appDir;
if(plat == 'win32') {
appDir = path.join(homeDir, 'AppData', appName);
}
else {
appDir = path.join(homeDir, '.' + appName);
}
fs.mkdir(appDir, function(err) {
if(err) return cb(err);
cb(null, appDir);
})
}
Just declare a function to get the app directory. This should handle most systems, but if you run into a case where it does not it should be easy to fix because you can just create some kind of alternate logic here. Lets say you want to allow a user to specify a custom location for app data in a config file later on, you could easily add that logic. For now this should suite most of your cases for most all Unix/Linux systems and Windows Vista and up.
Storing in system temp folder, depending on the system, your cache could be lost on an interval(cron) or on reboot. Using the global install path would lead to some issues. If you need this data to be unique per project, then you can extend that functionality to allow you to store this data in the project root, but not the module root. It would be best not to store it in the module root, even if its just installed as a local/project module, because then the user doesn't have the ability to include this folder in their repositories without including the entire module.
So in the event that you need to store this cached data relevant to a project, then you should do so in the project root not the node_modules. Otherwise store it in the users home directory in a system agnostic way.
First you need to know in what kind of SO you are running:
Your original idea is not bad, because global modules are not really global in all SO and in all virtual enviroments.
Using /home/user may not work in Windows. In windows you have to check process.ENV.HOMEPAHT
I recommend you a chain of checks to determine the best place.
Let the user take the control. Chose your own env variable. Supouse MYMOD_HOME. You firts check if process.ENV.MYMOD_HOME exists, and use it
Check if windows standard process.ENV.LOCALAPPDATA
Check if windows standard process.ENV.HOMEPATH
Check if exists '/home/user' or '~'
Otherwise use __dirname
In all cases create a directory ./mymodule

Resources