I am using C# test project .I wish to load a Xml which is available inside the project under a folder Dump . I am able to do
string path =
"C:\APP\FrameworkTest\TestProject\Dump\GetAddressById.xml";
but i don't want to use like this because if the drive changes,my code will fail.
in asp.net we have something like Server.MapPath() . Is there some thing like this ?
For example:
var dir = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
var path = Path.Combine(dir, "Dump", "GetAddressById.xml")
Hope this helps.
If you know that folder Dump will always be present in the deployement folder of your application then you certainly don't need to hard code full path.
For ASP.net:
var path = System.IO.Path.Combine(Server.MapPath("/"), "Dump",
"GetAddressById.xml");
For C#:
var path = System.IO.Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Dump",
"GetAddressById.xml");
Related
I wanted to find where is my .exe file when it runs.
for example it's location is *C:\User* and you need to know this when the app starts.
string path = "";
path = System.AppContext.BaseDirectory;
With this code you are able to find where is your .exe file is running from.
😉
I have a single directory with a few million json files in it. I ultimately want to iterate over each file in the directory, read it, do something with the information and then write something into a database.
My script works perfectly when I use a test directory with a few hundred files. However, it stalls when I use the real directory. I strongly believe that I have pinpointed the problem to the use of:
fs.readdirSync('my dir path')
Converting this to the Async function would not help anything since I need the file names before anything else can happen anyways. However, my belief is that this operation hangs because it simply "takes too long" for it to read the entire directory.
For reference here is a broader portion of the function:
function traverseFS(){
var path = 'my dir name and path';
var files = fs.readdirSync(path);
for (var i in files) {
path + '/' + files[i];
var fileText = fs.readFileSync(currentFile,'utf8');
var json= JSON.parse(fileText);
if(json)
// do something
}
}
My question is either:
Is there something I can do get this to work using readdirSync?
Is there another operation I should be using?
You would need to either use a child process (easiest) that creates a directory listing and parse that or write your own streamable binding to scandir() (on *nix) and/or whatever the equivalent is on Windows and use that. For the latter, you may want to use the libuv code (*nix, Windows) as a guide.
Got some task, it is not hard, but i have some trouble.
Maybe someone already has similar problem.
My task is writing to zip archive some folder, with files and other folders in it with using NodeJS
I try to use AdmZip pakage
folder project structure:
var AdmZip = require('adm-zip');
let Archive = new AdmZip();
Archive.addLocalFolder('Archive/filesFolder', '');
Archive.writeZip('Archive/newArchive.zip);
I must get archive with 'filesFolder', instead i get archive with 'Archive' folder and in it i have 'filesFolder'.
If anybody know, how to record only target folder, and not the sequence of a way folders?
What happens is that you are providing Archive/filesFolder as value to writeZip and that means include in the zip Archive folder and inside that include filesFolder.
For testing purpose change the value of writeZip() to just filesFolder.zip and it should zip content of Archive as filesFolder.zip in current working directory. See below (you can copy/paste the bit of code and run it and it should work).
var zip = new AdmZip();
// I don't know why you provided a second argument to below, I removed it
// There was nothing about it in the documentation.
zip.addLocalFolder('./Archive');
//rename the value of the following to just filesFolder.zip
zip.writeZip('filesFolder.zip');
The above should output the content of Archive to the current working directory as filesFolder.zip
I did mention this in my comment and your commend seem to indicate that you have path issue, so try the above script.
Hi I am building a Mozilla Extension through CFX tool. I have used ChromeWorker in it. It is working fine while i am running : cfx run command. But while building up a package using cfx xpi the Chrome Worker file is not included in the xpi package.
I am using this to create the worker thread.
var tworker = new ChromeWorker("chrome://addons/content/t_worker.js");
my t_worker.js file is present in addons/lib
I have also put one chrome.manifest file in the package that contains :
content addons ./resources/addons/lib/
Please tell the possible reason for this problem and also how to fix it .
Try moving your file into the data folder then do:
const self = require('sdk/self');
var tworker = new ChromeWorker(self.data.url('t_worker.js')
im totally not sure of this syntax, i just typed off top of my head
I putted that worker file in the lib thats why its not working. I just changed the location of my file to data folder and made changes in my chrome.manifest file : content addons ./resources/addons/data(previously it was lib)/ . Its working fine Thanks to #Noitidart for the suggestion of putting it in data folder
I want to split the code into different files. I currently write all get and post methods in the same file, but I want greater readability and manageability.
I've tried to put the code in different files, but while running the main app, the rest of get and post methods in other files are not able to called.
I include this:
var Db = require('/filename.js');
// ...but I can't call those methods.
I want to split my single file code for readability. How do I achieve this?
Just have a look at the module documentation:
Starting with / looks for absolute paths, eg:
require('/home/user/module.js');
./ starts with the path where the calling file is located.
require('./lib/module.js');
__dirname has the same effect than ./:
require( __dirname + '/module.js');
Try:
var Db = require('./filename.js');
or
var Db = require('filename.js');
Also, have a look at this blog post.