Planning to build new project on Windows Azure Websites using node.js, I ran into issues with SQL Server module, that I just cant resolve:
I'm using downloaded binary of the module, with simple code from example, connecting to Azure hosted SQL Server:
var sql = require('node-sqlserver');
var http = require('http');
var conn_str = "Driver={SQL Server Native Client 10.0};Server=tcp:server.database.windows.net,1433;Database=database;Uid=user#server;Pwd=mypass;Encrypt=yes;Connection Timeout=30;";
var port = process.env.port||3000;
http.createServer(function (req, res) {
sql.query(conn_str, "SELECT * FROM testdb.testtable", function (err, results) {
if (err) {
res.writeHead(500, { 'Content-Type': 'text/plain' });
res.write("Got error :-( " + err);
res.end("");
return;
}
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end(JSON.stringify(results));
});
}).listen(port);
The example works locally from my PC, it is able to connect to Azure SQL and retrieve rows, although I had to downgrade node to 0.6.19 to make it work, but it fails when I deploy it to Azure. When I try to access website, after long waiting time I receive:
iisnode encountered an error when processing the request.
HRESULT: 0x6d
HTTP status: 500
HTTP reason: Internal Server Error
You are receiving this HTTP 200 response because system.webServer/iisnode/#devErrorsEnabled configuration setting is 'true'.
In addition to the log of stdout and stderr of the node.exe process, consider using debugging and ETW traces to further diagnose the problem.
The node.exe process has not written any information to the stdout or stderr.
I tried to compile node-sqlserver module from source, and again it worked on my PC, with the same result on Azure. Also I verified that module binary gets deployed(I'm using git to deploy).
Any ideas?
Ok, what it turned out to be, is that SQL database and Websites where in different geo-regions, and despite "Make available to Azure Services" setting, Azure not able to connect to SQL server in different region. Had to move the database and it worked.
Related
I am building a NodeJS app and would like to use SQL Server for persistence storage. Before, I have never had problems connecting to MySQL Server but I am now getting the following error when attempting to connect to SQL Server:
ConnectionError: Error: [Microsoft][SQL Server Native Client 11.0]SQL Server Network Interfaces: Connection string is not valid [87].
Error: [Microsoft][SQL Server Native Client 11.0]Login timeout expired
Error: [Microsoft][SQL Server Native Client 11.0]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.
Below is my config and connection using mssql/msnodesqlv8 module.
const mssql = require('mssql/msnodesqlv8')
const dbConfig =
"server":"127.0.0.1",
"user":"sa",
"password":"PasswordHere",
"database":"DBName_Here",
"driver":"msnodesqlv8",
"dialect":"mssql",
"port":1433,
"options":{
"enableArithAort":true,
"instanceName":"MSSQLSERVER"
},
"connectionTimeout":15000,
"pool":{
"max":100,
"min":0,
"idleTimeoutMillis":30000
}
}
ValidateUser = async (data)=>{
console.log(data)
return new Promise(async (resolve, reject)=>{
console.log(dbConfig)
try{
let pool = await mssql.connect(dbConfig)
let results = await pool.request().query(`SELECT * FROM Users WHERE Username = ${data.username} AND Password = ${data.password}`)
results = results.recordSets
console.log(results)
if(results.length > 0){
resolve(results)
}else{
reject('Wrong Username or Password!')
}
}catch(err){
console.log(err)
reject('DB Exception: Authentication failed!')
}
})
}
I have attempted variations of the config including replacing the IP with my machine name DESKTOP-GLPFSS\COMPUTERNAME but to no avail. All relevant support will be greatly appreciated.
Please check the settings in SQL Server Configuration Manager are properly set:
Please follow below steps to verify it and try to configure it as per the attached image
Click the Windows key + R to open the Run window.
Type compmgmt.msc in the Open: box.
Click OK.
Expand Services and Applications.
Expand SQL Server Configuration Manager
Instead of "MSSQLSERVER" look for instance name which you provided at time of installation.
I recently deployed my very first Node.js app to AWS Elastic Beanstalk. It is a very simple portfolio page. The site worked with no problem for several hours, but then the instance went to Severe status and the page returned this message:
502 Bad Gateway
nginx/1.12.1
The error message in the log was "First argument must be a string or Buffer".
I restarted the app server, and the page worked for 12 hours with no problem, but then it went down again with the same message. So I started troublshooting and tried these things:
The Node.js version in Elastic Beanstalk was different than the version used to create my app, so I changed it to the same version the site was created with (8.12.0). Restarted app server. Same problem.
I thought that maybe the load balancer was having trouble reading the response, so I started converting the data sent in the response to a string (.toString()), but that did not help. And it turns out that my configuration does not even have a load balancer.
The Node documentation for fs.readFile said that the readFile method uses a lot of memory and to consider using readStream instead, so I made that change, but I'm getting the same result with readStream.
I rebuilt the environment and tried again. This time the page ran successfully for two days. Then after two days it errored again with this message:
Error: ENOENT: no such file or directory, open 'public//hell.php'
events.js:183
throw er; // Unhandled 'error' event
^
I don't use ANY php code. Why is it referencing a php file called "hell"?
Here is my code in the server.js file:
const http = require("http");
const fs = require("fs");
//use AWS's default port, or if it's not available, use port 8081.
const port = process.env.PORT || 8081;
const server = http.createServer(function (req, res) {
res.statusCode = 200;
if (req.url == "/" || req.url == "/index.html" || req.url == "/home") {
let readStream = fs.createReadStream("public/index.html");
// When the stream is done being read, end the response
readStream.on('close', () => {
res.end();
})
// Stream chunks to response
readStream.pipe(res);
}
else {
let readStream = fs.createReadStream("public/" + req.url);
// When the stream is done being read, end the response
readStream.on('close', () => {
res.end();
})
// Stream chunks to response
readStream.pipe(res);
}
}).listen(port);
A copy of the "public/index.html" file being read by fs can be found at:
https://zurafuse.github.io/index.html
Does anyone have any idea what I am doing wrong?
I have resolved this issue. It turns out, bots frequently hit AWS sites like mine looking for vulnerabilities and in my case they were trying to open pages that do not exist (like Wordpress pages). So I modified my code to only open pages that exist that I have defined, and if any http requests come asking for something unexpected, I return a "page not found" response. I have not had a problem since.
Because my site was constantly getting errors trying to open pages that do not exist, it was crashing my AWS Elastic Beanstalk instance. And since I have the free version, it is not scalable at all and so not very forgiving.
As a preface I am new to web development and have never published a site before.
I have built a website which runs fine locally and I want to publish it to the web using Azure.
The site uses a node.js server which I wrote myself (No express) which is connected to an SQLite3 database using the sqlite3 node module.
All I want to do is publish this site, I've tried using Azure to do so by using Azure command line tools to create a web app from the Git repo I have for the site.
I have a package.json file pointing to the server.js file which is the backend for the site, as well as serving files in the site it also returns data from an SQLite3 database I also have in the site folder. I also have the web.config file from this: https://github.com/projectkudu/kudu/wiki/Using-a-custom-web.config-for-Node-apps with the path to my server changed to match mine.
When I try to visit the site all I get is a blank screen, the application log gives this error: Error: SQLITE_CANTOPEN: unable to open database file
at Error (native)
So I'm guessing this means it has a problem having the DB built into the site in this way, if I comment out the database stuff it loads the site (minus the db content) just fine. When I try to test running the server in the azure console I get a "Bad Request" error, running on my own machine works fine.
My question is basically, how should I go about this goal of getting the site up given the challenges I've got? Is having an integrated db file completely the wrong approach or can I make it work? I've played around creating an azure DB but I cannot work out how to get the data from my db file into it. Are azure virtual machines the way to go, the advice I read was they're for more computationally intensive projects I'm only hosting a site?
I try to reproduce your issue on my side, and build a simple nodejs server with sqlite3 module on Azure Web Apps. But it works fine on my side, here are my test steps, you can try to follow my steps to fix your issue.
As to install sqlite3 requiring node-pre-gyp which is kind of Native Modules not supported via deployment task on Azure Web Apps. So we can install the sqlite3 module on local, and deploy the node_modules folder with the application togather to Azure.
As the nodejs runtime on Azure is in ia32 platform version. So we need to install with the command npm install sqlite3 --target_arch=ia32 on local.
Here is the code on my test:
var http = require("http");
var server = http.createServer(function(request, response) {
response.writeHead(200, {"Content-Type": "text/html"});
response.write("<!DOCTYPE html>");
response.write("<html>");
response.write("<head>");
response.write("<title>Hello World Page</title>");
response.write("</head>");
response.write("<body>");
var sqlite3 = require('sqlite3').verbose();
var db = new sqlite3.Database('test.db');
var data = [];
db.serialize(function() {
db.run("CREATE TABLE IF NOT EXISTS lorem (info TEXT)");
var stmt = db.prepare("INSERT INTO lorem VALUES (?)");
for (var i = 0; i < 10; i++) {
stmt.run("Ipsum " + i);
}
stmt.finalize();
db.each("SELECT * FROM lorem", function(err, row) {
data.push(row);
}, function() {
response.write(JSON.stringify(data));
response.write("</body>");
response.write("</html>");
response.end();
});
});
db.close();
});
server.listen(process.env.PORT || 1337);
Any further concern, please feel free to let me know.
I have a node server deployed to azure and am using Edge.js to make some WCF calls. When I host the server locally, everything works great because I can put the .NET config for the web service calls in node.exe.config located next to my local node.exe as recommended here.
However, this does not seem feasible for a node server hosted on azure. Azure doesn't seem to let user arbitrarily put files on their file systems on whatever machine the user's server happens to be currently running on (which is completely reasonable). Is there a way I can tell edge or node to look for the config in a different location?
At first, we should check out whether your WCF calls can be called from public network via the endpoint of the WCF calls.
If we can call the WCF call via the endpoint, we can directly call it in node.js.
Now I have an endpoint of WCF REST service like
“http://IP:port/Service.svc/Select?type=-1”, and here is my code snippet about calling this service in node.js:
router.get('/wcfCall', function (req, res, next) {
var request = require('request');
var r = request('http://IP:port/Service.svc/Select?type=-1');
r.on('data', function(data) {
console.log('decoded chunk: ' + data)
})
.on('response', function(response) {
response.on('data', function(data) {
console.log('received ' + data.length + ' bytes of compressed data')
})
})
})
I am trying to make an external request from my custom azure mobile services web api.
I know that it runs node js so I searched for solution....I came up with this solutia:
exports.get = function(request, response) {
var http = require('http');
http.get("YOURSITE",
function(res) {
console.log("Got response: " + res.statusCode);
}).on('error', function(e) {
console.log("Got error: " + e.message);
});
response.send(statusCodes.OK, { message : 'YOURMESSAGE' });
};
I get the following message: connect EACCES which means that I don't have the right permission... Besided this, I went to the configure tab in my mobile service and added at The Cross-origin resource sharing MYSITE, but I was pretty sure it won't help...
Do you have any suggestion?
I wasn't attentive and I put my local ip adress and thats why I didn't have acess. This works fine with external ips.