I'm running an application on Bluemix using Node-RED. I added a debug node to output the complete msg object, but it is being truncated in the debug console. How can I see the complete object?
You can have the output sent to the console as well as the debug tab by checking a box in the debug node's config. The whole object will be sent to the console
The current debug tab will always truncate, but there are some plans to possibly add a separate debug window that could show the whole message. Also have a look in settings.js as I believe the character limit for when to truncate is set there, so if it's just too short you can increase it a bit.
EDIT:
I'd missed the bluemix tag earlier. To view the console log you need to use the cf command to tail the output. e.g. for a app called node-red you would run the following:
cf logs node-red
Not sure if you can access your settings.js file but if you do, look for the debugMaxLength property and set it to a larger number. It will display more of your debug info.
This also applies for the Node-RED add-on for Home Assistant. After changing the debugMaxLength value in config/node-red/settings.js, one needs to restart the add-on.
Related
I am writing tests using pytest and if my validation fails it generates a lot of logs which is needed. I am using the python's standard logging module to log these errors to a file using FileHandler class.
I am using pytest-html to generate the html report.
Everything is working as expected except that pytest is logging the log messages into the terminal also which is taking more time than actual test execution. (Imagine logging around 200k log messages on the terminal).
Is there any option in pytest that would disable logging any logs into the terminal but still capture and make those logs available in the html report. It would still be desirable to log the exceptions or the assertion errors if possible.
You might be able to configure the output using the --show-capture option.
From pytest --help:
--show-capture={no,stdout,stderr,log,all}
Controls how captured stdout/stderr/log is shown on failed tests. Default is 'all'.
If you want a finer-grained control of what you filter out, you might need to look into the logging configuration itself (change the captured log level, etc.)
I'm able to successfully call my functions and make them do what I want them to do. The problem is that it doesn't look like the logs are being saved anywhere and I don't see how I can view them. Which I'll want to do in the event of an error. As a test I have my working function just doing a log.Info as soon as it's called. When testing locally it prints the message to the console. I believe I've enabled everything correctly but let me explain what I've done in case I didn't.
In my app service, under Monitoring -> Diagnostic Logs, I have enabled everything. Application Logging (filesystem) verbose, Application Logging (Blob) verbose (with the storage location set), detailed error messages and failed request tracing turned on.
In my function, I'm using the TraceWriter object that's passed to my run method (I started from a template).
Please note that functions are set to require authentication. If I click on the "Monitor" tab nothing appears. It just says "Loading..." forever and there's no information. Perhaps this is because of the authentication?
I used the Azure Storage Explorer to browse to my blob. The "log" blob exists, and I do see a set of nested directories that lead up to now. However it just contains a 354 byte file that contains a few lines of some random info. This file never seems to update or get larger.
I used FTP to try and browse to where the logs might be, but there's no directory on there that contains any log files.
I also went to KUDU for my function app ({myfunctionapp}.scm.azurewebsites.net/azurejobs/#/functions). While I do see that my function was called successfully, I don't see anything from the call to log.Info anywhere.
I tried using a different logger, and as a test did: System.Diagnostics.Trace.TraceError("test error");
I also don't see this message appearing anywhere.
Am I missing something as far as set up goes? Is the problem the fact that I require authentication? If it's the latter, is there still a way to view logs? I definitely have to have auth enabled. Thanks. And if it helps, below are links to what my settings and the monitor tab look like.
Settings: https://postimg.org/image/u57m2xbl5/
Monitor: https://postimg.org/image/uou10arch/
Authentication should not cause any problems with logging and Log.Info should work out of the box, no setup required.
I highly recommend that you enable AlwaysOn for your dedicated function app. The long loading of the Monitor tab could be because your site is in a 'cold' state, where it takes longer to start up.
If you go to {myfunctionapp}.scm.azurewebsites.net/DebugConsole and navigate to LogFiles/Application/Functions do you see any expected logs there? Also, when you run a function from the portal do you see logs in the log window?
Same thing happened to me if I had Fiddler open....close Fiddler and all is good.
I would like to know if anyone has suggestions for how to fix my broken debug node. So far I've done the following:
1) I have set up an inject node with a repeating message that sends the string: "Repeated Message," every 1 second. I've also connected this node to a debug node. When I deploy with this configuration (making sure that the debug deactivation sidebar is not selected), I see nothing on the debug output.
2) I've also set up one twitter input linked to my account that searches for basic keywords in actively sent tweets. I've also connected both a debug and a Twitter node which publishes the tweets with my desired key word. It's strange because, the debug node does not send any info to the debug tab after being deployed. However, my twitter account is constantly publishing each tweet that contains my chosen keyword.
I'd appreciate any help pointing to why the debug node is not working.
To confirm, you have the debug node such that the sidetab is green as shown in pic below?
If it is white, it will not show the data in your debug window.
I had same situation. In my case I has been set USERID and PASSWORD authentication in IoTF environment variables. Today I took off these authentication information of user ID and password from my environment variables. Then Debug Node is working well.
I hope this can help you.
Best Regards
If you are behind a proxy, add your application address in the exceptions of your web browser. It worked for me, I had the same issue as you did while using Firefox
I deployed a nodejs app on Google App engine following this tutorial https://github.com/GoogleCloudPlatform/appengine-nodejs-quickstart it was successful and now I want to check the logs of the nodejs server, like in development from the terminal console. The Vms are managed by google but even if I ssh to them I don't know where to look for the logs.
You can read the stdout of the docker container that your app runs by doing docker logs <container id> in the VM instance. You can get the container id from docker ps.
No need to SSH into the instance though. You can simply fetch the logs from the Developers Console under Monitoring > Logs.
As #tamberg mentioned in a comment, the easiest option I've found for looking at logs produced by Google App Engine instances running Node.js is to simply use the log viewer at:
https://console.cloud.google.com/logs/viewer?resource=gae_app
Detailed instructions from https://cloud.google.com/appengine/docs/standard/nodejs/building-app/viewing-service-logs are as follows:
Open the Logs Viewer in the GCP Console: https://console.cloud.google.com/logs/viewer?resource=gae_app
In the first filter dropdown at the top of the page, ensure GAE Application is selected, and choose Default Service.
Use the second filter dropdown to select only stdout and click OK. This limits the viewer to logs sent to standard output.
Use the text field above the dropdowns to search for the name you used when you submitted your form. You should see the logs corresponding to your submissions.
The default logging is really awful. None of my console.log messages show up! There are a few ways you can fix this.
1) Write logs to a log file.
For example, /var/log/app_engine/custom_logs/applogs.log
https://cloud.google.com/appengine/articles/logging
"Cloud Logging and Managed VMs apps Applications using App Engine
Managed VMs should write custom log files to the VM's log directory at
/var/log/app_engine/custom_logs. These files are automatically
collected and made available in the Logs Viewer. Custom log files
must have the suffix .log or .log.json. If the suffix is .log.json,
the logs must be in JSON format with one JSON object per line. If the
suffix is .log, log entries are treated as plain text."
2) Use winston with winston-gae.
Create a transport that will send the logs to appengine.
3) Use gcloud-logging module
Too verbose for my liking, but it is another option.
Is there a way through the brasiers like firebug or another browser plugin to do traces or log console from a cfc file.
I'm completely new to CF so sorry if this seems like a stupid question.
If you want logs to be visible in the browser ColdFire is your best choice. With it, you can see all of ColdFusion's extended debugging information even on a production site. Unless you have the proper authentication via ColdFire the server won't spit out the extended info.
As #gillesc recommended, you can use LogBox which is extracted from the ColdBox framework. The ColdBox Framework has a debugging mode that allows you to trace messages to the bottom of the page, or, to a separate window. This is useful even on production sites since you can observe the tracer methods from other users.
Finally, you can simply print to the console using writeDump(var="my log message",output="console") for quick debugging--or--use the <cflog> tag to save log messages to a named log file which you can monitor using tail. For a dead simple solution, you can save the log file to the root of your site and simply press F5 to see the new log entries; however, I do not recommend this practice (unless you are saving credit card information and share that file with me :).
Hope this reply helps.
Aaron
There is a cftrace tag that will allow you to log output to the console, among other spots in your application and development environment.
<cftrace category="init data" type="Information" var="myvartooutput" />
Calling this tag will output the relevant content in a few places:
The console in ColdFusion Builder, if you are using that IDE
In Dreamweaver, the Adobe docs mention a server debug tab/view (I don't use DW, so am not sure)
At the end of the request in the debug output
In cftrace.log, which is in your log directory (/COLDFUSION/INSTALL/DIR/logs/cftrace.log)
You can also use the tag cflog to write data to one of the standard log files or you may choose to have it write the desired data to a custom log file.
<cflog file="customlog" application="no" text="Output #somevar#!" />
If "customlog" does not exist, CF will create it for you (in the same location noted above).
Hope that helps!
EDIT: I offered this more of an alternative way to using to Firebug ... if you want the logs/traces but were not necessarily wed to a browser/plug-in.
If you've got CF Builder you can actually set up a debugger, but it's terribly slow. Here's the documentation on that: http://help.adobe.com/en_US/ColdFusionBuilder/Using/WS0ef8c004658c1089-31c11ef1121cdfd6aa0-7fff.html
There's also ColdFire, which is a Firebug add-on. Never used it before but I hear good things: https://github.com/nmische/ColdFire/
Try ColdFire for firebug extension
http://coldfire.riaforge.org/