I can not use the import command - linux

i can't import zxlolbot.py
have problem...
No handlers could be found for logger "zxlolbot"

It means zxlolbot uses logging module without configuring it properly. You should see:
logger = logging.getLogger(__name__)
in zxlolbot.py. Add this line after it:
logger.addHandler(logging.NullHandler())
To save logging messages to a file, you could add in bot.py:
if __name__=="__main__":
import logging
logging.basicConfig(filename='bot.log', level=logging.INFO)

This sounds like you didn't set up the logging module properly.

Related

Python logging use a single logger for entire project where name is defined by arguments from cmd

Hey I am trying to set up a logger for python3.9 for an entire project, with multiple files. I want to just define the logger in main.py using command line arguments to define log file name.
logging.basicConfig(format='%(asctime)s %(levelname)-8s %(message)s', level='INFO')
logger = logging.getLogger(__name__)
def main():
file_name = sys.argv[1]
lookback_minutes = int(sys.argv[2])
file_handler = logging.FileHandler(f'log/{file_name}.log')
logger.addHandler(file_handler)
logger.info(f'Running processing chain for: {file_name}')
processing_chain.run(lookback_minutes)
processing_chain file:
import logging
def run(lookback_minutes):
logging.info(lookback_minutes)
Which works for main, I get the info statement printed to the log file. However I do not understand how to import it into the files that main calls. How do I bring the file handler into processing_chain file? Currently from what I could understand from other places on stackoverflow, I just import logging and then use logging.info or any other level and it should follow. But it does not log to file, just to console.

SonarQube( SAST SCAN) log injection hotspot issue

I have written code to add logs using logging module in python. I tried running code through Sonarqube, It is showing following error .
Make sure that this logger's configuration is safe.
python code:
from logging.config import fileConfig
import logging
#this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
logger = logging.getLogger("alembic.env")
class DefaultConfig:
DEVELOPMENT = False
DEBUG = False
TESTING = False
LOGGING_LEVEL = "DEBUG"
CSRF_ENABLED = True
Please help to resolve this hotspot. And one more question I have, Is it mandatory to look into low priority hotspots.

How to redirect abseil logging messages to stout instead of stderr?

I am using python 3.7.6. and abseil module for logging messages with absl-py 0.9.0. I am using this piece of code for my tests.
from absl import logging
from absl import app
def main(argv):
#logging.set_stderrthreshold(logging.ERROR)
#logging._warn_preinit_stderr = False
logging.set_verbosity(logging.DEBUG)
print(' 0 -----')
logging.debug(' 1 logging-debug-test')
logging.info(' 2 logging-info-test')
logging.warning(' 3 logging-warning-test')
logging.error('4 logging-error-test')
print(' 5 -----')
if __name__ == '__main__':
app.run(main)
When testing it in a Jupyter notebook, it is clear from the color code of the background that abseil messages are in the stderr stream.
Same things when executing the python code in a shell:
I tried few things with different values like:
logging.set_stderrthreshold(logging.DEBUG)
logging._warn_preinit_stderr = True
but I still see 100% the same output.
How can I redirect output abseil logging messages to stdout instead of stderr ?
Is it expected to have the logging output messages redirect to stderr and not stdout? I am probably missing something with the logging logic and I want to better understand it.
I was told that this is the standard behavior and what Python's standard logging module does. In my case adding the following line redirect the logging messages to stdout:
logging.get_absl_handler().python_handler.stream = sys.stdout
Now in my Jupyter notebook it looks like that:
This did NOT work for me for some reason:
from absl import logging
import sys
logging.get_absl_handler().python_handler.stream = sys.stdout
But this did:
import logging
import sys
logging.basicConfig(stream=sys.stdout)

Python, Flask print to console and log file simultaneously

I'm using python 3.7.3, with flask version 1.0.2.
When running my app.py file without the following imports:
import logging
logging.basicConfig(filename='api.log',level=logging.DEBUG)
Flask will display relevant debug information to console, such as POST/GET requests and which IP they came from.
As soon as DEBUG logging is enabled, I no longer receive this output. I have tried running my application in debug mode:
app.run(host='0.0.0.0', port=80, debug=True)
But this produces the same results. Is there a way to have both console output, and python logging enabled? This might sound like a silly request, but I would like to use the console for demonstration purposes, while having the log file present for troubleshooting.
Found a solution:
import logging
from flask import Flask
app = Flask(__name__)
logger = logging.getLogger('werkzeug') # grabs underlying WSGI logger
handler = logging.FileHandler('test.log') # creates handler for the log file
logger.addHandler(handler) # adds handler to the werkzeug WSGI logger
#app.route("/")
def index():
logger.info("Here's some info")
return "Hello World"
if __name__ == '__main__':
app.run(host='0.0.0.0', port=80)
Other Examples:
# logs to console, and log file
logger.info("Some text for console and log file")
# prints exception, and logs to file
except Exception as ue:
logger.error("Unexpected Error: malformed JSON in POST request, check key/value pair at: ")
logger.error(ue)
Source:
https://docstrings.wordpress.com/2014/04/19/flask-access-log-write-requests-to-file/
If link is broken:
You may be confused because adding a handler to Flask’s app.logger doesn’t catch the output you see in the console like:
127.0.0.1 - - [19/Apr/2014 18:51:26] "GET / HTTP/1.1" 200 -
This is because app.logger is for Flask and that output comes from the underlying WSGI module, Werkzeug.
To access Werkzeug’s logger we must call logging.getLogger() and give it the name Werkzeug uses. This allows us to log requests to an access log using the following:
logger = logging.getLogger('werkzeug')
handler = logging.FileHandler('access.log')
logger.addHandler(handler)
# Also add the handler to Flask's logger for cases
# where Werkzeug isn't used as the underlying WSGI server.
# This wasn't required in my case, but can be uncommented as needed
# app.logger.addHandler(handler)
You can of course add your own formatting and other handlers.
Flask has a built-in logger that can be accessed using app.logger. It is just an instance of the standard library logging.Logger class which means that you are able to use it as you normally would the basic logger. The documentation for it is here.
To get the built-in logger to write to a file, you have to add a logging.FileHandler to the logger. Setting debug=True in app.run, starts the development server, but does not change the log level to debug. As such, you'll need to set the log level to logging.DEBUG manually.
Example:
import logging
from flask import Flask
app = Flask(__name__)
handler = logging.FileHandler("test.log") # Create the file logger
app.logger.addHandler(handler) # Add it to the built-in logger
app.logger.setLevel(logging.DEBUG) # Set the log level to debug
#app.route("/")
def index():
app.logger.error("Something has gone very wrong")
app.logger.warning("You've been warned")
app.logger.info("Here's some info")
app.logger.debug("Meaningless debug information")
return "Hello World"
app.run(host="127.0.0.1", port=8080)
If you then look at the log file, it should have all 4 lines printed out in it and the console will also have the lines.

How to send errors to .log file in python pandas?

Even I searched in the google, did not find.
I am struggling with the code. Please help me.
My requirement is simple:
If any error occured, need to send to .log file for the below program.But below code is not capturing in .log file.
Please help me.
Thanks in advance.
import pandas as pd
import pyodbc
import time
import logging
df= pd.read_csv('Testing.csv')
logger = logging.getLogger('simple_example')
logger.setLevel(logging.DEBUG)
# create file handler which logs even debug messages
fh = logging.FileHandler("Archieve\\spam.log")
fh.setLevel(logging.DEBUG)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
fh.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(ch)
logger.addHandler(fh)
# 'application' code
logger.debug('debug message')
logger.info('info message')
logger.warn('warn message')
logger.error('error message')
logger.critical('critical message')
Make sure you all the imports are properly resolved with installing respective dependencies.
Then make sure you have a file named "Testing.csv" in your current working directory or PYTHONPATH.
Finally make sure, you have folder named "Archieve" in your current working directory, if it is not there, create it first.
mkdir Archieve
Finally see the log file in Archieve folder.
cat Archieve\spam.log

Resources