I am using below logging config to create a log file. It is working perfectly fine without pytest run, but with pytest, it is failed to create log file.
logging.basicConfig(filename='test.log', filemode='a',format='%(message)s',level=logging.INFO)
Guessing, this above config was overwritten by pytest's config. Hence it failed to create it.
Please suggest how we can execute above logging config and get the log file while running through pytest?
Related
I have a google cloud run flask application named "HelloWorld1" already up and running however i need to create a second flask application. I followed the below steps as per documentation:
1- On "Cloud Shell Editor" clicked "<>Cloud Code" --> "New Application" --> "Cloud Run Application Basic Cloud Run Application .."-->"Python (Flask): Cloud Run", provide and new folder and application is created.
2- When i try to run it using "Run on Cloud Run Emulator" i get the following error:
Starting to run the app using configuration 'Cloud Run: Run/Debug Locally' from .vscode/launch.json...
To view more detailed logs, go to Output channel : "Cloud Run: Run/Debug Locally - Detailed"
Dependency check started
Dependency check succeeded
Starting minikube, this may take a while...................................
minikube successfully started
The minikube profile 'cloud-run-dev-internal' has been scheduled to stop automatically after exiting Cloud Code. To disable this on future deployments, set autoStop to false in your launch configuration /home/mian/newapp/.vscode/launch.json
Update initiated
Update failed with error code DEVINIT_REGISTER_BUILD_DEPS
listing files: file pattern [requirements.txt] must match at least one file
Skaffold exited with code 1.
Cleaning up...
Finished clean up.
I tried following:
1- tried to create different type of application e.g django instead of flask however always getting the same error
2- tried to give full path of [requirements.txt] in docker settings, no luck.
Please if someone help me understanding why i am not able to run a second cloud run Flask app due to this error?
It's likely that your Dockerfile references the 'requirements.txt' file, but that file is not in your local directory. So, it gives the error that it's missing:
listing files: file pattern [requirements.txt] must match at least one file
I am running my e2e WebdriverIO on Gitlab Pipeline. Now I am trying to integrate my e2e tests after the deployment on Azure. The tests run after deployment as expected, only that I have a strange error with Azure.
I have a test case to upload a file. Here is how I get the file:
const filePath = process.env.PWD + '/test/resources/files/logo.jpg'
console.log('file = ' + filePath);
When my tests run on Gitlab Pipeline, the file can be located as follow:
file = /builds/hw2yvbjx/0/xxx/xxx/xxx/xxx/e2e/test/resources/files/logo.jpg
But when my tests run on Azure Pipeline, the file is undefined as follow:
file = undefined/test/resources/files/logo.jpg
and the full log is as follow:
Error: ENOENT: no such file or directory, open 'C:\azagent\xxx\xxx\xxx\xxx\_xxx\e2e\undefined\test\resources\files\logo.jpg'
The path is correct, except that extra undefined is added between e2e and test. Does anyone know why this undefined is appended in the path? And how to fix it?
Thanks
process.env.PWD will log as undefined on Windows. The PWD environment variable is a 'Linux thing'.
That's why you're seeing this in your log statement in Azure (running on Windows, deduced based on the path starting with C:\...) but not in your GitLab job, which runs on a Linux host.
To fix it, you can use process.cwd(), which is platform agnostic, instead of process.env.PWD.
I am following along here - https://docs.solana.com/developing/on-chain-programs/debugging
I have included this line in .bashrc
export RUST_LOG=solana_runtime::system_instruction_processor=trace,solana_runtime::message_processor=info,solana_bpf_loader=debug,solana_rbpf=debug
When I deploy a program to solana-test-validator and call the function the program completes sucessfully however the msg! calls in the rust program are not printing in the console.
I am getting command not found error for below. Do I have to configure the github cargo registry?
$RUST_LOG
solana_runtime::system_instruction_processor=trace,solana_runtime::message_processor=info,solana_bpf_loader=debug,solana_rbpf=debug: command not found
To see the msg! calls, you can use the solana logs tool, pointed at your local validator, by opening up another terminal and running:
solana logs --url localhost
Check out https://docs.solana.com/cli/usage#solana-logs for the full info on how to use it.
I have a python script with a cli argument parser (based on argparse)
I am calling it from a batch file:
set VAR1=arg_1
set VAR2=arg_2
python script.py --arg1 %VAR1% --arg2 %VAR2%
within the script.py I call a logger:
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
This script utilizes chromedriver, selenium and requests to automate some clicking and moving between web pages.
When running from within PyCharm (configured so that the script has arg_1 and arg_2 passed to it) everything is great - I get log messages from my logger only.
When I run the batch file - I get a bunch of logging messages from chromedriver or requests (I think).
I have tried:
#echo off at the start of the batch file.
Setting the level on the root logger.
Getting the logging logger dictionary and setting each logger to WARNING - based on this question.
None of these work and I keep getting logging messages from submodules - ONLY when run from a batch file.
Anybody know how to fix this?
You can use the following configuration options to do this
import logging.config
logging.config.dictConfig({
'version': 1,
'disable_existing_loggers': True,
})
I'm using debug npm module to log stuff, is there a way to log into a file programmatically?
Right now I'm doing DEBUG=* node myApp.js > abc.log, how can I log into abc.log by simply running DEBUG=* node myApp.js, while also outputting in stderr?
I didn't find any package doing this.
The package doesn't seem to provide a builtin feature to do this, but it provides you with a hook to customise how logs are emitted.
There is an example in the Readme here.
Note: the example is a bit confusing because it shows you how to replace writing on stdout with ... writing on stdout using the console !
So what you should at the startup of the application:
Open a stream that writes to a file. Tutorial here if you need help on this
Override the log.log() as explained in the doc to write to your file instead of using console.log().