Hy,
I have a filebeat.yml file (installed on a DEV server, sending data to logstash and further to kibana) and I would like to show one extra field with the environment I am working with.
environment = "DEV"
But I didn´t find out, how I add such a filed in my filebeat configuration.
Can anyone help me?
Related
I have two environment development(local) and other QA environment, In the last is a cluster with two nodes.
The problem came with the deploy in QA environment I can't see the log in the server, but locally print console logging without problem.
I'm sure that module structure is the same in both environments, and my configure is in the classpath with xml file.
Which aspect can influence in this difference?
Local print logging console server and QA enviroment dont do it.
I was able to solve this, I checked the server in the development environment, it's a domain configuration and for this reason it did not show the application log in the user wizard, due to application conditions only the applications write to the .log file, at start I was not allowed to check this domain log directory, but after I check this by command line my app log file is right there
I'll hope to help someone else
My goal is to collect some custom logs in Azure Monitor from an external VM running on Linux. In that regard, I've installed the log analytics agent according to the MS official documentation, I ran the wizard in order to setup a custom log - that includes a sample file, a row delimiter and a location from where to collect the logs. However, I'm getting a warning message saying:
Two successive configuration applications from OMS Settings failed – please report issue to github.com/Microsoft/PowerShell-DSC-for-Linux/issues (1)
Tried to follow the link proposed that points to Github where I wasn't able to find any solution (nor on any other link) on this and that's why I said to give it a change and ask the community in here.
Though, it is weird that the heartbeat of the machine or manual syslogs messages are being collected except for the custom logs.
Has anyone encountered this and managed to overpass it? Thanks
Apparently, according to the MS answer, the above warning message is normal to be displayed. However, the reason for not collecting the logs was that in the target file that has to be processed by the oms agent, you need to keep appending new entries because this triggers the oms agent which compare and check if the file has new entries than at the last check.
Hope will help someone!
I am setting up SolrCloud configuration for already existed solr configuration with drupal-7. I have configured zookeeper in 3 different machines and SolrCloud in 2 other machines. All the conf files are present in the configs directory in zookeeper.
Everything is fine till here but communication between drupal and Solr in not happening due to the following error.
Error: "You are using an incompatible schema.xml configuration file. Please follow the instructions in the handbook for setting up Solr."
Currently, Application is running on drupal-7 and solr-7.x-1.13 module is installed.
Till now, I didn't touch any solr configuration files in drupal server.
What else configuration I have to modify here to resolve the schema.xml incompatibility error?
I tried by configuring solrCloud using 5.4.1 and 6.4.1 version but I am getting same error.
In my case, what fixed this issue was killing the solr process and then starting solr again.
First, find the relevant solr process by trying to start solr...
cd /base/path/for/your/solr
bin/solr start
You will see something like...
Port 8983 is already being used by another process (pid: 12345)
Kill whatever process ID is mentioned in the "already being used" message...
kill 12345
Now you should be able to start solr...
bin/solr start
After this restart of solr, I refreshed the page in Drupal and the "incompatible schema.xml" message was gone.
You will have to look at the solr error logs to see what part of your schema.xml is not right.
You'd really have to do this is each one of your solr cloud nodes, since there isn't any guarantee that zookeeper uploaded the correct schema.xml on all shards, and that's why you could be getting that error.
You could use zkcli to upload your configs (https://lucene.apache.org/solr/guide/6_6/command-line-utilities.html), and then reload your collection on all nodes to apply the changes, but even then there's no guarantee it'll work.
To save time and stress, you could just use a SaaS service, such as https://opensolr.com
You can get it setup for free and you get a UI to edit your config files, upload your config files to your server, and a lot of other nice UI features to manage your solr index.
Just started to use LogParser. Already existing system is using log parser to read the IIS file and update the db to calculate hits, etc..
I am trying to understand the flow and need to extract two more new fields from IIS log and update the db.
In my local desktop i do have sample log file and log parser. And i tried this query LogParser.exe “Select top 10 * from c:\LogParser*.log” in Log parser and got Error: detected extra argument "top" after query. Why i couldnt read the log file which is existing in my local?
And also i got batch file which is in the production. i changed the path to access my desktop files and scheduled the windows task. It is also not working. The code as,
logparser file:Extract.sql?inputfile=c:\LogParser*.log -o:SQL -database:dbname server:test1 -username:username -password:password -createtable:OFF -maxStrFieldLen:2048 -clearTable:OFF
I just need to simulate the existing system to update the database and need to add more fields.
Please help me to go further. i really got stuck.
I am not sure, if this would solve your problem, you can try your hands at LogParser Studio - that gives an IDE to the traditional Log Parser.
Definitely easier to rectify your common mistakes, and get help/documentation at your disposal. You can get more info and download it from here.
Hope it helps!
I am trying to create an instance using the Configuration Manager of WCS 7. I am working on a Win 7 x64 machine with DB2 9.5 64 bit version.
I am struck with this Massloading error when the instance creation happens :
In createInstanceANT.log file :
[Massload] Massloading
C:\IBM\WebSphere\CommerceServer\schema\xml\wcs.keys.xml Error in
MassLoading, please check logs for details.
The error log shows the following error :
[jcc][10165][10044][4.3.111] Invalid database URL syntax:
jdbc:db2://:0/WCSDEMO. ERRORCODE=-4461, SQLSTATE=42815
C:\IBM\WEBSPH~1\COMMER~2\config\DEPLOY~1\xml\createBaseSchema.xml:185:
Error in massloading
WCSDEMO is the database name. The Massloader is not able to get the URL and port to connect. It is supposedly getting them from createInstance.properties file but it is not working. The createInstance.properties file has all the details of the DB to connect.
What could be the reason for this error and how to resolve it ? Is there any configuration change that I am missing ?
Can you provide some more details.
look inside the messages.txt file located in WC_install_dir/instances/instance_name/logs
and confirm what the exact issue is. If it is related to jdbc driver being wrong I may be able to help you.
I've been running into massloading problems with external systems. Eg. databases not on the same machine as the WAS installation.
In these cases I look for the
As you can see setting the loaderDBName to just the name of the database would look on the local machine. But by changing this statement so you load with the syntax
loaderDBName=[DATABASE_SERVER_NAME]:[PORT]/[DATABASE_NAME]
You'll be able to massload using the commerce standard scripts. These changes needs to be done in many scripts. Both for updating fixpacks and enabling features. If you run database updates without the changes it will crash at first and have done all the schema changes to the database that you then need to comment out before trying again.
IBM Software Support is your friend. They'll help you fix it.