How to change a numeric ID into a sentence in Graylog using pipelines? - graylog2

I am trying to "beautify" the data I receive from some windows logs on Graylog. My idea is to change the windows log ID from a number to the actual definition for that ID. For example: I receive a log with ID 4625, I want to show in my widget "An account failed to log on".
To do that, I am using a pipeline and a lookup table, which reads the IDs and the respective definitions in natural language from a .csv that I've uploaded on the server.
This is the rule that I wrote for my pipeline, that doesn't seem to work:
rule "eventid_windows_rule"
when
has_field("winlogbeat_winlog_event_id")
then
let winlogbeat_winlog_italiano = lookup("winlogbeat_winlog_event_id", to_string($message.winlogbeat_winlog_event_id));
set_field("winlogbeat_winlog_italiano", winlogbeat_winlog_italiano);
end
I think my problem is specifically in this rule, because Graylog allows to test the lookup tables, and if I manually write an ID, the lookup table finds the respective description.

I solved the issue myself, this is the correct code for the rule:
rule "eventid_windows_rule"
when
has_field("winlogbeat_winlog_event_id")
then
let winlogbeat_winlog_italiano = lookup("eventid_widget_windows_lookup", $message.winlogbeat_winlog_event_id);
set_field("winlogbeat_winlog_italiano", winlogbeat_winlog_italiano);
end
This rule checks if the log has the field "winlogbeat_winlog_event_id", then it generates the new field "winlogbeat_winlog_italiano", associates the numeric value of "winlogbeat_winlog_event_id" with the description in natural language thanks to the .csv that I've created, then puts the description in the field "winlogbeat_winlog_italiano".

Related

QuickSight Row Level Security: DatasetRulesUnexpectedError

I'm attempting to apply row-level security I have an S3-based dataset with username and what I want to filter on. The dataset looks good in quicksight. I can create an analysis on it. rls_rater_action_username maps to a column in my dataset. However, no matter what I do with this file I get the error "An unexpected error occurred. If this problem continues, contact your administrator. Error code: DatasetRulesUnexpectedError".
csv file contents:
username,rls_rater_action_username
Dave, test
The error is kind of useless. I have no idea what the issue is? Anyone have any guesses?
Usernames are case sensitive, write it exactly as amazon suggests with an uppercase Username.
Make sure the data type of both datasets(dataset,rules dataset), are strings, that the filter will apply.
Column order also matters in your s3 rules dataset.

Excel Import of custom mandatory field doesn't work [Hybris 6.7.0]

I'm using Hybris version 6.7.0 and I stuck with the following problem:
When I trying to perform importing products from excel file. It gives me the following error ->
I've checked the excel file and there is, of course, field "Subscription Term*", it is mandatory that's why there is an asterisk there. Good to mention that this field is custom, so I write custom translator to it and exporting part works fine, but in importing part when I did debugging I found strange fact:
This WorkbookMandatoryColumnsValidator validator calls the method findColumnIndex(typeSystemSheet, sheet, this.prepareSelectedAttribute(mandatoryField)); from DefaultExcelTemplateService this method returns -1 and the validation does not passed. I dig into this method and there is such line of code:
String attributeDisplayName = this.findAttributeDisplayNameInTypeSystemSheet(typeSystemSheet, selectedAttribute); which returns "Subscription Term" string as you can see without an asterisk.
I've checked the other mandatory fields e.g. "Catalog version*^" it returns with 2 symbols after it.
The thing is that "Subscription Term" and "Subscription Term*" after string equality operation returns false and the validation fails as you can see here:
attributeDisplayName.equals(this.getCellValue(headerRow.getCell(i))).
Of course the second value is taken from the excel file where the asterisk sign presents.
If I remove an asterisk from excel file then I receive: Unknown attributes of type ISku error in WorkbookTypeCodeAndSelectedAttributeValidator validator:
The asterisk should be presented in excel file, I've just checked what would be...
It doesn't help me at all to understand what really happens.
I can't understand one thing: What is the source of "Subscription Term" string? Why without an asterisk? Is it predefined constant somewhere?
From debug I couldn't figure out from which source that string comes from.
I do not know for sure but I expect that string( i.e Subscription Term) to come from a localization file based on backoffice current session language ( e.g {extensionName}-locales_en.properties if the current language is en).
Try to search after "Subscription Term" in all properties files.
Maybe, if the attribute is mandatory(i.e optional="false" in items.xml) then Hybris will add to its name an "*" when performing the import.
Check whether you provided read and write permission to that attribute for that user.
Check with admin user before doing that. If there is no issue with admin user, then only permission issue with the user.

Unable to copy file from SFTP in Azure Data Factory when using wildcard(*) in the filename

I am unable to copy csv files from an SFTP connection to blob storage when using the wildcard(*) in the filename.
More specifically, I receive csv files in the SFTP on a daily basis, and they are of the format: "ddMMyyyyxxxxxx.csv", where "xxxxxx" is the timestamp. More concretely, my csv file for the 13th of March is: "13032019083647.csv", while for the 14th of March: "14032019083556.csv". Obviously, the timestamp is different for every day, thus I want to copy the file independently of whatever strings exists between the date and the the file extenstion.
In the "File" subfield of the "File path" of the "Connection" tab of my subset, I give as input: "13032019*.csv", as instructed by the help icon next to the field:
When I do so, my Debug run fails with:
{"errorCode": "2200", "message":
"ErrorCode=UserErrorInvalidCopyBehaviorBlobNameNotAllowedWithPreserveOrFlattenHierarchy,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot
adopt copy behavior PreserveHierarchy when copying from folder to a
single file.,Source=Microsoft.DataTransfer.ClientLibrary}
I receive a similar error no matter which type of copy behaviour I choose. I have also tried experimenting with the fileFilter parameter (even though ADF warns that the same behaviour can be achieved with the fileName option), but I still end up getting the same error.
For further clarification, I am attaching the Code segment that ADF produces for this configuration:
I should also mention, that when using the full fileName in the corresponding field, namely the value: "13032019083647.csv", copying works normally.
Any help would be greatly appreciated!
My guess it might get two files with wildcard operation.
In such cases we need to use metadata activity, filter activity and for-each activity to copy these files.
1.Metadata activity : Use data-set in these activity to point the particular location of the files and pass the child Items as the parameter.
2.Filter activity : Use filter to filter the files based on your needs.
3.For-each activity : In the For-each activity get Items from the previous activity and add copy activity inside the for-each.
In copy activity the source data set should be #item().name.
I hope this will solve your issue.
What worked for me was the following: I kept the same regex for the input file, but I defined as "Copy behaviour: Merge Files". Since as mentioned, there is only 1 file that satisfies the regex condition, only 1 file was created as output. I am aware that this is a sort of "dirty" solution, but it did the trick for me.

Netsuite - Transfer Inventory error

I have been using NetSuite for only a short time, and already hate it. I am sorry if this is a stupid question, but I haven't been able to find an answer so far, either in the Netsuite docs, StackOverflow or other websites. In fact, the answers I found have resulted in an error.
My company requires a script to transfer inventory based on an EDI input file. Reading the file is no problem, even parsing it is working. However, actually inserting the data is proving problematic.
I have been able to insert normal records, but Inventory Transfer records are giving me problems.
From Stack Overflow I found and adapted some code into the following:
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
nlapiSelectNewLineItem('invt');
nlapiSetLineItemValue("invt","invtid",1, 189);
nlapiSetLineItemValue("invt","adjustqtyby", 1, "5");
nlapiCommitLineItem('invt');
var id = nlapiSubmitRecord(xfer);
The FormatDate function just exchanges the date from the text file into a system date NetSuite can understand.
However, when I run this code I get the following error:
USER_ERROR: You must enter at least one line item for this transaction.
I thought inserting the line item was the reason to use nlapiSelectNewLineItem, but I guess not. Also nlapiCreateNewLineItem doesn't seem to exist.
The values I am inserting are all just test data, as I'm testing this in the debugger. Location 9 exists, as does item 189.
My full script finds these id's based on string values from the text files. But since this is the section that doesn't work I have set it apart to test.
Can anyone help with this?
You did not specify the type of script you are using, but, looks like you are not setting the fields on the record object, but, setting the value on current record. Below, is the suggested code.
Also, there is no sublist named invt, it should be inventory. Also, there is no field as such invtid, I think most probably you want to setup item, the field Id should be item. You might want to refer SuiteScript Record Browser as well for a help on correct Ids
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
xfer.selectNewLineItem('inventory');
xfer.setCurrentLineItemValue("inventory", "item", 189);
xfer.setCurrentLineItemValue("inventory","adjustqtyby", "5");
xfer.commitLineItem('inventory');
var id = nlapiSubmitRecord(xfer);
If you are using Bin/Lot Numbered Items, please see help topic "Sample Scripts for Advanced Bin / Numbered Inventory Management"

Can I import SAP tables that were exported by SE16?

I have exported the contents of a table with transaction SE16, by selecting all the entries and going selecting Download, unconverted.
I'd like to import these entries into another system (where the same table exists and is active).
Furthermore, when I import, there's a possibility that the specific key already exists for a number of entries (old entries).
Other entries won't have a field with the same key present in the table where they're to be imported (new entries).
Is there a way to easily update my table in the second system with the file provided from the first system? If needed, I can export the data in the 3 other format types (Spreadsheet, Rich text format and HTML format). It seems to me though like the spreadsheet and rich text formats sometimes corrupt the data, and the html is far too verbose.
[EDIT]
As per popular demand, the table i'm trying to export / import is a Z table whose fields are all numeric, character, date or time fields (flat data types).
I'm trying to do it like this because the clients don't have any basis resource to help them transport, and would like to "kinna" automate the process of updating one of the tables in one system.
At the moment it's a business request to do it like this, but I'm open to suggestions (and the clients are open too)
Edit
Ok I doubt that what you describe in your comment exists out of the box, but you can easily write something like that:
Create a method (or function module if that floats your boat) that accepts the following:
iv_table name TYPE string and
iv_filename TYPE string
This would be the method:
method upload_table.
data: lt_table type ref to data,
lx_root type ref to cx_root.
field-symbols: <table> type standard table.
try.
create data lt_table type table of (iv_table_name).
assign lt_table->* to <table>.
call method cl_gui_frontend_services=>gui_upload
exporting
filename = iv_filename
has_field_separator = abap_true
changing
data_tab = <table>
exceptions
others = 4.
if sy-subrc <> 0.
"Some appropriate error handling
"message id sy-msgid type 'I'
" number sy-msgno
" with sy-msgv1 sy-msgv2
" sy-msgv3 sy-msgv4.
return.
endif.
modify (p_name) from table <table>.
"write: / sy-tabix, ' entries updated'.
catch cx_root into lx_root.
"lv_text = lx_root->get_text( ).
"some appropriate error handling
return.
endtry.
endmethod.
This would still require that you make sure that the exported file matches the table that you want to import. However cl_gui_frontend_services=>gui_upload should return sy-subrc > 0 in that case, so you can bail out before you corrupt any data.
Original Answer:
I'll assume that you want to update a z-table and not a SAP standard table.
You will probably have to format your datafile a little bit to make it tab or comma delimited.
You can then upload the data file using cl_gui_frontend_services=>gui_upload
Then if you want to overwrite the existing data in the table you can use
modify zmydbtab from table it_importeddata.
If you do not want to overwrite existing entries you can use.
insert zmydbtab from table it_importeddata.
You will get a return code of sy-subrc = 4 if any of the keys already exists, but any new entries will be inserted.
Note
There are many reasons why you would NOT do this for a SAP-standard table. Most prominent is that there is almost always more to the data-model than what we are aware of. Also when creating transactional data, there are often follow-on events or workflow that kicks off, that will not be the case if you're updating the database directly. As a rule of thumb, it is usually a bad idea to update SAP standard tables directly.
In that case try to find a BADI, or if that's not available, record a BDC and do the updates that way.
If the system landscape was setup correctly, your client would not need any kind of basis operations support whatsoever to perform the transports. So instead of re-inventing the wheel, I'd strongly suggest to catch up on what the CTS and TMS can do once they're setup with sensible settings.

Resources