Lost headers when using UnZipResultSplitter - spring-integration

I'm using the Spring Integration Zip extension and it appears that I'm losing headers I've added upstream in the flow. I'm guessing that they are being lost in UnZipResultSplitter.splitUnzippedMap() as I don't see anything that explicitly copies them over.
I seem to recall that this is not unusual with splitters but I can't determine what strategy one should use in such a case.

Yep!
It looks like a bug.
The splitter contract is like this:
if (item instanceof Message) {
builder = this.getMessageBuilderFactory().fromMessage((Message<?>) item);
}
else {
builder = this.getMessageBuilderFactory().withPayload(item);
builder.copyHeaders(headers);
}
So, if those splitted items are messages already, like in case of our UnZipResultSplitter, we just use message as is without copying headers from upstream.
Please, raise a JIRA ticket (https://jira.spring.io/browse/INTEXT) on the matter.
Meanwhile let's consider some workaround:
public class MyUnZipResultSplitter {
public List<Message<Object>> splitUnzipped(Message<Map<String, Object>> unzippedEntries) {
final List<Message<Object>> messages = new ArrayList<Message<Object>>(unzippedEntries.size());
for (Map.Entry<String, Object> entry : unzippedEntries.getPayload().entrySet()) {
final String path = FilenameUtils.getPath(entry.getKey());
final String filename = FilenameUtils.getName(entry.getKey());
final Message<Object> splitMessage = MessageBuilder.withPayload(entry.getValue())
.setHeader(FileHeaders.FILENAME, filename)
.setHeader(ZipHeaders.ZIP_ENTRY_PATH, path)
.copyHeaders(unzippedEntries/getHeaders())
.build();
messages.add(splitMessage);
}
return messages;
}
}

Related

Parse CLR Event with TraceProcessor

I have followed the guidance at https://learn.microsoft.com/en-us/windows/apps/trace-processing/extensibility to get my hands on the .NET Runtime events. When I get a EventContext instance with the unparsed data I have no convenient way to parse things further?
Ideally there should be a parser generator for manifest based events like it is the case with TraceEvent. Something like
TraceProcessorGen -generateEvents c:\Windows\Microsoft.NET\Framework\v4.0.30319\CLR-ETW.man
would be a big help. I am not inclinded to manually write the parsing code with hundreds of hard coded offsets for dozens of events.
class ClrDataSource : IFilteredEventConsumer
{
public IReadOnlyList<Guid> ProviderIds { get; } = new Guid[] { new Guid("e13c0d23-ccbc-4e12-931b-d9cc2eee27e4") };
public int Count { get; private set; }
public void Process(EventContext eventContext)
{
ReadOnlySpan<byte> data = eventContext.Event.Data;
// What do do next?
}
}
TraceEvent from Vance Morrison has an easy way to explore manifest based events where the Payload and PayloadNames are already preparsed based on their manifest. This is not very performant but for many cases and explorative research very helpful.
(I am a developer at Microsoft who works on the TraceProcessor project.)
IFilteredEventConsumer is a way to get at the unparsed events in the trace, and it's true that we have not added support for using a manifest file to simplify that parsing.
However, parsed events for that provider should be available in the IGenericEventDataSource like this:
using (ITraceProcessor trace = TraceProcessor.Create(tracePath))
{
Guid[] providerIds = new[] { Guid.Parse("e13c0d23-ccbc-4e12-931b-d9cc2eee27e4") };
IPendingResult<IGenericEventDataSource> pendingEventsData = trace.UseGenericEvents(providerIds);
trace.Process();
IGenericEventDataSource eventData = pendingEventsData.Result;
foreach (IGenericEvent genericEvent in eventData.Events)
{
// Process event here
}
}
Within each IGenericEvent, there is a property called Fields, which should let you access events either by integer index or by name.

Release invoice on new screen

I need your help.
I have created a new screen, where I am calling all invoices pending release.
I have problems to release, I send a message where you request (you want to release).
It shows me the infinite message.
Only once should you ask me, then you should go out and follow the normal process.
public ProcessDocNew()
{
// Acuminator disable once PX1008 LongOperationDelegateSynchronousExecution [Justification]
Document.SetProcessDelegate(
delegate (List<ARInvoice> list)
{
List<ARRegister> newlist = new List<ARRegister>(list.Count);
foreach (ARInvoice doc in list)
{
newlist.Add(doc);
}
ProcessDoc(newlist, true);
}
);
Document.SetProcessCaption(ActionsMensje.Process);
Document.SetProcessAllCaption(ActionsMensje.ProcessAll);
}
public virtual void ProcessDoc(List<ARRegister> list, bool isMassProcess)
{
string title = "Test";
string sms = "¿Stamp?";
var Graph = PXGraph.CreateInstance<ARInvoiceEntry>();
ARInvoice document = Document.Current;
PEFEStampDocument timbrar = new PEFEStampDocument();/*This is a class where it is, all my method*/
if (isMassProcess == true)
{
Document.Ask(title, sms, MessageButtons.YesNo, MessageIcon.Question);
{
PXLongOperation.StartOperation(Graph, delegate
{
timbrar.Stamp(document, Graph); /*here I have my release method*/
});
}
}
}
public static class ActionsMensje
{
public const string Process = "Process";
public const string ProcessAll = "Process All";
}
I await your comments
Only once should you ask me, then you should go out and follow the
normal process.
That is not how the processing pattern works. The process delegate is called for each record and is therefore not a valid location to display a message that should be shown only once.
You would need to add a custom action to achieve that behavior. The scenario you're looking for should be implemented with a processing filter checkbox and processing filter to comply with best practices:
Documentation on processing screens implementation is available here:
https://help-2019r2.acumatica.com/Help?ScreenId=ShowWiki&pageid=a007b57b-af69-4c0f-9fd1-f5d98351035f

Spring Integration- FTP should synchronize with local folder

I have ftp location files and have local folder, on first time the files are copied to local and on restarting the server(Currently it is copying already copied files to the local folder) it should not look for the files which are already exist in the local and it should lookup for new files only. Please let me know is it possible to achieve it using Spring-Integration ftp?
I have added Filter also but still it is not working, please let me know where I am going wrong,
#Bean
#InboundChannelAdapter(value = "inputChannel", poller = #Poller(fixedDelay = "1000", maxMessagesPerPoll = "1"))
public MessageSource<?> receive() {
FtpInboundFileSynchronizingMessageSource messageSource = new FtpInboundFileSynchronizingMessageSource(synchronizer());
PropertiesPersistingMetadataStore metadataStore = new PropertiesPersistingMetadataStore();
FileSystemPersistentAcceptOnceFileListFilter acceptOnceFilter = new FileSystemPersistentAcceptOnceFileListFilter(metadataStore,"*.xml");
File Temp = new File(TEMP_FOLDER);
metadataStore.setBaseDirectory(TEMP_FOLDER);
messageSource.setLocalDirectory(Temp);
messageSource.setAutoCreateLocalDirectory(false);
messageSource.setLocalFilter(acceptOnceFilter);
return messageSource;
}
private AbstractInboundFileSynchronizer<FTPFile> synchronizer() {
folderCleanUp();
AbstractInboundFileSynchronizer<FTPFile> fileSynchronizer = new FtpInboundFileSynchronizer(sessionFactory());
fileSynchronizer.setRemoteDirectory(ftpFileLocation);
fileSynchronizer.setDeleteRemoteFiles(false);
Pattern pattern = Pattern.compile(".*\\.xml$");
FtpRegexPatternFileListFilter ftpRegexPatternFileListFilter = new FtpRegexPatternFileListFilter(pattern);
fileSynchronizer.setFilter(ftpRegexPatternFileListFilter);
return fileSynchronizer;
}
To clarify Artem's advice about implementing your custom FileListFilter, here is an example of such filter (aimed to filter out files older than given moment):
#Component
public class OldFilesFilter extends AbstractFileListFilter<FTPFile> {
// (oldFilesTimestamp field declaration and its source)
#Override
protected boolean accept(FTPFile file) {
String fileName = file.getName();
long fileTimestamp = file.getTimestamp().getTimeInMillis();
ZonedDateTime fileModTimestamp = ZonedDateTime.ofInstant(Instant.ofEpochMilli(fileTimestamp), ZoneId.systemDefault());
boolean isFileAcceptable = fileModTimestamp.isAfter(oldFilesTimestamp);
if (log.isTraceEnabled()) {
log.trace("File {}:\n" +
"file timestamp : {};\n" +
"given timestamp: {};\n" +
"file is new : {}",
fileName, fileModTimestamp, oldFilesTimestamp, isFileAcceptable);
}
return isFileAcceptable;
}
}
Also note that Spring Integration allows multiple filters to be applied to single file source at the same time. This can be achieved with CompositeFileListFilter:
private CompositeFileListFilter<FTPFile> remoteFileFilter() {
FtpPersistentAcceptOnceFileListFilter persistentFilter =
new FtpPersistentAcceptOnceFileListFilter(metadataStore, "remoteProcessedFiles.");
return new CompositeFileListFilter<>(Arrays.asList(new FtpSimplePatternFileListFilter("*.zip"),
persistentFilter,
oldFilesFilter /*known from previous example*/));
}
Yes, it is. Take a look to the local-filter property and FileSystemPersistentAcceptOnceFileListFilter is for you to track local files via external MetadataStore, e.g. Redis, MongoDb or any other which keeps the data over system restarts.

Spring + ibatis + String matching

I have a Spring application integrated with ibatis.
I am calling some third party application from where I am getting a String message (a message is combination of messages, there are Strings concatenated with \ delimiter to concatenate the different messages from the third party) as output.
I have to filter this output based on String matching. There are some 150 other Strings. If the output message contains any string out of 150 messages, i have to add some functionality.
I need suggestions to implement it. I am thinking to put 150 Strings in table as the count may increase in future. The Output may contain either no message out of this 150, or any number of combinations with these 150 messages.
I am new to Spring. please tell me how to get these messages from database, since i do not have an id to fetch them or shall I get all of them as list and then compare the output string from the third party. Also please tell me If it wise to keep these messages in database or I can keep them in some property file as well, which one will be better in performance.
Thanks in advance.
Ok, let's start with some possibilities:
IF you will only be adding a few messages in the future and only do so with new releases, then storing the messages in an enum would be a viable choice:
enum ErrorMessage {
SOME_MESSAGE("something, bla bla"),
SOME_OTHER_MESSAGE("something_else"),
...;
private String message;
private ErrorMessage(String message) {
this.message = message;
}
public static ErrorMessage getByErrorMessage(String message) {
for(ErrorMessage errorMessage: message) {
if (errorMessage.message.equals(message)) {
return errorMessage;
}
}
return null;
}
public static boolean exists(String message) {
return getByErrorMessage(message) != null;
}
}
Please note that this version is quite primitive and could be improved by adding all the messages into a static Set:
static Set<String> messagesCache = new Hashset<String>();
//in constructor:
messagesCache.add( message );
// better exists() method:
public static boolean exists(String message) {
return messagesCache.contains(message);
}
Or, as with other solutions, you could only store the actual hashcode of your strings. A hashcode is simple a numeric representation of your string and will be unique enough for you to identify it. Same solution as above:
static Set<String> messagesHashCodes = new Hashset<String>();
//in constructor:
messagesHashCodes .add( message.hashCode() );
// better exists() method:
public static boolean exists(String message) {
return messagesHashCodes .contains(message.hashCode());
}
(Of course, it would be a good idea to check for null values, etc.)
The enum version has one big advantage, if you want to have DIFFERENT actions taken for some of the actions, you can code them into the enum, for example...
SOME_MESSAGE_REQUIRING_AN_ACTION("...") {
#Override
public void doAction(StringBuilder finalString) {
...doSomething.
}
}
...
public void doAction(StringBuilder finalString) {
finalString.append( this.message );
finalString.append( SOME_SEPERATOR );
}
public void static doAction(StringBuilder builder, String errorMessage) {
if (exists(errorMessage)) {
}
}
In this example, you CAN override the doAction method in each enum value, if it should do more/something else than append the message to the StringBuilder. It would also be a nice touch to add some "NULL_MESSAGE" to the enum List that does nothing and is only there to allow easier handling:
UNKNOWN_MESSAGE(null) {
#Override
public void doAction(StringBuilder finalString) {
// do nothing here
}
}
public static ErrorMessage getByErrorMessage(String message) {
for(ErrorMessage errorMessage: message) {
if (errorMessage.message != null && errorMessage.message.equals(message)) {
return errorMessage;
}
}
return UNKNOWN_MESSAGE;
}
This way, you can simple give every single string into your enum method doAction(StringBuilder, String) and get the result: If a message fits, it is added (and some other action taken), if not, it will be ignored, null checks included.
On the other hand, if you messages change quite often, then you might not do a release for such a change but keep the values in a database. In this case, I would use the hashCode() of the message as an id (as I said, unique enough, typically) and load the whole thing into memory when the application starts, allowing you, for example, to build again a Set of hashcodes to compare your errorMessages' hashcodes against.
protected void init() {
// load all error Messages from the database
// put them into a Map<String, String> (hashCode -> Value) or even just a Set<String> (hashcodes)
}

dynamic template generation and formatting using freemarker

My goal is to format a collection of java map to a string (basically a csv) using free marker or anything else that would do smartly. I want to generate the template using a configuration data stored in database and managed from an admin application.
The configuration will tell me at what position a given data (key in hash map) need to go and also if any script need to run on this data before applying it at a given position. Several positions may be blank if the data in not in map.
I am thinking to use free-marker to build this generic tool and would appreciate if you could share how I should go about this.
Also would like to know if there is any built is support in spring-integration for building such process as the application is a SI application.
I am no freemarker expert, but a quick look at their quick start docs led me here...
public class FreemarkerTransformerPojo {
private final Configuration configuration;
private final Template template;
public FreemarkerTransformerPojo(String ftl) throws Exception {
this.configuration = new Configuration(Configuration.VERSION_2_3_23);
this.configuration.setDirectoryForTemplateLoading(new File("/"));
this.configuration.setDefaultEncoding("UTF-8");
this.template = this.configuration.getTemplate(ftl);
}
public String transform(Map<?, ?> map) throws Exception {
StringWriter writer = new StringWriter();
this.template.process(map, writer);
return writer.toString();
}
}
and
public class FreemarkerTransformerPojoTests {
#Test
public void test() throws Exception {
String template = System.getProperty("user.home") + "/Development/tmp/test.ftl";
OutputStream os = new FileOutputStream(new File(template));
os.write("foo=${foo}, bar=${bar}".getBytes());
os.close();
FreemarkerTransformerPojo transformer = new FreemarkerTransformerPojo(template);
Map<String, String> map = new HashMap<String, String>();
map.put("foo", "baz");
map.put("bar", "qux");
String result = transformer.transform(map);
assertEquals("foo=baz, bar=qux", result);
}
}
From a Spring Integration flow, send a message with a Map payload to
<int:transformer ... ref="fmTransformer" method="transform" />
Or you could do it with a groovy script (or other supported scripting language) using Spring Integration's existing scripting support without writing any code (except the script).

Resources