How do you create a more accurate timer in OpenLaszlo than the lz.Timer class that can pause and loop? - openlaszlo

The lz.Timer class in OpenLaszlo can sometimes fire up to 256ms late, how do you create one that fires more accurately?

The following OpenLaszlo timer class I designed will fire more accurately and also has a nice looping feature with pause and reset for ease of use. This code works in OpenLaszlo 4.9.0 SWF10 and DHTML run-times:
<library>
<!---
Class: <loopingtimer>
Extends: <node>
This class is a looping timer that can be used to call a method to peform
an action repeatedly after a specified number of milliseconds.
Override the abstract method reactToTimeChange(theTime) to do something
useful in your application, theTime will be the time elapsed since the
timer's last firing, it will be close to the 'timer_resolution' set but
will be off from about 47ms-78ms in FireFox 2 and 47ms-94ms in IE6
(see note below for explanation).
NOTE:
This class originally used the LzTimer class but it was firing up to 256ms
late, so this has been replaced with a setTimeout() method and embedded
JavaScript which is more accurate, but still fires about 59ms late on
average.
For this reason the firing times of this class are approximate but will
probably fire 47ms to 78ms (about 59ms) on average late. As a workaround
for this problem the timer uses the system time to calculate how much time
has actually elapsed since the last timer firing and passes the actual time
elapsed ('theTime') in milliseconds to the abstract 'reactToTimeChange()'
method.
-->
<class name="loopingtimer" extends="node">
<switch>
<when property="$as3">
<passthrough>
import flash.utils.setTimeout;
</passthrough>
</when>
</switch>
<!-- *** ATTRIBUTES *** -->
<!-- Public Attributes -->
<!---
#param numnber timer_resolution: number of milliseconds between timer
firings (default: 40ms)
Note: OpenLaszlo seems to typically have a lower limit of 47-78
milliseconds between firings, so setting below this may be useless.
-->
<attribute name="timer_resolution" type="number" value="40" />
<!-- Private Attributes -->
<!--- #keywords private -->
<!---
#param number formertime: used internally to calculate the number of
elapsed milliseconds since playback was started
-->
<attribute name="formertime" type="number" value="0" />
<!--- #keywords private -->
<!---
Used internally for tracking virtual current time in milliseconds
for pause functionality.
-->
<attribute name="currenttime" type="number" value="0" />
<!--- #keywords private -->
<!--- #param string timer_state: 'PAUSED' | 'COUNTING' -->
<attribute name="timer_state" type="string" value="PAUSED" />
<!-- *** METHODS *** -->
<!-- Public Methods -->
<!--- #keywords abstract -->
<!---
ABSTRACT METHOD: overwrite to do something useful in your program
#param number theTime: the time in milliseconds elapsed since playback
was started
-->
<method name="reactToTimeChange" args="theTime">
if ($debug){
Debug.write('WARNING: reactToTimeChange(): This is an abstract method that should be overridden to do something useful in your application');
Debug.write('reactToTimeChange(): Time elapsed since last firing in milliseconds: '+theTime);
}
</method>
<!--- Start Timer (Note: This will reset timer to 0) -->
<method name="startTimer">
this.setAttribute('timer_state', 'COUNTING');
var now = new Date();
var rawTime = now.getTime();
this.setAttribute('formertime', rawTime);
this.doForTime();
</method>
<!--- Pauses timer at current time -->
<method name="pauseTimer">
this.setAttribute('timer_state', 'PAUSED');
</method>
<!--- Resumes timer from time it is paused at -->
<method name="unpauseTimer">
this.setAttribute('timer_state', 'COUNTING');
var now = new Date();
var rawTime = now.getTime();
this.setAttribute('formertime', rawTime-this.currenttime);
this.repeat();
</method>
<!--- Stop Timer - stops timer and resets to 0 -->
<method name="stopTimer">
this.pauseTimer();
this.resetTimer();
</method>
<!--- Resets Timer to 0 -->
<method name="resetTimer">
this.setAttribute('formertime', 0);
this.setAttribute('currenttime', 0);
</method>
<!---
Seeks to the given time in milliseconds.
#param number(int) iTimeMs: the time to seek to
-->
<method name="seekToTime" args="iTimeMs">
this.setAttribute('currenttime', Math.floor(iTimeMs));
</method>
<!-- Private Methods -->
<!--- #keywords private -->
<!---
Called Internally By Timer
#param number theTime: the actual time in milliseconds that has passed
since the last timer firing (will usually be 16-100ms more than timer
firing interval)
-->
<method name="doForTime">
// Prevent Timer Incrementing When Paused
if (this.timer_state == 'PAUSED')
return;
var now = new Date();
var rawTime = now.getTime();
if (this.formertime != 0)
var currentTime = rawTime - this.formertime;
this.setAttribute('currenttime', currentTime);
// Call Abstract Method:
this.reactToTimeChange(currentTime);
this.repeat();
</method>
<!--- #keywords private -->
<!---
Used internally for timer looping.
-->
<method name="repeat">
// This function uses an embedded JavaScript function which
// can be called via the standard JavaScript setTimeout()
// method which results in more accurate timer firing then the
// standard OpenLaszlo LzTimer() class. LzTimer() fired up to
// 256ms late, while setTimeout() usually fires from
// only 47ms-78ms
var f = function(){
doForTime();
}
setTimeout(f, this.timer_resolution);
</method>
</class>
</library>

Related

Spring Integration - Scheduling Job from configuration file

I'm using Spring Integration to parse XML file and i will need to create a thread (and each one have a different rate) for each tag.
Right now (with the help of many users here :)) i'm able to split XML by tag and then route it to the appropiate service-activator.
This works great but i'm not able to redirect to a channel that create "a thread" and then execute the operations. Right now i have the following configuration and in my mind (that i dont know if it is correct...)
Split tag -> Route to the appropiate channel -> Start a thread(from tag configuration) -> Execute the operation
This is my actual configuration that split tag and redirect to the channel.
The router should redirect not toward a channel directly, but schedule them.
In first instance will be enought to redirect it in a pool with fixed rate and later i will use XPATH to get the attribute and then replace this "fixed" rate with the correct value.
I've tried many solutions to create this flow but each one fails or do not compile :(
<context:component-scan base-package="it.mypkg" />
<si:channel id="rootChannel" />
<si-xml:xpath-splitter id="mySplitter" input-channel="rootChannel" output-channel="routerChannel" create-documents="true">
<si-xml:xpath-expression expression="//service" />
</si-xml:xpath-splitter>
<si-xml:xpath-router id="router" input-channel="routerChannel" evaluate-as-string="true">
<si-xml:xpath-expression expression="concat(name(./node()), 'Channel')" />
</si-xml:xpath-router>
<si:service-activator input-channel="serviceChannel" output-channel="endChannel">
<bean class="it.mypkg.Service" />
</si:service-activator>
UPDATE:
Using this configuration for the service this should run a task every 10 seconds (the id=service1) and every 5 seconds the other (the id=service2). In the same way i can have another tag that is handle by another class (because this will have another behaviour)
<root>
<service id="service1" interval="10000" />
<service id="service2" interval="5000" />
<activity id="activity1" interval="50000" />
<root>
I will have a classe (Service) that is general to handle Service tag and this complete some operation and then "return me" the value so i can redirect to another channel.
public class Service {
public int execute() {
// Execute the task and return the value to continue the "chain"
}
}
It's not at all clear what you mean; you split a tag; route it but want to "schedule" it at a rate in the XML. It's not clear what you mean by "schedule" here - normally each message is processed once not multiple times on a schedule.
As I said, I don't understand what you need to do, but a smart poller might be suitable.
Another possibility is the delayer where the amount of the delay can be derived from the message.
EDIT
Since your "services" don't seem to take any input data, it looks like you simply need to configure/start an <inbound-channel-adapter/> for each service, and then start it, based on the arguments in the XML.
<int:inbound-channel-adapter id="service1" channel="foo"
auto-startup="false"
ref="service1Bean" method="execute">
<poller fixed-delay="1000" />
</int:inbound-channel-adapter/>
Note auto-startup="false".
Now, in the code that receives the split
#Autowired
SourcePollingChannelAdapter service1;
...
public void startService1(Node node) {
...
service1.setTrigger(new PeridicTrigger(...));
service1.start();
...
}
I dont know if this is the right way to implement the flow, but i've write the follow code:
applicationContext.xml
<context:component-scan base-package="it.mypkg" />
<!-- Expression to extract interval from XML tag -->
<si-xml:xpath-expression id="selectIntervalXpath" expression="//*/#interval" />
<si:channel id="rootChannel" />
<!-- Split each tag to redirect on router -->
<si-xml:xpath-splitter id="mySplitter" input-channel="rootChannel" output-channel="routerChannel" create-documents="true">
<si-xml:xpath-expression expression="//service|//activity" />
</si-xml:xpath-splitter>
<!-- Route each tag to the appropiate channel -->
<si-xml:xpath-router id="router" input-channel="routerChannel" evaluate-as-string="true">
<si-xml:xpath-expression expression="concat(name(./node()), 'Channel')" />
</si-xml:xpath-router>
<!-- Activator for Service Tag -->
<si:service-activator input-channel="serviceChannel" method="schedule">
<bean class="it.mypkg.Service" />
</si:service-activator>
<!-- Activator for Activity Tag -->
<si:service-activator input-channel="activityChannel" method="schedule">
<bean class="it.mypkg.Activity" />
</si:service-activator>
<!-- Task scheduler -->
<task:scheduler id="taskScheduler" pool-size="10"/>
Each tag will extend an Operation class (to avoid code duplication on bean injection)
Operation.java
public abstract class Operation {
protected TaskScheduler taskScheduler;
protected XPathExpression selectIntervalXpath;
abstract public void schedule(Node document);
#Autowired
public void setTaskScheduler(TaskScheduler taskScheduler) {
this.taskScheduler= taskScheduler;
}
public TaskScheduler getTaskScheduler() {
return this.taskScheduler;
}
#Autowired
public void setSelectIntervalXpath(XPathExpression selectIntervalXpath) {
this.selectIntervalXpath = selectIntervalXpath;
}
public XPathExpression getSelectIntervalXPath() {
return this.selectIntervalXpath;
}
}
And an example of Service class (that handle all tags service provided on .xml)
public class Service extends Operation {
private static final Logger log = Logger.getLogger(Service.class);
#Override
public void schedule(Node document) {
log.debug("Scheduling Service");
long interval = Long.parseLong(this.selectIntervalXpath.evaluateAsString(document));
this.taskScheduler.scheduleAtFixedRate(new ServiceRunner(), interval);
}
private class ServiceRunner implements Runnable {
public void run() {
log.debug("Running...");
}
}
}
Now to continue my flow i will need to find a way to redirect the output of each job to Spring Integration (applicationContext.xml).

ChannelResolutionException: no output-channel or replyChannel header available - Only with many requests

I am running the client portion of the Spring Integration TCP Multiplex example. I was trying to see how many requests it could handle at once and around 1000, I started to get this error: ChannelResolutionException: no output-channel or replyChannel header available
Everything is fine below about 1000 calls.
<beans:description>
Uses conversion service and collaborating channel adapters.
</beans:description>
<context:property-placeholder />
<converter>
<beans:bean class="org.springframework.integration.samples.tcpclientserver.ByteArrayToStringConverter" />
</converter>
<!-- Fastest Wire Protocol - takes a byte array with its length definied in the first x bytes-->
<beans:bean id="fastestWireFormatSerializer" class="org.springframework.integration.ip.tcp.serializer.ByteArrayLengthHeaderSerializer">
<beans:constructor-arg value="1" />
</beans:bean>
<!-- Client side -->
<gateway id="gw"
service-interface="org.springframework.integration.samples.tcpclientserver.SimpleGateway"
default-request-channel="input" />
<ip:tcp-connection-factory id="client"
type="client"
host="localhost"
port="${availableServerSocket}"
single-use="false"
serializer="fastestWireFormatSerializer"
deserializer="fastestWireFormatSerializer"
so-timeout="10000" />
<publish-subscribe-channel id="input" />
<!-- scheduler - Thread used to restablish connection so the other threads aren't starved while waiting to re-establish connection -->
<!-- client-mode - Automatically re-establishes the connection if lost -->
<ip:tcp-outbound-channel-adapter id="outAdapter.client"
order="2"
channel="input"
client-mode="true"
connection-factory="client" /> <!-- Collaborator -->
<!-- Also send a copy to the custom aggregator for correlation and
so this message's replyChannel will be transferred to the
aggregated message.
The order ensures this gets to the aggregator first -->
<bridge input-channel="input" output-channel="toAggregator.client"
order="1"/>
<!-- Asynch receive reply -->
<ip:tcp-inbound-channel-adapter id="inAdapter.client"
channel="toAggregator.client"
connection-factory="client" /> <!-- Collaborator -->
<!-- dataType attribute invokes the conversion service, if necessary -->
<channel id="toAggregator.client" datatype="java.lang.String" />
<aggregator input-channel="toAggregator.client"
output-channel="toTransformer.client"
correlation-strategy-expression="payload.substring(0,3)"
release-strategy-expression="size() == 2"
expire-groups-upon-completion="true" />
<transformer input-channel="toTransformer.client"
expression="payload.get(1)"/> <!-- The response is always second -->
<task:scheduler id="reconnectScheduler" pool-size="10"/>
And the code used to test:
TaskExecutor executor = new SimpleAsyncTaskExecutor();
final CountDownLatch latch = new CountDownLatch(100);
final Set<Integer> results = new HashSet<Integer>();
for (int i = 100; i < 1050; i++) {
results.add(i);
final int j = i;
executor.execute(new Runnable() {
public void run() {
String result = gateway.send(j + "Hello world!"); // first 3 bytes is correlationid
System.out.println("Test Result: " + result);
results.remove(j);
latch.countDown();
}});
}
I haven't figured out entirely why you are getting that exception, but there are several problems with your test.
The countdown latch needs to be initialized at 950
Since you are exceeding 999, we need to change the correlation:
payload.substring(0,4)
With those changes, it works for me.
I'll try to figure out why we're getting that exception when I get a bit more time.
EDIT
The issue is indeed caused by the conflicting correlation ids.
The last 50 messages all have correlation id 100 which means messages are released in an indeterminate fashion (given the release is based on size). In some cases two input messages are released (causing the wrong reply to the test case). When 2 replies are released; there is no output channel.

File inbound-channel-adapter spring integration for Multiple Files aggregation into one master File for Job processing

I have written a code to combined multiple files into one single Master file.
The issue is with int-transformer where I am getting one file at a time although I have aggregated List of File in composite Filter of File inbound-channel-adapter. The List of File size in composite filter is correct but in Transformer bean the List of File size is always one and not getting the correct list size aggregated file by the filter.
Here is my config:
<!-- Auto Wiring -->
<context:component-scan base-package="com.nt.na21.nam.integration.*" />
<!-- intercept and log every message -->
<int:logging-channel-adapter id="logger"
level="DEBUG" />
<int:wire-tap channel="logger" />
<!-- Aggregating the processed Output for OSS processing -->
<int:channel id="networkData" />
<int:channel id="requests" />
<int-file:inbound-channel-adapter id="pollProcessedNetworkData"
directory="file:${processing.files.directory}" filter="compositeProcessedFileFilter"
channel="networkData">
<int:poller default="true" cron="*/20 * * * * *" />
</int-file:inbound-channel-adapter>
<bean id="compositeProcessedFileFilter"
class="com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine" />
<int:transformer id="aggregateNetworkData"
input-channel="networkData" output-channel="requests">
<bean id="networkData" class="com.nt.na21.nam.integration.helper.CSVFileAggregator">
</bean>
</int:transformer>
CompositeFileListFilterForBaseLine:
public class CompositeFileListFilterForBaseLine implements FileListFilter<File> {
private final static Logger LOG = Logger
.getLogger(CompositeFileListFilterForBaseLine.class);
#Override
public List<File> filterFiles(File[] files) {
List<File> filteredFile = new ArrayList<File>();
int index;
String fetchedFileName = null;
String fileCreatedDate = null;
String todayDate = DateHelper.toddMM(new Date());
LOG.debug("Date - dd-MM: " + todayDate);
for (File f : files) {
fetchedFileName = StringUtils.removeEnd(f.getName(), ".csv");
index = fetchedFileName.indexOf("_");
// Add plus one to index to skip underscore
fileCreatedDate = fetchedFileName.substring(index + 1);
// Format the created file date
fileCreatedDate = DateHelper.formatFileNameDateForAggregation(fileCreatedDate);
LOG.debug("file created date: " + fileCreatedDate + " today Date: "
+ todayDate);
if (fileCreatedDate.equalsIgnoreCase(todayDate)) {
filteredFile.add(f);
LOG.debug("File added to List of File: " + f.getAbsolutePath());
}
}
LOG.debug("SIZE: " + filteredFile.size());
LOG.debug("filterFiles method end.");
return filteredFile;
}
}
The Class file for CSVFileAggregator
public class CSVFileAggregator {
private final static Logger LOG = Logger.getLogger(CSVFileAggregator.class);
private int snePostion;
protected String masterFileSourcePath=null;
public File handleAggregateFiles(List<File> files) throws IOException {
LOG.debug("materFileSourcePath: " + masterFileSourcePath);
LinkedHashSet<String> allAttributes = null;
Map<String, LinkedHashSet<String>> allAttrBase = null;
Map<String, LinkedHashSet<String>> allAttrDelta = null;
LOG.info("Aggregator releasing [" + files.size() + "] files");
}
}
Log Output:
INFO : com.nt.na21.nam.integration.aggregator.NetFileAggregatorClient - NetFileAggregator context initialized. Polling input folder...
INFO : com.nt.na21.nam.integration.aggregator.NetFileAggregatorClient - Input directory is: D:\Projects\csv\processing
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - Date - dd-MM: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - file created date: 0103 today Date: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - File added to List of File: D:\Projects\NA21\NAMworkspace\na21_nam_integration\csv\processing\file1_base_0103.csv
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - file created date: 0103 today Date: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - File added to List of File: D:\Projects\NA21\NAMworkspace\na21_nam_integration\csv\processing\file2_base_0103.csv
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - **SIZE: 2**
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - filterFiles method end.
DEBUG: org.springframework.integration.file.FileReadingMessageSource - Added to queue: [csv\processing\file1_base_0103.csv, csv\processing\file2_base_0103.csv]
INFO : org.springframework.integration.file.FileReadingMessageSource - Created message: [GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]]
DEBUG: org.springframework.integration.endpoint.SourcePollingChannelAdapter - Poll resulted in Message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.channel.DirectChannel - preSend on channel 'networkData', message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.handler.LoggingHandler - org.springframework.integration.handler.LoggingHandler#0 received message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.handler.LoggingHandler - csv\processing\file2_base_0103.csv
DEBUG: org.springframework.integration.channel.DirectChannel - postSend (sent=true) on channel 'logger', message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.transformer.MessageTransformingHandler - org.springframework.integration.transformer.MessageTransformingHandler#606f8b2b received message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: com.nt.na21.nam.integration.helper.CSVFileAggregator - materFileSourcePath: null
INFO : com.nt.na21.nam.integration.helper.CSVFileAggregator - **Aggregator releasing [1] files**
Can some one help me here in identifying the issue with Filter and same is not collecting for transformation?
Thanks in advance.
The issue is with int:aggregator as I am not sure how to invoke. I have used this earlier in my design but it didn't get executed at all. Thanks for the quick response.
For this problem I have written a FileScaner utility which will scan all the files in Folder inside and aggregation is working perfectly.
Please find the config with Aggregator which didn't works, hence I splited the design by two poller first produced all the CSV file(s) and second collect it and aggregate it.
<!-- Auto Wiring -->
<context:component-scan base-package="com.bt.na21.nam.integration.*" />
<!-- intercept and log every message -->
<int:logging-channel-adapter id="logger" level="DEBUG" />
<int:wire-tap channel = "logger" />
<int:channel id="fileInputChannel" datatype="java.io.File" />
<int:channel id="error" />
<int:channel id="requestsCSVInput" />
<int-file:inbound-channel-adapter id="pollNetworkFile"
directory="file:${input.files.directory}" channel="fileInputChannel"
filter="compositeFileFilter" prevent-duplicates="true">
<int:poller default="true" cron="*/20 * * * * *"
error-channel="error" />
</int-file:inbound-channel-adapter>
<bean id="compositeFileFilter"
class="com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForTodayFiles" />
<int:transformer id="transformInputZipCSVFileIntoCSV"
input-channel="fileInputChannel" output-channel="requestsCSVInput">
<bean id="transformZipFile"
class="com.nt.na21.nam.integration.file.net.NetRecordFileTransformation" />
</int:transformer>
<int:router ref="docTypeRouter" input-channel="requestsCSVInput"
method="resolveObjectTypeChannel">
</int:router>
<int:channel id="Vlan" />
<int:channel id="VlanShaper" />
<int:channel id="TdmPwe" />
<bean id="docTypeRouter"
class="com.nt.na21.nam.integration.file.net.DocumentTypeMessageRouter" />
<int:service-activator ref="vLanMessageHandler" output-channel="newContentItemNotification" input-channel="Vlan" method="handleFile" />
<bean id="vLanMessageHandler" class="com.nt.na21.nam.integration.file.handler.VLanRecordsHandler" />
<int:service-activator ref="VlanShaperMessageHandler" output-channel="newContentItemNotification" input-channel="VlanShaper" method="handleFile" />
<bean id="VlanShaperMessageHandler" class="com.nt.na21.nam.integration.file.handler.VlanShaperRecordsHandler" />
<int:service-activator ref="PweMessageHandler" output-channel="newContentItemNotification" input-channel="TdmPwe" method="handleFile" />
<bean id="PweMessageHandler" class="com.nt.na21.nam.integration.file.handler.PseudoWireRecordsHandler" />
<int:channel id="newContentItemNotification" />
<!-- Adding for aggregating the records in one place for OSS output -->
<int:aggregator input-channel="newContentItemNotification" method="aggregate"
ref="netRecordsResultAggregator" output-channel="net-records-aggregated-reply"
message-store="netRecordsResultMessageStore"
send-partial-result-on-expiry="true">
</int:aggregator>
<int:channel id="net-records-aggregated-reply" />
<bean id="netRecordsResultAggregator" class="com.nt.na21.nam.integration.aggregator.NetRecordsResultAggregator" />
<!-- Define a store for our network records results and set up a reaper that will
periodically expire those results. -->
<bean id="netRecordsResultMessageStore" class="org.springframework.integration.store.SimpleMessageStore" />
<int-file:outbound-channel-adapter id="filesOut"
directory="file:${output.files.directory}"
delete-source-files="true">
</int-file:outbound-channel-adapter>
The code is working fine till the routed to all the channel below:
<int:channel id="Vlan" />
<int:channel id="VlanShaper" />
<int:channel id="TdmPwe" />
I am trying to return LinkedHashSet from the Process of the above channel which contains CSV data and I need to aggregate all the merge
LinkedHashSet vAllAttributes to get the master output CSV file.
List<String> masterList = new ArrayList<String>(vAllAttributes);
Collections.sort(masterList);
Well, looks like you misunderstood a bit <int-file:inbound-channel-adapter> behaviour. Its nature is producing one file per message to the channel. It doesn't depend on the logic of the FileListFilter. The is like:
The FileReadingMessageSource uses DirectoryScanner to retrieve files from the provided directory to an internal toBeReceived Queue
Since we scan the directory for the files the design for the DirectoryScanner looks like List<File> listFiles(File directory). I guess this has led you astray.
After that the filter is applied to the original file list and returns only appropriate files.
They are stored to the toBeReceived Queue.
And only after that the FileReadingMessageSource polls an item from the queue to build message for the output channel.
To achieve your aggregation requirements you really should use an <aggregator> between <int-file:inbound-channel-adapter> and your <int:transformer>.
You can mark the <poller> of the <int-file:inbound-channel-adapter> with max-messages-per-poll="-1" to really poll all your files during the single scheduled task. But anyway there will as much messages as your filter returns files.
After that you must accept some tricks for the <aggregator>:
correlationKey - to allow your file messages to be combined to the single MessageGroup for release a single message for the further <transformer>. Since we don't have any context from <int-file:inbound-channel-adapter>, but we know that all messages are provided by the single polling task and withing scheduled Thread (you don't use task-executor on the <poller>), hence we can simply use correlationKey as:
correlation-strategy-expression="T(Thread).currentThread().id"
But the is not enough, because we should produce somehow the single message in the end anyway. Unfortunately we don't know the number of files (however you can do that via the ThreadLocal from your custom FileListFilter) to allow the ReleaseStrategy to return true for the aggregate phase. Hence we never have the normal group completion. But we can forceRelease uncompleted groups from the aggregator to use the MessageGroupStoreReaper or group-timeout on the <aggregator>.
In addition to the previous clause you should supply these options on the <aggegator>:
send-partial-result-on-expiry="true"
expire-groups-upon-completion="true"
And that's all. There is no reason to provide any custom aggregation function (ref/method or expression), because the default on just build a single message with the List of payloads from all messages in group. And that is appropriate for your CSVFileAggregator. Although you can avoid that <transformer> and this CSVFileAggregator for the aggregation function.
Hope I ma clear

Is it possible to have a while loop connected to the mouseover state in an OpenLaszlo app?

Is it possible to do something like this
while (view.mouseover == true) {
preform action
}
I want to have an action repeat for as long as the mouse is over a specific view.
(asked on the laszlo-user mailing list)
Well, it looks like you answered your own question while I was testing my solution to make sure it worked correctly, but here is an alternative solution that works under OpenLaszlo 4.9.0 SWF10 and OpenLaszlo 4.9.0 DHTML run-times:
<canvas width="1000" height="665" debug="true">
<view id="v" bgcolor="0xcccccc" width="200" height="200">
<!--- #param boolean mouseisover: true when the mouse is over -->
<attribute name="mouseisover" type="boolean" value="false" />
<!--- #keywords private -->
<!--- #param lz.Delegate dlgt_repeat: stores the lz.Delegate object -->
<attribute name="dlgt_repeat" type="expression" />
<!--
Called when the 'onmouseover' event occurs
-->
<handler name="onmouseover">
// Step 1) unregister any existing delegate
// mark it for garbage collection
// and prevent its event from triggering:
if (this['dlgt_repeat'])
this.dlgt_repeat.unregisterAll();
// Step 2) update this.mouseisover flag:
if (!this.mouseisover)
this.setAttribute('mouseisover', true);
// Step 3) create an event Delegate and call it
// on the next application idle event:
var objDlgt = new lz.Delegate(this, 'doSomething');
this.setAttribute('dlgt_repeat', objDlgt);
lz.Idle.callOnIdle(this.dlgt_repeat);
</handler>
<!--
Called when the 'onmouseout' event occurs
-->
<handler name="onmouseout">
// Step 1) unregister any existing delegate
// mark it for garbage collection
// and prevent its event from triggering:
if (this['dlgt_repeat'])
this.dlgt_repeat.unregisterAll();
// Step 2) Update this.mouseisover flag:
if (this.mouseisover)
this.setAttribute('mouseisover', false);
</handler>
<!--- #keywords private -->
<!---
Called on application idle event by lz.Idle repeatedly
when the mouse is down.
#param ??? objDummy: required for SWF9+ run-times for methods
called by delegates due to AS3 (ActionScript3 compiler
requirements). Just set default to null to make compiler
happy and ignore...
-->
<method name="doSomething" args="objDummy=null">
<![CDATA[
// Note: CDATA allows '&&' to be used in script below,
// alternatively omit CDATA and use '&&' instead
// of '&&'
// Step 1) Execute your code you want to run here:
if ($debug) Debug.debug('Do something...');
// Step 2): If mouse is still over and the event
// delegate exists then schedule the event to be
// executed upon the next application idle state:
if (this.mouseisover && this['dlgt_repeat'] != null)
lz.Idle.callOnIdle(this.dlgt_repeat);
]]>
</method>
<text text="Move mouse over" />
</view>
</canvas>
Since both ActionScript and JavaScript are single threaded, it's not possible to have a while loop with pauses between each loop iteration. In the SWF10/11 runtime, you need to make sure that the code within each method or function can be executed within one frame (duration depends on the framerate of the SWF clip) of your application.
As a workaround you can use a timer, here is an example:
<canvas debug="true">
<class name="mouseoverview" extends="view"> <attribute name="timer" type="object" value="null" />
<!-- lz.Delegate instance used by the timer -->
<attribute name="timerdel" type="object" value="null" />
<attribute name="timeractive" type="boolean" value="false" />
<!-- milliseconds to pause before each call to doWhileMouseover method -->
<attribute name="tick" type="number" value="500" />
<handler name="onmouseover">
Debug.info('mouseover');
if (this.timeractive == false) {
this.setAttribute('timeractive', true);
this.timerdel = new lz.Delegate( this, "timerCallback" );
this.timer = lz.Timer.addTimer( this.timerdel, this.tick );
// When the timer is activated, do one call to the method
// containing the loop logic. The following calls will be
// handled by the timer and delegate.
this.doWhileMouseover();
}
</handler>
<handler name="onmouseout">
Debug.info('mouseout');
if (this.timeractive) {
this.setAttribute('timeractive', false);
lz.Timer.removeTimer(this.timerdel);
}
</handler>
<method name="timerCallback" args="millis">
if (this.timeractive) {
lz.Timer.resetTimer(this.timerdel, this.tick);
this.doWhileMouseover();
}
</method>
<method name="doWhileMouseover">
Debug.info("This is your virtual while loop for mouseover");
</method>
</class>
<mouseoverview x="100"
y="100"
width="400"
height="400"
bgcolor="#33aaff"
tick="250">
</mouseoverview>
</canvas>
When a mouseover occurs, a timer is started using the timerdel (an instance of lz.Delegate). Then the doWhileMouseover() method is called once directly, and then repeatedly using the timer, as long as no onmouseout event happened.

Aggregating local events in SCXML

My state machine has a self loop every time some request event is created. I want to store these events in a local context list against a key and everytime this self loop is executed an element is appended to the list. Then this list after a certain expiry period ,of say 1 Hour , is added to global context of SCXML. How can I achieve this?
Basically I want to aggregate the requests before I trigger a particular action.
<state id="S02" label="REQUEST CREATED">
<onentry>
<action:trigger id="ACC1" name="EXPIRY_EVENT_expiry.sm00007" />
</onentry>
<transition event="expiry.sm00007" target="S03">
<action:trigger id="ACC2" name="TO_DO_SOMETHING" />
// add the local event list to global context
</transition>
<transition event=reqCreated" target="S02" >
// keep adding the event to local context like appending to list
</transition>
</state>
In the SCXML spec, all datamodel variables are global so there's not really a "local" context. But you could use a key to index into a JavaScript object. Something like:
<datamodel>
<data id="globalEventList"/>
<data id="localEventListMap" expr="{}"/>
<data id="localKey" expr="'foo'"/>
</datemodel>
<state id="init">
<onentry>
<script>
localEventListMap[localKey] = [];
</script>
</onentry>
<transition target="S02"/>
</state>
<state id="S02" label="REQUEST CREATED">
<onentry>
<action:trigger id="ACC1" name="EXPIRY_EVENT_expiry.sm00007" />
</onentry>
<transition event="expiry.sm00007" target="S03">
<action:trigger id="ACC2" name="TO_DO_SOMETHING" />
<script>
// add the local event list to global context
globalEventList = localEventListMap[key];
</script>
</transition>
<transition event="reqCreated" target="S02" >
<script>
// keep adding the event to local context like appending to list
localEventListMap[key].push(_event);
</script>
</transition>
</state>

Resources