Hello Sai,

I'm gonna paraphrase what I think your use-case is first, let me know if this is wrong. You want to keep track of the number of logs coming in and every hour you want to document how many came in in that hour. Currently NiFi doesn't handle this type of "stateful" event processing very well and with what NiFi currently offers you are very limited. 

That said, I've done some work in order to help NiFi into the "stateful" event processing space that may help you. I currently have an open PR[1] to add state to UpdateAttribute. This allows you keep stateful values (like a count) and even acts as a Stateful Rule Engine (using UpdateAttribute's 'Advanced Tab'). 

So in order to solve your use-case you can set up one stateful UpdateAttribute along your main flow that counts all your incoming FlowFiles. Then add a GenerateFlowFile processor running on an hourly cron job that is routed to the stateful UpdateAttribute to act as a trigger. When the Stateful UpdateAttribute is triggered it adds the count as an attribute of the triggering flowfile and resets the count. Then just do a RouteOnAttribute after the stateful UpdateAttribute to separate the triggering FlowFile from the incoming data and put it to ElasticSearch.

That may not have been the best explanation and if not I can create a template and take screenshots tomorrow if you're interested. One thing to keep in mind though, this stateful processing does have a limitation in this PR in that it will only work with local state. So no tracking counts across a whole cluster, just per node.

[1] https://github.com/apache/nifi/pull/319

Joe
 
- - - - - - 
Joseph Percivall
linkedin.com/in/Percivall
e: joepercivall@yahoo.com



On Wednesday, November 9, 2016 11:41 AM, "Peddy, Sai" <Sai.Peddy@capitalone.com> wrote:


Hi All,
 
Previously posted this in the Dev listserv moving it over to the Users listserv
 
Iā€™m currently working on a use case to be able to track the number of individual logs that come in and put that information in ElasticSearch. I wanted to see if there is an easy way to do this and whether anyone had any good ideas?
 
Current approach I am considering: Route the Log Files coming in ā€“ to a Split Text & Route Text Processor to make sure no empty logs get through and get the individual log count when files contain multiple logs ā€“ At the end of this the total number of logs are visible in the UI queue, where it displays the queueCount, but this information is not readily available to any processor. Current thought process is that I can use the ExecuteScript Processor and update a local file to keep track and insert the document into elastic search hourly.
 
Any advice would be appreciated
 
Thanks,
Sai Peddy


The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.