[ARGUS] Argus flows to Kafka
deneaulp at bc.edu
Wed Sep 11 14:39:19 EDT 2019
On Wed, Sep 11, 2019 at 11:53 AM Carter Bullard <carter at qosient.com> wrote:
> If you’re interested, lets see if we can’t get something developed in a
> reasonable bit of time. Have radium.1 be a Kafka stream processor, since
> its our argus stream processor. So if you want to write to Kafka, use
> radium … if you want to read Kafka, use radium and then all the apps that
> want to get data get if from radium ???
I think it would be really great if radium produced flows to Kafka. There
are others who would also be interested in using something like that.
I'm not sure how much utility there would be in the use case to consume
data from Kafka into ra* tools, but you are more tapped into the
flow-based-ecosystem and your own customers than I am. Personally, I've
found the ra* tools efficient with the rate of record creation in Argus.
The only thing Kafka might buy you is robustness in case of disruption of
network traffic between control points.
> On Sep 11, 2019, at 11:37 AM, Phillip Deneault <deneaulp at bc.edu> wrote:
> Hi Carter,
> Its less about what new is wanted, and more just to optimize how to get
> what we already have there.
> Kafka is simply a message queue system. In principle, Argus or radium
> could directly send the logs to a Kafka 'topic' (maybe simply using the
> defined collector id or something) that would be buffered for as long as
> someone wanted their buffers to be. Then other consumers can come along
> and process the data. In my use case, it allows me to run local Kafka
> queues on my sensors which produce events as fast as it can, while letting
> my consumers (who are far less tolerant to spikes in events) come along and
> pick up those logs as they can. Additionally, I'd like to avoid writing a
> XML log to disk, only to have to tail that log in (and all that comes with
> that), to convert it into JSON for Elastic. If a tool can write directly
> to a Kafka buffer, in any sort of structured format, Kafka can do all the
> heavy lifting of managing the records from there and you can leave
> downstream processing to something else.
> And I'm personally intending to use ELK, but there are tons of
> applications and processors out there that will happily process Kafka
> queues. I know you have been hesitant in the past to support lots and lots
> of output formats and types and I would be too, since it would take away
> from the core flow development, but it would be great for us who like Argus
> and want to load data from directly into other tools.
> On Tue, Sep 10, 2019 at 11:20 AM Carter Bullard <carter at qosient.com>
>> Hey Phil,
>> There are a few groups that have done specific ML strategies using
>> streaming argus data, Oak Ridge National Labs has an operational system,
>> Situ, and a number of commercial entities have used argus as a part of
>> their ML offerings, but I’m not sure if they are using Kafka … TensorFlow
>> has always been the most common buzz word with these groups … since
>> TensorFlow and Kafka are a common pair of terms, I think some of these
>> companies are probably doing Kafka streaming and Argus data, but not sure
>> that anyone will tell you ...
>> Is there something that Kafka would need from radium, or the
>> argus-clients that isn’t already there .... Is there a specific thing that
>> Kafka wants in its streaming pipeline ??
>> > On Sep 10, 2019, at 8:08 AM, Phillip Deneault <deneaulp at bc.edu> wrote:
>> > Is there an ra* tool, or is anyone aware of a 3rd party tool, that can
>> process argus output directly into Kafka? Final stop would be an ELK
>> database, but using Kafka would be a better middle ground from a
>> performance and maintenance point of view.
>> > Thanks,
>> > Phil
>> > _______________________________________________
>> > argus mailing list
>> > argus at qosient.com
>> > https://pairlist1.pair.net/mailman/listinfo/argus
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the argus