not approved

Automated Cyber Threat Intelligence

$40,000.00 Requested
Ideascale logo View on ideascale
Community Review Results (1 reviewers)
Addresses Challenge
Feasibility
Auditability
Problem:

<p>The Cardano ecosystem is producing massive volumes of data, we need a scalable framework to correlate this data into real-time threats.</p>

Yes Votes:
₳ 27,717,245
No Votes:
₳ 38,366,019
Votes Cast:
251

Detailed Plan

<u>Summary</u>

Kafka ecosystem provides the components to correlate massive volumes of data in real-time to provide situational awareness across the Cardano ecosystem to find all needles in the haystack. We can deliver contextually rich data to reduce false positives:

1. Collect all events from data sources with Kafka Connect

2. Filter event streams with Kafka Connect's Single Message Transforms (SMT) so that only relevant data gets into the Kafka topic

3. Empower real-time streaming applications with Kafka Streams or ksqlDB to correlate events across various source interfaces

4. Forward priority events to other systems such as the SIEM/SOAR with Kafka Connect or any other Kafka client (Java, C, C++, .NET, Go, JavaScript, HTTP via REST Proxy, etc.)

These can be further split up into the following categories:

Data Producers: comes from various sources which include real-time syslogs, batch systems, network logs / pcap logs

Data Normalization and Enrichment: data correlation in real-time at scale. This includes data normalization and processing such as filter, aggregate, transform etc. Data governance concepts for enforcing data structures and ensuring data quality are crucial on the client-side and server-side

Data Consumers: workloads require data correlation in real-time to detect anomalies or even prevent threats as soon as possible. Kafka Streams or ksqlDB provide out-of-the-box stream processing capabilities.

<u>High Level Solution Design</u>

TBC

<u>Threat Model</u>

Sigma is a generic and open signature format that allows you to describe relevant log events. The goal is to provide a structured form in which Sigma rules can describe Cardano detection methods and share with other SPOs / community. More about this project can be found here: <https://github.com/SigmaHQ/sigma>

<u>Roadmap</u>

TBC

<u>Working Example</u>

Analyse Zeek IDS data with ksqlDB running on Confluent Platform

<https://github.com/berthayes/cp-zeek>

<u>Measuring Success</u>

To achieve situational awareness across the SPO's covering various levels beyond the raw network events. It includes all environments, including application data, logs, people, and processes.

<u>The Team</u>

OSO was founded in 2018 by two engineers who have worked on some of the largest distributed cloud based systems in Europe. Now a global and fully remote team of experts working across technologies and building tooling, we're passionate about the Cardano project and driven by it's potential to drive true change. Our Cardano Ansible role has helped countless SPO's get up and running in AWS: https://github.com/osodevops/ansible-role-cardano-node

<u>Notes</u>

High level solution design diagrams and roadmap will be uploaded in the coming days.

Community Reviews (1)

Comments

close

Playlist

  • EP2: epoch_length

    Authored by: Darlington Kofa

    3m 24s
    Darlington Kofa
  • EP1: 'd' parameter

    Authored by: Darlington Kofa

    4m 3s
    Darlington Kofa
  • EP3: key_deposit

    Authored by: Darlington Kofa

    3m 48s
    Darlington Kofa
  • EP4: epoch_no

    Authored by: Darlington Kofa

    2m 16s
    Darlington Kofa
  • EP5: max_block_size

    Authored by: Darlington Kofa

    3m 14s
    Darlington Kofa
  • EP6: pool_deposit

    Authored by: Darlington Kofa

    3m 19s
    Darlington Kofa
  • EP7: max_tx_size

    Authored by: Darlington Kofa

    4m 59s
    Darlington Kofa
0:00
/
~0:00