Implementing Splunk Data Stream Processor (DSP) (ISDSP)

Course code: ISDSP

This hands-on module provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.

This course contains 18 total hours of content.

2 135 EUR

2 583 EUR including VAT

Selection of dates
onas
Do you have a question?
+420 731 175 867 edu@edutrainings.cz

Professional
and certified lecturers

Internationally
recognized certifications

Wide range of technical
and soft skills courses

Great customer
service

Making courses
exactly to measure your needs

Course dates

Starting date: Upon request

Type: In-person/Virtual

Course duration: 4 days

Language: en/cz

Price without VAT: 2 135 EUR

Register

Starting
date
Place
Type Course
duration
Language Price without VAT
Upon request In-person/Virtual 4 days en/cz 2 135 EUR Register
G Guaranteed course

Didn't find a suitable date?

Write to us about listing an alternative tailor-made date.

Contact

Course description

  • Introduction to Splunk Data Stream Processor
  • Deploying a DSP cluster
  • Prepping Sources and Sinks
  • Building Pipelines – Basics
  • Building Pipelines – Deep Dive
  • Working with 3rd party Data Feeds
  • Working with Metric Data
  • Monitoring DSP Environment

Target group

This 4-day module is designed for the experienced Splunk
administrators who are new to Splunk DSP.

Course structure

Topic 1 – Introduction to DSP

  • Review Splunk deployment options and challenges
  • Describe the purpose and value of Splunk DSP
  • Understand DSP concepts and terminologies

Topic 2 – Deploying a DSP Cluster

  • List DSP core components and system requirements
  • List DSP core components and system requirements
  • Describe installation options and steps
  • Check DSP service status
  • Learn to navigate in DSP UI
  • Use scloud

Topic 3 – Prepping Sources and Sinks

  • Ingest data with DSP REST API service
  • Configure DSP source connections for Splunk data
  • Configure DSP sink connections for Splunk indexers
  • Create Splunk-to Splunk pass-through pipelines

Topic 4 – Building Pipelines – Basics

  • Describe the basic elements of a DSP pipeline
  • Create data pipelines with the DSP canvas and SPL2
  • List DSP pipeline commands
  • Use scalar functions to convert data types and schema
  • Filter and route data to multiple sinks

Topic 5 – Building Pipelines – Deep Dive

  • Manipulate pipeline options:
  • Extract
  • Transform
  • Obfuscate
  • Aggregate and conditional trigger

Topic 6 – Working with 3rd party Data Feeds

  • Read from and write data to pub-sub systems like Kafka
  • List sources supported with the collect service
  • Transform data from Kafka and normalize

Prerequisites

Required:

  • Splunk Enterprise System Administration (SESA)
  • Splunk Enterprise Data Administration (SEDA)

Recommended:

  • Architecting Splunk Enterprise Deployments (ASED)
  • Working knowledge of:
    Distributed system architectures
    Apache Kafka (user level)
    Apache Flink (user level)
    Kubernetes (admin level)

Do you need advice or a tailor-made course?

onas

product support

ComGate payment gateway MasterCard Logo Visa logo