Spark context configured for using the Spark Cassandra connector. Relevant configuration parameters are:
spark.cassandra.connection.host
: hostname of a Cassandra contact pointspark.cassandra.connection.port
: port of a Cassandra contact pointspark.cassandra.auth.username
: connection usernamespark.cassandra.auth.password
: connection passwordConfiguration for a Cassandra storage backend and application-specific event serializers. Relevant storage backend configuration parameters are:
eventuate.log.cassandra.keyspace
: keyspace of the event log tableeventuate.log.cassandra.table-prefix
: prefix of the event log tableConfiguration for a Cassandra storage backend and application-specific event serializers.
Configuration for a Cassandra storage backend and application-specific event serializers. Relevant storage backend configuration parameters are:
eventuate.log.cassandra.keyspace
: keyspace of the event log tableeventuate.log.cassandra.table-prefix
: prefix of the event log tableSpark context configured for using the Spark Cassandra connector.
Spark context configured for using the Spark Cassandra connector. Relevant configuration parameters are:
spark.cassandra.connection.host
: hostname of a Cassandra contact pointspark.cassandra.connection.port
: port of a Cassandra contact pointspark.cassandra.auth.username
: connection usernamespark.cassandra.auth.password
: connection passwordExposes an event log with given logId
as Spark RDD[DurableEvent]
.
Exposes an event log with given logId
as Spark RDD[DurableEvent]
. Events are read concurrently
from multiple Cassandra partitions so that events in the RDD are ordered per partition (by
DurableEvent.localSequenceNr). Applications that want to have a total event ordering should sort
the resulting RDD with .sortBy(_.localSequenceNr)
.
Id of the event log.
Sequence number from where to start reading.
Eventuate Spark adapter for batch processing events from event logs with a Cassandra storage backend.