Skip to main content
Version: Next

Redis

Redis source connector

Description

Used to read data from Redis.

Key features

Options

nametyperequireddefault value
hoststringyes-
portintyes-
keysstringyes-
batch_sizeintyes10
data_typestringyes-
userstringno-
authstringno-
db_numintno0
modestringnosingle
hash_key_parse_modestringnoall
nodeslistyes when mode=cluster-
schemaconfigyes when format=json-
formatstringnojson
common-optionsno-

host [string]

redis host

port [int]

redis port

hash_key_parse_mode [string]

hash key parse mode, support all kv, used to tell connector how to parse hash key.

when setting it to all, connector will treat the value of hash key as a row and use the schema config to parse it, when setting it to kv, connector will treat each kv in hash key as a row and use the schema config to parse it:

for example, if the value of hash key is the following shown:

{ 
"001": {
"name": "tyrantlucifer",
"age": 26
},
"002": {
"name": "Zongwen",
"age": 26
}
}

if hash_key_parse_mode is all and schema config as the following shown, it will generate the following data:

schema {
fields {
001 {
name = string
age = int
}
002 {
name = string
age = int
}
}
}

001002
Row(name=tyrantlucifer, age=26)Row(name=Zongwen, age=26)

if hash_key_parse_mode is kv and schema config as the following shown, it will generate the following data:

schema {
fields {
hash_key = string
name = string
age = int
}
}

hash_keynameage
001tyrantlucifer26
002Zongwen26

each kv that in hash key it will be treated as a row and send it to upstream.

Tips: connector will use the first field information of schema config as the field name of each k that in each kv

keys [string]

keys pattern

batch_size [int]

indicates the number of keys to attempt to return per iteration,default 10

Tips:Redis source connector support fuzzy key matching, user needs to ensure that the matched keys are the same type

data_type [string]

redis data types, support key hash list set zset

  • key

The value of each key will be sent downstream as a single row of data. For example, the value of key is SeaTunnel test message, the data received downstream is SeaTunnel test message and only one message will be received.

  • hash

The hash key-value pairs will be formatted as json to be sent downstream as a single row of data. For example, the value of hash is name:tyrantlucifer age:26, the data received downstream is {"name":"tyrantlucifer", "age":"26"} and only one message will be received.

  • list

Each element in the list will be sent downstream as a single row of data. For example, the value of list is [tyrantlucier, CalvinKirs], the data received downstream are tyrantlucifer and CalvinKirs and only two message will be received.

  • set

Each element in the set will be sent downstream as a single row of data For example, the value of set is [tyrantlucier, CalvinKirs], the data received downstream are tyrantlucifer and CalvinKirs and only two message will be received.

  • zset

Each element in the sorted set will be sent downstream as a single row of data For example, the value of sorted set is [tyrantlucier, CalvinKirs], the data received downstream are tyrantlucifer and CalvinKirs and only two message will be received.

user [string]

redis authentication user, you need it when you connect to an encrypted cluster

auth [string]

redis authentication password, you need it when you connect to an encrypted cluster

db_num [int]

Redis database index ID. It is connected to db 0 by default

mode [string]

redis mode, single or cluster, default is single

nodes [list]

redis nodes information, used in cluster mode, must like as the following format:

["host1:port1", "host2:port2"]

format [string]

the format of upstream data, now only support json text, default json.

when you assign format is json, you should also assign schema option, for example:

upstream data is the following:

{"code":  200, "data":  "get success", "success":  true}

you should assign schema as the following:

schema {
fields {
code = int
data = string
success = boolean
}
}

connector will generate data as the following:

codedatasuccess
200get successtrue

when you assign format is text, connector will do nothing for upstream data, for example:

upstream data is the following:

{"code":  200, "data":  "get success", "success":  true}

connector will generate data as the following:

content
{"code": 200, "data": "get success", "success": true}

schema [config]

fields [config]

the schema fields of redis data

common options

Source plugin common parameters, please refer to Source Common Options for details

Example

simple:

Redis {
host = localhost
port = 6379
keys = "key_test*"
data_type = key
format = text
}
Redis {
host = localhost
port = 6379
keys = "key_test*"
data_type = key
format = json
schema {
fields {
name = string
age = int
}
}
}

read string type keys write append to list

source {
Redis {
host = "redis-e2e"
port = 6379
auth = "U2VhVHVubmVs"
keys = "string_test*"
data_type = string
batch_size = 33
}
}

sink {
Redis {
host = "redis-e2e"
port = 6379
auth = "U2VhVHVubmVs"
key = "string_test_list"
data_type = list
batch_size = 33
}
}

Changelog

2.2.0-beta 2022-09-26

  • Add Redis Source Connector

next version

  • [Improve] Support redis cluster mode connection and user authentication 3188
  • [Bug] Redis scan command supports versions 5, 6, 7 7666