You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am new to redis and spark-redis.
I have a problem that I struggle to understand and solve, so I would appreciate a lot any help:
I am trying to write to redis a dataset with 99 rows and then read it. Both the writing and reading are successful, but the problem is that 3 out of 14 columns in the dataset are simply not written to redis. What these 3 columns have in common is that they contain a big number of nulls: one column contains only nulls, one column contains 93 and the third one contains 43 nulls.
The weird thing is that there are other 3 columns containing nulls but they are written to redis every time: one column with 51 nulls, one column with 31 nulls and one with 3 nulls.
This is the code for writing, pretty standard:
dataset.write()
.format("org.apache.spark.sql.redis")
.option("table", uniqueName)
.mode(SaveMode.Overwrite)
.save();
How spark-redis handles nulls and what I could do to have all columns be written to redis?
Thanks in advance,
Iulian
The text was updated successfully, but these errors were encountered:
Hi,
I am new to redis and spark-redis.
I have a problem that I struggle to understand and solve, so I would appreciate a lot any help:
I am trying to write to redis a dataset with 99 rows and then read it. Both the writing and reading are successful, but the problem is that 3 out of 14 columns in the dataset are simply not written to redis. What these 3 columns have in common is that they contain a big number of nulls: one column contains only nulls, one column contains 93 and the third one contains 43 nulls.
The weird thing is that there are other 3 columns containing nulls but they are written to redis every time: one column with 51 nulls, one column with 31 nulls and one with 3 nulls.
This is the code for writing, pretty standard:
dataset.write()
.format("org.apache.spark.sql.redis")
.option("table", uniqueName)
.mode(SaveMode.Overwrite)
.save();
How spark-redis handles nulls and what I could do to have all columns be written to redis?
Thanks in advance,
Iulian
The text was updated successfully, but these errors were encountered: