Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some columns are not written to Redis #375

Open
iulianm82 opened this issue Jul 14, 2023 · 0 comments
Open

Some columns are not written to Redis #375

iulianm82 opened this issue Jul 14, 2023 · 0 comments

Comments

@iulianm82
Copy link

Hi,

I am new to redis and spark-redis.
I have a problem that I struggle to understand and solve, so I would appreciate a lot any help:

I am trying to write to redis a dataset with 99 rows and then read it. Both the writing and reading are successful, but the problem is that 3 out of 14 columns in the dataset are simply not written to redis. What these 3 columns have in common is that they contain a big number of nulls: one column contains only nulls, one column contains 93 and the third one contains 43 nulls.
The weird thing is that there are other 3 columns containing nulls but they are written to redis every time: one column with 51 nulls, one column with 31 nulls and one with 3 nulls.

This is the code for writing, pretty standard:
dataset.write()
.format("org.apache.spark.sql.redis")
.option("table", uniqueName)
.mode(SaveMode.Overwrite)
.save();
How spark-redis handles nulls and what I could do to have all columns be written to redis?

Thanks in advance,
Iulian

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant