Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change pipeline behaviour to autoexecute upon exit of the with block #1383

Closed
wants to merge 1 commit into from

Conversation

eoghanmurray
Copy link

A behaviour I very much expected!

I've a similar open pull request for aredis at NoneGG/aredis#168
The author there suggested that redis-py is the template being followed, so I thought it might be good to open it up for discussion here also to see if there are any objections?

The idea is that if you are just setting a few values in a transaction, you'd expect them all to be completed upon exiting the with block (you don't care about the results in this case):

        with r.pipeline() as pipe:
            pipe.set('d', 'val')
            pipe.set('d1', 'val2')
            pipe.set('d1.1', 'val3')

I didn't find any documentation relating to pipelines, but didn't look all that hard.

@codecov-commenter
Copy link

Codecov Report

Merging #1383 into master will increase coverage by 0.03%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1383      +/-   ##
==========================================
+ Coverage   85.92%   85.96%   +0.03%     
==========================================
  Files           6        6              
  Lines        2892     2892              
==========================================
+ Hits         2485     2486       +1     
+ Misses        407      406       -1     
Impacted Files Coverage Δ
redis/client.py 86.02% <100.00%> (+0.05%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b80d423...6e86ff4. Read the comment docs.

@andymccurdy
Copy link
Contributor

Sometimes you do care about the reply values from commands within the pipeline and we need to provide a way to access those values. See the discussion on #730.

@Fogapod
Copy link

Fogapod commented Aug 23, 2020

Sometimes you do care about the reply values from commands within the pipeline and we need to provide a way to access those values.

You can still get results by calling execute manually.
Calling it 1 more time in __exit__ won't hurt: https://github.com/andymccurdy/redis-py/blob/b80d423cc531c5db972d35ba72424cf9cf8772ff/redis/client.py#L4109-L4110

@eoghanmurray
Copy link
Author

eoghanmurray commented Aug 24, 2020

Sometimes you do care about the reply values from commands

Yes (in addition to calling execute manually) I wondered about setting a pipe.results attribute; but I'd say that's unneeded and not very elegant?

@eoghanmurray
Copy link
Author

eoghanmurray commented Aug 24, 2020

To play devil's advocate, could there be code in the wild written as follows?

with r.pipeline() as pipe:
     pipe.some_command()
     if some_condition:
         pipe.execute()
     else:
         pass  # don't execute

But I'm struggling to come up with what some_condition would be (as you don't get any results/errors until the pipe is executed).

Also presumably they'd be calling pipe.reset() explicitly instead of letting a pass or the lack of an else take care of it?

@github-actions
Copy link
Contributor

This pull request is marked stale. It will be closed in 30 days if it is not updated.

@github-actions github-actions bot added the Stale label Aug 25, 2021
@eoghanmurray
Copy link
Author

Prompted by bot ...

Re. the 'access to results' problem, I think @Fogapod is saying that this change request is not preventing you from doing the following:

    with r.pipeline() as pipe:
        pipe.set('d', 'val')
        pipe.set('d1', 'val2')
        pipe.set('d1.1', 'val3')
        pipe.get('d1.1', 'val3')            
        res = pipe.execute()

It does however prevent the following

    with r.pipeline() as pipe:
        pipe.set('d', 'val')
        pipe.set('d1', 'val2')
        pipe.set('d1.1', 'val3')
        pipe.get('d1.1', 'val3')            
    res = pipe.execute()  # would be empty

Is the lack of the latter a concern (or is it even possible in terms of the exit being already called?)

If it is, we could store the result in the pipe, such that a later pipe.execute() returns the stored result.

@github-actions github-actions bot removed the Stale label Sep 9, 2021
@jaklinger
Copy link

Just a passing visitor, but I noticed this PR as I've also encountered this problem. One way to satisfy all parties in a light-touch way would be to modify redis.utils.pipeline:

i.e. change

@contextmanager
def pipeline(redis_obj):
    p = redis_obj.pipeline()
    yield p
    p.execute()

to

@contextmanager
def pipeline(redis_obj):
    p = redis_obj.pipeline()
    try:
        yield p
    finally:
        p.execute()

and then encourage users who want this behaviour to change their code from:

with r.pipeline() as pipe:
    ...

to

from redis.utils import pipeline
with pipeline(r) as pipe:
    ...

Obviously it's light-touch enough that I don't really need this PR, but perhaps it would be useful for others.

@chayim chayim requested a review from dvora-h September 29, 2022 08:37
@github-actions
Copy link
Contributor

This pull request is marked stale. It will be closed in 30 days if it is not updated.

@github-actions github-actions bot added the Stale label Sep 30, 2023
@github-actions github-actions bot closed this Oct 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants