You have a big slow piece of Emacs Lisp, and it blocks Emacs for a painful few seconds. You rack your brain: how to fix it? You’ve already tuned the performance to your best ability. You’ve ruled out async.el or shelling out, because your code needs to manipulate the current Emacs state. So you go look up the libraries deferred.el (inspired by jsdeferred) or emacs-aio (inspired by asyncio)…
But wait! These concepts can be way too sophisticated for your use-case, as they were for mine. They take time to understand well enough to debug code that uses them. You need to learn computer-science terms like “iteration deferred object”, “errorback” and “callback” before you can make heads or tails of what’s going on. At some point you’re questioning the meaning of a function call and you realize you’ve spent 20 hours implementing something that ought’ve taken 1 hour.
This package is not so abstract, but it suffices to solve an use case I suspect to be common.
Use-case
Say you’ve carved up your big slow function into piecemeal functions, each with near-imperceptible latency. How will you call those mini-functions? Through a dolist
? No, dolists can’t be paused midway through, so that’d block just as long as the whole function. You need some kind of smarter dolist… perhaps something that utilizes run-with-idle-timer
in between each step…
Enter this library. It helps you run a series of functions without hanging Emacs, implemented on a chain of timers.
(Technically, it does not put a timer between every function call, but that’s an implementation detail)
Features
- weeds out many edge cases; rolling your own timer-based non-blocking mechanism is laden with gotchas, as I’ve found
- gets out of the way when the user is operating Emacs
- begins instantly, not after a second of idle time or something like that, because sometimes a second is roughly all the time you get
- (For example: the time between when the user switches to the minibuffer, and when the user begins to type. It can be nice to maximize the chance your loop completes the relevant work within that tight time-space.)
- zero delay in-between function calls
- near-zero performance overhead
- easy profiling: check the log buffer to see how long each of your functions take
- easy debugging: check the log buffer to see what each of your functions returned
- (The return value is only used for this purpose, so you have freedom to return whatever you think would be informative.)
- made with 250 lines of code, and no dependencies
Usage
The API consists mainly of asyncloop-run
, consult its docstring. You may also find reason to call:
(asyncloop-remainder LOOP)
: access state of the running loop, so you can modify it withsetf
,push
,pop
,delete
etc- The value is the list of functions left in the loop. The docstring of
asyncloop-run
can tell you more. - If you don’t know object-oriented programming, this is actually an accessor / “place expression”, so modifying its return value affects the place in memory where it got that value, but only if you use generalized setters like
setf
instead ofsetq
orset
. So you can do(setf (asyncloop-remainder LOOP) (list t #'other-function #'other-function-2))
and it really will change what the loop does next.
- The value is the list of functions left in the loop. The docstring of
(asyncloop-cancel LOOP)
: cancel the loop(asyncloop-pause LOOP)
: pause the loop(asyncloop-resume LOOP)
: resume the loop(asyncloop-log LOOP &rest ARGS)
: send extra message to the log buffer
Example
For an example usage, find the use of “asyncloop-run” inside deianira.el, which gave birth to this library.
Hygiene tips
What to include in each mini-function? Consider
- writing a general sanity check and calling it at the top of each function
- writing a cleanup function and calling it whenever you call
asyncloop-cancel
- doing the heavy or bug-prone calculations inside some let-bindings, and then the side-effects at the end, so that interruptions are likely to happen before any of the side-effects
Anti-wishlist (won’t implement)
- The “argument-passing style” enabled in deferred.el among others, where you can use the return value from each function as input for the next function, sounds elegant but I’ve found it awful for debugging. Perhaps I was missing some tooling?
- I recommend instead the “crude” approach of keeping state in external variables that you can inspect along the way, making it easy to figure out what work a broken chain did or didn’t get done. This approach also helps clarify the intent of your code because you have to name those variables.
- The upside of ignoring return values is you can return any value you want and it’s printed in the log buffer, making for pleasant debugging.
The following snippet shows a way you can write your own solution.
When I first wrote asyncloop, I didn’t know about either input-pending-p
or while-no-input
, so I wrote complicated comparisons involving current-idle-time
… please don’t retrace my steps!
Here’s how you do it!
(cl-defstruct (queue (:constructor queue-create))
fns)
(defun eat (queue)
(funcall (car (queue-fns queue)))
(pop (queue-fns queue))
(if (queue-fns queue)
(eat queue)
(message "All done"))
nil)
(defun resume (queue)
(when (while-no-input (with-local-quit (eat queue)))
(run-with-idle-timer 1 nil #'resume queue)
(message "Pausing for a moment")))
;; Background variables, eval once.
(progn
(setq full-fns (list
(lambda () (sit-for .4) (message "foo"))
(lambda () (sit-for .4) (message "bar"))
(lambda () (sit-for .4) (message "baz"))
(lambda () (sit-for .4) (message "zab"))
(lambda () (sit-for .4) (message "rab"))
(lambda () (sit-for .4) (message "oof"))))
(setq my-queue (queue-create)))
;; Test by evalling this. Watch *Messages* and try to
;; interrupt what it's doing via user activity such as
;; moving the text cursor around.
(progn
(setf (queue-fns my-queue) full-fns)
(resume my-queue))
;; After a C-g, it stops trying to resume. Resume from where
;; it stopped by evalling this.
(resume my-queue)
The function eat
uses recursion, which is cool and all but risks tripping max-lisp-eval-depth
in a very long loop. So let’s edit it to prune the call stack every 100 calls:
(setq recursion-ctr 0)
(defun eat (queue)
(funcall (car (queue-fns queue)))
(pop (queue-fns queue))
(if (queue-fns queue)
(if (> 100 (cl-incf recursion-ctr))
(eat queue)
(setq recursion-ctr 0)
(run-with-timer .01 nil #'eat queue)
nil)
(message "All done")
nil))
Why not just use the timer like that for every call? That’s what I did at first, and found that the loop takes longer to complete: there’s some dead time between each call. By limiting it to once per 100 calls, we eliminate 99% of that overhead.
Don’t like series of functions? A different method Stefan Monnier mentions in passing:
Of course, the body of while-no-input can do regular “checkpoints” so as to know where to start next time.
The keyword is checkpoints. In many ways, this is equivalent to having a list of functions, but you may find it simpler to reason about.
Something like this skeleton:
(defvar stage-1-done nil)
(defvar stage-2-done nil)
(defvar stage-3-done nil)
(defun big-work ()
(while-no-input
(unless stage-1-done
(clean the carpet)
(dress the windows)
(setq stage-1-done t))
(unless stage-2-done
(make the bed)
(work. work never changes)
(setq stage-2-done t))
(unless stage-3-done
...)))
Then just re-try calling big-work
however often that makes sense, and it’ll eventually reach the last stage.