Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Utkarsh/issue 186 #189

Merged

Conversation

Sigma-Verma
Copy link
Contributor

@Sigma-Verma Sigma-Verma commented Sep 23, 2024

Made cache removal safer by using a lock to avoid repeated checks

Pull Request

Description

Enhanced thread safety by consolidating cache removal logic within a lock, preventing redundant iterations and ensuring consistent access to shared data.

Checklist:

  • My code follows OCF's coding style guidelines
  • I have performed a self-review of my own code
  • I have checked my code and corrected any misspellings

Made cache removal safer by using a lock to avoid repeated checks
Copy link

sentry-io bot commented Sep 23, 2024

🔍 Existing Issues For Review

Your pull request is modifying functions with the following pre-existing issues:

📄 File: pv_site_api/cache.py

Function Unhandled Issue
remove_old_cache RuntimeError: dictionary changed size during iteration /sites/{site_uuid}/pv_fo...
Event Count: 4
remove_old_cache RuntimeError: dictionary changed size during iteration /sites/pv_...
Event Count: 1

Did you find this useful? React with a 👍 or 👎

Copy link
Contributor

@peterdudfield peterdudfield left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Sigma-Verma for this.

Do you mind fixing the linting, see CI?

This seems a really good way to do it, with thread locking. Do you know how this works if two calls to the api are made very close to each other? Does this then slow down the second one? Perhaps this is the only way to do it?

Perhaps there's a speed up here, where we check if these items are cache locked, and if so carry on and do nothing

added old-cache-check for skipping unnecessary processing, improving response time
@Sigma-Verma
Copy link
Contributor Author

hey can you guide me on how to fix the Linting ?
and for the question about two very close calls,
i added a catch entry removal check
If no entries need to be removed , it will skip the locking process

@peterdudfield
Copy link
Contributor

hey can you guide me on how to fix the Linting ? and for the question about two very close calls, i added a catch entry removal check If no entries need to be removed , it will skip the locking process

See errors in the github actinos and fix manually
Screenshot 2024-09-30 at 23 01 48
or do make format

logger.debug(f"Memory is {process.memory_info().rss / 10 ** 6} MB")

return last_updated, response
now = datetime.now(tz=timezone.utc)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this identented correctly?

@peterdudfield
Copy link
Contributor

Also, do you mind incorporating #199?

keys_to_remove.append(key)

with cache_lock:
for key, value in last_updated.items():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you include the .copy() that i added please

@peterdudfield
Copy link
Contributor

peterdudfield commented Oct 22, 2024

Thank you for this, just a small lint change,

Screenshot 2024-10-22 at 11 46 54

@peterdudfield peterdudfield changed the base branch from main to development October 28, 2024 15:08
@peterdudfield peterdudfield merged commit 70eddd5 into openclimatefix:development Oct 28, 2024
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants