-
Notifications
You must be signed in to change notification settings - Fork 36
Update the Paradox GitHub package information #138
Comments
Perhaps it'd be useful to come up with a way to do this on the user-end, to avoid running into stale data for lack of timely regeneration. Perhaps it can be done asynchronously every so often, not sure if that's possible. |
This specific part could actually be done on the client (it's a single json file provided by Melpa). The original reason why I cached Paradox information on a server is that Paradox displays the number of starts for a repository, and there's no way to get that unless we do one query for each package (which is way too much for the client). |
I think that'd be a good idea. For example, I personally don't use the stars functionality, so it's pretty unfortunate that a very useful, basic function is limited by a more relatively frivolous feature (star count). If the star counts are a bit outdated it's not that big of an issue, but not being able to see a new package's commit list is pretty annoying. |
Agreed, this has been coming up more and more recently with packages and Paradox saying "not a Gihub repo." Displaying commit logs is one of my favorite (and I would argue most useful) features of Paradox, so it would be great if it was given priority. |
Mon Jul 30 19:55:24 BST 2018 Was about to open an issue, but found this (after not finding an instance of All of the This package ( Questions:
Update:Mon Jul 30 21:12:41 BST 2018 |
Yes, the process can and should be automated (I think I still have the script for that). It's supposed to be run on a server a few times a day. I used to run it on my pc at the university, but now that I work at a regular company I don't want to use my company computer for that. Obviously the correct thing to do is run it on something like heroku or AWS, so it's a little more permanent, but I never got around to it. |
For context, the reason why we need to run this script is to compile a single file with star count of all packages (which takes thousands of rest api requests, and shouldn't be done on the client's computer). Other alternatives are to:
|
Wed Aug 1 12:55:58 BST 2018
The script is responsible for creating all 4 hash tables - including |
Wed Aug 1 18:20:05 BST 2018 When using An extra step to resolve these would be to make an http HEAD request to the repo and use the new location (in the hash table) if a 301 is encountered. |
Tue Aug 7 18:12:14 BST 2018 Think I've figured it out. The github api also correctly handles redirects for relocated repos. (There is now no need to add Curl output for `capitaomorte/sly`
Also, just noticed this: I've added It'll prolly take you 2 mins to write that. |
Not in a single request but close enough. I have just implemented this for See emacscollective/emir@021df24 (here on Github - I have added some comments). |
By the way, I have also updated the recipes for @joaotavora's packages on Melpa. |
Tue Aug 7 23:35:10 BST 2018
@tarsius - I see this was done in melpa/melpa@4150455 - 7 repos. Thank you! |
Nice to know. Hopefully I can spend some time on that this year. :-D |
Thanks for the help @tarsius, I've finally managed to look into it. :-) Unfortunately, it seems that pagination is limited to 100 repositories, so we'd still have to make dozens of queries. While that's a lot better than thousands, I'm not sure it would be a good user experience to try to do that while refreshing the package list. |
Actually I have updated this to use GraphQL and to be done asynchronously. The difficult bit is implemented in
|
Yes, I went through the code and I can see that being done. The problem is that all of this has to happen before the package buffer is rendered, and I can't make the user wait for 42 consecutive requests. I could run the requests async after the package buffer is rendered, but populating the buffer with the star-counts after it has already been rendered is going to be a lot of work. :-P |
There's a middle ground. Instead of periodically fetching the data on some server and then distributing the information you could just do the same locally. I don't see a need to suddenly make sure this information is completely up-to-date. So it should be enough to provide a separate "update melpa statistics" command and/or to automatically update periodically and in the background. |
Evil has migrated to GitHub and its repo information is reflected in the melpa recipes but one still can't view the commit-list in Paradox because it claims it's not a GitHub package, because the paradox data source constructed from the melpa recipes hasn't been updated for some time.
The text was updated successfully, but these errors were encountered: