Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updates token handling for Azure #234

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

SokolovAnatoliy
Copy link

Description of changes

This pull request addresses token management for Azure Open AI tokens. The token is saved to a gptstudio specific directory and ensures that tokens are properly refreshed as needed.

@calderonsamuel
Copy link
Collaborator

calderonsamuel commented Oct 31, 2024

Thank you for taking the time to contribute!

Some comments:

  1. This PR doesn't seem to be linked to an issue, so I have no context on what problem it solves
  2. New dependency Microsoft365R should be stated in DESCRIPTION. Using require() in the .onLoad() function seems a bit strange to me. We don't do that for any other package, so I would like to know the reasons.
  3. Do you know any programatically way to test that this does what is expected?
  4. I'm curious on why you didn't use the full PR template

@JamesHWade
Copy link
Collaborator

Tony and I talked about the PR a few days ago (we work together). This does better handling of the Azure token. Before we were generating a token way too often.

Copy link
Collaborator

@JamesHWade JamesHWade left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Let me know if you have any questions.

},
error = function(e) NULL
)
token <- retrieve_azure_token_object() %>% suppressMessages()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we no longer have magrittr dependency so we are using base pipe

@@ -10,6 +10,9 @@
))
}

require("Microsoft365R")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be good to refactor this so that this is an optional dependency. Most packages won't need it since most don't use Azure OpenAI.

#' a function that determines the appropriate directory to cache a token
#' @export
gptstudio_cache_directory = function(){
rappdirs::user_data_dir(appname = glue::glue("gptstudio"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we switch this to tools::R_user_dir()?

@@ -110,31 +110,33 @@ query_api_azure_openai <-
retrieve_azure_token <- function() {
rlang::check_installed("AzureRMR")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

swap out for Microsoft365R?


invisible(token$token$credentials$access_token)
client <- Microsoft365R:::do_login(tenant = Sys.getenv("AZURE_OPENAI_TENANT_ID"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CRAN might yell about us depending on a hidden function, but I'm not sure what the rules are.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On these situations I would prefer to reverse-engineer the function to have a local version instead of a dependency. But the level of effort required varies between cases

@calderonsamuel
Copy link
Collaborator

Tony and I talked about the PR a few days ago (we work together). This does better handling of the Azure token. Before we were generating a token way too often.

Thanks @JamesHWade , I'll move away from this one then

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants