Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Support Casting WholeGraph Embeddings to PyTorch Tensors and NumPy Arrays in cuGraph Feature Store #4078

Closed
alexbarghi-nv opened this issue Jan 4, 2024 · 1 comment
Assignees
Labels
feature request New feature or request non-breaking Non-breaking change
Milestone

Comments

@alexbarghi-nv
Copy link
Member

The cuGraph feature store currently throws an exception for the torch and numpy backends when a WholeGraph embedding is passed. This should be changed to allow passing a WG embedding, but casting it to the appropriate backend, possibly with a warning.

@alexbarghi-nv alexbarghi-nv added feature request New feature or request non-breaking Non-breaking change labels Jan 4, 2024
@alexbarghi-nv alexbarghi-nv added this to the 24.04 milestone Jan 4, 2024
@alexbarghi-nv alexbarghi-nv self-assigned this Jan 4, 2024
@alexbarghi-nv
Copy link
Member Author

Not planned; we're changing the way WG is integrated as a feature store.

@alexbarghi-nv alexbarghi-nv closed this as not planned Won't fix, can't repro, duplicate, stale Apr 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request non-breaking Non-breaking change
Projects
None yet
Development

No branches or pull requests

1 participant