Replies: 1 comment
-
Since S3 is part of the official tractusx-edc release, I tried with this (and not Azure Blob Storage) and got it working with S3 -> S3 transfers. Since this was kind of difficult for me to get it up and running because I couldn't find the relevant information, I tried to put together my learning in the following sequence diagram. Most important learning, With this, I can say at least say for S3, my Q1 from above was a misunderstanding. At least with S3 the consumer provides the required access keys during the transfer process. Still, I think the question remains if it wouldn't be more efficient to trigger a Cloud-2-Cloud copy instead of down- and uploading the file via the provider data plane. I hope this helps others here and let me know in case you have questions. -- Matthias Binzer |
Beta Was this translation helpful? Give feedback.
-
Because of an internal review I looked into how (to my understanding) blob storage transfers work.
This is related to #639 and others.
I had 3 major questions where I was not sure if this is the desired approach - or maybe I misunderstood the concept or the code.
Q1: to my understanding, the consumer provides the
keyName
under which the provider does a lookup in its wallet to find the appropriate token to upload to the consumer's blob storage. Is this correct?If yes, shouldn't this be a token - dynamically or statically - created on the consumer side that is sent during the transfer process? This would allow short lifetime of those tokens and also not require out of band token transfers. And not allow consumers tell the provider which key to load and to send to an external API!
Q2: Also related to Q1, I think there is a minor security question here. Can it happen, that the consumer sets someone elses
keyName
and the provider would load this from the vault and send it to the (evil) consumer blob storage API? Currently I only see the provider side configuration of the endpoint"https://%s.blob.core.windows.net"
that prevents this (assuming azure does NOT log the access tokens and after a quick search it seems to me that only hashes of those can be logged).I think this needs some further security investigation IF it is really the case, that the provider loads any consumer given value from the vault and sends it to an external API! Can someone please deny that this is possible :-)
Q3: Wouldn't a transfer be much quicker with a described/proposed platform-2-platform transfer?
--
Matthias Binzer
Beta Was this translation helpful? Give feedback.
All reactions