Skip to content

Commit

Permalink
Update documentation for Importing a single object for local blocksto…
Browse files Browse the repository at this point in the history
…re (#7069)
  • Loading branch information
Isan-Rivkin authored Nov 28, 2023
1 parent c9703a3 commit 9bd29a8
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/howto/deploy/onprem.md
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,7 @@ blockstore:
- Using a local adapter on a shared location is relativly new and not battle-tested yet
- lakeFS doesn't control the way a shared location is managed across machines
- Import works only for folders
- When using lakectl or the lakeFS UI, you can currently import only directories. If you need to import a single file, use the [HTTP API](https://docs.lakefs.io/reference/api.html#/import/importStart) or API Clients with `type=object` in the request body and `destination=<full-path-to-file>`.
- Garbage collector (for committed and uncommitted) and lakeFS Hadoop FileSystem currently unsupported

{% include_relative includes/setup.md %}
Expand Down
2 changes: 1 addition & 1 deletion docs/howto/import.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ lakectl import \
1. The import duration depends on the amount of imported objects, but will roughly be a few thousand objects per second.
1. For security reasons, if you are using lakeFS on top of your local disk (`blockstore.type=local`), you need to enable the import feature explicitly.
To do so, set the `blockstore.local.import_enabled` to `true` and specify the allowed import paths in `blockstore.local.allowed_external_prefixes` (see [configuration reference]({% link reference/configuration.md %})).
Presently, local import is allowed only for directories, and not single objects.
When using lakectl or the lakeFS UI, you can currently import only directories locally. If you need to import a single file, use the [HTTP API](https://docs.lakefs.io/reference/api.html#/import/importStart) or API Clients with `type=object` in the request body and `destination=<full-path-to-file>`.
1. Making changes to data in the original bucket will not be reflected in lakeFS, and may cause inconsistencies.

## Examples
Expand Down

0 comments on commit 9bd29a8

Please sign in to comment.