save_object()
now useshttr::write_disk()
to avoid having to load a file into memory. (#158, h/t Arturo Saco)
- Remove usage of
endsWith()
in two places to reduce (implicit) base R dependency. (#147, h/t Huang Pan)
- Bump aws.signature dependency to 0.3.4. (#142, #143, #144)
- Attempt to fix bug introduced in 0.3.4. (#142)
- Update code and documentation to use aws.signature (>=0.3.2) credentials handling.
put_object()
andput_bucket() now expose explicit
acl` arguments. (#137)get_acl()
andput_acl()
are now exported. (#137)- Added a high-level
put_folder()
convenience function for creating an empty pseudo-folder.
put_bucket()
now errors if the request is unsuccessful. (#132, h/t Sean Kross)- Fixed a bug in the internal function
setup_s3_url()
whenregion = ""
.
- DESCRIPTION file fix for CRAN.
- CRAN (beta) release. (#126)
bucketlist()
gains both an alias,bucket_list_df()
, and an argumentadd_region
to add a region column to the output data frame.
- Exported the
s3sync()
function. (#20) save_object()
now creates a local directory if needed before trying to save. This is useful for object keys contains/
.
- Some small bug fixes.
- Updated examples and links to API documentation.
- Tweak region checking in
s3HTTP()
.
- Fix reversed argument order in
s3readRDS()
ands3saveRDS()
. - Fixed the persistent bug related to
s3readRDS()
. (#59) - Updated some documentation.
- Mocked up multipart upload functionality within
put_object()
(#80) - Use
tempfile()
instead ofrawConnection()
for high-level read/write functions. (#128) - Allow multiple CommonPrefix values in
get_bucket()
. (#88) get_object()
now returns a pure raw vector (without attributes). (#94)s3sync()
relies onget_bucket(max = Inf)
. (#20)s3HTTP()
gains abase_url
argument to (potentially) support S3-compatible storage on non-AWS servers. (#109)s3HTTP()
gains adualstack
argument provide support for "dual stack" (IPv4 and IPv6) support. (#62)
- Fixed a bug in
get_bucket()
whenmax = Inf
. (#127, h/t Liz Macfie)
- Two new functions -
s3read_using()
ands3write_using()
provide a generic interface to reading and writing objects from S3 using a specified function. This provides a simple and extensible interface for the import and export of objects (such as data frames) in formats other than those provided by base R. (#125, #99)
s3HTTP()
gains aurl_style
argument to control use of "path"-style (new default) versus "virtual"-style URL paths. (#23, #118)
- All functions now produce errors when requests fail rather than returning an object of class "aws_error". (#86)
s3save()
gains anenvir
argument. (#115)
get_bucket()
now automatically handles pagination based upon the specified number of objects to return. (PR #104, h/t Thierry Onkelinx)get_bucket_df()
now uses an available (but unexported)as.data.frame.s3_bucket()
method. The resulting data frame always returns character rather than factor columns.
- Further changes to region vertification in
s3HTTP()
. (#46, #106 h/t John Ramey)
bucketlist()
now returns (in addition to past behavior of printing) a data frame of buckets.- New function
get_bucket_df()
returns a data frame of bucket contents.get_bucket()
continues to return a list. (#102, h/t Dean Attali)
s3HTTP()
gains acheck_region
argument (default isTRUE
). IfTRUE
, attempts are made to verify the bucket's region before performing the operation in order to avoid confusing out-of-region errors. (#46)- Object keys can now be expressed using "S3URI" syntax, e.g.,
object = "s3://bucket_name/object_key"
. In all cases, the bucketname and object key will be extracted from this string (meaning that a bucket does not need to be explicitly specified). (#100; h/t John Ramey) - Fixed several places where query arguments were incorrectly being passed to the API as object key names, producing errors.
- Update and rename policy-related functions.
- Exported the
get_bucket()
S3 generic and methods.
- Fixed a bug related to the handling of object keys that contained spaces. (#84, #85; h/t Bao Nguyen)
- Fixed a bug related to the handling of object keys that contained atypical characters (e.g.,
=
). (#64) - Added a new function
s3save_image()
to save an entire workspace. - Added a temporary fix for GitHub installation using the DESCRIPTION
Remotes
field.
- Added function
s3source()
as a convenience function to source an R script directly from S3. (#54)
- Added support for S3 "Acceleration" endpoints, enabling faster cross-region file transfers. (#52)
s3save()
,s3load()
,s3saveRDS()
, ands3readRDS()
no longer write to disk, improving performance. (#51)
- Added new functions
s3saveRDS()
ands3readRDS()
. (h/t Steven Akins, #50)
- Operations on non-default buckets (outside "us-east-1") now infer bucket region from bucket object. Some internals were simplified to better handle this. (h/t Tyler Hunt, #47)
- All functions now use snake case (e.g.,
get_object()
). Previously available functions that did not conform to this format have been deprecated. They continue to work, but issue a warning. (#28) - Separated authenticated and unauthenticated testthat tests, conditional on presence of AWS keys.
- Numerous documentation fixes and consolidations.
- Dropped XML dependency in favor of xml2. (#40)
- The structure of an object of class "s3_bucket" has changed. It now is simply a list of objects of class "s3_object" and bucket attributes are stored as attributes to the list.
- The order of
bucket
andobject
names was swapped in most object-related functions and the Bucket name has been added to the object lists returned bygetbucket()
. This means thatbucket
can be omitted whenobject
is an object of class "s3_object".
- Initial release.