NEWS.md
gcs_auth
gcs_load_all()
(#112) - thanks @jasonmhouleoption(googleCloudStorageR.upload_limit)
or gcs_upload_set_limit()
- default is 5000000L or 5MB (#120)gcs_save_all()
to also accept bucket level ACL (#129)gcs_list_buckets()
(#100)gcs_setup()
to help first time setupgcs_save_all()
to use new zip
for filepaths (Thanks @caewok) #107gcs_version_bucket()
- thanks @j450h1 ! (#96)gcs_upload()
will use file extension of name
in its temporary file (#91)gcs_copy_object()
gcs_compose_objects()
gcs_list_objects()
to googleAuthR > 0.7 gar_api_page()
MULTI_REGIONAL
, REGIONAL
, and COLDLINE
gcs_load
wouldn’t work if file name not “.RData”gcs_first
and gcs_last
to autosave your file workspace to GCSgcs_save_all
and gcs_load_all
which will zip, save/load and upload/download a directory_gcssave.yaml
file to control gcs_first/last
behaviourgcs_object_metaname
not require name so it can be reused (#56 - thanks seandavi)gs://
style URLs for object names (#57 - thanks seandavi)gcs_get_bucket()
to only expect length 1 character vectors for bucket name. (#60)gcs_object_list
(#58 - thanks @G3rtjan)saveToDisk
option to gcs_load
(#52 - thanks @tomsing1)gcs_get_object()
(#63 - thanks @nkeriks)prefix
and delimiter
in gcs_object_list
to filter objects listed (#68)gcs_get_object
now supports downloads over 2GB (#69)gcs_upload
gcs_save
to store R session data in cloudgcs_load
to restore session data stored with gcs_save
options(googleAuthR.rawResponse = TRUE)
when using gcs_get_object
object_name
in gcs_get_object
etc.gcs_global_bucket
and gcs_get_global_bucket
to set global bucket namegcs_upload
gcs_metadata_object
gcs_upload
to allow uploads over 5MB, limit now 5TBgcs_retry_upload
gcs_delete_object
gcs_source
to source .R files from GCS