Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

download_file() improvements #3331

Open
jay7x opened this issue Jun 28, 2024 · 3 comments
Open

download_file() improvements #3331

jay7x opened this issue Jun 28, 2024 · 3 comments
Labels
Feature New features and improvements.

Comments

@jay7x
Copy link
Contributor

jay7x commented Jun 28, 2024

Use Case

The existing approach with storing a file downloaded to the $destination directory under the basename($source) file name is quite limiting.. Downloading bunch of files from a host or downloading files with the same name but different path from a host makes it especially painful.

Describe the Solution You Would Like

  1. It'd be nice if download_file() accepts an Array as a source. This way we can download bunch of files with one call. That opens another question how to download a bunch of files within one transport connection, but it's another question.
  2. Make $destination a dynamic template, which can accept the source file name and a host name as a parameters. For example:
$res = download_file(['/tmp/1/example.txt', '/tmp/2/example.txt'], 'examples/<%= md5sum($hostname)[1,8] %>/<%= $source_path.regsubst('[/]', '_') %>/<%= shellquote($source_filename) %>.txt', _dynamic_destination => true)

Side notes

  1. That allows to solve download_file shouldn't be destructive #3198 also. No real reason to wipe the target directory I guess.
  2. Maybe it's better to implement another function instead of extending the download_file() (download_files())?. Though, there is no breaking change as long as template processing is requested explicitly in the $options.
@jay7x jay7x added the Feature New features and improvements. label Jun 28, 2024
@jay7x
Copy link
Contributor Author

jay7x commented Jun 28, 2024

Wrt downloading multiple files from a host in one ssh session. I've just managed to use sftp -b- with a generated batchfile to achieve that. Roughly like this:

run_command("echo \"${batchfile}\" | sftp -b- ${target.user}@{$target.uri}", 'localhost')

@dionysius
Copy link

dionysius commented Aug 21, 2024

I'd also love to be able to download multiple files in one command. We use oscap to generate a report file and a results file per target and currently I need to call download_file twice. Wildcard support also appreciated.

And since those files are text files in my case, could we also introduce a flag to compress it during transport (if not already handled in some way)?

This brings me to a next idea: Can we have a pipe plan function (There may be needed 2 of those for each direction e.g. pipe_in(...) and pipe_out(...)). This way we could generically run any command on one side and pipe any data through the connection. The source side could tar multiple files together, gzip it, send it through the pipe and the receiver side gunzips and untars it. This way with one function a flexible type, amount and compression of data transportation is possible.

Example:

$cmd_on_target = "tar zcf - /tmp/somefolder/report*"
$cmd_receive = "tar zxf - -C /srv/tostorethis"
$res = pipe_in($cmd_on_target, $cmd_receive, $targets)

(consider case if multiple targets are set, maybe theres a variable we can use in $cmd_receive)

@jay7x
Copy link
Contributor Author

jay7x commented Aug 27, 2024

Hit this today again.. so far I can say that the only good use-case is when you want to download one file/one directory from bunch of hosts once. For anything else it's easier to script rsync or sftp directly.

P.S. As a consequence I need file::move() or file::rename() now..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature New features and improvements.
Projects
None yet
Development

No branches or pull requests

2 participants