Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 532 Vote(s) - 3.45 Average
  • 1
  • 2
  • 3
  • 4
  • 5
How to capture cURL output to a file?

#11
My favorite is lwp-download, which can be found here:

[To see links please register here]


You can use it like this:

lwp-download

[To see links please register here]


This will store the file as "latest.tar.gz" in your current directory, so no further option is needed.
Reply

#12
Either `curl` or `wget` can be used in this case. All 3 of these commands do the same thing, downloading the file at

[To see links please register here]

and saving it locally into "my_file.txt".

_Note that in all commands below, I also recommend using the `-L` or `--location` option with `curl` in order to follow HTML 302 redirects to the new location of the file, if it has moved. `wget` requires no additional options to do this, as it does this automatically._

```bash
# save the file locally as my_file.txt

wget

[To see links please register here]

-O my_file.txt # my favorite--it has a progress bar
curl -L

[To see links please register here]

-o my_file.txt
curl -L

[To see links please register here]

> my_file.txt
```

Alternatively, to save the file as the same name locally as it is remotely, use either `wget` by itself, or `curl` with `-O` or `--remote-name`:
```bash
# save the file locally as file.txt

wget

[To see links please register here]

curl -LO

[To see links please register here]

curl -L --remote-name

[To see links please register here]

```

_Notice that the `-O` in all of the commands above is the capital letter "O"._

The nice thing about the `wget` command is it shows a nice progress bar.

You can prove the files downloaded by each of the sets of 3 techniques above are exactly identical by comparing their sha512 hashes. Running `sha512sum my_file.txt` after running each of the commands above, and comparing the results, reveals all 3 files to have the exact same sha hashes (sha sums), meaning the files are exactly identical, byte-for-byte.


## References
1. I learned about the `-L` option with `curl` here: [Is there a way to follow redirects with command line cURL?](

[To see links please register here]

)

<sub>See also: [`wget` command to download a file and save as a different filename](

[To see links please register here]

;

Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through