Main Board > Web Services

Curl Syntax

(1/1)

Mike Hatfield:
I am trying to specifiy the folder to download a file using Curl FTP
It uses the current folder by default.
I have trawled all the online doco but I cant find the syntax to specify the local destination.

LET MPROGRAM$="""\hit\pvx\curl.exe""",MDESTINATION$="ftp://ftp.hatfields.com.au/public_html/Future/Texts/"+TT$,MUSER$=%NW_WEB_USER$+ \
             ":"+%NW_WEB_PASSWORD$

LET SS$=MPROGRAM$+" -k --ssl -O """+MDESTINATION$+""" --user """+MUSER$+"""       "
INVOKE WAIT SS$

The above becomes:


"\hit\pvx\curl.exe" -k --ssl -O "ftp://ftp.domain.com.au/public_html/Future/Texts/mytestfile.txt" --user "username:password"

1. What is the syntax to specify the destination on the local computer?
I assume it's possible because *web/ftp can do it.
Unfortunately *web/ftp won't work with the new certificate requirements on ftp server. You always get an error.
I've had that discussion before, hence using Curl. The -k and --ssl parameters overcome the certificate issue.

2. Is there a syntax to specify all eg   *.txt files on the ftp server to download.
Again I can't find that either.
I did find ranges eg   [100-200].txt but that won't work in my case because the files names could be random.
I have resorted to retrieving a file ( ?.txt ) from the ftp server folder and parsing that for individual files.
 
Thanks

koenv:
-O tells curl to download it to the current directory using the original filename (https://curl.haxx.se/docs/manpage.html#-O). Replace it with "-o /path/to/local/file" (https://curl.haxx.se/docs/manpage.html#-o).

For your second question, Google gave me 2 options:
Use -l to get the directory listing first, then use that to get all the files (https://curl.haxx.se/docs/manpage.html#-l).
Use something other than curl.

Devon Austen:
I can confirm that it is -o /path/local that you want instead of -O.

cURL will not handle the * wildcard for you. You have to get the listing from the server and do the pattern match and then specify each file that matched the pattern to download from cURL. This is what *web/ftp has to do to support wildcards.

In a future release *web/ftp will support implicit and explicit FTP over SSL/TLS.

Mike Hatfield:
Guys

Thanks for the input.
I had worked out to download the folder contents and then parse it.
I had read about the -o but didn't realise you could specify the path.
So the bit I was missing was the local path.
Thanks.

Navigation

[0] Message Index

Go to full version