Your best friend for file transfer.

Unsuccessful transfer of very large file (2 posts)
- Started 13 years ago by Lissa
- Latest reply 13 years ago from Scott McGuire
-
Lissa Member
-
Scott McGuire Administrator
Hi,
So, both times, the download stopped after exactly an hour, is that correct?
The fact that it happened twice, after exactly the same period of time, and the time is a round number like 1 hour, suggests that either the server or your Internet provider is automatically disconnecting downloads that go on for longer than an hour. I'd begin by contacting the people who run the server and talk to them and see if they have such a limit, and if so, if there is a way they can make an exception for you. If they say they don't have a limit, I'd check with your Internet provider next. (It's also possible, although unlikely, that your network hardware - your router or cable/DSL modem - causes disconnects after an hour. But the server or your ISP are much more likely candidates.)
(The length of the iCal event wouldn't affect your transfer - all iCal does is tell Fetch to start the transfer at the time you specify; after that, iCal doesn't communicate with Fetch again.)
Fetch does have a resume download command, which lets you pick up download a large file where the previous download left off, so if you have no other options, you could use the resume download command repeatedly over several nights to get your download. While resume download isn't directly Applescriptable, there's a way to schedule a specific file to resume downloading with iCal. If nothing else works out, let us know and we can help you set this up.
Please let us know if you have further questions.
Thanks,
Scott McGuire
Fetch Softworks
- Page 1
I've tried several ways of getting a 30+GB file from my hosting company's server. I thought Fetch was going to save me, as I set up a test with a small file, following instructions here for having the transfer start via Automator and iCal (transfer has to start after midnight and complete before noon to avoid bandwidth throttling). The small file worked. But the big file just stops at about 3GB at 2 AM (I had it start at 1). I thought the first time that it was iCal shutting the program down because the "event" ran over an hour, so I made a 6 hour event, and it still shut down after an hour. Here's the Fetch Transcript. I'm on OS X 10.5.7, and this is a trial version of Fetch (5.5.1). Any help will be mightily appreciated!
Connecting to 72.167.53.211 port 21 (Mac OS X firewall is limiting connections to specific applications) (7/18/09 1:00 AM)
Connected to 72.167.53.211 port 21 (7/18/09 1:00 AM)
220 ProFTPD 1.3.1 Server (ProFTPD) [72.167.53.211]
USER tc5w0r1d
331 Password required for tc5w0r1d
PASS
230 User tc5w0r1d logged in
SYST
215 UNIX Type: L8
PWD
257 "/" is the current directory
MACB ENABLE
500 MACB not understood
CWD private/
250 CWD command successful
PWD
257 "/private" is the current directory
PWD
257 "/private" is the current directory
TYPE A
200 Type set to A
PASV
227 Entering Passive Mode (72,167,53,211,173,86).
Making data connection to 72.167.53.211 port 44374
LIST -al
150 Opening ASCII mode data connection for file list
drwx------ 3 tc5w0r1d root 4096 Jul 10 04:02 .
drwxr-xr-x 15 root root 4096 Jul 10 03:54 ..
-rw------- 1 tc5w0r1d psaserv 165 Apr 1 02:56 README.txt
drwxr-xr-x 14 tc5w0r1d psacln 4096 Jul 7 22:24 _restore7-1-09
-rw-r--r-- 1 root root 33163315233 Jul 10 05:23 _restore7-1-09.gz
-rw-r--r-- 1 root root 29404282880 Jul 9 22:27 httpdocs_backup_7-9-2009.tar.gz
-rw-r--r-- 1 tc5w0r1d psacln 17 Jul 6 11:42 myuntitled.txt
226-Transfer complete
226 Quotas off
TYPE I
200 Type set to I
SIZE _restore7-1-09.gz
213 33163315233
MDTM _restore7-1-09.gz
213 20090710122316
PASV
227 Entering Passive Mode (72,167,53,211,144,75).
Making data connection to 72.167.53.211 port 36939
RETR _restore7-1-09.gz
150 Opening BINARY mode data connection for _restore7-1-09.gz (33163315233 bytes)
Unsuccessful transfer of _restore7-1-09.gz as binary data (3,251,811,248/33,163,315,233 bytes, 902,528 bytes/sec, 1:00:03 elapsed) stopped at 7/18/09 2:00 AM (error: 2,-30014)
ftp_retrieve: 2,-30014 (state == RGET_RETRIEVING)
Posted 13 years ago #