Jump to content

FTP download failure - fails at exactly the same point in the transfer file.


Recommended Posts

Guest mark4asp
Posted

I am trying to download a backup copy of the main database from a

Windows 2003 server. This has been done everyday at 1:15 for the last

11 months using a script (with the built-in command line ftp utility).

Not one of those transfers failed. [i know that because an MD5

fingerprint is taken before and after].

 

Today all 5 of my attempted tranfers failed. The first 3 using the

command line utility. The last two using FileZilla. The first two

transfers both failed at exactly the same point (byte-wise); as did the

last two transfers. Transfers 1 & 2 were trying to copy version 1 of

the backup file. Transfers 3,4,5 were trying to copy version 2 of the

backup file.

 

What is causing these problems?

  • Replies 7
  • Created
  • Last Reply
Guest Pegasus \(MVP\)
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

 

"mark4asp" <mark4asp@gmail.com> wrote in message

news:01d785b7$0$21126$c3e8da3@news.astraweb.com...

>I am trying to download a backup copy of the main database from a

> Windows 2003 server. This has been done everyday at 1:15 for the last

> 11 months using a script (with the built-in command line ftp utility).

> Not one of those transfers failed. [i know that because an MD5

> fingerprint is taken before and after].

>

> Today all 5 of my attempted tranfers failed. The first 3 using the

> command line utility. The last two using FileZilla. The first two

> transfers both failed at exactly the same point (byte-wise); as did the

> last two transfers. Transfers 1 & 2 were trying to copy version 1 of

> the backup file. Transfers 3,4,5 were trying to copy version 2 of the

> backup file.

>

> What is causing these problems?

 

Hard to say. Disk full? Server reboot? File too large? What

happens when you copy the file instead of using FTP?

Guest mark4asp
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

"mark4asp" <mark4asp@gmail.com> wrote in message

news:01d785b7$0$21126$c3e8da3@news.astraweb.com...

>> What is causing these problems?

 

On Mon 17 Dec 2007 18:35:02 +0100 Pegasus wrote:

> Hard to say.

> Disk full?

 

No. None of the discs on the client or server are full.

> Server reboot?

 

Why would that happen? But the server was upgraded last Friday with a

software update from Microsoft, installed by the web-host.

> File too large?

 

It is 76,987,904 bytes. Is that too large for ftp to handle?

> What happens when you copy the file instead of using FTP?

 

I can copy if from disc to disc on the server with no problems.

 

The server ftp protocol seems to be broke. Could it have been last

Fridays's update that broke it? The update is always followed by a

server reboot.

 

Everyday at about 1:15 I send myself a message like this:

Last ftp upload of my database was OK

Hash1: ED6DB477F67CB68A3C790274256573F2

Hash2: ED6DB477F67CB68A3C790274256573F2

 

The two hashes refer to MD5 fingerprints on the FTPed file before and

after. The last one of these I got was dated 13 Dec 2007, which was

Thrusday morning. I should've got Friday morning's too if the cause of

te fault was the server reboot the host did on Friday!!! Puzzle!!!

It seems to have broke at least 9 hours before my web host updated the

server.

Guest mark4asp
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

This is not any kind of random failure as I'm always getting a certain

number of bytes transfered.

 

After my web host advised me to use Active mode.

 

I changed to active mode and repeated that attempt using FileZilla and

the same backup file as used for #4 and #5. The new attempt [#6] failed

precisely at 32.1% [24,726,410 bytes].

 

32.1% is exactly where it failed for attempts #4 and #5. I didn't

record the number of bytes transfered for #4,#5.

 

After abondoning #6, when it got stuck, I found the portion downloaded

here (on disk) to be: 24,707,072 bytes. FileZilla shows that to be

24,726,410 bytes in its file listing. I will now repeat the process.

I close down FileZilla and rename the portion of the file downloaded to

xxxx.file6. As soon as I close down FileZilla I notice that the portion

now has a filesize of: 24,726,410 bytes.

 

Attempt #7 gave 24,707,072 bytes on my local disk at the point where I

had to disconnect the ftp client becauase it got stuck again. This

reads as 24,726,410 bytes in the FileZilla listing (again)

 

24,707,072 bytes is the number of bytes of the partial file stored on

my local disk when the transfer freezes. I can't rename the partial

file at this point because even though I have stopped the transfer

FileZilla still has and handle on it. I have to close FileZilla down.

Then I read the the filesize as : 24,726,410 bytes. I open up FileZilla

 

I am now going to make an eighth attempt but, with the microsoft

command line ftp utility. This has, at least, managed to get past 32.1%

! Last time I tried the command line ftp utility with this backup file

(#3) it failed at about 48Mb. This time it got stuck at: 39,112,162

bytes. Eventually, after getting stuck the command line utility closes

itself down. The final portion tranfered for #8 is: 39,112,162 bytes.

 

#9. I repeated the process (ftp command line utility, default mode). It

eventually managed to transfer: 43,685,376 bytes before it got stuck.

 

#10 (ftp command line utility, default mode). After getting stuck and

closing this left a portion of 43,685,376 bytes too. [same as #9]. So

#8 was the anomally here.

 

#11. I created a 3rd version of the database backup file and tried to

use the ftp command line utility, default mode. This one stopped at:

24,726,410 bytes [same as #7, #6 and probably also #4,#5]

 

FTP is consistently getting stuck and stopping at identical places for

identical file transfers when using the same software.

 

OK. I give up. FTP does not work on Win 2003. What did microsoft do to

break it and will they be fixing what they broke? Is this a gentle

nudge telling us all that we need to upgrade to 2008? [joke]

Guest Pegasus \(MVP\)
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

 

"mark4asp" <mark4asp@gmail.com> wrote in message

news:00be1b68$0$25784$c3e8da3@news.astraweb.com...

> This is not any kind of random failure as I'm always getting a certain

> number of bytes transfered.

>

> After my web host advised me to use Active mode.

>

> I changed to active mode and repeated that attempt using FileZilla and

> the same backup file as used for #4 and #5. The new attempt [#6] failed

> precisely at 32.1% [24,726,410 bytes].

>

> 32.1% is exactly where it failed for attempts #4 and #5. I didn't

> record the number of bytes transfered for #4,#5.

>

> After abondoning #6, when it got stuck, I found the portion downloaded

> here (on disk) to be: 24,707,072 bytes. FileZilla shows that to be

> 24,726,410 bytes in its file listing. I will now repeat the process.

> I close down FileZilla and rename the portion of the file downloaded to

> xxxx.file6. As soon as I close down FileZilla I notice that the portion

> now has a filesize of: 24,726,410 bytes.

>

> Attempt #7 gave 24,707,072 bytes on my local disk at the point where I

> had to disconnect the ftp client becauase it got stuck again. This

> reads as 24,726,410 bytes in the FileZilla listing (again)

>

> 24,707,072 bytes is the number of bytes of the partial file stored on

> my local disk when the transfer freezes. I can't rename the partial

> file at this point because even though I have stopped the transfer

> FileZilla still has and handle on it. I have to close FileZilla down.

> Then I read the the filesize as : 24,726,410 bytes. I open up FileZilla

>

> I am now going to make an eighth attempt but, with the microsoft

> command line ftp utility. This has, at least, managed to get past 32.1%

> ! Last time I tried the command line ftp utility with this backup file

> (#3) it failed at about 48Mb. This time it got stuck at: 39,112,162

> bytes. Eventually, after getting stuck the command line utility closes

> itself down. The final portion tranfered for #8 is: 39,112,162 bytes.

>

> #9. I repeated the process (ftp command line utility, default mode). It

> eventually managed to transfer: 43,685,376 bytes before it got stuck.

>

> #10 (ftp command line utility, default mode). After getting stuck and

> closing this left a portion of 43,685,376 bytes too. [same as #9]. So

> #8 was the anomally here.

>

> #11. I created a 3rd version of the database backup file and tried to

> use the ftp command line utility, default mode. This one stopped at:

> 24,726,410 bytes [same as #7, #6 and probably also #4,#5]

>

> FTP is consistently getting stuck and stopping at identical places for

> identical file transfers when using the same software.

>

> OK. I give up. FTP does not work on Win 2003. What did microsoft do to

> break it and will they be fixing what they broke? Is this a gentle

> nudge telling us all that we need to upgrade to 2008? [joke]

 

I have not come across this issue. If I was in your position then

I would do this:

- Consider using "copy" instead of ftp, and/or

- Check Google for other posts on the subject of ftp, large files, service

packs.

Guest mark4asp
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

Pegasus (MVP) wrote:

>

> "mark4asp" <mark4asp@gmail.com> wrote in message

> news:00be1b68$0$25784$c3e8da3@news.astraweb.com...

> > This is not any kind of random failure as I'm always getting a

> > certain number of bytes transfered.

> >

> > After my web host advised me to use Active mode.

> >

> > I changed to active mode and repeated that attempt using FileZilla

> > and the same backup file as used for #4 and #5. The new attempt

> > [#6] failed precisely at 32.1% [24,726,410 bytes].

> >

> > 32.1% is exactly where it failed for attempts #4 and #5. I didn't

> > record the number of bytes transfered for #4,#5.

> >

> > After abondoning #6, when it got stuck, I found the portion

> > downloaded here (on disk) to be: 24,707,072 bytes. FileZilla shows

> > that to be 24,726,410 bytes in its file listing. I will now repeat

> > the process. I close down FileZilla and rename the portion of the

> > file downloaded to xxxx.file6. As soon as I close down FileZilla I

> > notice that the portion now has a filesize of: 24,726,410 bytes.

> >

> > Attempt #7 gave 24,707,072 bytes on my local disk at the point

> > where I had to disconnect the ftp client becauase it got stuck

> > again. This reads as 24,726,410 bytes in the FileZilla listing

> > (again)

> >

> > 24,707,072 bytes is the number of bytes of the partial file stored

> > on my local disk when the transfer freezes. I can't rename the

> > partial file at this point because even though I have stopped the

> > transfer FileZilla still has and handle on it. I have to close

> > FileZilla down. Then I read the the filesize as : 24,726,410

> > bytes. I open up FileZilla

> >

> > I am now going to make an eighth attempt but, with the microsoft

> > command line ftp utility. This has, at least, managed to get past

> > 32.1% ! Last time I tried the command line ftp utility with this

> > backup file (#3) it failed at about 48Mb. This time it got stuck

> > at: 39,112,162 bytes. Eventually, after getting stuck the command

> > line utility closes itself down. The final portion tranfered for #8

> > is: 39,112,162 bytes.

> >

> > #9. I repeated the process (ftp command line utility, default

> > mode). It eventually managed to transfer: 43,685,376 bytes before

> > it got stuck.

> >

> > #10 (ftp command line utility, default mode). After getting stuck

> > and closing this left a portion of 43,685,376 bytes too. [same as

> > #9]. So #8 was the anomally here.

> >

> > #11. I created a 3rd version of the database backup file and tried

> > to use the ftp command line utility, default mode. This one stopped

> > at: 24,726,410 bytes [same as #7, #6 and probably also #4,#5]

> >

> > FTP is consistently getting stuck and stopping at identical places

> > for identical file transfers when using the same software.

> >

> > OK. I give up. FTP does not work on Win 2003. What did microsoft do

> > to break it and will they be fixing what they broke? Is this a

> > gentle nudge telling us all that we need to upgrade to 2008? [joke]

>

> I have not come across this issue. If I was in your position then

> I would do this:

> - Consider using "copy" instead of ftp, and/or

> - Check Google for other posts on the subject of ftp, large files,

> service packs.

 

Yesterday I attempted and failed (11 times) to do an ftp download of

the application database backup to my PC. I tried 2 different types of

software (Filezilla and the command line utility) with 3 different

versions of the database backup. In every case the download got stuck

mid-way.

 

This morning, at home, I managed an ftp download of the database backup

with no problems.

 

Just now, back in the office, when I tried yet again it got stuck at

24,726,410 bytes again. On resuming the download it managed a further

24Mb or so then got stuck again. Resuming as 2nd time does not work.

Even closing the ftp package down (FileZilla) and resuming a 3rd time

results in nothing being transfered.

 

I've also noticed that my automatic ftp download, using a back-room

server is broke too. Every night, at about 1:15 in the morning the

backup database file should be downloaded to a server in our back-room.

This has not been done since last Thursday morning. The last successful

notification I have of this is dated: 13 December 2007 01:20. The

attempt on 14 Dec, 1:15 resulted in only 7Mb being transfered before

the download broke. It looks like the script used to run the automatic

download had not run since then, or if it has run, it has stopped

before attempting the ftp download. Is there any way that the server

patch installed on last Friday (at about 11:00) could be stopping my

script from running? I have scripts on both the remote server (which

holds the database) and my local server. The remote script backs the

database up and does an MD5 check. The local script copies the backup

and Md5 check over to the local server, validates the MD5 fingerprint

and sends me an email indicating success or failure.

 

The typical size of a database backup is now 76Mb.

 

There seems to be a problem at the our office end but the network

administrator, here insists there are no such ftp problems.

 

Here is my ftp script:

 

open xxx.xxx.xxx.xxx

myUserName

myPassword

cd private

lcd C:\Data\Development\myApp\Database

ascii

get MW40.MD5

binary

get MW40.BAK

disconnect

bye

 

How can the script be improved? If I use the actual ftp location

ftp://myapp.com/private rather than the IP would that be better?

 

Should I consider using a script written in WSH instead?

 

The remote server has no problem creating the daily backup.

Guest Pegasus \(MVP\)
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

 

"mark4asp" <mark4asp@gmail.com> wrote in message

news:00b9f6bd$0$32763$c3e8da3@news.astraweb.com...

> Pegasus (MVP) wrote:

>

>>

>> "mark4asp" <mark4asp@gmail.com> wrote in message

>> news:00be1b68$0$25784$c3e8da3@news.astraweb.com...

>> > This is not any kind of random failure as I'm always getting a

>> > certain number of bytes transfered.

>> >

>> > After my web host advised me to use Active mode.

>> >

>> > I changed to active mode and repeated that attempt using FileZilla

>> > and the same backup file as used for #4 and #5. The new attempt

>> > [#6] failed precisely at 32.1% [24,726,410 bytes].

>> >

>> > 32.1% is exactly where it failed for attempts #4 and #5. I didn't

>> > record the number of bytes transfered for #4,#5.

>> >

>> > After abondoning #6, when it got stuck, I found the portion

>> > downloaded here (on disk) to be: 24,707,072 bytes. FileZilla shows

>> > that to be 24,726,410 bytes in its file listing. I will now repeat

>> > the process. I close down FileZilla and rename the portion of the

>> > file downloaded to xxxx.file6. As soon as I close down FileZilla I

>> > notice that the portion now has a filesize of: 24,726,410 bytes.

>> >

>> > Attempt #7 gave 24,707,072 bytes on my local disk at the point

>> > where I had to disconnect the ftp client becauase it got stuck

>> > again. This reads as 24,726,410 bytes in the FileZilla listing

>> > (again)

>> >

>> > 24,707,072 bytes is the number of bytes of the partial file stored

>> > on my local disk when the transfer freezes. I can't rename the

>> > partial file at this point because even though I have stopped the

>> > transfer FileZilla still has and handle on it. I have to close

>> > FileZilla down. Then I read the the filesize as : 24,726,410

>> > bytes. I open up FileZilla

>> >

>> > I am now going to make an eighth attempt but, with the microsoft

>> > command line ftp utility. This has, at least, managed to get past

>> > 32.1% ! Last time I tried the command line ftp utility with this

>> > backup file (#3) it failed at about 48Mb. This time it got stuck

>> > at: 39,112,162 bytes. Eventually, after getting stuck the command

>> > line utility closes itself down. The final portion tranfered for #8

>> > is: 39,112,162 bytes.

>> >

>> > #9. I repeated the process (ftp command line utility, default

>> > mode). It eventually managed to transfer: 43,685,376 bytes before

>> > it got stuck.

>> >

>> > #10 (ftp command line utility, default mode). After getting stuck

>> > and closing this left a portion of 43,685,376 bytes too. [same as

>> > #9]. So #8 was the anomally here.

>> >

>> > #11. I created a 3rd version of the database backup file and tried

>> > to use the ftp command line utility, default mode. This one stopped

>> > at: 24,726,410 bytes [same as #7, #6 and probably also #4,#5]

>> >

>> > FTP is consistently getting stuck and stopping at identical places

>> > for identical file transfers when using the same software.

>> >

>> > OK. I give up. FTP does not work on Win 2003. What did microsoft do

>> > to break it and will they be fixing what they broke? Is this a

>> > gentle nudge telling us all that we need to upgrade to 2008? [joke]

>>

>> I have not come across this issue. If I was in your position then

>> I would do this:

>> - Consider using "copy" instead of ftp, and/or

>> - Check Google for other posts on the subject of ftp, large files,

>> service packs.

>

> Yesterday I attempted and failed (11 times) to do an ftp download of

> the application database backup to my PC. I tried 2 different types of

> software (Filezilla and the command line utility) with 3 different

> versions of the database backup. In every case the download got stuck

> mid-way.

>

> This morning, at home, I managed an ftp download of the database backup

> with no problems.

>

> Just now, back in the office, when I tried yet again it got stuck at

> 24,726,410 bytes again. On resuming the download it managed a further

> 24Mb or so then got stuck again. Resuming as 2nd time does not work.

> Even closing the ftp package down (FileZilla) and resuming a 3rd time

> results in nothing being transfered.

>

> I've also noticed that my automatic ftp download, using a back-room

> server is broke too. Every night, at about 1:15 in the morning the

> backup database file should be downloaded to a server in our back-room.

> This has not been done since last Thursday morning. The last successful

> notification I have of this is dated: 13 December 2007 01:20. The

> attempt on 14 Dec, 1:15 resulted in only 7Mb being transfered before

> the download broke. It looks like the script used to run the automatic

> download had not run since then, or if it has run, it has stopped

> before attempting the ftp download. Is there any way that the server

> patch installed on last Friday (at about 11:00) could be stopping my

> script from running? I have scripts on both the remote server (which

> holds the database) and my local server. The remote script backs the

> database up and does an MD5 check. The local script copies the backup

> and Md5 check over to the local server, validates the MD5 fingerprint

> and sends me an email indicating success or failure.

>

> The typical size of a database backup is now 76Mb.

>

> There seems to be a problem at the our office end but the network

> administrator, here insists there are no such ftp problems.

>

> Here is my ftp script:

>

> open xxx.xxx.xxx.xxx

> myUserName

> myPassword

> cd private

> lcd C:\Data\Development\myApp\Database

> ascii

> get MW40.MD5

> binary

> get MW40.BAK

> disconnect

> bye

>

> How can the script be improved? If I use the actual ftp location

> ftp://myapp.com/private rather than the IP would that be better?

>

> Should I consider using a script written in WSH instead?

>

> The remote server has no problem creating the daily backup.

 

I used the native WinXP version of ftp.exe to download a 50 MByte

binary file from an external FTP server, using the commands you

listed above, and had no problem at all.

 

If this was my own problem then I would narrow the problem down

by a process of elimination:

- Download the same file while working on a different machine;

- Download a file from a different (e.g. external) ftp server to your

current machine;

- Repeat both exercises at a different site, e.g. from your home.

 

The results should be revealing.

Guest mark4asp
Posted

Re: FTP download failure - fails at exactly the same point in the transfer file.

 

Pegasus (MVP) wrote:

>

> "mark4asp" <mark4asp@gmail.com> wrote in message

> news:00b9f6bd$0$32763$c3e8da3@news.astraweb.com...

> > Pegasus (MVP) wrote:

> >

> > >

> >>"mark4asp" <mark4asp@gmail.com> wrote in message

> > > news:00be1b68$0$25784$c3e8da3@news.astraweb.com...

> >>> This is not any kind of random failure as I'm always getting a

> >>> certain number of bytes transfered.

> > > >

> >>> After my web host advised me to use Active mode.

> > > >

> >>> I changed to active mode and repeated that attempt using FileZilla

> >>> and the same backup file as used for #4 and #5. The new attempt

> >>> [#6] failed precisely at 32.1% [24,726,410 bytes].

> > > >

> >>> 32.1% is exactly where it failed for attempts #4 and #5. I didn't

> >>> record the number of bytes transfered for #4,#5.

> > > >

> >>> After abondoning #6, when it got stuck, I found the portion

> >>> downloaded here (on disk) to be: 24,707,072 bytes. FileZilla

> shows >>> that to be 24,726,410 bytes in its file listing. I will

> now repeat >>> the process. I close down FileZilla and rename the

> portion of the >>> file downloaded to xxxx.file6. As soon as I close

> down FileZilla I >>> notice that the portion now has a filesize of:

> 24,726,410 bytes.

> > > >

> >>> Attempt #7 gave 24,707,072 bytes on my local disk at the point

> >>> where I had to disconnect the ftp client becauase it got stuck

> >>> again. This reads as 24,726,410 bytes in the FileZilla listing

> >>> (again)

> > > >

> >>> 24,707,072 bytes is the number of bytes of the partial file stored

> >>> on my local disk when the transfer freezes. I can't rename the

> >>> partial file at this point because even though I have stopped the

> >>> transfer FileZilla still has and handle on it. I have to close

> >>> FileZilla down. Then I read the the filesize as : 24,726,410

> >>> bytes. I open up FileZilla

> > > >

> >>> I am now going to make an eighth attempt but, with the microsoft

> >>> command line ftp utility. This has, at least, managed to get past

> >>> 32.1% ! Last time I tried the command line ftp utility with this

> >>> backup file (#3) it failed at about 48Mb. This time it got stuck

> >>> at: 39,112,162 bytes. Eventually, after getting stuck the command

> >>> line utility closes itself down. The final portion tranfered for

> #8 >>> is: 39,112,162 bytes.

> > > >

> >>> #9. I repeated the process (ftp command line utility, default

> >>> mode). It eventually managed to transfer: 43,685,376 bytes before

> >>> it got stuck.

> > > >

> >>> #10 (ftp command line utility, default mode). After getting

> stuck >>> and closing this left a portion of 43,685,376 bytes too.

> [same as >>> #9]. So #8 was the anomally here.

> > > >

> >>> #11. I created a 3rd version of the database backup file and

> tried >>> to use the ftp command line utility, default mode. This one

> stopped >>> at: 24,726,410 bytes [same as #7, #6 and probably also

> #4,#5]

> > > >

> >>> FTP is consistently getting stuck and stopping at identical places

> >>> for identical file transfers when using the same software.

> > > >

> >>> OK. I give up. FTP does not work on Win 2003. What did microsoft

> do >>> to break it and will they be fixing what they broke? Is this a

> >>> gentle nudge telling us all that we need to upgrade to 2008?

> [joke]

> > >

> > > I have not come across this issue. If I was in your position then

> > > I would do this:

> > > - Consider using "copy" instead of ftp, and/or

> > > - Check Google for other posts on the subject of ftp, large files,

> > > service packs.

> >

> > Yesterday I attempted and failed (11 times) to do an ftp download of

> > the application database backup to my PC. I tried 2 different

> > types of software (Filezilla and the command line utility) with 3

> > different versions of the database backup. In every case the

> > download got stuck mid-way.

> >

> > This morning, at home, I managed an ftp download of the database

> > backup with no problems.

> >

> > Just now, back in the office, when I tried yet again it got stuck at

> > 24,726,410 bytes again. On resuming the download it managed a

> > further 24Mb or so then got stuck again. Resuming as 2nd time does

> > not work. Even closing the ftp package down (FileZilla) and

> > resuming a 3rd time results in nothing being transfered.

> >

> > I've also noticed that my automatic ftp download, using a back-room

> > server is broke too. Every night, at about 1:15 in the morning the

> > backup database file should be downloaded to a server in our

> > back-room. This has not been done since last Thursday morning. The

> > last successful notification I have of this is dated: 13 December

> > 2007 01:20. The attempt on 14 Dec, 1:15 resulted in only 7Mb being

> > transfered before the download broke. It looks like the script

> > used to run the automatic download had not run since then, or if it

> > has run, it has stopped before attempting the ftp download. Is

> > there any way that the server patch installed on last Friday (at

> > about 11:00) could be stopping my script from running? I have

> > scripts on both the remote server (which holds the database) and my

> > local server. The remote script backs the database up and does an

> > MD5 check. The local script copies the backup and Md5 check over to

> > the local server, validates the MD5 fingerprint and sends me an

> > email indicating success or failure.

> >

> > The typical size of a database backup is now 76Mb.

> >

> > There seems to be a problem at the our office end but the network

> > administrator, here insists there are no such ftp problems.

> >

> > Here is my ftp script:

> >

> > open xxx.xxx.xxx.xxx

> > myUserName

> > myPassword

> > cd private

> > lcd C:\Data\Development\myApp\Database

> > ascii

> > get MW40.MD5

> > binary

> > get MW40.BAK

> > disconnect

> > bye

> >

> > How can the script be improved? If I use the actual ftp location

> > ftp://myapp.com/private rather than the IP would that be better?

> >

> > Should I consider using a script written in WSH instead?

> >

> > The remote server has no problem creating the daily backup.

>

> I used the native WinXP version of ftp.exe to download a 50 MByte

> binary file from an external FTP server, using the commands you

> listed above, and had no problem at all.

>

> If this was my own problem then I would narrow the problem down

> by a process of elimination:

> - Download the same file while working on a different machine;

> - Download a file from a different (e.g. external) ftp server to your

> current machine;

> - Repeat both exercises at a different site, e.g. from your home.

>

> The results should be revealing.

 

Thanks for your help. It seems that there may be a problem or

configuration change in the internet at this office which is causing

problems.

 

I can't get ftp.exe to work properly. I was able to get FileZilla to

work by setting the transfer mode to passive and enabling keep alive

messages. ftp.exe doesn't have such a feature (keep alives). I will

investigate writing a WSH program which calls a suitable .net object.

If nothing exists in .net we may buy something in. I've given up on

ftp.exe

 

I even extended the server ftp timeout to 240 seconds and it had no

effect on ftp.exe


×
×
  • Create New...