Forum

Problem with Large ...
 
Notifications
Clear all

Problem with Large file ZModem xfer

0 Posts
3 Users
0 Reactions
801 Views
(@gizmola)
Active Member
Joined: 24 years ago
Posts: 6
Topic starter  

I recently uploaded some large files from my XP computer to a RedHat 9 server using rz. This was across a 100mb switch between the computers. The files were 400mb+ in size.

At first everything was fine, and the upload proceeded rapidly, but as the file grew very large, the transfer slowwed to a crawl, and I noticed that CPU utilization on the windows box began to cycle from 98-100%. This is on a 2.3ghz P4 with 1gb of RAM.

I'm wondering if there might not be some memory management issues, so I thought I'd bring it up.

[size=1][ October 16, 2003, 05:17 PM: Message edited by: Brian T. Pence ][/size]


   
ReplyQuote
(@bpence)
Member Admin
Joined: 1 year ago
Posts: 1375
 

Gizmola,

I don't think you would be the first to have this problem. Do you see an increase in overall memory size when this occurs? Are you using SSH2, SSH1, or telnet?


   
ReplyQuote
(@gizmola)
Active Member
Joined: 24 years ago
Posts: 6
Topic starter  

Brian,
I was using SSH2. I didn't really check the memory size, so I can't say. I do know that I stopped most of the other processes just in case it was a thrashing issue, but doing so had no positive effect once the condition occurred.


   
ReplyQuote
(@bpence)
Member Admin
Joined: 1 year ago
Posts: 1375
 

Does it happen every time? How large does the file have to be before you run into this problem? At what point in the transfer do you begin noticing a slowdown?

I've been doing some transfer tests, and so far I haven't been able to reproduce the problem.

What version of AbsoluteTelnet are you using?


   
ReplyQuote
(@gizmola)
Active Member
Joined: 24 years ago
Posts: 6
Topic starter  

All I can say is that it was reproducible in that it happened for all the files. They were larger than 400mb in size. I'm not 100% sure where the problems occurred since I was doing something on another machine while the uploads were running. I'm running version 2.13 of AT.

If you upload a file of say, 600mb (the one's I were uploading were gzipped) and you can't cause it to reoccur, I'm willing to accept it might be something having to do with my particular environment. Since this is something I don't do that often (was installing oracle so I was uploading the linux oracle installation programs) I can't say I tried too hard to figure out the specifics. I did however, reboot after the first issue, just in case it was a problem with that machine, and even after a clean boot, the problem happened again when I uploaded the next file. Based on that evidence I suspect that were I to upload one of the files again, I could probably make it happen, but without knowing what information would be helpful to you, I would not be doing any type of substantially large upload in my regular use of AT.

[size=1][ October 17, 2003, 11:51 PM: Message edited by: Gizmola ][/size]


   
ReplyQuote
(@bpence)
Member Admin
Joined: 1 year ago
Posts: 1375
 

My test file was only 200MB in size. Much larger than I normally use, but perhaps not large enough to trigger the problem. I'll try a few larger files before I give up on this, though.


   
ReplyQuote
(@msa)
Estimable Member
Joined: 23 years ago
Posts: 111
 

Just my dime on this:
I am doing a lots of transfers of Linux distributions ISO-images (600+ MB) and I haven't seen any problems so far. I normally download the images to another computer (on another network) and I have a small shell-script on the other computer starting "sz" in the middle of the night and boom - the images are on my desktop when I am back in the morning. Last download was of the Severn (Fedora) release from RedHat (wget to the remote host and then ZModem over SSH2 during the night).
Could it be something else that messes up the SSH2-connection, like a firewall or something? Also, when cancelling the ZModem transfer, was the SSH2 connection still intact (i.e. you had a working prompt)?

Cheers,
/msa


   
ReplyQuote
(@bpence)
Member Admin
Joined: 1 year ago
Posts: 1375
 

Speed problems could always be rooted in network problems, but the fact that processor utilization is pegged at %98-%100 leads me to believe it is probably more of a client issue.

Also, in the original poster's example, the issue is with a transfer of a file from the client TO the server using rz, rather than server to client using sz.

Mattias, are you able to run a test the other way, transferring one of those monster files from client to server?

Thanks,

Brian


   
ReplyQuote
(@gizmola)
Active Member
Joined: 24 years ago
Posts: 6
Topic starter  

Just to clarify, the transfers worked. The problem was that, once the program bogged down, it took an innordinate amount of time to complete them relative to what it should have taken.

For example, assuming it took 1 hour to get to 90% completion, it took another hour to finish the last 10%... something along those lines. I don't have any idea what the actual times were, but hopefully you get the idea.


   
ReplyQuote
(@msa)
Estimable Member
Joined: 23 years ago
Posts: 111
 

Sorry for the delay in my answering.
Sending ISO-image 2 of RedHat Linux 9.0 (646 MB) over standard 100 MBps Ethernet to a RedHat Linux 8.0 server takes 33 min 28 sec (2.7 MBps)
AT connects to a RH 8.0 running SSH. CPU on the sending computer (800MHz running Windows XP) between 60-75% and with 2,220K memory usage. Downloading the file to the laptop takes about the same time. No problems or resending during the transfers.


   
ReplyQuote
Share: