<div dir="ltr">1) Save message to a scratch file<div>2) egrep '(http:|https:)' /path/to/scratchfile > /path/to/newfile</div><div>3) wget `cat /path/to/newfile`</div><div><br></div><div>The single quote marks in step 3 are back tics (`), not normal single quotes (') [aka not the quote on the same key as the double quote key]</div><div><br></div><div>If you are sure that there is only one http/https line in the file (and that the URL you want is shown as an http/https), then steps 2 and 3 can be done as:</div><div>wget `egrep '(http:|https:)' /path/to/scratchfile`</div><div><br></div><div>(that is a back tic before the egrep, a single quote around the http/https re-ex, and a closing back tic)</div><div><br></div><div>This also assumes the URL fits on a single line in the scratch file. I've also assumed that the text file is ASCII, not base64 (or similar) encoded. Decode to ASCII before trying this.</div><div><br></div><div><br></div><div>Windows copy (^C) is of course typically interpreted as kill (^C) [technically interrupt] in UNIX/UNIX-like OS's which is why that doesn't work. You could use stty to remap the "intr" character that would fix that (it'll drive anyone expecting this functionality nuts however). The man page for stty should help there.</div><div><br></div><div>You should be able to use cut and paste with the mouse (highlight with left mouse, paste with middle) as well. I don't have access to putty or Windows to test this (putty may remap the mouse buttons needed- check the manual and settings as I vaguely recall having to change some defaults when I used it to get it to play nice/normal).</div><div><br></div><div>If the URL has UNIX command line unfriendly things in it, you may need to quote (as in single quote characters: wget '<a href="http://blah/blah/blah">http://blah/blah/blah</a>') the URL. If it has quotes (single, double, and/or backtics), life gets unpleasant as you'd then need to escape them with backslashes: wget http://\'\"\...</div><div><br></div><div>Hopefully one of these will work.</div><div><br></div><div>I used wget (personal preference). Any CLI alternative should work.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 8, 2019 at 10:57 AM Walt Smith <<a href="mailto:waltechmail@yahoo.com">waltechmail@yahoo.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
hi,<br>
<br>
I have an offsite unix account, assume telnet access, using putty.<br>
<br>
I want to use it to download a file ( via http) and store it.<br>
Right now, a long link to that file to be downloaded<br>
is sent to my offsite unix is in an email.<br>
The link is long, so I need copy/paste.<br>
<br>
However, my *nix know how editing text between apps<br>
is lost.<br>
<br>
At that site, I have pine available, lynx, links.<br>
In pine , I managed to export the email text body<br>
containing only the link (I think ).<br>
<br>
would:<br>
lynx < myfilelink <br>
be a way to download the link ?<br>
<br>
or <br>
<br>
wget > myfilelink ? or a pipe ?? <br>
<br>
( to avoid actually typing the link )<br>
<br>
Is there an easier way to do this?<br>
I seem to be unable to take the short route and<br>
simply copy into a OS buffer ( ^C ) the text<br>
inside the pine email body.<br>
<br>
<br>
suggestion ?<br>
<br>
thx,<br>
<br>
Walt . . . <br>
<br>
I want to do that by telnetting or ssh .<br>
<br>
<br>
<br>
Tools available there are pine, lynx, links<br>
<br>
---- The government is lawless, not the press (people). ( [Supreme Court] Justice Douglas re: The Pentagon Papers )<br>
<br>
_______________________________________________<br>
CALUG mailing list<br>
<a href="mailto:CALUG@unknownlamer.org" target="_blank">CALUG@unknownlamer.org</a><br>
<a href="http://lists.unknownlamer.org/listinfo/calug" rel="noreferrer" target="_blank">http://lists.unknownlamer.org/listinfo/calug</a><br>
</blockquote></div>