[CALUG] offsite for file access

Howard Bampton howard.bampton at gmail.com
Thu Aug 8 13:08:33 EDT 2019


1) Save message to a scratch file
2) egrep '(http:|https:)' /path/to/scratchfile > /path/to/newfile
3) wget `cat /path/to/newfile`

The single quote marks in step 3 are back tics (`), not normal single
quotes (') [aka not the quote on the same key as the double quote key]

If you are sure that there is only one http/https line in the file (and
that the URL you want is shown as an http/https), then steps 2 and 3 can be
done as:
wget `egrep  '(http:|https:)' /path/to/scratchfile`

(that is a back tic before the egrep, a single quote around the http/https
re-ex, and a closing back tic)

This also assumes the URL fits on a single line in the scratch file. I've
also assumed that the text file is ASCII, not base64 (or similar) encoded.
Decode to ASCII before trying this.


Windows copy (^C) is of course typically interpreted as kill (^C)
[technically interrupt] in UNIX/UNIX-like OS's which is why that doesn't
work. You could use stty to remap the "intr" character that would fix that
(it'll drive anyone expecting this functionality nuts however). The man
page for stty should help there.

You should be able to use cut and paste with the mouse (highlight with left
mouse, paste with middle) as well. I don't have access to putty or Windows
to test this (putty may remap the mouse buttons needed- check the manual
and settings as I vaguely recall having to change some defaults when I used
it to get it to play nice/normal).

If the URL has UNIX command line unfriendly things in it, you may need to
quote (as in single quote characters: wget 'http://blah/blah/blah') the
URL. If it has quotes (single, double, and/or backtics), life gets
unpleasant as you'd then need to escape them with backslashes: wget http://
\'\"\...

Hopefully one of these will work.

I used wget (personal preference). Any CLI alternative should work.

On Thu, Aug 8, 2019 at 10:57 AM Walt Smith <waltechmail at yahoo.com> wrote:

>
> hi,
>
> I have an offsite unix account, assume telnet access, using putty.
>
> I want to use it to download a file ( via http)  and store it.
> Right now, a long link to that file to be downloaded
> is sent to my offsite unix is in an email.
> The link is long, so I need copy/paste.
>
> However, my *nix know how editing text between apps
> is lost.
>
> At that site,  I have pine available, lynx, links.
> In pine , I managed to export the email text body
> containing only the link  (I think ).
>
> would:
> lynx < myfilelink
> be a way to download the link ?
>
> or
>
> wget > myfilelink ?  or a pipe ??
>
> ( to avoid actually typing the link )
>
> Is there an easier way to do this?
> I seem to be unable to take the short route and
> simply copy into a OS buffer  ( ^C ) the text
> inside the pine email body.
>
>
> suggestion ?
>
> thx,
>
> Walt . . .
>
> I want to do that by telnetting or ssh .
>
>
>
> Tools available there are pine, lynx, links
>
> ---- The government is lawless, not the press (people). ( [Supreme Court]
> Justice Douglas re: The Pentagon Papers )
>
> _______________________________________________
> CALUG mailing list
> CALUG at unknownlamer.org
> http://lists.unknownlamer.org/listinfo/calug
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.unknownlamer.org/pipermail/calug/attachments/20190808/69228516/attachment.html>


More information about the CALUG mailing list