Web on console

Most of the people think “graphical interfaces” when they think of surfing the Web. And there are indeed great programs like Firefox, Chrome under X11. But console in not the waste land. They also have lots of utilities for downloading or uploading content.

Lets say you want to surf the web. The first utility to use is one of the oldest command Lynx. I also started using Lynx when a machine was not able to handle X11. It is just a basic command , you give a file name or URL you want to visit. So, say you want to hit Google :

lynx http://www.google.com

Lynx then asks you whether you want to accept cookies Google is trying to set. Once you either accept or reject the cookie Lynx loads the page. You will notice there are no images. But all the links and text box to enter the search query is there. You can navigate from link to link using arrow keys. Since contents are text based , the items are at different locations. You can handle more than one URL when you use Lynx. Use can use option= accept_all_cookies to avoid warning messages of asking about cookies when a URL is launched.

Lynx does have few issues, it has a hard time with HTML interpreting and it does not handle frames. You can mention proxy with the option -http-proxy host:port

For text replacement of Lynx you have Elinks ; which has colors, tables, background downloading, tabbed browsing. Useful options are anonymous 1 and  -lookup. When you use this Elinks prints out the resolved IP addresses for a given domain name.

Now you can look the Web content from the command line, how can you interact with the Web ?  Say, you want an offline copy and read it at your leisure  on a terrace.  You can use curl to do that. It can do things like HTTP, POST, FTP, SSL connections. Just specify a name value pair.

curl http:// your_site. {a, b,c}.com

which hits all three sites.

curl http://your_site.com/doc%5B1-5%5D.odt

which downloads all five files.

What if you want to copy an entire site for offline viewing ? wget is the one which you need to use.

wget -p -k -r http://www.your_site.com

-r re-curses through the site path, -k rewrites downloaded file, -p for an extra content eg: images. And if you want to upload a content to a Web, use wput.

So, now you would be able to interact with the Web without having to use Graphical Interface. Yet another reason to keep you on command line :)

Leave a comment

1 Comment

  1. Very nicely written post. Eventhough there are numerous articles on the web on lynx, elinks, curl, wget, the use cases that you mentioned are interesting and nice!



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: