Hostwinds VPS and Dedicated Server clients who create websites sometimes wish to test out their site on a local machine. If you use a Linux system and want to test out new URLs on your site, you can use either cURL or wget. Both commands serve a similar purpose: act as a command-line facilitator of HTTP requests. However, both can be used for so much more.
What do cURL and wget have in common?
Both cURL and wget are programs that are available to use and modify on your server. Both cURL and wget support HTTP and its secure version, HTTPS, which are the base protocols for websites. They also both support FTP and SFTP, which are commonly used to upload files to servers.
cURL boasts a wide variety of supported protocols. According to curl's website, curl supports "DICT, FILE, FTP, FTPS, Gopher, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, Telnet and TFTP. curl supports SSL certificates, HTTP POST, HTTP PUT, FTP uploading, HTTP form-based upload, proxies, HTTP/2, cookies, user+password authentication (Basic, Plain, Digest, CRAM-MD5, NTLM, Negotiate and Kerberos), file transfer resume, proxy tunneling, and more.
wget at this time only supports HTTP- and FTP-based protocols. However, GNU Wget2 is actively being worked on at this time, which might change in future versions.
Using cURL/wget to view a website
You can use curl like so:
The first thing you'll notice about curl is it outputs everything to your terminal screen, so if you run it against your site, you'll see it push the raw HTML to your screen:
Using wget is similar, but the result is a little different.
As you can see, wget will actually download the file. It tries to save it as index.html rather than displaying it directly to your screen.