I’ve joined the bandwagon on Digital Ocean’s cheap SSD VPS hosting options ($5/mo say what!) and have been very delighted to jump into a fresh Ubuntu shell with no issues. If you’re like me and haven’t been installing servers for years then you might find some of these tips helpful. I may add more tips later on so let me know if I can help with some general tips (not too fancy!?) for the future.
For starters, let’s start with everyone’s favorite: users and permissions. I always find it somewhat annoying when I add a user or settle on root and I have to continually type ‘sudo’ for simple things like writing to a file or adding a directory.
adduser myuser
This will create a /home/myuser directory. Now to grant all the sudo privileges to that user, it’s as easy as
usermod -aG sudo myuser
The -a makes sure that the sudo group is added and does not replace other groups assigned to the user.
Password Protect A Public Directory
Recently, I needed to create a staging site for a client but I wanted to keep the directory password protected and prevent it from being crawled by search engines.
You can password protect a public directory using the htpasswd command and .htaccess file.
You need to make sure a few things are in place.
First, make sure that your .htaccess is read by Apache by navigating to your site’s httpd.conf file (now referred to as site.conf in recent versions of Linux).
Navigate to /etc/apache2/sites-available.
cp 000-default.conf seojeek.conf // copy default settings vi seojeek.conf
ServerName seojeek.com ServerAlias www.seojeek.com ServerAdmin me@seojeek.com DocumentRoot /var/www/seojeek <Directory "/var/www/seojeek"> Options Indexes FollowSymLinks MultiViews AllowOverride ALL Order allow,deny allow from all </Directory>
The AllowOverride ALL was added so that Apache would read and obey nested .htaccess files. Once I added .htaccess to the staging directory, I wanted to ensure that this directory was not visible to search engines (robots.txt) and password protected (.htpasswd file).
Robots.txt
We’ll start with the preventative measure of making sure search engines aren’t crawling my staging site.
At the site root, touch a robots.txt file and add the following:
disallow: /client_site_directory/*
Somewhere on the server, we’ll create a .htpasswd file that will store all of our username and encrypted passwords. We generate this with the htpasswd command. You may need to install apache utils first.
htpasswd .htpasswd username
Now, to password protect the directory, you’ll modify .htaccess:
AuthType Basic AuthName "restricted area" AuthUserFile /var/www/seojeek/client_site_directory/.htpasswd require valid-user
Be sure to restart apache.
apachectl restart
Now, when you navigate to yoursite.com/client_site_directory, you’ll receive a login prompt!