Modifying robots.txt on Google Compute Engine

By default, Google Compute Engine instances have robots.txt set to disallow. This means that your site will not be indexed by/show up on Google and other search engines.

To fix this you need to connect to your instance via SSH and then type the following command:

sudo nano /var/www/html/robots.txt

Change “Disallow” to “Allow” > press Ctrl + X and save your file