没有.htaccess文件,Drupal仍在工作

I'm having a weird problem with my site.

Recently I found that one domain (not mine) was pointing to my IP. My IP hosts a system on Drupal 7 (LAMP) and Ubuntu. My site (still in development and only working in my IP) was already indexed by Google with that other domain name, so I decided to block this site with a simple ".htaccess" trick...

I can't block the "intruder" site using this and that snippet... Frustrated after many tries I wrote some garbage to see if something happened... but everything was still working. Then I erase the .htaccess file in Drupal root... but everything is still working until now.

I've been developing almost 10 years ago, managing small VPSs a few years ago, and when you make a tiny mistake in .htaccess then nothing works! And that's ok! So, as you can imagine i'm lost.

Here is the result when I search for all .htaccess files: enter image description here

My Drupal is installed in /var/www/html

Using the devel module this is part of my phpinfo (as you can see .htaccess is not in Drupal root):

enter image description here

Possibly, I'm making a very stupid mistake, but i've been on this like 2 hours and right now I don't have more places to look at.

I restarted Apache, asumming some kind of unknown cache in the .htaccess file. I also restarted the VPS (Digital Ocean)... No .htaccess file, but my Drupal installation is still working...

I'm missing something terribly... please help!

UPDATE: Every page is still working... also nice URLs. UPDATE 2: I'm attaching how looks 'sites-enabled' folder from Apache with only one file and it's contents.

sites-enabled folder and 000-default.conf contents

Most likely you have some rewrite rules in your VirtualHost configuration and/or apache configuration files. Check all files in /etc/apache2/sites-enabled for any RewriteRules.

Even if .htaccess file is not present home page will load , if you want to stop indexing by google , disallow it in robots.txt file . also u can make an htacesss popup password protection too.Also try to ping the url and see is dev/Production server is giving response if the url's are same .

Well... Finally I discovered my own error.

I put my rewrite configuration inside /etc/apache2/apache2.conf, probably I had some troubles setting nice-urls at beggining and then I forgot to enabled it properly :/

I know Drupal can't have nice URLs without these rewrite rules, so they should be somewhere. So I search for the string "RewriteEngine on" and "apache2.conf" came as a result (see the image).

Now it's solved. My apologies.

enter image description here

You could put

Deny from all
Allow from your.ip.address 

in the .htaccess file. That will prevent everyone except you from accessing the site.

Ref: https://httpd.apache.org/docs/2.2/howto/access.html

There is also a header which you can add like so:

Header set X-Robots-Tag noindex,noarchive

This requests that search engines not index or archive the content. If access is denied, it won't be necessary. Be sure to remove it when the site goes live.

Ref: https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?hl=en