Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
10-06-2017, 03:25 PM | #151 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
no in bash.sh the ; at the end of a statement is not needed
var=something (declaration) like JavaScript $var beneath (a declared variable) like echo $var the caps are what i did they could be in lowercase too -- but bash .sh is case sensitive in a terminal $ dothis; sothat; is this; && dosomethindgood | (<pipe>) to the next statement |
10-09-2017, 01:19 PM | #152 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So,
I did that... Code:
#!/bin/bash # Shell script to backup MySql database # To backup Nysql databases file to /backup dir and later pick up by your # script. You can skip few databases from backup too. # For more info please see (Installation info): # http://www.cyberciti.biz/nixcraft/vivek/blogger/2005/01/mysql-backup-script.html # Last updated: Aug - 2005 # -------------------------------------------------------------------- # This is a free shell script under GNU GPL version 2.0 or above # Copyright (C) 2004, 2005 nixCraft project # ------------------------------------------------------------------------- # This script is part of nixCraft shell script collection (NSSC) # Visit http://bash.cyberciti.biz/ for more information. # ------------------------------------------------------------------------- STARTTIME=date +%s MyUSER=root # USERNAME MyPASS=Alfarenna79 # PASSWORD MyHOST=localhost # Hostname # Linux bin paths, change this if it can't be autodetected via which command MYSQL="$(which mysql)" MYSQLDUMP="$(which mysqldump)" CHOWN="$(which chown)" CHMOD="$(which chmod)" GZIP="$(which gzip)" # Backup Dest directory, change this if you have someother location DEST="/var/backup" # Main directory where backup will be stored MBD="$DEST/mysql" #elimino vecchi backup rm $MBD/* # Get hostname HOST="$(hostname)" # Get data in dd-mm-yyyy format NOW="$(date +"%d-%m-%Y")" # File to store current backup file FILE="" # Store list of databases DBS="" # DO NOT BACKUP these databases IGGY="information_schema cond_instances mysql performance_schema phpmyadmin" [ ! -d $MBD ] && mkdir -p $MBD || : # Only root can access it! $CHOWN 0.0 -R $DEST $CHMOD 0600 $DEST # Get all database list first DBS="$($MYSQL -u $MyUSER -h $MyHOST -p$MyPASS -Bse 'show databases')" for db in $DBS do skipdb=-1 if [ "$IGGY" != "" ]; then for i in $IGGY do [ "$db" == "$i" ] && skipdb=1 || : done fi if [ "$skipdb" == "-1" ] ; then #FILE="$MBD/$db.$HOST.$NOW.gz" #no gzip, comprimo dopo tutta la cartella FILE="$MBD/$db.$HOST.$NOW.sql" # do all inone job in pipe, # connect to mysql using mysqldump for select mysql database # and pipe it out to gz file in backup dir :) #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE #no gzip, comprimo dopo tutta la cartella $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE fi done #comprimo tutto zip -r $DEST/mysql-backup-$HOST.zip $MBD/ #tar -zcvf $DEST/mysql-backup-$HOST.tar.gz $MBD ENDTIME=date +%s TOTTIME=$ENDTIME-$STARTTIME echo Elapsed_time: $TOTTIME Code:
/var/backup/mysql_backup: line 15: +%s: command not found |
10-10-2017, 05:03 AM | #153 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
STARTTIME=(`date +%s`)
try like this and the time will be in epoch seconds ENDTIME=(`date +%s`) |
10-10-2017, 02:52 PM | #154 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
It works, but the result is kinda odd...
Code:
Elapsed_time: 1507671766-1507671705 I tried to put quotes, parentheses etc etc, but he does not want to do it... can we do this last thing too? It takes more time for this little thing than to configure all the server... |
10-10-2017, 10:15 PM | #155 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Maybe in $()
TOTTIME=$($ENDTIME-$STARTTIME) Do the math -- the sum is in seconds barry@paragon-DS-7:~$ bc <<< 1507671766-1507671705 61 seconds |
10-11-2017, 02:30 PM | #156 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Says:
Code:
/var/backup/mysql_backup: line 93: 1507756208-1507756139: command not found Code:
TOTTIME=`expr $ENDTIME - $STARTTIME` Now I'm worried about those odd quotes... In PHP when I find those quotes it means that there was a copy paste error from the HTML and nothing works anymore. So I have the habit of removing them as soon as I see them and replace them with a normal apex... in sh instead it seems to be fundamental... I surely have removed someone thinking they were a error... I shouldn't have done any damage, because everything seems to work, but maybe I'm going to look for the original script and I see if there was someone... P.S. It's strange how we can install an entire server, and then the simplest things make us crazy... |
10-11-2017, 03:03 PM | #157 | ||
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
these are called backticks
bc is a terminal calculator program apt install bc man bc Quote:
Quote:
Like our @array =(<FILENAME>); |
||
10-12-2017, 02:34 PM | #158 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
There's one last thing that scares me a lot
Load Much of that red is due to the phase of moving sites and all the importing error of those damned databases. Also the other server at the beginning had very red, then slowly it is normalized. This is taking a little more... But what sounds strange to me is that going to see the detail of the server, you do not understand why there is all that red. The CPU rarely arrives at 90%, the memory is a bit chubby but it works, disk there is plenty, errors or special problems there are none... The sites are running well, fast, without interruptions, or visible slowdowns... Cpu sometimes says "stolen" even if it is working maybe at 70%, and already this is odd. But it is the usual load to give more worries, sometimes even 4-5, I also saw 7 in the days of cronjob (they are still synchronizing many data due to the lack of cronjob in the other servers) What does it actually indicate load? And how much do I have to worry? On a scale it goes from "quiet, goes all right" to "shit the server is going to explode, run away all before it's late, shit we'll die all ", where am I? |
10-12-2017, 06:58 PM | #159 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Your problem is your PHP script and the MySQL daemon (server). Software for your application; or,
If you look at the times of the peak usage and grep those times in the server access logs you may find that bing is indexing too many pages too fast -- you can place a directive in the robots.txt User-agent: bingbot Crawl-delay:$v 5 10 see https://www.siteground.com/kb/how_to..._eng ine_bot/ https://www.bing.com/webmaster/help/...ntrol-55a30302 Slow bing down -- don't Disallow Bing they bring good converting traffic and the sell their PSaaS or indexed database to Yahoo and other search engines. You may find Baidu is indexing too many pages too fast -- block them at your firewall I have had luck that way Porn is illegal in China and you won't sell legit Chinese buyers either. # Free IP2Location Firewall List by Search Engine # Source: Whitelist Robots by Search Engine | IP2Location Code:
whois -h v4.whois.cymru.com " -c -p 183.131.32.0/20" AS | IP | BGP Prefix | CC | AS Name 4134 | 183.131.32.0 | 183.128.0.0/11 | CN | CHINANET-BACKBONE No.31,Jin-rong Street, CN -c -p 12.1.72.32/27" 7018 | 12.1.72.32 | 12.0.0.0/9 | US | ATT-INTERNET4 - AT&T Services, Inc., US " -c -p 104.193.88.0/22" 55967 | 104.193.88.0 | 104.193.88.0/24 | US | CNNIC-BAIDU-AP Beijing Baidu Netcom Science and Technology Co., Ltd., CN https://github.com/arineng/nicinfo that will give you full RDAP/whois information Third way is just $ whois <ip address> If you are generating many dynamic pages search engines may be causing this problem Scrapers and *bad bots* may be the issue too. This is what server logs are for to search for problems and find patterns. A firewall is the way to go -- just do not answer -- drop the packet. |
10-16-2017, 03:40 PM | #160 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
But Holy cow
I was away 2 days and the server was invaded by bots, just like you said... Code:
51.255.65.66 - - [16/Oct/2017:22:25:31 +0000] "GET /27 HTTP/1.1" 302 3634 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)" 157.55.39.234 - - [16/Oct/2017:22:25:09 +0000] "GET /cimla+sexy+photos.com/ HTTP/1.1" 200 32929 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 93.105.187.11 - - [16/Oct/2017:22:25:05 +0000] "GET /search.php?q=shemale+mia+isabella+teacher+her+student+a+lesson+free+porn&sort=date&page=5 HTTP/1.1" 200 10438 "http://www.bigbigbigboobs.com/search.php?q=shemale+mia+isabella+teacher+her+student+a+lesson+free+porn&sort=date" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 216.244.66.245 - - [16/Oct/2017:22:25:30 +0000] "GET /search-amy+anderssen+photos+pk/ HTTP/1.1" 200 79767 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])" 207.46.13.86 - - [16/Oct/2017:22:25:24 +0000] "GET /search-bigboob+s+saree+woman+photo+pk/ HTTP/1.1" 200 24426 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" ::1 - - [16/Oct/2017:22:25:31 +0000] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.4.18 (Ubuntu) OpenSSL/1.0.2g (internal dummy connection)" 66.249.64.3 - - [16/Oct/2017:22:25:16 +0000] "GET /love+sex+move/ HTTP/1.1" 200 34386 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 40.77.167.14 - - [16/Oct/2017:22:25:31 +0000] "GET /search-bbw+back+sid+girl+photos.com/ HTTP/1.1" 200 24797 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 66.249.70.19 - - [16/Oct/2017:22:25:32 +0000] "GET /74277 HTTP/1.1" 200 19774 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.70.30 - - [16/Oct/2017:22:25:32 +0000] "GET /savita+babhi/ HTTP/1.1" 200 20327 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log 8.37.233.40 - - [16/Oct/2017:22:26:54 +0000] "GET /download+video+bokep+jepang+rina+araki/ HTTP/1.1" 200 29346 "https://www.google.co.id/search?client=ucweb-b-bookmark&q=video+ngentot+rina+araki&oq=video+ngentot+rina+araki&aqs=mobile-gws-lite.." "Mozilla/5.0 (Linux; U; Android 6.0.1; en-US; SM-G532G Build/MMB29T) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 UCBrowser/11.3.5.972 U3/0.8.0 Mobile Safari/534.30" 49.34.127.70 - - [16/Oct/2017:22:26:56 +0000] "GET /xvillage+desi+8+saal+ki+bachi+ki+chudai+video/ HTTP/1.1" 200 31479 "android-app://com.google.android.googlequicksearchbox" "Mozilla/5.0 (Linux; Android 5.1.1; SM-J200G Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.91 Mobile Safari/537.36" 93.105.187.11 - - [16/Oct/2017:22:26:30 +0000] "GET /page-17/search-googleweblight.comlite_url+2+mom+big+naked+milky+boobs+images.com/date/ HTTP/1.1" 200 26231 "http://www.monsterboobshardpics.com/page-14/search-googleweblight.comlite_url+2+mom+big+naked+milky+boobs+images.com/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 216.244.66.228 - - [16/Oct/2017:22:27:08 +0000] "GET /search-big+assas+larag+ass+masive+ass+huge+cock+large+cock+hardcore+anal+gp+download+free/ HTTP/1.1" 200 97124 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])" 93.105.187.11 - - [16/Oct/2017:22:26:18 +0000] "GET /page-7/search-desi+bhabi+sexy+boob+press+fuck+pussy+mp+mobile+ipone/date/ HTTP/1.1" 200 24021 "http://www.monsterboobshardpics.com/search-desi+bhabi+sexy+boob+press+fuck+pussy+mp+mobile+ipone/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [16/Oct/2017:22:26:23 +0000] "GET /page-14/search-boobs+milk+breathing+imeges/date/ HTTP/1.1" 200 24406 "http://www.monsterboobshardpics.com/page-9/search-boobs+milk+breathing+imeges/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 207.46.13.183 - - [16/Oct/2017:22:27:07 +0000] "GET /page-15/search-african+black+pussy/ HTTP/1.1" 200 24382 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 202.46.58.190 - - [16/Oct/2017:22:27:06 +0000] "GET /search-big+black+fatty+boom+shemale+fuck/ HTTP/1.1" 200 24940 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36" 51.255.65.27 - - [16/Oct/2017:22:27:09 +0000] "GET /36331 HTTP/1.1" 200 19518 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)" 93.105.187.11 - - [16/Oct/2017:22:26:08 +0000] "GET /page-12/search-tite+big+round+heavy+boobs+hd+pics/date/ HTTP/1.1" 200 25087 "http://www.monsterboobshardpics.com/page-7/search-tite+big+round+heavy+boobs+hd+pics/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log 216.244.66.228 - - [16/Oct/2017:22:27:24 +0000] "GET /search-bbw+big+hips+mom+churidar+hot+photo+xxxin/ HTTP/1.1" 200 92852 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])" 46.229.168.79 - - [16/Oct/2017:22:27:25 +0000] "GET /52199 HTTP/1.1" 200 19569 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)" 93.105.187.11 - - [16/Oct/2017:22:27:06 +0000] "GET /search.php?q=leanne+crow+huge+boobs+fake&page=5 HTTP/1.1" 200 10927 "http://www.bigbigbigboobs.com/search.php?q=leanne+crow+huge+boobs+fake" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [16/Oct/2017:22:27:02 +0000] "GET /search.php?q=windian+bhabi+tight+salwar+gand+penty+showing+sexy+pic&page=2 HTTP/1.1" 200 10397 "http://www.bigbigbigboobs.com/search.php?q=windian+bhabi+tight+salwar+gand+penty+showing+sexy+pic" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" ::1 - - [16/Oct/2017:22:27:26 +0000] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.4.18 (Ubuntu) OpenSSL/1.0.2g (internal dummy connection)" 157.55.39.77 - - [16/Oct/2017:22:27:24 +0000] "GET /page-13/search-pornstar+aunty+sex+videos+downloadiporntv.net/ HTTP/1.1" 200 26726 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 216.244.66.233 - - [16/Oct/2017:22:27:24 +0000] "GET /303/ HTTP/1.1" 200 79410 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])" 93.105.187.11 - - [16/Oct/2017:22:27:16 +0000] "GET /?q=face+book+hot+nice+aunty+xxx+back+side+imagedate/ HTTP/1.1" 200 9673 "http://www.bigbigbigboobs.com/search.php?q=face+book+hot+nice+aunty+xxx+back+side+imagedate&page=6" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 40.77.167.62 - - [16/Oct/2017:22:27:25 +0000] "GET /search-anteysex+photo.com/ HTTP/1.1" 200 23531 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 180.76.15.8 - - [16/Oct/2017:22:27:26 +0000] "GET /page-3/search-japanese+boobs+pics/random/ HTTP/1.1" 500 637 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log 157.55.39.238 - - [16/Oct/2017:22:27:20 +0000] "GET /page-5/search-african+aunty+without+dress+and+bra+big+boobs+sexy+photos/ HTTP/1.1" 200 24875 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 207.46.13.20 - - [16/Oct/2017:22:27:28 +0000] "GET /page-16/search-sa+tranny+nude+pics/ HTTP/1.1" 200 25587 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 207.46.13.39 - - [16/Oct/2017:22:27:21 +0000] "GET /desi+girl+in+loose+tshirt+pics/ HTTP/1.1" 200 27165 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 157.55.39.29 - - [16/Oct/2017:22:27:26 +0000] "GET /page-14/search-hot+sexy+aunty+boobs+in+saree+hd+picturescom/ HTTP/1.1" 200 25844 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 202.46.57.88 - - [16/Oct/2017:22:27:28 +0000] "GET /page-5/search-naked+pics+of+nicole+charming/ HTTP/1.1" 200 24457 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36" 157.55.39.149 - - [16/Oct/2017:22:26:51 +0000] "GET /page-2/search-big+boobs+pandora+peaks+bikini/ HTTP/1.1" 200 0 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 202.46.58.166 - - [16/Oct/2017:22:27:28 +0000] "GET /search-lesbian+sucking+boobs/random/ HTTP/1.1" 200 24133 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36" 164.132.161.3 - - [16/Oct/2017:22:27:31 +0000] "GET /7241 HTTP/1.1" 302 3638 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)" 207.46.13.152 - - [16/Oct/2017:22:27:30 +0000] "GET /search-big+boobs+tite+studant/ HTTP/1.1" 200 23030 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 46.229.168.67 - - [16/Oct/2017:22:27:27 +0000] "GET /sunnyleone%20sexbeg/ HTTP/1.1" 200 20184 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)" Thanks, just in time |
10-16-2017, 06:11 PM | #161 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, I limited Bing from robots.txt on all my sites. For now I see no big differences, but maybe it takes a little because of the cache
I also found in my htaccess, these rules that should stop Yandex and China Code:
RewriteCond %{HTTP_USER_AGENT} ^.*MJ12bot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^.*Yandex [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^.*Baidu [NC] RewriteRule .* - [L,F] Then I went to see Ip2location, I joined and I generated the file, but I did not understand how to use the file that they gave me... I have generated Linux iptables, they gave me a thing like this: Code:
iptables -A INPUT -s 104.146.0.0/18 -j DROP iptables -A INPUT -s 104.146.100.0/22 -j DROP iptables -A INPUT -s 104.146.104.0/21 -j DROP iptables -A INPUT -s 104.146.112.0/24 -j DROP Do I need to install iptables? Will it still work UFW? I have seen some sites where it says to open a configuration file of UFW and add the lines, but my lines have a different format and the files to be modified indicated in these sites are always different... I also thought of changing manually this: Code:
iptables -A INPUT -s 104.146.100.0/22 -j DROP Code:
# block IP -A ufw-before-input -s 104.146.100.0/22 -j DROP But I'm not sure that doing this manually is a good idea I'm not really understanding anything. And I would not use the rules in httacces, first because here also have a different format from what I used in precedence, and then because they are so many... Do I need to install Fail2ban? I really need to fix this thing quickly because my server is merging, can you help me? |
10-17-2017, 06:09 AM | #162 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Code:
ufw deny from 192.187.100.58 to any; ufw deny from 112.137.167.30 to any; ufw deny from 82.117.194.229 to any; ufw deny from 91.121.45.246 to any; Code:
root@ds12-ams-2gb:/home/work# ufw status numbered Status: active To Action From -- ------ ---- [ 1] 22 ALLOW IN 99.30.xxx.xx/29 [ 2] 80 ALLOW IN Anywhere [ 3] 443 ALLOW IN Anywhere [ 4] xxxx ALLOW IN 99.30.xxx.xx/29 [ 5] 80,443/tcp ALLOW IN Anywhere [ 6] Nginx Full ALLOW IN Anywhere [ 7] Anywhere DENY IN 69.30.222.130 [ 8] Anywhere DENY IN 155.133.82.122 [ 9] Anywhere DENY IN 54.196.30.74 [10] Anywhere DENY IN 66.240.205.0/26 [11] Anywhere DENY IN 188.165.2.183 [12] Anywhere DENY IN 71.6.146.130 [13] Anywhere DENY IN 89.163.146.57 [14] Anywhere DENY IN 139.162.199.176 [15] Anywhere DENY IN 180.97.106.37 [16] Anywhere DENY IN 104.193.252.165 [17] Anywhere DENY IN 190.248.153.234 [18] Anywhere DENY IN 142.54.183.226 [19] Anywhere DENY IN 158.106.67.0/24 [20] Anywhere DENY IN 170.210.156.91 [21] Anywhere DENY IN 81.4.125.125 [22] Anywhere DENY IN 66.240.192.128/26 [23] Anywhere DENY IN 35.188.194.96 [24] Anywhere DENY IN 149.202.207.121 [25] Anywhere DENY IN 158.106.64.0/18 [26] Anywhere DENY IN 142.54.161.10 [27] Anywhere ALLOW IN 99.30.xx.xx/29 21 [28] Anywhere DENY IN 66.240.192.0/18 [29] Anywhere DENY IN 192.187.100.58 [30] Anywhere DENY IN 112.137.167.30 [31] Anywhere DENY IN 82.117.194.229 [32] Anywhere DENY IN 91.121.45.246 [33] 80 (v6) ALLOW IN Anywhere (v6) [34] 443 (v6) ALLOW IN Anywhere (v6) [35] 80,443/tcp (v6) ALLOW IN Anywhere (v6) [36] Nginx Full (v6) ALLOW IN Anywhere (v6) Code:
root@ds12-ams-2gb:/home/work# ufw delete 37 Deleting: allow 21/tcp Proceed with operation (y|n)? y Rule deleted Mapping the rules is a better idea but I haven't seen a good solution for ufw only for iptables and now nftables ufw is an acronym for Uncomplicated FireWall UFW: The Linux Uncomplicated Firewall <uncomplicated tutorial iptables is sort of hard to understand and has been superseded by https://linux-audit.com/nftables-beg...fic-filtering/ <nftables Baidu doesn't play by the rules regarding robits.txt and will use IP to spider you without any user-agent sig that says 'baidu' making you .htaccess code useless. get the ip CIDRs and block them in the ufw firewall. |
10-17-2017, 04:23 PM | #163 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, I downloaded the CIDR of the engines that I want to block
and launched this: Code:
while read line; do sudo ufw insert 1 deny from $line to any; done < cdir.txt But in the access.log the ones I see most often are: Opensiteexplorer.org/dotbot, [email protected] semrush.com/bot.html bing.com/bingbot.htm ahrefs.com/robot/ Apart from Bing, the rest seem to be marketing tools, some more or less connected to Google or moz.com I don't use them, but mostly I don't need them if they first destroyed my server... Can I block them? Always via IP-UFW? And in case, which IP should I block? Their Ip in my access.log changes, eg: Code:
46.229.168.76 - - [17/Oct/2017:23:04:53 +0000] "GET /search-busty%20mom%20loves%20to%20suck%20cock/ HTTP/1.1" 200 24789 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)" 46.229.168.69 - - [17/Oct/2017:22:57:38 +0000] "GET /search-big%20brest%20sex%20photo/random/ HTTP/1.1" 200 21170 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)" 46.229.168.67 - - [17/Oct/2017:22:55:13 +0000] "GET /desi%20girls%20boobs%20suckers%20bees%20photos2015/ HTTP/1.1" 200 31832 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)" Or is it better in this case to use robots.txt? Is there a serious list (robots.txt or IP) of bad bots to block? |
10-18-2017, 05:00 PM | #164 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, in the meantime... I don't know if I made a crap...
But I did this thing... I searched with grep some bots in the acces.log Code:
grep ahrefs/var/log/apache2/access.log Then I copied a few thousand lines and I created a PHP script that creates a file with only the IP, line by line (leaving the duplicates) Code:
<? $my_database_txt = 'seznambot.txt'; $array_righi = file($my_database_txt); foreach($array_righi as $key => $capi){ list($ip, $merda) = explode(" - - ", $capi); if(strpos($ip_list, $ip) === false){ $ip_list = $ip_list.$ip.' '; } } $nome_file_index = "ip_list.txt"; //chmod($nome_file_index, 0666); $file = fopen($nome_file_index, "w") ; fwrite($file, $ip_list); fclose($file); ?> http://porn-update.com/temp/bad-bot-cidr.txt Then with the usual while I added the rules to UFW Code:
while read line; do sudo ufw insert 1 deny from $line to any; done < /var/www/html/bad-bot-cidr.txt Code:
while read line; do sudo ufw delete deny from $line; done < /var/www/html/bad-bot-cidr.txt The CPU graph goes up and down at the moment, but that of load is slowly descending. I wait a little, and I see what happens... |
10-18-2017, 06:47 PM | #165 | |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
If the only tool in your toolbox is a hammer that is how you screw in a wood screw ...
try this: Code:
$ cut -d'-' -f1 /home/work/domain.com/logs/access.log| grep -v '99\.3' | uniq -c | sort -nr|sed 's/\([0-9]\) \([0-9]\)/\1:\2/g' |less (returns unique hits:IP grep -v will delete your ip pattern) 2742:173.208.249.226 189:158.69.229.6 155:160.202.163.148 153:78.190.44.124 91:82.165.75.132 64:178.137.82.201 62:46.2.77.72 62:201.18.18.173 62:201.18.18.173 62:185.81.155.40 instead of |less >fileName.* you don't need a hammer to tighten a screw After checking the whois Quote:
Code:
ufw deny from 173.208.249.224/29 to any; Rule added |
|
10-18-2017, 09:56 PM | #166 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
I thought to a fast way to get an IP CIDR
Code:
$ whois 173.208.249.226 |grep 'CIDR:'|cut -d':' -f2|sed -e 's/^/ufw deny from /g' -e 's/ / /g' -e 's/$/ to any;/g' #returns ufw deny from 173.208.249.224/29 to any; ufw deny from 173.208.128.0/17 to any; |
10-19-2017, 06:04 PM | #167 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, something I did...
Now virtually are only the Google and Bing bots in my access.log. But CPU and load are still at absurd levels... The strangest thing, something has changed in both my servers.... This week the visits have not doubled (indeed, they have fallen a bit), but something is obviously changed, but I have no idea what it is... I have not changed anything, I searched in all the logs I know, but I find nothing that can explain this sudden increase in CPU usage. In the error logs I often find lines like this: Code:
[Fri Oct 20 01:00:20.885615 2017] [core:error] [pid 4771] [client 37.9.113.202:36406] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace. [Fri Oct 20 01:00:25.593807 2017] [core:error] [pid 4771] [client 37.9.118.28:37754] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace. These days I have been trying to look also often at the sites and seem to work well. Nixstat often says that the mysqld process is often at 130%, 150%, 180%, my sites definitely make an important use of MySQL, but having not changed anything, I do not understand why. (It was very high when I had problems importing tables, but it had normalized after having them fixed.) Not having increased the visits, and having eliminated many of the bots, who or what is using my CPU and my mysql? I don't know what else to do to understand what's going on... I'm kinda worried also because usually the visits increase on Saturday and Sunday, and I have no idea what will happen this week |
10-20-2017, 05:03 AM | #168 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
https://www.google.com/search?client...utf-8&oe=utf-8
https://gist.github.com/JustThomas/141ebe0764d43188d4f2 I usually try searching the exact error to get some idea -- seems this may be some .htaccess rewrite errors ... |
10-20-2017, 08:02 PM | #169 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
t's the first thing I always do, I ask here when I find nothing
I found a lot of solutions for wordpres, but my sites are not WordPress, and I understand very little url_rewriter... I wrote these rules a long time ago, following guides, and I never saw this error before, until I started to manage my server... My url_revriter is really simple and stupid, and the error does not give many clues to understand what creates it. Code:
RewriteEngine On RewriteBase / RewriteCond %{HTTP_USER_AGENT} ^.*MJ12bot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^.*Yandex [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^.*Baidu [NC] RewriteRule .* - [L,F] RewriteRule ^page-([^/]+)/$ index.php?page=$1 [L] RewriteRule ^([^/]+)\.html$ video.php?title=$1 [L] RewriteRule channels/ channels.php [L] RewriteRule tags/ tags.php [L] RewriteRule ^tags-([^/]+)\/$ tags.php?letter=$1 [L] RewriteRule search/ search.php [L] RewriteRule ^search-([^/]+)$ search.php?query=$1 [L] RewriteRule ^search-([^/]+)\/$ search.php?query=$1 [L] RewriteRule ^page-([^/]+)/search-([^/]+)/$ /search.php?page=$1&query=$2 [L] RewriteRule ^search-([^/]+)\/([^/]+)\/$ search.php?query=$1&sort=$2 [L] RewriteRule ^page-([^/]+)/search-([^/]+)/([^/]+)/$ /search.php?page=$1&query=$2&sort=$3 [L] RewriteRule ^([0-9]+)$ out.php?id_photo=$1 [L] RewriteRule ^([0-9]+)/$ out.php?id_photo=$1 [L] RewriteRule ^([0-9]+)-([^/]+)$ out.php?id_photo=$1 [L] But I also found another thing... Last week Google decided to scan my sites, all together, and a lot of pages of each site... Now I'm thinking, I don't know if... Wait a few days, maybe, when Google has finished this scan the server returns to a normal regime. Or Try to limit its consumption by Google, adding for example if-midified-since and Last-Modified in the headers of my pages There were already, but time ago I had to comment on why they created problems with some crappy VPS |
10-20-2017, 08:57 PM | #170 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
I had a problem with the new Apache2 version then changed to Nginx for other reasons.
Code:
</IfModule> <Directory /home/xxxxx/xxx.xxx.com/public_html> Order allow,deny Allow from all # New directive needed in Apache 2.4.3: Require all granted </Directory> |
10-21-2017, 05:11 PM | #171 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
But where?
I remember you had already shown me this thing, but for some strange reason in my .conf files there is not... Where should I add, in the files mysitecom.conf? In sites-enabled or sites-available? or both? Then I have to disable and enable the sites? Restart Apache clearly, but I have to reboot the server too? I want to fix this, because my logs are still full of those errors, Perhaps there is some new hope, in the last few hours something is changing, as it started, it seems perhaps to return to normal... |
10-21-2017, 06:42 PM | #172 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Code:
cd /etc/apache2/sites-available ls -1 cp file file.bk <making a backup copy nano file then ###### #then make symbolic link Code:
a2ensite <file> /etc/init.d/apache2 reload or service apache2 reload reload as opposed to restart this just reloads the new configuration. |
10-31-2017, 07:52 PM | #173 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, I made the changes on all my domains, then I waited some days, because here things never seem to happen right away... maybe because of the various caches but it seems to me always want a few days to see the changes actually applied.
(then I waited a few more days due to an flu and fever, balls) The errors unfortunately did not go away, indeed, I find some very strange, like this: Code:
[Wed Nov 01 01:28:37.585964 2017] [core:error] [pid 31598] [client 113.200.85.109:36600] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace., referer: https://m.baidu.com/from=1005640b/bd_page_type=1/ssid=0/uid=0/pu=cuid%40Yiv_ugul2ill82uxgaBWiguwHtY5iHue_u2b8_uh2iqMuHi3A%2Cosname%40baidubrowser%2Ccua%40_a2St4aQ2igyNm65fI2UN_aLXioouvhBmmNqA%2Ccut%4099mjqk4iv9grivh5gaXliouRL8_4h2NlgNEHjA4msqqSB%2Ccsrc%40ios_box_suggest_txt%2Cctv%402%2Ccfrom%401005640b%2Cbbi%40ga2Mijah2uz3uSf3lh2ti_O3sf0kuSNT0uvo8guESilSu2iuA%2Ccen%40cuid_cua_cut_bbi%2Csz%401320_2004%2Cta%40iphone_1_11.0_5_4.10%2Cusm%401/baiduid=242780940BA28A3DDDE364D833464EE4/w=0_10_/t=iphone/l=1/tc?ref=www_iphone&lid=11746036396691745334&order=3&fm=alop&tj=www_normal_3_0_10_title&url_mf_score=3&vit=osres&m=8&cltj=cloud_title&asres=1&title=JapanesePornUpdates%7CJapanese%2CAsian%2CExotic...&dict=32&w_qd=IlPT2AEptyoA_yiiC6SnGjEtwQ4INvD8&sec=25105&di=2c8f31196ae39669&bdenc=1&tch=124.167.24.701.1.0&nsrc=IlPT2AEptyoA_yixCFOxXnANedT62v3IEQGG_yRZAje5mFqyavvxHcFqZj0bNWjMIEb9gTCc&eqid=a3024dde9f9ee8001000000359f92319&wd=&clk_info=%7B%22srcid%22%3A%221599%22%2C%22tplname%22%3A%22www_normal%22%2C%22t%22%3A1509499715089%2C%22sig%22%3A%2241388%22%2C%22xpath%22%3A%22div-a-h3%22%7D The CPUs continue to come and go, in the last 15 days, have behaved in a really odd way, it seems that my sites to work regularly need about 20%, then when they arrive the search engines would need another 10 servers... do not know if there is something that It does not work, but in case, it does not work only with search engines because the sites work very well. Visits have not undergone major increases or losses. are more or less stable. In the access_log I see very google and especially much Bing even though I added delay=1 in robots and reduced time and scan frequency in Bing Webmaster At this point I'm kinda confused about what to do (maybe even because of the flu) and I have no clear if everything works great and if there is something that really does not work. Suggestions to understand how the situation actually is and if I can do something to improve it? |
10-31-2017, 08:48 PM | #174 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
1 sec try =5 (5 sec)
Run it until it croaks -- Who wrote that PHP script that is leaking (possibly)? whois 113.200.85.109 CHINA ofc |
11-01-2017, 08:01 PM | #175 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Sorry, this time I did not really understand what I have to do, maybe I'm still kinda feels from the flu...
I wrote every single line of code of all my sites, my rule is to write code in the simple possible way, to avoid as many problems as possible. I can't figure out what creates problems and how to fix them Sorry, I'm not at my best these days... |
11-01-2017, 10:27 PM | #176 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Do I have to activate/set something on the server in order to use these?
PHP Code:
|
11-02-2017, 07:01 AM | #177 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
IDK but if you make a script and run it -- then look at the browser w/ firefox live HTTP Headers add on; or,
curl your test page URL Code:
barry@paragon-DS-7:~$ curl -X HEAD -i 'https://gfy.com/' Warning: Setting custom HTTP method to HEAD with -X/--request may not work the Warning: way you want. Consider using -I/--head instead. HTTP/1.1 200 OK Date: Thu, 02 Nov 2017 13:55:53 GMT Content-Type: text/html; charset=ISO-8859-1 Connection: keep-alive Set-Cookie: __cfduid=dd618d22df242d865d99e4178888286b31509630953; expires=Fri, 02-Nov-18 13:55:53 GMT; path=/; domain=.gfy.com; HttpOnly Set-Cookie: bbsessionhash=ecf9c70481cf1feb1bf1dd91ac56c2b1; path=/; HttpOnly Set-Cookie: bblastvisit=1509630953; expires=Fri, 02-Nov-2018 13:55:53 GMT; path=/ Set-Cookie: bblastactivity=0; expires=Fri, 02-Nov-2018 13:55:53 GMT; path=/ Expires: Thu, 02 Nov 2017 13:55:52 GMT Cache-Control: no-cache Pragma: no-cache X-UA-Compatible: IE=7 Server: cloudflare-nginx CF-RAY: 3b77981133232579-ORD On the SEO Q?: If your content is not changed then you would be gaming Googlebot and the other indexing bots ... |
11-02-2017, 04:20 PM | #178 |
Confirmed User
Industry Role:
Join Date: Jul 2017
Location: Worldwide
Posts: 37
|
Did not read the whole 4 pages, but:
- Any slow DB queries, or intensive queries that are being executed together? Enable the slow query log and check what will happen. If there are slow queries that mark the start of the overloads, are they malformed queries or are they regular ones? If they are a result of the overload and not the cause, they would pop up after the CPU spikes, so then what is PHP doing at the same time? - Is the DB itself optimised, with indexes and everything? - Is disk I/O out of the question or is there a significant disk wait during the time when the CPU hits 100%, or shortly before that moment? - What does "top" show when this happens? Is PHP spiking its CPU usage or is MySQL showing on top? My opinion is that you should hire someone and get them to profile the whole thing if that is possible, as the graphs above don't provide any clues. And I don't think that you have a problem with bots either.
__________________
PUSHR CDN - High-performance content delivery network for fast growing projects Skype: Victor_NetStack |
11-02-2017, 08:59 PM | #179 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
If it was a hardware issue digital ocean would be aware of it. You are using a VPS cloud so you are not an isolated user with dedicated hardware.
You are having the same sort of problems you were having with your shared hosting. Code:
root@paragon-DS-7:/var/log/mysql# zcat error.log.4.gz root@paragon-DS-7:$ cat /var/log/apache2/error.log check your PHP code. Code:
$ php file.php > file_name_warnings.txt |
11-02-2017, 10:25 PM | #180 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, I think we need a fuller view.
In the other VPS is hard to say, the sites could not stay online 24 hours below... However, in Digitalocean I have 2 servers, one in NY and the other in Bangalore, but they practically go into crisis together. The error that I find most often in the logs is always that of too many redirects, it is not gone and I do not know what else to do, but it is an error that there is always, not only when the server goes into crisis, but also when the CPU works at 20-30%. The page with most visits in all my sites is the internal search engine, about 80-90% of the visits and a MySQL table that also contains about 80-90% of the content of the site, in this case about 400,000 search tags, about 5000 pages of content. This is the structure of the table: Code:
-- Struttura della tabella `tgp_search` CREATE TABLE `tgp_search` ( `id_search` int(11) NOT NULL, `query` varchar(255) NOT NULL, `views` int(11) NOT NULL DEFAULT '1', `insitemap` int(1) NOT NULL DEFAULT '0', `data_ins` varchar(255) NOT NULL DEFAULT '1388796621', `last_mod` varchar(255) DEFAULT NULL, `engine` varchar(255) DEFAULT NULL ) ENGINE=MyISAM DEFAULT CHARSET=latin1; -- Indici per le tabelle `tgp_search` ALTER TABLE `tgp_search` ADD PRIMARY KEY (`id_search`), ADD KEY `query` (`query`); ALTER TABLE `tgp_search` ADD FULLTEXT KEY `query_2` (`query`); -- AUTO_INCREMENT per la tabella `tgp_search` ALTER TABLE `tgp_search` MODIFY `id_search` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=409143; Code:
SELECT query, MATCH(query) AGAINST('tits') as score_search FROM tgp_search WHERE MATCH(query) AGAINST('tits') ORDER BY views, score_search DESC LIMIT 25 UPDATE tgp_search SET views='2407', last_mod = '1509683869' WHERE query = :query SELECT SQL_CALC_FOUND_ROWS id_photo, title, description, url, img, MATCH(title, description) AGAINST('tits') as score FROM tgp_photo WHERE MATCH(title, description) AGAINST('tits') ORDER BY score DESC LIMIT 0, 66 SELECT FOUND_ROWS() AS 'found_rows'; SELECT query FROM tgp_search WHERE MATCH(query) AGAINST('tits') ORDER BY views DESC LIMIT 25 Code:
SELECT SQL_CALC_FOUND_ROWS id_photo, title, description, url, img, MATCH(title, description) AGAINST('tits') as score FROM tgp_photo WHERE MATCH(title, description) AGAINST('tits') ORDER BY score DESC LIMIT 0, 66 Compared to a simple query like: LIKE '% boobs% ' I have made complete PDFs of the nixstat statistics of the last 15 days, including everything. I/O is used very little, as well as the disk, the RAM is little, but we only have that, the most active are CPU and MySQL process. ubuntu-1gb-nyc3-01-NIXStats.pdf ubuntu-2gb-blr1-14-04-3-NIXStats.pdf Obvious that the site uses heavily MySQL, when the CPU goes to 100% the MySQL process goes to 150-200% (although I do not understand how it is possible ) I activated the log on long queries, about 4 hours ago, for now it is empty. I almost thought I'd try to block Bing for a few days, just to see what happens, and if it can be his fault, but maybe it's a stupid idea. I'm very undecided on whether or not to use the last-mod and if-modified-since headers, because I'm always afraid that once you set, big G you don't consider it anymore. Anyway, now my sites are online 24 hours a day, seem fast and work without problems even when the server goes into crisis. |
11-10-2017, 02:27 PM | #181 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So, these days I tried, studied, tested, but htaccess and url_rewriting I understand very little...
This error is still there... there is always... Code:
[Fri Nov 10 21:22:42.691073 2017] [core:error] [pid 23733] [client 95.108.129.196:58013] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace. I did not use the structure indicated in the tutorials "var/www/html/site.com/public_html I public_html never liked it, so I ignored it, in site.com, there are files of my sites. All my htaccess start with this line Code:
RewriteBase / A virtually identical error occurs with WordPress multisite (which I have never used and I do not know) in Apache 2.4 The solution seem to be this couple of lines: Code:
RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f Any ideas to help? P.S. other thing, can I run in a single site a different version of PHP, generally php7, and in a single site php5? |
11-11-2017, 08:05 AM | #182 | |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
mod_rewrite - Apache HTTP Server Version 2.4
Quote:
! -d not a directory ! -f not a file ! is a negation = equal to != not equal to |
|
11-13-2017, 02:53 PM | #183 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
I don't think it's my solution...
Of this there is nothing physically on the server, it is all url_rewriting Code:
http://www.alternativegirlshardpics.com/search-boobs/ Code:
http://www.alternativegirlshardpics.com/search.php?query=boobs I begin to have also some doubt that the problem is URL rewriter On the same page, there is also this: Code:
tail -f error_log|fgrep '[rewrite:' Today I decided to download the log files locally, the result is 0, not found On the weekend I also tried to block Bing via IP list downloaded from ip2location The CPU behavior hasn't changed much, it's not him the problem |
11-13-2017, 09:12 PM | #184 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
cd to the path for error_log
Code:
$ tac error_log|grep 'rewrite:'|cat -n|less |
11-14-2017, 06:28 PM | #185 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
The answer is empty... nothing... only: (END)
Very strange... there seem to be no errors with url_rewriter, but looking in the search engines the redirect error, the only solution I find refers to the correction of an error in htaccess in WordPress on Url_rewriter of the multisite... I'm really confused |
11-15-2017, 09:49 PM | #186 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Sometimes strange things happen...
For example, I deleted a site, I disabled it, I deleted files and databases and deleted the virtual host files, but I forgot to change the DNS... All site requests ended on the first site (in alphabetical order) on my server... Now I have also changed the DNS, but I keep seeing errors like this: Code:
[Thu Nov 16 04:15:35.503536 2017] [:error] [pid 4988] [client 66.249.64.128:48685] script '/var/www/html/mysite.com/search.php' not found or unable to stat But I find errors even stranger, like this: Code:
[Thu Nov 16 04:16:00.637321 2017] [:error] [pid 4835] [client 199.59.91.34:49152] script '/var/www/html/mysite.com/status.php' not found or unable to stat Sometimes I also often see errors related to wordpress, errors related to directors or files type wp-admin or wp-login, but there is no wordpress on my servers. Code:
[Thu Nov 16 04:43:18.759137 2017] [:error] [pid 7967] [client 120.28.68.192:64647] script '/var/www/html/veryhardsexupdates.com/wp-login.php' not found or unable to stat [Thu Nov 16 04:45:10.462677 2017] [:error] [pid 10408] [client 120.28.68.192:64801] script '/var/www/html/extremehardpics.com/wp-login.php' not found or unable to stat |
11-15-2017, 11:44 PM | #187 | |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Code:
barry@paragon-DS-7:~$ host 66.249.64.128 128.64.249.66.in-addr.arpa domain name pointer crawl-66-249-64-128.googlebot.com. barry@paragon-DS-7:~$ host 199.59.91.34 34.91.59.199.in-addr.arpa domain name pointer cs2508.mojohost.com. barry@paragon-DS-7:~$ host 120.28.68.192 Host 192.68.28.120.in-addr.arpa. not found: 3(NXDOMAIN) So there is a 404 -- so what? You could set up Permanent Redirect Quote:
|
|
11-16-2017, 05:36 PM | #188 | |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Something really weird is happening...
Today I got an AdSense alert with this URL Quote:
The structure with subdomains (eg. analvideoupdates.bigbigbigboobs.com) existed in the old VPS where Bigbigbigboobs.com was the "main domain" and for each domain added was created a sub-domain to the main domain. I have absolutely no idea why, or what they used to do, but that idiot of CPanel created them... I have never indexed them or used them, but for some strange reason the search engines seem to know that they exist. The thing that worries me now is that every time I changed VPS the "main domain" changed, not to put off line all the sites, it was created a new one and moved everything quietly. Now I have no idea how many of these subdomains have been created and under what "main domains"... For now I have temporarily solved the situation by moving a site, now even the first is a porn website. I do not think I can do everything with Apache alias, I thought I would create a script that intercepts the refer (including domain, subdomain and query string) and redirect based on the sub-domain, in the right site. |
|
11-16-2017, 08:17 PM | #189 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
If you are redirecting 404's DON'T
Better the links that do not exist dead end and get deindexed. |
11-17-2017, 07:08 AM | #190 |
So Fucking Banned
Industry Role:
Join Date: Apr 2017
Posts: 104
|
Interesting Topic :3
|
11-17-2017, 12:38 PM | #191 | |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
What do you mean?
I have this in all my httaccess and a 404 page in all my sites (e.g. 404 Page Not Found | Alternative Girls Hard Pics) Code:
ErrorDocument 404 /404.php ErrorDocument 403 /403.php Quote:
I thought I would retrieve something with these 2 sites (one per server, first site in alphabetical order). Adult Hashtag Adult Sex Search This type of website saved searches from the query string... at least recovery the keywords of the search are read by search engines on all those sub-domains. What do you think? stupid idea? |
|
11-17-2017, 01:16 PM | #192 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Your 404.php is OK it say 'page not found'
I meant 404 (302)-> index.php |
11-24-2017, 02:13 PM | #193 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Permissions are correct if practically everything in /html directory has these?
Code:
-rwxr-xr-x www-data www-data |
11-24-2017, 05:19 PM | #194 |
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
generally it is either
user:www-data www-data:www-data if you don't want to put files sftp or ftp |
12-02-2017, 07:28 PM | #195 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Maybe I found the bastard who's cloning my sites.
Right now my server is at 100% CPU, 100% memory, 50 load, sites practically offline. In access.log I see a lot of access from this IP 93.105.187.11, Code:
93.105.187.11 - - [03/Dec/2017:02:10:52 +0000] "GET /page-12/search-mooty+mooty+boobs+dhod+figar+onley+fack+pick+full+screen+hd/random/ HTTP/1.1" 200 25269 "http://www.monsterboobshardpics.com/page-9/search-mooty+mooty+boobs+dhod+figar+onley+fack+pick+full+screen+hd/random/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:10:52 +0000] "GET /page-2/search-gp+tranny+danika+haze+anal+porn/random/ HTTP/1.1" 200 23399 "http://www.monsterboobshardpics.com/page-3/search-gp+tranny+danika+haze+anal+porn/random/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:10:55 +0000] "GET /page-1/search-indian+aunty+visible+nipples+in+clothes/date/ HTTP/1.1" 200 23789 "http://www.monsterboobshardpics.com/search-indian+aunty+visible+nipples+in+clothes/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:10:54 +0000] "GET /page-19/search-yasmine+james+boobs+all+pic/random/ HTTP/1.1" 200 24789 "http://www.monsterboobshardpics.com/page-23/search-yasmine+james+boobs+all+pic/random/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:10:52 +0000] "GET /page-2/search-north+indian+mix+pussy+photo+fap/popular/ HTTP/1.1" 200 23788 "http://www.monsterboobshardpics.com/search-north+indian+mix+pussy+photo+fap/popular/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:12:30 +0000] "GET /page-15/search-extra+large+booty+stuffed+with+black+cock+gp+king/date/ HTTP/1.1" 200 24549 "http://www.monsterboobshardpics.com/page-10/search-extra+large+booty+stuffed+with+black+cock+gp+king/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:12:29 +0000] "GET /page-9/search-anna+carlene+milky+nipples+porn/ HTTP/1.1" 200 23543 "http://www.monsterboobshardpics.com/page-12/search-anna+carlene+milky+nipples+porn/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:12:29 +0000] "GET /page-11/search-a+huge+bellied+ssbbw+chuddy+3gp+video+sex/date/ HTTP/1.1" 200 24278 "http://www.monsterboobshardpics.com/page-6/search-a+huge+bellied+ssbbw+chuddy+3gp+video+sex/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:15:18 +0000] "GET /search.php?q=latin+angels+images+of+2014+of+hot+and+sexy+models&sort=date&page=1 HTTP/1.1" 200 25222 "http://www.bigbigbigboobs.com/search.php?q=latin+angels+images+of+2014+of+hot+and+sexy+models&sort=date" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=video+hd+karla+james+bath+bubbles+and+boobs&sort=date&page=8 HTTP/1.1" 200 24747 "http://www.bigbigbigboobs.com/search.php?q=video+hd+karla+james+bath+bubbles+and+boobs&sort=date" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=lila+bbw+pictures&sort=date&page=4 HTTP/1.1" 200 23594 "http://www.bigbigbigboobs.com/search.php?q=lila+bbw+pictures&sort=date&page=1" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=patti+with+kerala+girls&sort=random&page=4 HTTP/1.1" 200 25217 "http://www.bigbigbigboobs.com/search.php?q=patti+with+kerala+girls&sort=random" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=sexy+video+downloddate&sort=random HTTP/1.1" 200 25473 "http://www.bigbigbigboobs.com/search.php?q=sexy+video+downloddate&page=1" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=malu+big+boobs+in+t+shirt+pics&sort=popular&page=10 HTTP/1.1" 200 25418 "http://www.bigbigbigboobs.com/search.php?q=malu+big+boobs+in+t+shirt+pics&sort=popular" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=w+feat+auntybhabhi+moti+gand+hole+fuckingphotodate&sort=date&page=1 HTTP/1.1" 200 24731 "http://www.bigbigbigboobs.com/search.php?q=w+feat+auntybhabhi+moti+gand+hole+fuckingphotodate&sort=date" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=bigblack+mama+wet+pussy+pics&sort=popular&page=9 HTTP/1.1" 200 25187 "http://www.bigbigbigboobs.com/search.php?q=bigblack+mama+wet+pussy+pics&sort=popular" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:14:59 +0000] "GET /search.php?q=aucking+big+boobs+mothers+son+photos&sort=random&page=3 HTTP/1.1" 200 25325 "http://www.bigbigbigboobs.com/search.php?q=aucking+big+boobs+mothers+son+photos&sort=random" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" I tried to block it with this: ufw deny from 93.105.187.11 to any; But I still see access from this IP... am I doing something wrong? Why isn't it blocked? The firewall is enabled and I have also restarted the server. |
12-02-2017, 07:39 PM | #196 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
Now also my second server started to go into crisis and I find the same IP here too
Code:
93.105.187.11 - - [03/Dec/2017:02:35:43 +0000] "GET /page-23/search-pics+gigantomastia+granny+tits/ HTTP/1.1" 200 24056 "http://www.bigboobsupdate.com/page-26/search-pics+gigantomastia+granny+tits/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:35:44 +0000] "GET /page-4/search-desi+xxx+penti+ass+photo/popular/ HTTP/1.1" 200 25648 "http://www.bigboobsupdate.com/search-desi+xxx+penti+ass+photo/popular/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:55 +0000] "GET /page-28/search-xxx+pussy+4minit+hd/ HTTP/1.1" 200 25543 "http://www.bigboobsupdate.com/page-24/search-xxx+pussy+4minit+hd/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:54 +0000] "GET /page-4/search-japan+wife+gif+fucking+photo/longest/ HTTP/1.1" 200 26546 "http://www.bigboobsupdate.com/search-japan+wife+gif+fucking+photo/longest/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:54 +0000] "GET /page-8/search-beautiful+girls+fucked+stepwise+photo/ HTTP/1.1" 200 26726 "http://www.bigboobsupdate.com/search-beautiful+girls+fucked+stepwise+photo/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:53 +0000] "GET /page-7/search-japanese+xxl+tits+squirty+naked+sex/popular/ HTTP/1.1" 200 25591 "http://www.bigboobsupdate.com/search-japanese+xxl+tits+squirty+naked+sex/popular/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:15 +0000] "GET /page-14/search-aunty+boob+popout+from+blous+image/ HTTP/1.1" 200 23940 "http://www.bigboobsupdate.com/page-4/search-aunty+boob+popout+from+blous+image/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" 93.105.187.11 - - [03/Dec/2017:02:34:18 +0000] "GET /page-6/search-big+monster+suck+lmage/popular/ HTTP/1.1" 200 25986 "http://www.bigboobsupdate.com/page-5/search-big+monster+suck+lmage/popular/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0" |
12-03-2017, 10:25 AM | #197 | |
♥♥♥ Likes Hugs ♥♥♥
Industry Role:
Join Date: Nov 2001
Location: /home
Posts: 15,841
|
Quote:
__________________
I like pie. |
|
12-03-2017, 03:18 PM | #198 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
I don't have allow rules, just apache, postfix and ssh...
My UFW is really weird, if I still try: Code:
ufw deny from 93.105.187.11 to any; Skipping adding existing rule And this IP continues to pick up content from my server... UFW is not doing anything... I saw that Digitalocean can apply a firewall on the droplet (Digitalocean Cloud firewalls, https://www.digitalocean.com/communi...loud-firewalls) Whose limits are: "Total incoming and outgoing rules per firewall: 50" I do not use it because I trust to have UFW of Ubuntu, but maybe the firewall of Digitalocean also has effect on UFW? Only reads the first 50 rules? (I have 534 now, it really would be a crap if it works like that...) |
12-03-2017, 04:51 PM | #199 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
As a last hope, I tried to launch this:
Code:
iptables -I INPUT -s 93.105.187.11 -j DROP But I have no idea how iptable works and from what I understand if I reboot the server everything resets... Can anyone help me figure out how to properly set up iptable or how to make it work properly UFW? |
12-06-2017, 04:27 PM | #200 |
Confirmed User
Industry Role:
Join Date: Apr 2014
Posts: 385
|
So... these days I studied a little iptables (very little).
I took a little courage... I launched sudo ufw reset and deleted all the rules of the server firewall. Then I re-applied the basic rules both on UFW and on Iptables. I found out how to block an IP and a list of IP with this script: Aggiungere regole a iptables da lista IP e renderla persistente and added all the IPs that I want to block (regional, Baidu, Yandex, bad spider etc). I installed iptables-persistent saved and restarted the server... Aaaand I made a mess... The sites seem to be online and work well, but initially I did not receive mail from the server (eg those of the cronjob) and I did not see the stats of Nixstat. With some adjustment nixstat now works and some mail has arrived, but I'm afraid I have done some other mess because the statistics of the server with Ubuntu 14.04 in particular (my other server is 16.04) have fallen drastically these days, or at least so they say statistics services... but maybe I blocked the access of Yandex metrics and analytics to the server? I don't know... Today after some modification and reboot when I opened the rules.v4 file I found it empty... Did I miss something? I don't really know what I'm doing... This is my current rules. V4, something missing? Is there anything too much? http://porn-update.com/temp/rules.v4 |