Gossamer Forum
Home : Products : Links 2.0 : Discussions :

My LINKS Site Has Been Hacked! Beware of yours...

Quote Reply
My LINKS Site Has Been Hacked! Beware of yours...
help my site gets alot of hits a day, and i am the target of some hacker, they are screwing up my database, i'm pretty sure it is some hacker and not a software problem because i disabled the admin.cgi and somehow the hacks still persist, my link is

http://lyricsearch.virtualave.net

do you know what file i could replace or so?
p.s. my url.db file is 0 bytes. im guessing thats not a good thing.
the error i get now is....
Link ID etc... not Found.

In Search Of Serious Help.

------------------
lyricsearch99
Quote Reply
Re: My LINKS Site Has Been Hacked! Beware of yours... In reply to
One thing that you can do is add the url.db and other files in your data directory into the back-up process in nph-build.cgi file, so that you can restore these files. I would also recommend securing your data and backup directories via .htaccess.

However, it seems that the hack is coming through your system via telnet or FTP, which does not stop them from deleting or editing files.

Unfortunately, the only way that you are going to be able to restore your directory is manually re-building the url.db file from your links.db file.

Regards,

Regards,

------------------
Eliot Lee
Founder and Editor
Anthro TECH, L.L.C
http://www.anthrotech.com/
info@anthrotech.com
==========================
Coconino Community College
http://www.coco.cc.az.us/
Web Technology
Coordinator
elee@coco.cc.az.us
Quote Reply
Re: My LINKS Site Has Been Hacked! Beware of yours... In reply to
you could modify jump.cgi to use the links.db..

jerry

slower though
Quote Reply
Re: My LINKS Site Has Been Hacked! Beware of yours... In reply to
Ok, how would i build it back manually, and if so i think it would be very hard for me due to the fact that i have over 600 links(lyrics) listed in the site, so thats a lot of manual labor, how can i make the modification to include url.db in the back-up procedure. any help would greatly be appreciated also, about the htaccess , i have tried that but somehow, it keeps failing me i dont know what the problem is, if anyone is here on VirtualAve and knows how to configure the htaccess then please inform me, because my site is quite vulnerable at the present moment in time.

thanks.
Brian
http://lyricsearch.virtualave.net

------------------
lyricsearch99
Quote Reply
Re: My LINKS Site Has Been Hacked! Beware of yours... In reply to
And change all of your passwords ASAP. If they are accessing by ftp or telnet, they obviously have gotten a hold of your virtualave login and pass.
Quote Reply
Re: My LINKS Site Has Been Hacked! Beware of yours... In reply to
What you can do to add the url.db into your backup scheme would be the following:

1) In the sub build_backup routine in your nph-build.cgi file, add the following codes:

Code:
&File::Copy::copy ("$db_script_path/data/url.db", "$db_script_pat
h/backup/$date.url.db") or &cgierr ("Unable to copy category backup. Reason
: $!");

BEFORE the following line:

Code:
# Otherwise, the ugly way.

2) Replace the following line:

Code:
print "\tBacking up links, category and email database (Regular - $@) ..
. \n";

with the following:

Code:
print "\tBacking up links, category, email, and url database (Regular - $@) ..
. \n";

3) Replace the following codes:

Code:
foreach (qw!links categories email!) {

with the following codes:

Code:
foreach (qw!links categories email url!) {

So, the complete new sub-routine should look like the following:

Code:
sub build_backup {
# --------------------------------------------------------
# Backs up important database files.
#
my $date = &get_date;
if (-e "$db_script_path/backup/$date.links.db") {
print "\tBackup exists for today.. Skipping\n";
return;
}

# Try to do it the right way..
eval { require File::Copy; };
if (!$@) {
print "\tBacking up links, category and email database (File::Copy) ... \n";
&File::Copy::copy ("$db_script_path/data/links.db", "$db_script_path/backup/$date.links.db") or &cgierr ("Unable to copy links backup. Reason: $!");
&File::Copy::copy ("$db_script_path/data/categories.db", "$db_script_path/backup/$date.category.db") or &cgierr ("Unable to copy category backup. Reason: $!");
&File::Copy::copy ("$db_script_path/data/email.db", "$db_script_path/backup/$date.email.db") or &cgierr ("Unable to copy email backup. Reason: $!");
&File::Copy::copy ("$db_script_path/data/url.db", "$db_script_path/backup/$date.url.db") or &cgierr ("Unable to copy email backup. Reason: $!");
}
# Otherwise, the ugly way.
else {
print "\tBacking up links, category, email, and url database (Regular - $@) ... \n";
foreach (qw!links categories email url!) {
open (TMP, "$db_script_path/data/$_.db") or &cgierr ("Unable to open $db_script_path/data/$_.db. Reason: $!");
open (TMPOUT, ">$db_script_path/backup/$date.$_.db") or &cgierr ("Unable to open $db_script_path/$date.$_.db. Reason: $!");
while (<TMP> ) {
print TMPOUT;
}
close TMP;
close TMPOUT;
}
}
}

Hope this helps.

Regards,

------------------
Eliot Lee
Founder and Editor
Anthro TECH, L.L.C
http://www.anthrotech.com/
info@anthrotech.com
==========================
Coconino Community College
http://www.coco.cc.az.us/
Web Technology
Coordinator
elee@coco.cc.az.us