Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

Possible nice feature

Quote Reply
Possible nice feature
Here's an idea I got from someone using PHP that might carry over well to Links SQL. What he has done is randomize the meta tags throughout his site, keeping the odds high of at least one spider ranking pages well at any given time. As long as the meta tags are database driven, this could be done pretty easily with Links, couldn't it?

You could have the standard field for adding keywords, plus have the option to turn randomizing on or off. If it's on, it would select a certain number of words or characters up to a default setting.

Any thoughts?

Dan
Quote Reply
Re: Possible nice feature In reply to
And it increases the amount of garbage hits when you try to do a search on any of the major engines.

<flame off>



------------------
POSTCARDS.COM -- Everything Postcards on the Internet www.postcards.com
LinkSQL FAQ: www.postcards.com/FAQ/LinkSQL/








Quote Reply
Re: Possible nice feature In reply to
how is this a 'nice' feature for Links.. it's just a strategy some person thought of to get more hits to get a higher placement in search engines.. isn't it?

------------------
Jerry Su
Links SQL User
------------------
Quote Reply
Re: Possible nice feature In reply to
Let me try explaining that again... I am very much against keyword stuffing with irrelevant terms. What I meant is using keywords related to your site, but randomizing the selection of them. There's nothing wrong with that, and since every search engine ranks differently, it might take a little pressure off people to come up with the "perfect" meta tags.

Dan
Quote Reply
Re: Possible nice feature In reply to
Scoring high in search engines is not about meta tags. In fact, I'd say that meta tags count just like 10%-15%, and randomizing keywords wouldn't accomplish anything at all.
Quote Reply
Re: Possible nice feature In reply to
How do you figure that wouldn't accomplish anything? If we take your numbers of 10-15%, then a different set of keywords will have at worst a 1 in 10 chance of affecting things (loosely playing with the statistics Wink ). Don't forget, this can be taken over several hundred Links pages, so some of the pages are guaranteed to show up differently at some point.

I realize there is potential for abuse, but not really any more so than a spammer periodically changing static keywords.

Dan
Quote Reply
Re: Possible nice feature In reply to
Dan,

Changing periodic keywords related to site's content is not spamming. Spamming, basically, is either submitting URLS multiple times, repeating keywords excessive times or leading to unrelated content.

I know that randomizing keywords won't get you in top positions because I've worked with search engines and have been positioning clients' sites and my sites for a while.

You should consider about 7-9 factors for each engine, and among them, are the keywords... that's why I say they count like 10-15% Smile

But anyway, this is going out of the thread.

Emilio

[This message has been edited by ekaram (edited January 27, 2000).]
Quote Reply
Re: Possible nice feature In reply to
  
Quote:
I know that randomizing keywords won't get you in top positions...

Quote:
You should consider about 7-9 factors for each engine, and among them, are the keywords... that's why I say they count like 10-15%

Ok, so if I'm to take your two points together, I would conclude that removing keywords from a page would have no affect on the listing (after all, the selection of the keywords doesn't matter). Thus, 85-90% is really 100%. That doesn't quite make sense.

Just because there are 7-9 factors, as you say, doesn't mean that each is worth anywhere near the same amount, especially considering the differences between search engine algorithms. It bears repeating that I am talking about randomizing the selection of the keywords, not just the order. However, many "experts" would say that even the order of the keywords is important.

But you're right, this is getting pretty far off topic. Since interest has been at best ambivalent, and at worst strongly against (not quite sure why the instant rush to judgement of my motives), I will drop it.

Dan

[This message has been edited by Dan Kaplan (edited January 27, 2000).]
Quote Reply
Re: Possible nice feature In reply to
Dan,

I know what you are saying, you are looking at your site as a single entity, so that any hits on a page in your site will bring you traffic.

What others are saying, and I am saying, is that search engines currently index PAGES, not sites. So, if you put a keyword that has nothing to do with a page on 20 pages, you have increased the noise and decreased the value of any search.

You should carefully pick your keywords to reflect your pages, and use the meta tags for the category builds, etc.

This is the best way to increase your "hits"

The more diverse pages you get, the more "hits" you'll capture on a wider range of searches.

If someone is used to seeing your pages show up and have absolutely nothing to do with their search, eventually they'll just ignore your site even for proper 'hits'

I do that... there are sites that have the same keywords on every page, and I just ignore them completely, since all they do is waste my time.

I hope that clears it up a bit....

No one is jumping on you, per se, but rather on the idea of decreasing the signal:noise ratio to useless.



------------------
POSTCARDS.COM -- Everything Postcards on the Internet www.postcards.com
LinkSQL FAQ: www.postcards.com/FAQ/LinkSQL/








Quote Reply
Re: Possible nice feature In reply to
Would it be hypocritical of me to respond to a response when I said I was done? Wink

Quote:
I know what you are saying, you are looking at your site as a single entity, so that any hits on a page in your site will bring you traffic.
Close, but that's not quite what I'm saying. I only consider the site a single entity if it all deals with one subject, but that's not really the situation I'm considering.

What I was trying to describe is selecting from individual groups of meta keywords relevant to each of the categories. For example, categories 1 and 2 might each have 400 characters worth or keywords relative to their content, but only 250 could be selected from the standpoint of the search engine. Since there is no guarantee that the same selection and order of the keywords will always rank highest, why not randomize the process and if nothing else, average out your odds of doing well in any one listing. How often do you hear the complaint, "I'm listed first page on SE's A, B, and C, but nowhere in the top 2,000 on D?!" Smile

Additionally, with Links listings, we are dealing with an essentially ever-changing page. Assuming you have a max # of links per page set, the content will change over time. So why not have an allowance for the same to happen with the keywords without having to do it manually?

Does that make a little more sense?

Again, let me say that I am as strongly opposed to garbage keyword listings and similar spam as anyone. I generally use indexes such as Yahoo instead of true search engines for this very reason.

Dan

[This message has been edited by Dan Kaplan (edited January 28, 2000).]
Quote Reply
Re: Possible nice feature In reply to
Ok, so what you are saying, is to randomize the string of keywords for each page the list is on.

All you want to do is shift the order around inside the meta tag to put ones at the end at the beginning, etc.

I see what you are saying now, exactly, but again, that goes to the issue is it worth indexing every page of a cgi-generated site? Every time you rebuild, the information is going to change, links will move around, from page to page, etc.

You actually have a better chance of getting properly indexed by using the proper keywords, the right wording on your front pages, and in your meta=description and using a robots.txt file to direct the search engines to the pages you want indexed.

Most engines stop at a certain depth, or number of pages. They won't get to all your pages. Some won't index pages that have refresh, redirect, or other non-static html on them.

Randomizing the keywords may not hurt, but since many of the engines are using an order-found type of hit by randomizing the keywords rather than properly picking the order will actually get you LOWER scores on "hits" by people looking for "word1 word2" when your randomization has turned it into "word2 word1"

It shouldn't be hard to randomize the meta tags, just parse on the white space, stuff it into a Hash using the key=>value of random(64000) => word, then read the hash back sorted on the key. Every time you do that, your words will come out in a different order.

You can find code fragments for all of this in the existing LinkSQL code, and probably hack it together in a short time.




------------------
POSTCARDS.COM -- Everything Postcards on the Internet www.postcards.com
LinkSQL FAQ: www.postcards.com/FAQ/LinkSQL/








Quote Reply
Re: Possible nice feature In reply to
 
Quote:
Every time you rebuild, the information is going to change, links will move around, from page to page, etc.
Precisely my reason for thinking random would be as good as anything. Smile I do agree, though, that it may or may not help. I currently have spiders indexing quite a few of my Links pages (I don't go too deep with categories), and lots of 404's for categories that have been moved. I haven't yet decided if I want to set robots.txt to keep the spiders out, assuming they even bother to follow the guidelines...

At any rate, thanks for the suggestions as to how I might go about implementing it. It's currently a bit over my head -- hey, I'm not through chapter 6 yet of the MySQL doc's -- but my Links SQL is "in the mail," so I'll be able to start playing with stuff soon.

Dan
Quote Reply
Re: Possible nice feature In reply to
Dan,

I never meant to judge or critic your motives, all I wanted to make clear is that it's better to optimize than randomize your keywords.

If you still want to randomize them, you could use a CGI to output a random list of keywords, and include them in your <meta keywords> field with an SSI call.

Emilio
Quote Reply
Re: Possible nice feature In reply to
Actually, most of this isn't SQL, it's all perl. You use DBSQL to get the meta-keyword tags, then parse the result with "split" and then hash it with the random number, and sort it...

It's actually only probably 8 to 10 lines of code (and probably could be done in less) if you sacrifice readability.

Maybe someone will do it up as a quick project, I have too much on my burners to commit to it... I owe a lot already.



------------------
POSTCARDS.COM -- Everything Postcards on the Internet www.postcards.com
LinkSQL FAQ: www.postcards.com/FAQ/LinkSQL/








Quote Reply
Re: Possible nice feature In reply to
Ok, I think we're all one big happy family again. Smile

That doesn't sound too tough to implement, although quite honestly, I'm not sure it's even something I would put to use. I'm quite happy setting up my own meta tags; I was thinking more in terms of the people who are less comfortable doing so.

Cheers,
Dan