Gossamer Forum
Home : Products : Links 2.0 : Discussions :

Auto....

(Page 1 of 2)
> >
Quote Reply
Auto....
does links 2.0 have the ability to search the website it's installed on and add all the url links it finds to it's database?

I sure like this product but I have thousands of links on my site and it would be a hugh task to manually enter them into Links 2.0

Thanks,
Rpgman

Quote Reply
Re: Auto.... In reply to
hi,

here the software to check your links and to export to the format of links 2.0
http://www.innerprise.net/us4.htm



ciao
nicky
here u can visit german gossamer links forum
http://forum.nicky.net
Quote Reply
Re: Auto.... In reply to
Could you or anyone that knows, tell me how to use URL Spider Pro v1.9 to export links from my web pages into Links 2.0?

Thanks,
RPGMAN

Quote Reply
Re: Auto.... In reply to
1/ Run the software
2/ Direct it to your first webpage
Example http://mywebsite.com/index.html
3/ Set Filter-2 as only import from mywebsite.com
4/ Set follow links to YES
5/ Set spider depth to unlimited
6/ All your pages will be scanned.
7/ It will import URL, Title and Keywords.
8/ Use the export function to export to links-2 format
Thats It
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
set filter 2? I'm using version 1.9. When I click on filter 2, this is what I get: Optional: Web sites at these domains will not be saved, Case Insensitive.

I'm a little confused on how that filter works.

Question: how is this filter suppose to read all the url's on my site only and not any other sites?

Basicly what I want URL Spider Pro v1.9 to do is, just extract the url's on my site without following those url's. I just want it to find all the links on my site.



Quote Reply
Re: Auto.... In reply to
I am using V1.8b
that has 3 filters
The URL filter can be chanced to a different location in v1.9 I dont know
Normally the help file is very good
I spent 2 hours examining the help file on v1.8b
Advice you do the same on v1.9
I been to the site of INTERPRICE
and have read the update of 1.9
Seems there are more filters and should be more easy to filter only your pages
Did you go and read this version update ???
Did you study their facts of v1.9
Their help file included in every version is normaly very good and as a registered user you can use their online forum for help - People there are friendly and help comes fast.
regards,
Sanuk


Quote Reply
Re: Auto.... In reply to
v1.9 only has 3 filters also.

I'm just confused on how the filters work.

I just want it to find the links on my site and not follow them.

RPGMAN

Quote Reply
Re: Auto.... In reply to
It has to follow the links to find them.
I explained direct to your index.htm (.html)
It will save this page.
Then follow all your links there to your other pages and also scan them and save them.
Then again on these pages follow the links to more of your pages and also scan and download untill your complete site is scanned.
Only be sure to set one of the filters only to follow links on pages that are named yoursite.com other wise if you have links on some of your pages that link to someother.com then spider pro will also follow them without stopping.
But Normally the info in the help file is very good and detailed - dont understand your problem ?????

Quote Reply
Re: Auto.... In reply to
This from the help file of v1.8

URL Spider Pro will check for the keywords and phrases added to this list box within the web page it's currently indexing. If URL Spider Pro can't find any or all (depending on what you've specified) of those keywords and phrases the web page will not be saved.

Why would you want to use this?
Simple, if you have a web site dedicated to automobiles and only want to index other web sites having to do with automobiles then you just enter in a few keywords or phrases like: automobile, classic car, etc. You will notice we left off an 's' on each keyword and phrase, by doing this URL Spider Pro will find both: automobile and automobiles.

Tip: You can now specify if URL Spider Pro needs to find all or at least one of the keywords. That's done using the check box labled: "Web pages must contain ALL words/phrases".

Note: 100 keyword/phrase limitation.

So in this filter you add : yoursite.com
and only pages from yoursite.com will be scanned
Simple . . . Read your help file
Read the facts
Go to the Spider pro Forum
Everything is there.
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
Ok, here's my problem: I have hundreds of links on my site that link other sites. I only want URL Spider Pro v1.9 to include those links, not all the other links included in the other sites. For example I have a link to ABC.com, I only want that link to be included, not all of ABC.com's links also.

RPGMAN

Quote Reply
Re: Auto.... In reply to
You should have a copy of your complete site om harddisk ???
Havent you ??????
Spider all of these files with
Web address extractor from GBCS software
or any other URL extractor.
Convert the results to a txt or cvs file
Import this file with the data import function from Spider pro inside the que.db file of spider pro
Set follow links to NO
Spider pro will scan and index the complete que file
Web extractor can be downloaded with a demo of 30 days
more than enough for your need.
I dont know the URL of CBCS
Do a Altavista search foe CBCS software
Regards and goodnight.
It is now 11.58 pm in Thailand
I go sleep\Good luck
(But this is also explained in The spider forum ????)
Did you ever go look ????

Quote Reply
Re: Auto.... In reply to
Yes I have my site backed up on my harddrive.

Here's the problem:

I did what you suggested and took one of my webpages that has links on them and extracted them using Jimtool Extractor.

It extracted what I wanted (had to prune some links) but now I can't import them into URL Sprider Pro v1.9. It's a text file but it just won't import.

Does anyone out there know of a way to take the links off one's own site and automaticlly import into Links 2.O? I would hate to have to key each link into Links 2.0.

RPGMAN

Quote Reply
Re: Auto.... In reply to
FROM THE HELP FILE
Import Data
This will allow you to import the: URL Spider Pro text database, Que file (text file of URLs one on each line), and Track file (text file of URLs one on each line).
It Works - I use it.
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
Important:
Before sarting a New Scan,
Clear your Track File.
All URL's residing in the Trackfile
are asssumed allready visited and skipped.
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
when you use the import function....does the file name your importing have to be a certain name? Like I said, I'm trying to import a file that was created using JimTool Extractor.

I choose the import function....it ask me for the file name, I browse the C drive and find the directory it's in...choose the file to import and click run.

It doesn't seem to do anything.

Is there a way I can check to see if it imported it? A file name or something.

Thanks,
RPGMAN

Quote Reply
Re: Auto.... In reply to
Your file have to be a file.txt file.
1 Url on every line.
Example named "siam.txt":

http://siam.to/gallery
http://thcity.com/guidingstar
http://www.siamcool.com
http://www.xaap.com/mainth.asp
http://i.am/nosecandy
http://i.am/photon
http://i.am/startservice

Then just point as you did URL Import Data to this file on your computer.
Of Course import in Que and not In Track or database.
Url pro will start importing and you will see a "import status moving from 0% to 100% untill your file is imported in the Que.
In answer of a check,
On the bottem of your Spider screen you see the word "Que 7"
Because in the above example we imported 7 URL's in the Que

Importing 3000 Url's takes about 15 sec.
Now your Spider is charged to visit and scan all the URL's in your Que file.
Remember first to empty Track and database before starting.
Good luck
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
I did what you said and when I choose the .txt file to import to the to do list...it just sits there saying importing que.....it seems to be doing nothing....no counts in the done box....what could be going on? I'm using the version of URL Spider Pro v1.9 that you download for free. Maybe that version has import disabled?

I really apreicate you trying to help me. It just seems that its so difficut to import links into Links 2.0. There should be a simple way.


RPGMAN

Quote Reply
Re: Auto.... In reply to
Yes, Come to the point !!!!
You are not registered and using a Trial-Demo
I Talked about registered and using the help & Forum.
You did not react.
Many functions are disabled in The demo.
AS IS ALSO EXPLAINED IN YOUR HELP FILE !!!
To do what you want to Do you need to Register.
And buy The Software of Course.
Regards,
Sanuk

Quote Reply
Re: Auto.... In reply to
I have a few static pages on my site with several set of links with descriptions on each page. Is there a way to import these individual links (not pages on my site) in to a links database?

Thanks

Quote Reply
Re: Auto.... In reply to
Sorry to butt in... It would be very easy to create a one time Perl Script to go through defined files, extract the URL and dump out to the appropriate URL field of a Links 2.0 database record. I suspect that a little more would be required but I don't see any real problems.

I'm on holiday in California at the moment and when I return home I do know that I will be a little busy, however, if no-one else resolves this problem I will create a script to do the job, (as I see it), sometime during August. Please don't take me to task on target dates. I promise nothing but I'll do my best to help.

George E.D. Burville
Ed: www.appbe.com
Quote Reply
Re: Auto.... In reply to
That would be great. The sooner the better. I just thought that there might be something out there to make it easier to do what I want to do.

Does anyone know how to use that Bulkload.cgi in the resource directory? What the .txt file should look like, an example would be nice. I could put all my links in there and Bulk load them up to Links 2.0. The only problem would be the format of each record in the text file be in the format required in Links 2.0.

Does anyone know or could give me an example of what the record would look like in the text file required by Links 2.0 using the Bulkload.cgi program in the resource directory?

Thanx,
RPGMAN

Quote Reply
Re: Auto.... In reply to
I had a quick look at bulkload.cgi - it appears to want the records to be in links2 database format with the 'ID' field removed.

This morning I decided to have a quick look at what was needed to grab URL's and put them into a links2 database. In consequence I started creating the utility. I've finished the raw code which dumps out the URL and the description, if any, to a links2 database. It would not be a problem to remove the 'ID' field.

What I need to know is what defaults you require in all other fields of the new records.

George E.D. Burville
Ed: www.appbe.com
Quote Reply
Re: Auto.... In reply to
Great, would the script work off a text file of just grab the url's from the html page on my local drive?

If from a text file, I could fill in the blanks, or you could just use the name for the description and then I could pretty them up once I get them into the Links 2.o database.

If you want some ideas for what to fill in, send me an e-mail on exactly what you need.

Send it to rpgman@email.msn.com

Thanks,
RPGMAN

Quote Reply
Re: Auto.... In reply to
The script will need a text file that has a list of each file to be scanned. Each filename on a new line. Filenames can contain directory paths. Any URL starting with http will be extracted together with its description. I'll make the output into a file suitable for bulkload.cgi.

Its all but done. What is needed now is the info on how to install and run etc. Its enjoyable creating code but a bit of a chore creating the 'How To'. When its ready I'll put it on a page on my site and give you the URL.

I'm not sure how you test your code so I'm assuming you will do everything online. Here in California I test everything using a program called Perl Builder. Hence I can scan files offline. Back home I have more facilities and have therefore taken easy ways out. I've never looked into running Perl on a standalone without specific tools.

Once you've got a test run completed we can consider adjustments. My problem is that I'm running out of time, I won't be able to do anything from Sunday evening through to about Friday of next week.

George E.D. Burville
Ed: www.appbe.com
Quote Reply
Re: Auto.... In reply to
that will give me a week to test it.

That's great.....looking forward to hearing from you...

Please do post the url...

Can't wait to test it...and finally get those links from my site into Links 2.0.

Thanks for everything,

RPGMAN

> >