Gossamer Forum
Home : Products : Gossamer Links : Discussions :

New Import problems! 1.1x to 2.0.5

Quote Reply
New Import problems! 1.1x to 2.0.5
Hello Alex!

I tried to import couple of categories and a couple of links from 1.1x to 2.0.5 default installation. Here I have a problem.

I can import the categories and links. No problem. (There was or is a problem in the suscribe but this I will address later)

However, I cannot import the second time. This means it works only once and it needs to go through the entire import completely. This is not what I desire. I would like to have categories frozen and keep on importing links. They are two seperate issues.

I do have categories and their respectives IDs exactly in the new format imported ONCE. Then there are 400 links in the database. I import this and empty the table.

Then fill the table and import anathor 400 links batchwise.

How can I do this? Without this I have no possibility to import 60.000+ links at a time.

--------------------IMPORTANT-------------------------------
This would also mean that the import needs to check the categories first in the New_Table_Category first and then import the links which correspond to the new table CategoryID in there (And not the old one).

Also if the Old_Table_Category is empty, it should be able to import links. I could not import links when the Old category table was empty.

-------------------------------------------------------------

Any ideas of tweaking the code? Would be thankful.
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
I have similar problems with importing, but from Links 2.0 to Links SQL 2.04. Because the database is about 100 MB or more i have to split it up and do it step by step. This is not possible because every time when importing, the same categories are added again and again. Currently we do have 71 categories. After i import the second part of the old database i do have 142 categories, the third time 223, etc. With the option "Extra Data Integrity" enabled there are no categories added, but also no links. Its telling me always "Invalid category, Link skipped", which is not correct.

Deleting the double categories is causing also deleting the added links, so what to do?
Quote Reply
Re: [harry346] New Import problems! 1.1x to 2.0.5 In reply to
Not only this but the Javascript gives a lot of errors as it gets a lot confused more than admin.

I have a database of 40+ Megs and its very difficult to get it going.

I need this as I shall always need to work offline in Access.

The best would be also the export area where if one could define which links to export i.e. from ID=1000 to ID 2000 and therefore one could do this step by step export. It could be even by dates added!

Last edited by:

rajani: Dec 6, 2001, 2:14 PM
Quote Reply
Re: [harry346] New Import problems! 1.1x to 2.0.5 In reply to
Hi,

Can you run your import from shell? It should be able to work without splitting it up then.

Cheers,

Alex
--
Gossamer Threads Inc.
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
Hi,

Is your script getting killed if you run it from shell?

Cheers,

Alex
--
Gossamer Threads Inc.
Quote Reply
Re: [Alex] New Import problems! 1.1x to 2.0.5 In reply to
Hello Alex!

If the script works from the web, I could not come to the idea of using the Telnet. The script does not get killed through the web browser during the import of 400 links.

If you mean that the script gets killed from the web browser during the imports of 60.000 links then yes, it gets killed. It could import maximun 450 links and then it dies, regardless of telnet or web. By running under telnet, I would have started a process under nph-**.cgi and the process gets killed irrespective of my UserID. I run it under my user ID and hence there is no difference if I run it under Shell, Telnet or web browser. This has something to do with the CPU restriction on the entire server since I am on a shared server with pair.com!

I hope it is true that there is no change if I run it via web or telnet. I have tried the different scripts running and found that it is the same. There came a clear text confirmation from the administrator of the provider that regardless of web or telnet, the CPU restriction applies to all processes and thats 30 secs.

Running through telnet helps in installing directories or permissions etc. To avail a special CPU restriction removal, the provider would have to create a special group and remove the restriction taking a risk of a mega troubles giving other one hundred users on the same server.

This CPU restriction has been a problem everywhere in the scripts. If there is a general routine that is inserted everywhere for everything that would be a fundamental change in the basics of all scripts. For e.g.

If the sub_routine enters into a cycle or a loop of a defined value that there is a control on every query and inserts. For e.g. for every XXXX value the routines are addressed. For e.g. for every thousand query there are a thousand inserts.

This would apply to Verify, etc where there are long CPU time is required. Hence Links SQL modules enters into a loop inside CPU. It can then get killed, but then it has done a lot of work and it should be able to offer further work to continue from there. Like continue further download through ftp....

Or if this sounds not feasible then I suggest anathor:

If there is a possibility to work with a pre defined value from and to i.e. if I could enter a value FROM=150 and value TO=500 then the module takes the value and does all the function it is supposed to do. By doing so, I have a better conntrol in the CPU to get things done.

In Verify, one could check with the dates. This is the control that is required in the import < > export also.

Or if this sounds not feasible then I suggest further:

If there is a possibility to delete the links after the imports from the Old table then also it could help. The delete occurs only after it has pasted in to the New Table. If this happens, then the import could also be done keeping the CPU restrictions that it could be step by step. However it would not solve the exports.

Further, if there is a UNIQUE involved somewhere and if there has been a duplicate, then the imports shall break forcing to repeat the entire process.

Further there is no need to enter categories again and again from the old categories. If I have created new categories then I am not able to use them as it needs to delete.

Further, if a category exists in the new, then the import will not check it. It will simply enter them. I have had the same category about six times after I realized that it had imported six times the same. I changed to Unique. Then it must stop while importing the categories and could not import links!!! Thats the problem. It needs to get out of category forced checking and inserting from the Old import table.


Quote Reply
Re: [Alex] New Import problems! 1.1x to 2.0.5 In reply to
Hello,

maybe that would be possible, but would not solve the problem of the script. I am running the old links as a parallel system ( for automated submissions only ) and wants to import the daily validate.db from there also to the links sql database which is currently impossible. Better would be not only to precheck existing category names but also really delete them after a second import or to make the "Extra Data Integrity" option working 100%.

Regards

Harald
Quote Reply
Re: [harry346] New Import problems! 1.1x to 2.0.5 In reply to
Hello Herald!

I think both of us have the same problems. Exactly the same in the nature to be able to import links in the table on a regular bases.

I also receive auto submission from microsoft bcentral and have got it through formmail into the database!

Check my website www.AtoZ.com in the bCentral from microsoft!!!
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
Hello!

Someone wrote me an email asking me to tell where is my site listed in the mocrosoft submit it.

It is listed under General Directories in the advanced submission area. Following is the URL from Microsoft:

http://submitit.bcentral.com/system/ucc/SIOSpec2.cfm

Also from the begining:
http://submitit.bcentral.com/...UCC/siostartpage.cfm

One has to login through the passport .Net nightmare from them.

They wrote me earlier this year that they like my concept and if I have many links they shall upgrade my site to a high submission sites.

Last edited by:

rajani: Dec 7, 2001, 10:56 AM
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
Hi Rajani,

its strange that nobody had this problem before, i was searching the forum some hours, but could not find any notice about that. Lets hope Gossamer can fix it up quickly.

Regards

Harald
netcollector@gmx.net

Quote Reply
Re: [harry346] New Import problems! 1.1x to 2.0.5 In reply to
Hello there!

I hope too that Alex offers a small tweak so that we can work. Optimistic.

I beleive that is is quite very important apart from only you and me having a problem of import.
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
Any help Alex!

By the way, why did you remove the photos of the staff from the web, or restrict? That was good!
Quote Reply
Re: [Alex] New Import problems! 1.1x to 2.0.5 In reply to
Hello Alex!

Since one week I am waiting for the answer.

The import function was working very good with the text files in the version 1.13x!

With the new version the import works perfect overall but has certainly restricted.

I am not willing to convert my database into a Binary tarball with which I have no possibility to work with.

It is not a specific problem of mine that I have problems to import but also I have not the slightest idea to change my database into a Links understandable database. I need to also work with it, process it and upload the processed data.

If the database is on a remote server than just the backup file uploading would be 60 Megabytes for a links database. I do not understand how much of a problem it could be to all other users.

What I would be very thankful to you is simply tell me that there is no possibility to import text files into links table, nor you are planning to bring the feature back. This feature was there, many users and myself were working with it and not it suddenly disappeared.
Quote Reply
Re: [rajani] New Import problems! 1.1x to 2.0.5 In reply to
This issue is very complex.

It's similar to trying to "update" from the ODP RDF, rather than reimport it.

It's not a "feature" in links, or a feature that 95%+ of users of Links would use on a regular basis - if at all.

Links was not designed to be worked off-line, then updated. It was designed to run, full time, and update in "real time" through the program itself.

It sounds like you are trying to run a large server/directory on an inadaquate account ie: you are having your CPU/time restricted to a point where you cannot manage your directory.

This isn't a problem with Links, or the web, or the web interface, it's a problem with your account. You need a better hosting company, better account, or a dedicated machine.

Honestly, I don't see why Alex/GT has to take the time away from adding in new features, and upgrades, to deal with something that is not relevant to the program, and can be fixed simply by upgrading your account.

You might be able to get your ISP to run the import for you, without it timing out, but the bottom line seems to be that you need an account that doesn't restrict your CPU usage.

Dedicated servers are coming down in price, and we were able to get two servers, with 10x the ram for half the cost of our single server 2.5 years go (same company, SUN Sparc). It's not prudent, or economically feasible to invest time to develop software for something that is a "non issue" in a real world environment. This hurts the overall user base in the long run.

I'm not directing this just at you, but towards everyone who tries to run a large service of any sort on a shared server, free host, or other non-dedicated platform.

I've been paying my way on the Internet for years. If you are trying to run a large program or service, you need to be able to buy the resources to do so. If you can't, then running that service _costs_ someone else. This is why ISP's impose CPU restrictions.

Quote:
I hope it is true that there is no change if I run it via web or telnet. I have tried the different scripts running and found that it is the same. There came a clear text confirmation from the administrator of the provider that regardless of web or telnet, the CPU restriction applies to all processes and thats 30 secs.

Running through telnet helps in installing directories or permissions etc. To avail a special CPU restriction removal, the provider would have to create a special group and remove the restriction taking a risk of a mega troubles giving other one hundred users on the same server.

You say it all right there. 100 shared users on the server ???

GT is offering competitive hosting options, and I'm sure they don't have a CPU time out on the basic links scripts.




PUGDOG´┐Ż Enterprises, Inc.

The best way to contact me is to NOT use Email.
Please leave a PM here.
Quote Reply
Re: [pugdog] New Import problems! 1.1x to 2.0.5 In reply to
Hello Robert!

Here we are talking of many issues and a small part of it is a shared server.

Even if I have a dedicated server, the problem, many many of them being discussed above would remain the same. You also need to remember the discussions that we have had couple of years ago when Links SQL was in its earlier versions.

When users uses some features and that are being wiped of in the later versions thats not fair.

I do not beleive in stamping my database in a Binary tarball, simply because Alex decided to do it this way, and having millions of problems of support later on, REGADRLESS or shared or a dedicated server, which has been truely disappointing.

Even if I use it online rarther than offline which you presumed, the problems would remain the same.

The fact of Validate area of admin producing a terrible long webpage has still not changed since years. Having an inflow of a hundred links per day, I cannot see Validate webpage being produced a Kilometer long webpage LONG, and a dedicated server does not help here. This is a fact that bothers me a lot and hence I prefer to use offline. Further, its very comfortable.

Hence your discussion on shared server is infact true to a small degree, but my problems are far beyond to the main programming and all kinds of complications resulting out there.