Gossamer Forum
Home : Products : Gossamer Links : Discussions :

Argh, Help! Dynamic URLs overriding hard coded URLs

Quote Reply
Argh, Help! Dynamic URLs overriding hard coded URLs
Help, I've been building my site under a sub directory to make sure I had it all working before launching. I just changed the paths in the Setup > Paths & URLs to the root. All looks and seems fine but in my include_headers, static hard coded URLs are being changed to the dynamic path if it matches the root url. I have an iframe calling a separate page and it's not working because the path is being changed but I can't figure out why. Any ideas??

Example:
http://www.mydomain.com is root
http://www.mydomain.com/includes/file.php is called in iframe
GLinks changes it to:
http://www.mydomain.com/cgi-bin/page.cgi?g=includes%2Ffile.php;d=1 so it doesn't work???
Quote Reply
Re: [pshadow] Argh, Help! Dynamic URLs overriding hard coded URLs In reply to
It's being modified by the code in Links.pm, clean_output. I'll add something for future versions, but for now, you'll have to make the modification to the code. If you've got all the latest updates, then change admin/Links.pm (around line 577):
Code:
if ($url =~ m/^\Q$CFG->{build_static_url}\E/) {
$begin .= $url;
}
to:
Code:
if ($url =~ m/^(?:\Q$CFG->{build_static_url}\E|\Qhttp://www.mydomain.com/includes/\E)/) {
$begin .= $url;
}

Adrian

Last edited by:

brewt: Nov 28, 2005, 10:20 PM
Quote Reply
Re: [brewt] Argh, Help! Dynamic URLs overriding hard coded URLs In reply to
I used the http://www.domain.com for the dynamic version and just http://domain.com for the hard-coded hyperlinks- the latter weren't changed.
Quote Reply
Re: [brewt] Argh, Help! Dynamic URLs overriding hard coded URLs In reply to
Well, at least 1 year ago, I already complained to Alex, against using clean_output solution in Links.pm, but at that point I had no enough reasons to support my opinion, just the speed overhead, what the clean_output is causing with the regexp match.

Now there is another reason, why the clean_output function usage is bad... Shocked


IMHO, the clean_output should be not used.
I believe, constructing the URLs at the point where they are needed, is the best solution,
and does NOT make any overhead by parsing the full webpage content as the clean_output does...


Reference URLs:
Post 1: URL generating
Post 2: [BUG] in clean_output()
Post 3: dynamic_preserve still not implemented

Best regards,
Webmaster33


Paid Support
from Webmaster33. Expert in Perl programming & Gossamer Threads applications. (click here for prices)
Webmaster33's products (upd.2004.09.26) | Private message | Contact me | Was my post helpful? Donate my help...

Last edited by:

webmaster33: Dec 5, 2005, 9:16 AM