Hi Alex,
There is a bug in the clean_output() subroutine.
Regardless to our other discussion (to change usage of clean_output() to dynamic URL assigning at the starting of cgi scripts) this bug should be corrected in clean_output().
The BUG:
clean_output() is not checking, whether there are already same parameters as $CFG{dynamic_preserve} in the URL or not. It is adding dynamic_preserve parameters, regardless that there is already such parameter or not.
Example:
$CFG{dynamic_preserve} = t,d,s,page,order
a) Original URL is: http://www.site.com/cgi-bin/lsql/page.cgi?d=1&page=3&order=a
b) The replaced URL will be: http://www.site.com/cgi-bin/lsql/page.cgi?d=1&page=3&order=a&d=1&page=1&order=d
(we suppose that the input values in $IN was: d=1&page=1&order=d)
As you see in the result URL, clean_output() it is adding dynamic_preserve parameters additionally to the original ones (so there will be duplicate parameters).
However it should ignore those additions, because the parameters in the actual URL should have higher priority, than the dynamic_preserve input parameters in $IN, so if a parameter exists in
both original URL & $IN then addition of that parameter from $IN should be ignored!
I hope this is clear.
Let me know if there are still questions.
A patch would be fine to be posted here, after if is corrected.
Best regards,
Webmaster33
Paid Support from Webmaster33. Expert in Perl programming & Gossamer Threads applications. (click here for prices)
Webmaster33's products (upd.2004.09.26) | Private message | Contact me | Was my post helpful? Donate my help...
There is a bug in the clean_output() subroutine.
Regardless to our other discussion (to change usage of clean_output() to dynamic URL assigning at the starting of cgi scripts) this bug should be corrected in clean_output().
The BUG:
clean_output() is not checking, whether there are already same parameters as $CFG{dynamic_preserve} in the URL or not. It is adding dynamic_preserve parameters, regardless that there is already such parameter or not.
Example:
$CFG{dynamic_preserve} = t,d,s,page,order
a) Original URL is: http://www.site.com/cgi-bin/lsql/page.cgi?d=1&page=3&order=a
b) The replaced URL will be: http://www.site.com/cgi-bin/lsql/page.cgi?d=1&page=3&order=a&d=1&page=1&order=d
(we suppose that the input values in $IN was: d=1&page=1&order=d)
As you see in the result URL, clean_output() it is adding dynamic_preserve parameters additionally to the original ones (so there will be duplicate parameters).
However it should ignore those additions, because the parameters in the actual URL should have higher priority, than the dynamic_preserve input parameters in $IN, so if a parameter exists in
both original URL & $IN then addition of that parameter from $IN should be ignored!
I hope this is clear.
Let me know if there are still questions.
A patch would be fine to be posted here, after if is corrected.
Best regards,
Webmaster33
Paid Support from Webmaster33. Expert in Perl programming & Gossamer Threads applications. (click here for prices)
Webmaster33's products (upd.2004.09.26) | Private message | Contact me | Was my post helpful? Donate my help...