Script called without the required parameter(s) Line 70 Script called without the required parameter(s) Line 70
 

News:

CPG Release 1.6.26
Correct PHP8.2 issues with user and language managers.
Additional fixes for PHP 8.2
Correct PHP8 error with SMF 2.0 bridge.
Correct IPTC supplimental category parsing.
Download and info HERE

Main Menu

Script called without the required parameter(s) Line 70

Started by jiqbal, May 11, 2010, 06:12:11 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

jiqbal

Hi - I am receiving the error when uploading a 30MB file.  The server itself can handle files larges than that size, I have checked the settings.  If I upload a file that is 10MB it works fine, but I get this error when its larger.

Critical error

Script called without the required parameter(s).

File: /home/junaidiq/public_html/pinkxter.com/host/db_input.php - Line: 70

Website is -- www.pinkxter.com/host

Username is -- Test
Password is -- Test

Debug mode is enabled.

I would appreciate the assistance to get this resolved.

Thanks

phill104

Well done on providing all the required details and settings. That is seems to be a rarity at the moment.

What you are experiencing is quite common. Most servers will run into limitations set by your host. If you have full control over your php settings you could try looking at everything outlined here - http://forum.coppermine-gallery.net/index.php/topic,24088.0.html

However, with files that large you are much better off uploading via ftp and using batch add.
It is a mistake to think you can solve any major problems just with potatoes.

Joachim Müller

Quote from: jiqbal on May 11, 2010, 06:12:11 PMUsername is -- Test
Password is -- Test
That's not true. Actually, it's test/test (lower case). Capitalization does matter.

Regular uploads work as expected (see http://www.pinkxter.com/host/displayimage.php?pid=30 and http://www.pinkxter.com/host/displayimage.php?pid=31) - it's expected behaviour that such huge files will time out when http-uploading. Use FTP-upload plus batch-add to upload such huge files. If that doesn't work for you, post a link to one of the files you're having issues with when batch-adding.

Quote from: jiqbal on May 11, 2010, 06:12:11 PMThe server itself can handle files larges than that size, I have checked the settings.
Yeah, really? What settings did you check? How about your browser?

Joachim

jiqbal


Thank you for the feedback.

1. I apologize for the uppercase, didn't realize the caps, the username/password is test/test.

2. The settings i have are the following in the .htaccess file:

php_value upload_max_filesize 2000M
php_value max_execution_time 10000
php_value max_input_time 10000

3. I have another script that uses http uploading from another site and I did not have a Issue uploading the same file.

4. I have not tried the FTP upload through the batch add, I am pretty sure that will work, I was just hoping to get http working fine for users to upload videos or large zip files.

5. I tried on all three browsers (Chrome/Firefox/IE), got same error.

Thanks!

phill104

I think you need to read up on php settings as those are just silly IMHO. Does the .htaccess file actually have an effect? I doubt any host would allow such silly limits. Whay does you phpinfo suggest is set? What does your host actually allow as the settings you see will almost certainly be  above what your host allows.
It is a mistake to think you can solve any major problems just with potatoes.

jiqbal


My phpinfo for my local value does show 'upload_max_filesize = 2000M', the .htaccess file overrides the php.ini file.

I contacted my host and they help me setup the size for larger uploads.  I uploaded a 40MB file for another script I have in place, so I have tested large uploads to work fine without any problems through http.

I am not sure why the error itself occurs on the coppermine script.

jiqbal

I'm not sure how I accidentally might have marked this as 'Solved' but it is not, it is still a Issue.

Thanks

jiqbal

So wanted to follow up if this Issue will be looked into by the developers?

Thanks

Joachim Müller

Nothing to look into from a developer's perspective - as I said that it's expected behaviour:
Quote from: Joachim Müller on May 11, 2010, 07:14:36 PMRegular uploads work as expected [...] - it's expected behaviour that such huge files will time out when http-uploading.

You haven't done as I suggested:
Quote from: Joachim Müller on May 11, 2010, 07:14:36 PMUse FTP-upload plus batch-add to upload such huge files. If that doesn't work for you, post a link to one of the files you're having issues with when batch-adding.

Quote from: jiqbal on May 11, 2010, 07:50:33 PM
php_value upload_max_filesize 2000M
php_value max_execution_time 10000
php_value max_input_time 10000
That's just wishfull thinking: PHP is just a component of the webserver. The webserver daemon has got configuration settings of it's own that apply as well as the OS underneath it. To give you an analogy: if you want your car to run faster, you can not only exchange the speedometer into a modell that allows faster speeds; you also need to modify the engine and transmission to actually give the car more power. There are several limiting factors, and you need to take care of all of them if you want your system reliably.
Uploads of 30 MB do not work reliably through http because of many factors that have an impact on that process. It's hard to to figure out all limitations that can have an impact on this process (in other words: there are several things that can go wrong), basically because the process wasn't built to catter for such large files in the first place. That's why you should use the mechanisms that have been designed to transfer larger files to a webserver. The method is outlined in the docs: it's FTP upload. Coppermine offers a nifty mechanism to catter for that, i.e. batch-add. There is nothing that we (as Coppermine developers) could do to circumvent this, as we have no power over the limiting factors that apply for http uploads.

jiqbal

1. The reason I haven't tried the FTP is because I looking to have multiple users upload large files since i have the plugin for zip files, etc.

2. I was checking to see if there was a solution because I was using another script called 'webshare', which allowed me to upload 121MB file without any issues through http.

Hence, this led me to believe there is a Issue with the copperminescript somewhere preventing a large upload to fail.

If it can't be researched, then please close the thread.

Thanks

Αndré

Quote from: jiqbal on May 12, 2010, 05:33:24 PM
i have the plugin for zip files
Which plugin? Can you please provide a link?

Quote from: jiqbal on May 11, 2010, 06:12:11 PM
Username is -- Test
Password is -- Test
Doesn't work. Please re-enable the test account.