It is a error on Nginx Web Server. Actually happened when I tried to import a large sql backup into MySQL server through phpmyadmin.
First, I change the upload limit or max file size on php.ini
upload_max_filesize = 100M
post_max_size = 100M
And restart the HHVM.
Then I got the error.
413 Request Entity Too Large
Continue reading “413 Request Entity Too Large – Nginx Web Server”
When upload photos to Gallery 3.0.2, most of time I will see some errors. About 30% pictures can not be uploaded.
The Gallery 3.0.2 is running on Dreamhost hosting plan.
The error code is 2038.
Based on the official support page of Gallery3.
Some servers have simultaneous upload limitations.
To overcome this; login, Click Admin -> Settings -> Advanced
Change the gallery simultaneous_upload_limit to 1 and save.
After changing the settings, the upload has no problem, but slow.
Slow is better than error.
I have a website, which allow user to upload large file. Such as 200MB flv file through WEB interface. This kind of upload is not use FTP, not the regular php upload function, or get, post form function. The large file has to use socket to upload.
It need socket enabled on PHP.
I do have the –enable-sockets configuration on PHP.
The problem is happened when I upload the big file, it is always stopped or died on certain parts. About 4MB size.
I think it must be time out of php script.
Later I found a directive called
The default value is 60. It means 60 seconds.
Continue reading “Upload file with socket enabled”
Just got the email from Amazon. It announced Multipart Upload for S3 storage.
We are excited to announce Multipart Upload which enables faster, more flexible uploads into Amazon S3. Multipart Upload allows you to upload a single object as a set of parts. Each part is a contiguous portion of the objects data. You can upload these object parts independently and in any order. If transmission of any part fails, you can retransmit that part without affecting other parts. After all parts of your object are uploaded, Amazon S3 then presents the data as a single object.
Multipart Upload enables several features we’ve heard customers ask for:
* Parallel uploads – upload multiple parts of an object in parallel to improve throughput
* Pause – begin uploading parts of an object then stop and wait as long as you like before resuming
* Resume – resume an upload that was paused or one that only partially succeeded
* Unknown size uploads – begin uploading data before you know the total size of the object
Continue reading “Amazon S3 has Multipart Upload”
There is a picture present the global download speed comparison, which is sourcing from Akamai.
First of all, I would like you to test your own Internet speed first. It is including Download speed, Upload speed, and Ping time.
I recommend the speed testing service of Speedtest.net.
Just a few minutes ago, I tested my Internet download speed in office. It is connected by Telus.
The result is following:
Testing between, Telus Internet user and Lynnwood, WA server.
Now I have my own current speed testing result. Let us see how about the rest of the world.
Continue reading “Download speed comparison Globally”