-
Notifications
You must be signed in to change notification settings - Fork 31
Open
Description
Hi Keith,
Today I tried using s3util to stream 50 x 50MB videos to S3 in one second. My Heroku dyno ran out of memory (512MB).
vegeta attack -targets=targets-video.txt -rate=50 -duration=1s > results.bin
So time to figure out the memory profiler. This is what it reports for a smaller 500K upload:
flat flat% sum% cum cum%
76800kB 96.17% 96.17% 76800kB 96.17% github.com/kr/s3/s3util.(*uploader).Write
529.26kB 0.66% 96.83% 529.26kB 0.66% bufio.NewReaderSize
399.30kB 0.5% 97.33% 814.94kB 1.02% crypto/x509.parseCertificate
32.01kB 0.04% 97.37% 76970.79kB 96.38% io.Copy
The io.Copy is allocating 32K for every upload. My app is reading from a multipart.Part which doesn't implement WriteTo to avoid creating that buffer, nor does s3util implement:
func (u *uploader) ReadFrom(r io.Reader) (n int64, err error)I'm not sure if that is the main issue though. I'm still trying to understand uploader.go, particularly Write.
Metadata
Metadata
Assignees
Labels
No labels