After recent update of community page to latest 1.9 release candidate version, followed by enabling search using SOLR integration, we're proud to announce that community is now served entirely over SSL. Such change not only improves security and privacy of the users, but also allows us to benefit from state-of-the-art HTTP optimizations such as SPDY.
The goal of SPDY is to reduce web page load time by prioritizing and multiplexing the transfer of web page subresources so that only one connection per client is required. Since today we are serving community page using spdy/3.1 as well as http/1.1, so if your browser supports SPDY, you can benefit from it today without changing anything.
If you're interested in more technical aspects of our configuration, you might want to check how our SSL configuration validates as well as our SSL configuration for Elgg 1.9 on Nginx.
info@elgg.org
Security issues should be reported to security@elgg.org!
©2014 the Elgg Foundation
Elgg is a registered trademark of Thematic Networks.
Cover image by Raül Utrera is used under Creative Commons license.
Icons by Flaticon and FontAwesome.
If you plan to configure your own server to use SSL, please make sure to use latest version of openssl software as we still remember Heartbleed bug and should be weary of more recent ones like CVE-2014-0224.
You wouldn't happen to have any comparisons you could show us b/w SPDY and non-SPDY performance of the community would you?
Nope, and I'd rather expect visible improvements mostly on cold cache requests or crappy connections. You can disable it do do some benchmarks of course.
I'd love to quantify the impact if possible
Tests of 5 concurrent sequences of fresh connection and repeated view on "Mobile 3G (1.6 Mbps/768 Kbps 300ms RTT)" internet simulation 2 times each test:
With SPDY:
http://www.webpagetest.org/result/140620_3M_3WM/
http://www.webpagetest.org/result/140620_MJ_3Y3/
Without SPDY
http://www.webpagetest.org/result/140620_ZR_3T3/
http://www.webpagetest.org/result/140620_Y8_3ZA/
I'm (sadly) not seeing that much difference with caching disabled on my laptop. There's enough variance that sometimes it's even faster with traditional http loading.
Problem seems to be that even though the requests are all in parallel with SPDY, *all* the downloads are in serial. I.e. it literally does only open *one* connection. With http, there are a limited number of requests in parallel but all the downloading happens in parallel too.
the more recent versions of nginx include many ssl /spdy fixes and performance boosts (i saw the error message came up on the community here that said that nginx 1.1 something was being used.. while i am using 1.7.1).
after months of playing with the latest nginx builds here, i now have a very fast serve time with https only and nginx.
We are using 1.6 now, so that's not the issue.
oh ok.
looking at the waterfalls above here.. i notice that my site, although being SPDY enabled - does not present the very short type of waterfall shape that the elgg community is showing. i don't really know what is occurring in them.. they only appear to include ssl related transfers and not actual page data..
does anyone know why that is?
From the test results and google pagespeed insights, It seems that the community does not use any compression algorithm like gzip in nginx. That would save up on bandwidth and speeds up the site. I would suggest to enable that.
Evans test results show improvement on first byte and page time load using spdy. first byte time is qualified as good, but is relative to the total time loaded (see remark)
Using HHVM further improves load time and first byte (response time) even compared to using spdy:
Apache with SPDY
http://www.webpagetest.org/result/140622_4P_ABX/
Apache with HHVM
http://www.webpagetest.org/result/140622_MA_AET/
Remark: My page load time and first time byte are way faster than the community but get a D rating, the first byte time is mostly caused by elgg init process and running all start.php files from modules. I did some work on reducing that and that seems to pay off, even without using HHVM which still speeds thing up by 40% because of the faster processing of php in HHVM..