After much in frustration I am reaching your help and expertize.
We have set up an new site in subdomain to compliment our main site and the other sub-sites ( in subdomains) that we have and operate. The concept is to be a stock market mini portal with news feeds and trading analysis.
Both this site and main site run on wordpress and use the same wp theme.
Although the main site and the other subdomains (hosting each the authors personal blog) respond in timely fashion and load within reasonable time this site has from the begging erratic behavior.
The first time is accessed it has an extremely long time for the first byte time and takes unacceptable long time to start rendering but on subsequent refreshes it load relative fast
We use the SuperCache plugin for caching purposes.
Speed results can be found : http://www.webpagetest.org/result/150119_V7_QFX/
Any suggestions of the cause of the problem ?
Do any of the news feed or trading analysis require calls out to external services or do live calculations and are served directly in the HTML or are they all populated by Ajax on the front-end? That jumps out as the most obvious place I’d look for back-end delays.
The news feed are handled as RSS feeds using the Hungryfeed RSS plugin for Wordpress with caching enabled.
Part of the code:
And a follow up question if you are so kind
Some static images that work as headers-titles for the varius categories is better to call them as external files or put it within the main html page using base64 encoding
new test results :
Try to minify JS and CSS files, this plugin should do that and many other things that boost WordPress performance: https://wordpress.org/plugins/w3-total-cache/
Your site is running surprisingly fast for being on a Soup Kitchen server (many sites with unknown resource drains based on their traffic).
Use W3TC at your peril. You can prove this by doing a WPT test with W3TC active/inactive.
Try ZenCache instead.
Also your site requires 148 objects to render a page… Shudder…
Try reducing objects or defer loading of non-essential images + js till after page load.
If you must have all images present, then spritify your images.
I’ve ran the tests on your website a few times now and I don’t really see much of the erratic TTFB you mention.
http://capital.analyst.gr is currently offline.
WebsiteSpeedExperts suggests what is likely your next best area of optimization - spritify your site.
That said, just making all your jpeg images progressive might dramatically improve your site performance.
Specifically, making your images progressive will likely drop your OnLoad time from 12secs to around 3-4 seconds.
A simple + fast way to do this is to run the ImageMagick convert program across all your jpeg images, via a script that keeps original + replaces original with compressed/progressified/stripped image.
The command I use for this is…
convert $in -strip -interlace Plane -quality 90% $out
Progressive images won’t change the onload time at all assuming the original images are also compressed well (though they may help the visual experience depending on where you sit in the debate). The browser still waits for the full image to finish loading before firing the onload.