I am currently using EC2, and when testing (with locust.io) 100 concurrent users, I am achieving a RPS of 15. However, my CPU usage is nearly 100%.
Would anyone know how to track down the line/lines of PHP 7 code that are using most CPU, so I can optimize them?
I'd suggest xdebug
and kcachegrind
to profile and analyse the behaviour of your code.
Configure xdebug like this to enable profiling:
xdebug.profiler_enable = 1
xdebug.profiler_output_name = xdebug.out.%t
xdebug.profiler_output_dir = /tmp
xdebug.profiler_enable_trigger = 1
If you pass XDEBUG_PROFILE
as POST or GET paramteter, xdebug will produce profiling data in the configured location.
Open these files with kcachegrind to drill into it.
Beware: Execution while profiling will take its time, and the files produced can get pretty big -- have an eye on diskusage.
You can give it a try to xhprof. This may solve your problem.
As we used to have a hand on this when we wanted to know which part of script taking more time.
Give it a try.
There are quite a few commercial products designed to explicitly solve these problems (often with decent free trials) in most languages and application types.
New Relic is really good at this for web software, being able to generate burn charts showing which methods are causing the most CPU cycles. It also is able to specifically able to highlight bad queries, API calls and other external services.
It's my go to tool when trying to diagnose performance issues in PHP or Java web applications.
AWS just launched 'X-Ray' which is billed to do a similar job, although I've not had a chance to try it.