Resolving performance problems is hard (even more so when you have to do so with somebody else's code) and some clear measurements are very welcome. I tried out XDebug some years back when and it didn't work very well back then, but the latest release seems quite good. Here is how to use its profiler:
- First and foremost make sure that you are not using it on a production server! Good as it may be, it adds a substantial slowdown to the processing and it some rare cases the server could crash!
- Find out the PHP version you have and the location of PHP ini. You can do this by viewing a simple php file containing nothing more than:
- Get the XDebug module corresponding to your PHP version. This can be as simple as doing sudo apt-get install php-xdebug or going to the XDebug site and downloading it
- Register XDebug as a Zend extension (it is very importand not to register it as a normal extensions, that is with php_ext, but as a Zend extension with zend_extension_ts because it needs to hook in at a much deeper level than a normal extension)
- Setup the correct configuration in your php.ini. The documentation is a good starting point. Some things you should remember:
- Make sure that xdebug.profiler_enable is set to 1
- Make sure that xdebug.profiler_output_dir is set to an existing directory
- Change the xdebug.profiler_output_name setting, since the default will result almost certainly in data loss (since it saves the profile for the same page in the same file every time). I usually use cachegrind.out.%R.%t.%r
- The directory containing the trace files can grow very big very quick, so make sure that you keep an eye on it and purge it from time to time. This is an other reason to only use it on a test server.
- At this point you have two options: you can use either your web-browser to visit the target URL or a command line program like cURL or wget. The advantages/disadvantages being:
- With a web-browser you are using a
realclient which is fetching all the dependencies of the page. Also, you can fetch pages which have
complexaccess methods (like HTTPS with sessions which require login). The disadvantage is that you have no precise measure of time, only impressions (
this feels slow), which can mislead you.
- With a command-line program. These don't fetch all the dependencies and need a little work to set-up to handle authenticated pages (if you need HTTPS I definitely suggest cURL), but can be more easily benchmarked.
- With a web-browser you are using a
- Now that you have the trace data, you would load it up into something like KCacheGrind (for Linux) or WCacheGrind (for Windows). Don't let the fact that the later wasn't updated in three years scare you, it still works very well.
- At this point you can have two possibilities: a function taking long time to execute or a function being called many, many times. Also, keep in mind that the times displayed from the profiling information are relevant only from a magnitude point of view, not their exact value (again, enabling profiling slows the system down considerably, so disabling it will speed it up). There is no general
silver bulletto make scripts fast, however here are two tips to get you started:
- Know the functions available in PHP and avoid rewriting them, since they are implemented directly in C which is much more efficient. Be especially weary of constructs which loop over arrays for example.
- If a function is called repeatedly, but it always gives the same result for the same parameters, consider caching the result and first trying a lookup in the cache before starting to compute the answer.
Happy profiling! To give you an idea about the benefits: I used this technique on the blog plugin for DokuWiki and obtained a speedup of ~20%!