New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
QA: Add benchmarking system to test speed of different parts of Piwik in different situations #3177
Comments
Attachment: Patch for this issue. |
Attachment: Screenshot of patch. |
Attachment: Real screenshot. Ignore the other one. |
Added a patch for this issue. It contains a system for benchmarking and the following benchmarks:
I didn't test all of them since they can take up to ~40 mins to set up, but here are the results of the ones I did test:
Some other notes:
|
Really interesting concept -- it has some value as it is as it allows to manually compare Before and After a particular change. It would be VERY nice to run it on jenkins as part of the broader plan in: #2000 For example we could create a new build on http://qa.piwik.org:8080/jenkins/ called "Performance testing", that would run the performance tests and report the time spent in each test. Ideally we would reuse some tool that would let us draw graphs to see how the performance is evolving across releases maybe? |
Attachment: New patch. Improved speed for adding visits. |
I added a new patch for this issue w/ improved speeds for adding visits. It essentially bypasses CURL for adding visits. I've added a check for the useLocalTracking query param in the integration tests so the improved speed can be used when running tests. For me, running all_tests.php?useLocalTracking=true sped things up from ~640s to ~470s. Let me know what you think. Replying to matt:
Do you think it would be a good idea to store the results in the piwik_log_profiling table? Then the graphs could be made from the data there. Or maybe we could create a Jenkins plugin... |
Code review
For sure, if the table is good enough then please reuse it, that woudl be great! we could then write the performance report as piwik plugin (and even export it as HTML via the report publisher #3118 :) You can commit as it is but definitely we need some kind of Reporting in Jenkins or on a webpage to keep an eye on performance :) Not sure what the best or easiest way would be? |
(In [6533]) Refs #3177, added benchmarking system and set of benchmarks to test Piwik performance. Notes:
|
I committed the benchmarking system, however, there's still more to do. Specifically, the benchmarks are pretty still slow (though manageable), and of course, there needs to be some way to integrate w/ Jenkins. Here are my ideas: Speeding Things Up I think instead of running setup each time the benchmark is run, we can run it once in a database named after the test case, then keep the database (ie, don't drop it) so the setup method won't have to be run again. Integrating With Jenkins I've looked at some plugins for Jenkins and found two that might be useful for tracking performance:
Both of these process XML to get performance data. The benchmarks could be setup to output this XML when certain query parameters are used, and, if possible, Jenkins could get this data and feed it to the plugin. However, I have no experience setting up/configuring Jenkins, so I don't know what this will entail. Or if it's possible. What do you think? |
(In [6534]) Refs #3177, fixing build. |
Code review
I think setup/teardown will be pretty small time compared to the test running? I think it's OK as it is (deleting/recreating the tables)
Looks nice - Would this Jenkins plugin store the value for each build automatically? Could we plot simple graphs like "measure time to run benchmark" for each benchmark + total benchmark time, using this plugin? |
(In [6535]) Refs #3177, fix require bug when loading single benchmark group. |
Replying to matt:
The Url.test.php changes are used because adding a query parameter (like useLocalTracking=true) to all_tests.php will cause some checks to fail. The port change was for Jenkins, since it doesn't use port 80.
Fixed the bug. BTW, if you load benchmark_runner.php directly, it'll show every benchmark.
Benchmarks will not get loaded when running tests since the files don't end w/ .test.php. If they did, Jenkins would never complete a build :).
The tests for the ones I ran take ~1-4mins. The setup takes between 7-11mins when using useLocalTracking=true. W/o local tracking, it can take > 40 mins.
Not sure. I was hoping someone w/ more experience w/ Jenkins would be able to more easily figure that out. |
From what I understand is being said, I don't believe the port change is necessary. Jenkins runs in a Jetty container under port 8080. On the CI server, we run php-cgi through Apache at port 80. |
Replying to vipsoft:
Sorry, I wasn't clear enough, by 'port change' I meant 'change to Url.test.php that deals w/ the port in the current url'. No ports were actually changed :) |
Replying to capedfuzz:
Maybe the question is: Why is this change now necessary? @vipsoft, do you have any experience with Jenkins plugins and performance reporting? |
Replying to matt:
The change to Url.test.php is in [6533]. Originally it tested getCurrentUrl() == getCurrentUrlWithoutQueryString(). If you use a query parameter like useLocalTracking=true, however, the test will fail. I modified it to use parse_url manually, so we can use useLocalTracking=true on all_tests.php (to speed things up). What I didn't do was make sure the port in the current url was added to the test's expected url. Which caused the test to fail in jenkins, since it uses port 8080. So I committed a fix in [6534]. |
Thanks @capedfuzz for the clarification. Do you think this ticket should be closed? |
Replying to matt:
There's still the issue that its a bit too slow to run a lot of benchmarks. I can make using persisted DB data an option (not dropping a test data). Also, still have to try w/ the database dump. But other than that, the issue looks done. |
See Follow up ticket: #3280 to write some kind of reports and graphs to get Performance across several builds, and see if and when the software becomes faster/slower. |
(In [6580]) Refs #3177, fix small bug in benchmark runner. |
The benchmark tests are using the simpletest library, right? I'm currently working on migrating all tests to PHPUnit. Most core and plugin tests are already migrated, some integration tests are still missing. The tests using simpletest will be removed when I'm done. Is there a way to migrate the performance tests to phpunit or something else? |
Replying to SteveG:
For the most part they depend on the Integration base test case class. Simpletest itself isn't really used. The only other requirements are that:
I haven't been keeping up w/ the PHPUnit changes, though. Do you think this would still be possible? |
Attachment: benchmarking w/ phpunit |
I've uploaded a patch that implements benchmarking w/ the PHPUnit test infrastructure. I've modified visualphpunit to show the total execution time of the executed tests and made it possible to set GLOBAL variables before running a test through the browser. To run a benchmark, you select a benchmark, and set the PIWIK_BENCHMARK_FIXTURE global variable. Then you run the test and it'll tell you how long it took. Some notes on this implementation:
It's a bit preliminary, and I don't think it's quite working (running the archiving process w/ 12000 visits takes .02s...). Matt, what do you think of my patch? And Stefan, do you see a problem w/ the way I've implemented it (in terms of using PHPUnit)? |
Attachment: Another patch for phpunit benchmarking. |
I uploaded a new patch. It's been tested and provides several more fixture types and a benchmark for the Tracker. The SqlDump fixture uses a hardcoded URL, but there's currently no publicly hosted SQL dump to use, so it will have to be changed. Let me know what you think of it. |
Another note about my patch: To run a benchmark, go into visualphpunit and select a benchmark to run. Then in the 'GLOBALS' section on the left panel, enter two key-value pairs: 'PIWIK_BENCHMARK_FIXTURE' => the name of the fixture to use, such as 'OneSiteTwelveThousandVisitsOneDay' (optional) 'PIWIK_BENCHMARK_DATABASE' => the name of the database to permanently store data in. if not set, the fixture setup will be run every time, which will slow things down if you run benchmarks more than once. Then run the test. Be sure to only run one benchmark at a time. I think after I commit, I'll have to edit the README. |
Looks good to me, nice work! I would just suggest to remove some fixtures, and keep the 3 most useful ones? Also as per email, please prepare a comprehensive DB dump for the test case for #3330 |
Attachment: Ugly CoffeeScript script used to generate visits. |
(In [6954]) Fixes #3177, added benchmarking system that uses phpunit and visualphpunit. |
(In [6955]) Refs #3177, set svn property. |
(In [7029]) Refs #3177, add README section for benchmarking system. |
(In [7031]) Refs #3177, remove some unnecessary XHProf files and add a section to the tests README describing how to use XHProf. |
When optimizing or doing major re-factoring it's necessary to run performance tests to make sure new changes are as fast or faster than the unchanged code. There should be a system in place to allow this sort of testing.
Some ideas on how benchmarking should be done:
The text was updated successfully, but these errors were encountered: