Opened 3 years ago

Closed 3 years ago

#2243 closed Task (fixed)

Bulk insert reports when archiving

Reported by: matt Owned by:
Priority: normal Milestone: Piwik 1.3
Component: Core Keywords:
Cc: Sensitive: no

Description

Extract from Custom date range processing with sql profiler enabled (see global.ini.php)

Executed 5314 times in 5313.7ms (average = 1ms)

	INSERT IGNORE INTO piwik_archive_blob_2010_01 (idarchive, idsite, date1, date2, period, ts_archived, name, value) VALUES (?,?,?,?,?,?,?,?)

What happens is that the Page URL / Page Titles have many children, ie. thousands, and Piwik will INSERT one at a time, in this case 5313 times. Instead, Piwik should log all these into a file and use LOAD DATA INFILE

I'm pretty sure we'll hit some bugs with some user configurations, so plan for a setting in config file to disable the bulk insert.

Attachments (2)

unhex test.txt (1.6 KB) - added by matt 3 years ago.
2243.patch (490 bytes) - added by vipsoft 3 years ago.
Hack for charset mismatch

Download all attachments as: .zip

Change History (39)

comment:1 Changed 3 years ago by matt (mattab)

(In [4239]) Refs #2243

  • better logging
  • better performance when archiving Custom Date Range by changing cache key
  • remove unnecessary class hierarchy - ArchiveProcessing/Record*
  • Adding helper function to load data bulk in mysql + unit tests - tested with Mysql PDO but not mysqli (are webtests still running with mysqli ?)

If the bulk import fails for any reason, we fallback to loading results in a loop and doing plain INSERTs (rather than config file to disable)

comment:2 Changed 3 years ago by vipsoft (robocoder)

are webtests still running with mysqli

Only unit tests. I'll address this when we have parameterized builds running on Jenkins.

comment:3 Changed 3 years ago by matt (mattab)

OK cool I actually meant unit tests, that's perfect

comment:4 Changed 3 years ago by matt (mattab)

  • Resolution set to fixed
  • Status changed from new to closed

(In [4253]) Fixes #2243

  • Enabling the bulk loading on BLOB records.
  • Had to first run the blobs through bin2hex... indeed even though it *looked* like it was working at first, the tests once again saved us from a *VERY* nasty bug which would have shown up only later and be extremely difficult to find. LOAD DATA INFILE was working fine most of the time, but a few tests were failing! Then I discovered that the file containing the blob wasn't loaded properly in all cases, unless the binary blob was hex'ed. I ran quick performance test and it seems approximately the same performance, and uses the same disk space as well. However in some cases, it will definitely improve performance. memory usage somehow is much less in my test! so definitely a great point. it would have been nice to have versionning in the archive_blob table, since now we have 2 different data types (old blob are pure binary, new blobs are hex'ed) but it is working nonetheless (Old blobs will fail the first gzuncompress(hex2bin()) then fallback to gzuncompress() only


comment:5 Changed 3 years ago by vipsoft (robocoder)

Can you use MySQL's native HEX() and UNHEX() functions? I would think this would give us backward compatibility on blobs while also reducing php's runtime memory usage (because the string conversion happens in the db server).

comment:6 Changed 3 years ago by matt (mattab)

To use HEX we would have to use LOAD DATA INFILE ....... SET value = HEX(value) but the SET feature is only available as of 5.0.3

For the select part, I gave it a quick try, but it doesn't seem to work. Attached patch for reference.
in any case if this code is working perfectly as I hope it does, then it's fine ;)

Changed 3 years ago by matt (mattab)

comment:7 Changed 3 years ago by vipsoft (robocoder)

That wasn't what I meant. Let me test something.

comment:8 Changed 3 years ago by vipsoft (robocoder)

I'm still investigating an alternative to hex2bin/bin2hex because the current method doubles the blob size, but I want to log some observations first:

re: r4259. Not sure why it passed (before commenting it out) on your box. The CI server runs 5.3.6, so it's falling back to databaseInsertIterate. I suspect the test fails here because it inserts binary data (e.g., containing NULs) into a varchar column.

MySQL documentation says LOAD DATA INFILE uses the database's character set but we haven't explicitly set this in the CREATE DATABASE. Also, the "CHARACTER SET" clause wasn't added to the LOAD DATA INFILE syntax until MySQL 5.0.38. (At minimum, we should set the database charset for consistency.)

One thought is conditionals to use the newer features if available, e.g., Zend_Db_Adapters have a getServerVersion() method.

comment:9 Changed 3 years ago by vipsoft (robocoder)

(In [4263]) refs #2243 - set database's default charset explicitly

comment:10 Changed 3 years ago by vipsoft (robocoder)

(In [4264]) refs #2243 - use Piwik_Exec() for MYSQLI compatibility, i.e., can't prepare LOAD DATA INFILE when using Piwik_Query(). Tested with 5.1.3, 5.1.6, 5.2.0, and 5.2.17.

comment:11 Changed 3 years ago by vipsoft (robocoder)

  • Resolution fixed deleted
  • Status changed from closed to reopened

I'll add a test case that uses a table with a blob column It appears the bin2hex/hex2bin is unnecessary, so long as the column's datatype is blob.

comment:12 Changed 3 years ago by matt (mattab)

You says it doubles the blob size, but my tests shows that in fact it doesn't.
I run the archiving on 2 full years with the batch insert, and wihout, and the DB size was the same in both cases.

OK for charset, good thinking.

I'm just slightly concerned about the added escaping function, but tests are passing which should be fine.

When you test, it is good to also not only rely on unit tests: the blob sizes & scope they generate is quite limited. Try to generate loads of data using Visito Generator, then archive the data with the old & new algorithm to see if both work as excpected. You can also easily compare piwik_archive_* db size.
To switch from one to the other, simply do:

### Eclipse Workspace Patch 1.0
#P trunk
Index: core/ArchiveProcessing.php
===================================================================
--- core/ArchiveProcessing.php	(revision 4260)
+++ core/ArchiveProcessing.php	(working copy)
@@ -696,7 +696,7 @@
 	protected function insertBulkRecords($records)
 	{
 		// Using standard plain INSERT if there is only one record to insert
-		if($DEBUG_DO_NOT_USE_BULK_INSERT = false
+		if($DEBUG_DO_NOT_USE_BULK_INSERT = !false
 			|| count($records) == 1)
 		{
 			foreach($records as $record)

comment:13 Changed 3 years ago by vipsoft (robocoder)

But that doesn't disable the bin2hex/hex2bin. When I say "double the size", I'm comparing the old blobs (octets) vs the new blobs (nybbles as hexits). So, DEBUG_DO_NOT_USE_BULK_INSERT doesn't affect the size as it still uses hexits.

With the escaping function (used only when writing to the infile), blobs go back to being binary data.

comment:14 Changed 3 years ago by vipsoft (robocoder)

(In [4272]) refs #2243 - looks good so far

comment:15 Changed 3 years ago by matt (mattab)

But that doesn't disable the bin2hex/hex2bin.

oh oh I messed up! and I thought that the results were strange, well I should have thought a bit more indeed

The tests are failing for missing table: http://qa.piwik.org:8080/webtest/003_UnitTests/001_response_invoke.html

when you are confident all is working and tested properly, please ping me and I will test on the demo :)

comment:16 Changed 3 years ago by vipsoft (robocoder)

Yeah, I forgot to drop the database before running my unit test changes. I'll fix this a.m.

comment:17 Changed 3 years ago by vipsoft (robocoder)

Fixing the unit tests is taking longer than I thought ... encountering more test runner side-effects (similar to #896).

comment:18 Changed 3 years ago by vipsoft (robocoder)

(In [4285]) refs #2243 - move test code around

comment:21 Changed 3 years ago by vipsoft (robocoder)

(In [4290]) refs #2243, refs #2253 - use a pseudo test method since destruct is called too late

comment:24 Changed 3 years ago by vipsoft (robocoder)

The non-integration tests are working but the integration tests are broken. I'm getting different errors depending on whether I run Main.test.php or all_tests.php.

I've been hacking this since r4287. I'm off for dinner, and will revisit later tonight, unless someone beats me to it. ;)

comment:25 Changed 3 years ago by vipsoft (robocoder)

*blink blink* Main.test.php passes from command line and browser. Only failing in the browser when run as part of all_tests.php.

comment:26 Changed 3 years ago by vipsoft (robocoder)

  • Resolution set to fixed
  • Status changed from reopened to closed

(In [4293]) fixes #2243, refs #2253

comment:27 Changed 3 years ago by vipsoft (robocoder)

(In [4294]) fixes #2255

  • When the LOCAL keyword is omitted from the statement, the unit tests fail on dev6 because the db user doesn't have FILE privilege granted. (Granted.)
  • When the LOCAL keyword is included in the statement, the unit tests fail on dev6 because MySQL was not built with --enable-local-infile. (Thus, PDO_MYSQL on dev6 is affected by PHP bug 54158.)

refs #2243

  • fixes problem where infile not being deleted when Piwik_Exec() threw an exception (MYSQLI)
  • set driver_options uniformly between MYSQLI and PDO_MYSQL
  • update unit test conditions (when batch insert should work)
  • renamed methods
  • the performance gain is worth the effort of trying both LOAD DATA LOCAL INFILE and LOAD DATA INFILE

comment:28 Changed 3 years ago by matt (mattab)

  • Resolution fixed deleted
  • Status changed from closed to reopened

Reopening as on Windows at least there are some issues:

ArchiveProcessing.test.php

Fail: Test_Piwik_ArchiveProcessing -> test_tableInsertBatch -> Record 4 bug, not 简体中文 BUT ???? at [D:\piwik\svn\trunk\tests\core\ArchiveProcessing.test.php line 305]
Fail: Test_Piwik_ArchiveProcessing -> test_tableInsertBatch -> Record 4 bug, not 简体中文 BUT ???? at [D:\piwik\svn\trunk\tests\core\ArchiveProcessing.test.php line 305]

Also, some Main.test.php fail with "no data" when there should be some, which I think is because the blobs don't uncompress properly and Piwik defaults to "no data" rather than throwing the error.

Maybe the archiveprocessing failure above can help: it looks like utf8 characters. COuld it be some kind of charset issue? or windows specific issues?

Alternatively, we could disable bulk loading on Windows, but only when we are 100% sure that the failing tests are due to windows.

comment:29 Changed 3 years ago by vipsoft (robocoder)

I hope it's trivial ... I'll take a quick look in the a.m.

comment:30 Changed 3 years ago by matt (mattab)

Actually, only ArchiveProcessing.test.php is failing, Main.test.php seems fine

comment:31 Changed 3 years ago by vipsoft (robocoder)

We may need some more users to test on Windows. ArchiveProcessing.test.php passes for me with r4314 on Windows using:

  • PHP 5.2.9 + Apache + MySQL 5.1.33
  • PHP 5.2.14 + IIS + MySQL 5.1.33

Can you run SHOW VARIABLES LIKE 'character_set%'; ?

On my Windows box, my.ini has:

 character_set_client            | latin1
 character_set_connection        | latin1
 character_set_database          | latin1
 character_set_filesystem        | binary
 character_set_results           | latin1
 character_set_server            | latin1
 character_set_system            | utf8

This is the same as my Ubuntu dev box:

| character_set_client     | latin1
| character_set_connection | latin1
| character_set_database   | latin1
| character_set_filesystem | binary
| character_set_results    | latin1
| character_set_server     | latin1
| character_set_system     | utf8

comment:32 Changed 3 years ago by matt (mattab)

Runnning the query on the piwik_tests database, I get:

Variable_name 	Value
character_set_client 	utf8
character_set_connection 	utf8
character_set_database 	utf8
character_set_filesystem 	binary
character_set_results 	utf8
character_set_server 	latin1
character_set_system 	utf8
character_sets_dir 	C:\xampp\mysql\share\charsets\

and the values from mysql.exe

$ /cygdrive/c/xampp/mysql/bin/mysql.exe --verbose --help | grep set

character-sets-dir                (No default value)
default-character-set             latin1

comment:33 Changed 3 years ago by matt (mattab)

FYI the code is now running on demo.piwik.org

I had to DROP the archive tables for March & April since the ones built with the previous RC were HEXed and therefore not read properly. After dropping the tables, I ran archiving.

The problem is that, apparently the load data infile isn't used, I saw this in the SQL prfiler output

Executed 2 times in 1.2ms (average = 0.6ms)

	INSERT IGNORE INTO piwik_archive_blob_2011_01 (idarchive, idsite, date1, date2, period, ts_archived, name, value) VALUES (?,?,?,?,?,?,?,?)

investigating...

Changed 3 years ago by vipsoft (robocoder)

Hack for charset mismatch

comment:34 Changed 3 years ago by vipsoft (robocoder)

matt: can you try the attached patch?

comment:35 Changed 3 years ago by matt (mattab)

On demo.piwik.org: There is just an issue that somehow the Piwik_Exec doesn't log the query execution in the profiler like it does with other functions

After adding lots of logging the bulk insert code is executed as expected, cool!!

comment:36 Changed 3 years ago by matt (mattab)

I applied the patch locally on windows andit fixes the ArchiveProcessing.test.php !!!

comment:37 Changed 3 years ago by vipsoft (robocoder)

  • Resolution set to fixed
  • Status changed from reopened to closed

(In [4320]) fixes #2243 - charset mismatch hack; also handle case where dbhost is blank (eg socket), or a pseudo-domain was added to the hostname

Note: See TracTickets for help on using tickets.