I can only send data when string contains 523264 characters. HELP!
include("conn.php");
$stmt = $conn->prepare("INSERT INTO json_t (json_string) VALUES (?)");
$null = NULL;
$stmt->bind_param("b",$null);
$stmt->send_long_data(0, $json_to_save);
$stmt->execute();
$stmt->close();
$conn->close();
i'v tried this but still same errors
"Error executing prepared statement. Row size too large (> 8126). Changing some columns to TEXT or BLOB or using ROW_FORMAT=DYNAMIC or ROW_FORMAT=COMPRESSED may help. In current row format, BLOB prefix of 768 bytes is stored inline."
Please HELP.
$stmt = $conn->prepare("INSERT INTO json_t (json_string) VALUES (?)");
$null = NULL;
$stmt->bind_param("b",$null);
$max_allowed_packet = 100000;
if (!$stmt->bind_param('b', $null))
die("Error binding parameters. {$stmt->error}
");
echo "<br/><br/>";
foreach(str_split($v, $max_allowed_packet) as $packet )
if (!$stmt->send_long_data(0, $packet))
die("Error sending long packet. {$stmt->error}
");
echo "<br/><br/>";
if (!$stmt->execute())
die("Error executing prepared statement. {$stmt->error}
");
OK problem solved. I changed engine to MyISAM.
Allows to send parameter data to the server in pieces (or chunks), e.g. if the size of a blob exceeds the size of max_allowed_packet. This function can be called multiple times to send the parts of a character or binary data value for a column, which must be one of the TEXT or BLOB datatypes.
Emphasis mine.
You need to break the string down yourself, for instance with str_split($json_to_save, 100000);
, then call send_long_data
for each piece to be sent as chunks to the MySQL server.
There are tons of questions about this already.
https://dba.stackexchange.com/questions/75328/row-size-error-with-mysql/75333#75333
(I can't flag as duplicate)
The problem is your server setting of innodb_log_file_size
How to figure out a good setting: https://stackoverflow.com/a/18811422