I want to insert about 50,000 mysql query for 'insert' in mysql db, for this i have 2 options,
1- Directly import the (.sql) file: Following error is occur " You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit. "
2- Use php code to insert these queries in form of different chunks from the (.sql) file. here is my code:
<?php
// Configure DB
include "config.php";
// Get file data
$file = file('country.txt');
// Set pointers & position variables
$position = 0;
$eof = 0;
while ($eof < sizeof($file))
{
for ($i = $position; $i < ($position + 2); $i++)
{
if ($i < sizeof($file))
{
$flag = mysql_query($file[$i]);
if (isset($flag))
{
echo "Insert Successfully<br />";
$position++;
}
else
{
echo mysql_error() . "<br>\n";
}
}
else
{
echo "<br />End of File";
break;
}
}
$eof++;
}
?>
Ama bellek boyutu hatası i 256M hatta 512M için 128M hafıza limitini yaygınlaştıracağız ancak gerçekleşebilir.
Then i think that if i could be able to load a limited rows from (.sql) file like 1000 at a time and execute mysql query then it may be import all records from file to db. But here i dont have any idea for how to handle file start location to end and how can i update the start and end location, so that it will not fetch the previously fetched rows from .sql file.