<?xml version="1.0" encoding="utf-8" ?><rss version="2.0" xml:base="https://www.webmaster-forums.net/crss/node/1042047" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title></title>
    <link>https://www.webmaster-forums.net/crss/node/1042047</link>
    <description></description>
    <language>en</language>
          <item>
    <title>The only problem I see with</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1273959</link>
    <description> &lt;p&gt;The only problem I see with that is I would have to have a third database to store the data that doesn&#039;t change every day. I don&#039;t have any other tables as the site is fairly static besides the import data...just a potential problem in the future.&lt;/p&gt;
 </description>
     <pubDate>Thu, 15 Mar 2012 12:07:11 +0000</pubDate>
 <dc:creator>David26</dc:creator>
 <guid isPermaLink="false">comment 1273959 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>needed to delete certain</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1273426</link>
    <description> &lt;p&gt;needed to delete certain records as well, and only insert most from the feed I received. I had a script go through the CSV file, and for any record that was going to need deteled, set a flag on the record.&lt;/p&gt;
 </description>
     <pubDate>Sat, 25 Feb 2012 16:34:11 +0000</pubDate>
 <dc:creator>solarlight</dc:creator>
 <guid isPermaLink="false">comment 1273426 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Yes, I feel for you.  Be</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241990</link>
    <description> &lt;p&gt;Yes, I feel for you.  Be happy though, this board actually supplied a data dictionary - many don&#039;t even go that far...&lt;/p&gt;
 </description>
     <pubDate>Wed, 27 May 2009 14:15:53 +0000</pubDate>
 <dc:creator>Shaggy</dc:creator>
 <guid isPermaLink="false">comment 1241990 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Have you had the &#039;pleasure&#039;</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241975</link>
    <description> &lt;blockquote&gt;&lt;p&gt;Have you had the &#039;pleasure&#039; of dealing with (board&#039;s interpretation of) RETS feeds yet?&lt;/p&gt;&lt;/blockquote&gt;
&lt;p&gt;At least that is in XML.  &lt;a href=&quot;http://www.taar.com/refresh/exports/&quot;&gt;This&lt;/a&gt; is what I had to deal with.&lt;/p&gt;
 </description>
     <pubDate>Wed, 27 May 2009 02:17:24 +0000</pubDate>
 <dc:creator>pr0gr4mm3r</dc:creator>
 <guid isPermaLink="false">comment 1241975 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Also sounds like a good</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241973</link>
    <description> &lt;p&gt;Also sounds like a good approach.  Another option could be two tables instead of two databases you switch between.&lt;/p&gt;
&lt;p&gt;Ah, real estate data.  It is always interesting what weird and wonderful ways the boards come up with to distribute that stuff eh?  Have you had the &#039;pleasure&#039; of dealing with (board&#039;s interpretation of) RETS feeds yet?&lt;/p&gt;
&lt;p&gt;Cheers,&lt;br /&gt;
Shaggy&lt;/p&gt;
 </description>
     <pubDate>Wed, 27 May 2009 01:55:47 +0000</pubDate>
 <dc:creator>Shaggy</dc:creator>
 <guid isPermaLink="false">comment 1241973 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Also, something I did to</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241965</link>
    <description> &lt;blockquote&gt;&lt;p&gt;Also, something I did to majorly speed up the import was instead of issuing SQL statements to insert the data, I use &lt;a href=&quot;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&quot; title=&quot;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&quot;&gt;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&lt;/a&gt; mysqlimport to quickly import a whole CSV file straight to the table.&lt;/p&gt;&lt;/blockquote&gt;
&lt;p&gt;That was my first thought, but the CSV files are not consistent, and the slightest change would cause a big problem.&lt;/p&gt;
&lt;p&gt;I ended up going with Matt&#039;s solution.  The website has three databases.  DB1 holds the consistent data, and DB2 &amp;amp; DB3 holds the daily imported data.  If we are using DB2 today, then when the import script runs, DB3 will be truncated and populated while the website is serving DB2.  Once the import is done, the active database is swapped out with the updated one by changing a constant.&lt;/p&gt;
&lt;p&gt;Clunky??  Maybe, compared to some of the other solutions posted here, but this company is a bit of a penny pincher, and they wern&#039;t wiling to pay me to come up with something better.  And, it&#039;s worked so far.  I support my code free for a year, so the company would get me on the phone in a heartbeat if there was a problem.  I had to iron out a couple minor glitches after the launch, but I haven&#039;t heard from them on this project since late last year, so it must be working for them.&lt;/p&gt;
 </description>
     <pubDate>Tue, 26 May 2009 22:23:05 +0000</pubDate>
 <dc:creator>pr0gr4mm3r</dc:creator>
 <guid isPermaLink="false">comment 1241965 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>(yes the solution is</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241959</link>
    <description> &lt;p&gt;(yes the solution is probably already in place by now, but I like posting anyhow for those who may find this thread on a search engine looking for similar solutions)&lt;/p&gt;
&lt;p&gt;Also, something I did to majorly speed up the import was instead of issuing SQL statements to insert the data, I use &lt;a href=&quot;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&quot; title=&quot;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&quot;&gt;http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html&lt;/a&gt; mysqlimport to quickly import a whole CSV file straight to the table.&lt;/p&gt;
&lt;p&gt;Some tips. &lt;/p&gt;
&lt;p&gt;If your working on a local mySQL server (the CSV file is on the same server) use the LOAD DATA first, so you if you get any warnings you can then SHOW WARNINGS afterwards.&lt;/p&gt;
&lt;p&gt;I needed to delete certain records as well, and only insert most from the feed I received. I had a script go through the CSV file, and for any record that was going to need deteled, set a flag on the record. Then any that needed inserted, I&#039;m picky, and do a data integrety check to make sure i didn&#039;t get any bad rows (ie. make sure each row the field count matches), also I get dates in MM/DD/YYYY format, this is a good time two swap those out to YYYY-MM-DD. Save all the good data into a new CSV file.&lt;/p&gt;
&lt;p&gt;As this checking and processing what will take the majority of time, your data in the database is intact until you are ready to finally do the update. &lt;/p&gt;
&lt;p&gt;Now you can do a sinlge DELETE FROM tblWhatever WHERE DelFlag=&#039;Y&#039;&lt;br /&gt;
Then immediately do the mysqlimport (you will need to run this via system() command.)&lt;/p&gt;
&lt;p&gt;I have found on two projects that this is so much faster than doing inserts/updates. Also another method would be instead of doing an extra CSV file you could just insert all the cleansed data into a temporary table, and then issue one statement to insert all records from the temp table into the live one.&lt;/p&gt;
&lt;p&gt;I stick with CSV file for other purposes here though relavent to the project.&lt;/p&gt;
&lt;p&gt;Also, not sure how you are getting the data from 3rd party, but if it is like us, where they have their inventory system export about 4 CSV files each night (and accompanying &quot;long text&quot;.txt files for descriptions in another directory, rsync (on the linux end) and DeltaCopy (which easily wraps and sets up rsync on the windows end) is a great tool. Extremely fast and it has ways to transfer just what had changed, not everything all over again. Blew away establishing an FTP connection.&lt;/p&gt;
&lt;p&gt;-Greg&lt;/p&gt;
 </description>
     <pubDate>Tue, 26 May 2009 21:27:33 +0000</pubDate>
 <dc:creator>Greg K</dc:creator>
 <guid isPermaLink="false">comment 1241959 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Shaggy wrote:
I&#039;m (a lot)</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241956</link>
    <description> &lt;p&gt;&lt;div class=&quot;quote-msg&quot;&gt;&lt;div class=&quot;quote-author&quot;&gt;&lt;em&gt;Shaggy&lt;/em&gt; wrote:&lt;/div&gt;I&#039;m (a lot) late to the party&lt;/div&gt;Yeah, the beer&#039;s all gone mate, looks like you&#039;re on the Lambrini &lt;img src=&quot;https://www.webmaster-forums.net/misc/smileys/grin.png&quot; title=&quot;Laugh&quot; alt=&quot;Laugh&quot; class=&quot;smiley-content&quot; /&gt;&lt;/p&gt;
 </description>
     <pubDate>Tue, 26 May 2009 20:52:53 +0000</pubDate>
 <dc:creator>greg</dc:creator>
 <guid isPermaLink="false">comment 1241956 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>I&#039;m (a lot) late to the</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1241954</link>
    <description> &lt;p&gt;I&#039;m (a lot) late to the party, but here are a couple more ideas:&lt;/p&gt;
&lt;p&gt;1) Wrap everything in a transaction* :&lt;br /&gt;
BEGIN WORK&lt;br /&gt;
delete from tblname&lt;br /&gt;
... bunch of inserts&lt;br /&gt;
if success&lt;br /&gt;
  COMMIT&lt;br /&gt;
otherwise&lt;br /&gt;
  ROLLBACK&lt;/p&gt;
&lt;p&gt;* Never tried this in MySQL, though it apparently supports transactions now...&lt;/p&gt;
&lt;p&gt;2) Implement a sort of &#039;row versioning&#039;:&lt;/p&gt;
&lt;p&gt;Add a table &#039;imports&#039; with an import_id start timestamp, end timestamp, and a &#039;status&#039; field as well as anything else you might like (number of rows inserted, etc).&lt;/p&gt;
&lt;p&gt;Add a field to your import target, &#039;import_id&#039;, and change your indexes/keys to be unique (compound) for each &#039;import_id&#039; and listing_id.&lt;/p&gt;
&lt;p&gt;Before begining the import, insert a row in &#039;imports&#039; to obtain a new unique &#039;import_id&#039;.  Have the status = &#039;incomplete&#039;.&lt;/p&gt;
&lt;p&gt;insert your data, with each row having the new &#039;import_id&#039; you&#039;ve gained.&lt;/p&gt;
&lt;p&gt;When the import is done, and successful, you can update the &#039;imports&#039; table row with a &#039;success&#039; status.&lt;/p&gt;
&lt;p&gt;When successful, you can either delete the previous &#039;version&#039; (import_id) of the rows, or leave them there for historical purposes.&lt;/p&gt;
&lt;p&gt;In your application, you&#039;ll be selecting rows with your current criteria plus &quot;and import_id = (select import_id from imports where status = &#039;success&#039; and end_timestamp = (select max(end_timestamp) from imports where status = &#039;success&#039;))&lt;/p&gt;
&lt;p&gt;Cheers,&lt;br /&gt;
Shaggy&lt;/p&gt;
 </description>
     <pubDate>Tue, 26 May 2009 18:26:18 +0000</pubDate>
 <dc:creator>Shaggy</dc:creator>
 <guid isPermaLink="false">comment 1241954 at https://www.webmaster-forums.net</guid>
  </item>
  <item>
    <title>Well, unless it&#039;s a regional</title>
    <link>https://www.webmaster-forums.net/web-database-development/massive-daily-mysql-imports#comment-1234129</link>
    <description> &lt;blockquote&gt;&lt;p&gt;Well, unless it&#039;s a regional site, what&#039;s the middle of the night for you is the middle of the day for others. Smiling&lt;/p&gt;&lt;/blockquote&gt;
&lt;p&gt;It&#039;s a real-estate site for a local area, so there wouldn&#039;t be many international visitors.&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;Are you using LOAD DATA LOCAL INFILE to load the data?&lt;/p&gt;&lt;/blockquote&gt;
&lt;p&gt;No...I wish I could, but the third party where we are getting the data has a habit of changing the format of the CSV files frequently, and w/o telling us.  I parse it using PHP&#039;s fgetcsv(), and then format it all for the database.&lt;/p&gt;
 </description>
     <pubDate>Sat, 04 Oct 2008 15:03:51 +0000</pubDate>
 <dc:creator>pr0gr4mm3r</dc:creator>
 <guid isPermaLink="false">comment 1234129 at https://www.webmaster-forums.net</guid>
  </item>
  </channel>
</rss>
