Import a CSV file to MySQL via phpMyAdmin

I was given access to an existing instance of MySQL via phpMyAdmin. My task was to import a CSV file using the import feature on phpMyAdmin. First, I went to the the main phpMyAdmin page. Next, I clicked on the database instance that I am interested. Then, I click on the table that I want to import my data and click on Import button. On the import form, I click Browse button to choose the CSV file on my local hard drive. Since my CSV file uses “|” to separate each fields and “\n” to separate each records, I chose the following options:

  • Chose CSV using LOAD DATA
  • Fields terminated by: |
  • Fields enclosed by: [remove default value and leave it blank]
  • Fields escaped by: [remove default value and leave it blank]
  • Leave everything else to default value.

Finally, I clicked the Go button. This resulted the following SQL query. Note, I renamed values that are specific to my environment. So this is just for your reference only.

LOAD DATA LOCAL INFILE 'input_file' INTO TABLE `tablename` FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n'# Affected rows: 1

103 thoughts on “Import a CSV file to MySQL via phpMyAdmin

  1. Your post helped get me to a solution. The flaw in my approach was not looking at my cvs file in notepad after I had exported it from excel. Once I did that I could see what was after every field and line in the text document and put the same thing in the “fields terminated by” and “lines terminated by” box in the CSV using LOAD DATA option of phpmyadmin (import) and it worked. I also had left the field names in the first row which probably caused another error. I haven’t figured out if there is a way to let the auto increment funtion of my primary key field populate but I just added the numbers in excel (starting from x+1, where x is that last number in the auto increment field) beforehand. Seems obvious now! Thanks.

  2. i have an issue regarding this.actually i have two filed in my CSV file i.e album_id and genre _id which have values 8 5 9 8 respectively in their respective fileds.
    when i map that data on my table[having already two fileds name album_id,genre_id] its not mapping properly and not showing exactly those values[8 5 9 8] which are in CSV file.

    ANY ONE HELP ME PLZ

    REGARDS

  3. The example you provided appears to only have a single deliminator instead of using 2, one for separating records and one for separating fields. If it is the case, I would guess it might be a starting point to see if the issue can be resolved. Check out a few sample CSV files online.

  4. thanks so much for your instruction. if i had found this yesterday it would have saved me hours of time. as it stands, you still saved me loads of work. One thing people might want to watch for is that if you have text strings that have a “,” in the cell it will enclose the field, in my case, with quotes, so watch for that.

    Cheers

  5. you my friend, are wonderful.
    it wasnt working at all for me – i was scared i woudl have to manually add everything, argg!
    so, thank you!

  6. Hi Dave, I had to do the same and made a post about that in my blog. Now I search the net and find that you had already posted about it. However, as I rechecked, doing exactly what you said didn’t work in my case. I had to fill up the ‘column names’ field too. The details with an example is at import csv to sql using phpmyadmin.

    Thanks for the nice effort you are putting to help the internet community.

  7. On the auto-increment field, when uploading a csv worksheet done in excel how do we allow for the auto-increment ID fields and the auto-date field?
    For example in SQL I have a table that asks for
    ID (auto incremented), Date (posts date and time of record creation automatically), name, etc…

    in the CSV file it begins with name, and leaves out of course the auto incremented ID and the date….

    Do we then precede the data with a 0,0, so that we at least have a field with 0 in it, so that the field is then anticipated? I mean, I’m guessing this is what we do… anyone have any ideas?

  8. Just thought I’d add a tip that I found following your instructions. After trying to import, and having it only do the first line, I figured that the “AUTO” setting for newline character must not like Mac branded newlines. I opened TextWrangler and changed the CSV to Windows new lines, and the import went through without any problems.

  9. i have a csv file, field count and type of field is correct, but the only problem is that my first field in the table is an auto generated no where as in the csv it an integer. Is is possible to import without removing this auto generated field?

    or is there any way to overwrite this feild.

    Plz help

  10. I have been able to import the csv file to my table but the table has a password column and when I try to login using the password mentioned in the imported csv file it says wrong password. When I see the existing password in the mysql table they are in encypted format, but the imported password is in normal format.

    Any help.

  11. Thank you for this. I have been working on importing excel to mysql for several days and this was the only good advice that I found.
    Thank you!!!

  12. Computers are picky. So much of this sounds so easy but structure and format of the data needs to match up precisely. Dealing with computers here. Auto increment fields (like “ID”) should be left blank on the imported data. MySQL wants to start where it left off auto-incrementing but doesn’t let you know where to start.
    I found it helpful to export the MySQL table to a cvs file. Include field names in the export and open it in Excel. It’s a checkbox on the export form.
    At times I changed the file extension to manipulate the text with find and replace in Notebook. I saved the Notebook (.txt) file and then renamed the file extension (.csv) before importing into MySQL.
    Bit of a pain all this. When manipulating the text file it’s easy to add extra quotes or get an extra character where it doesn’t belong which MySQL doesn’t like much. I had a text field as the last field and they were all blank (empty) which it didn’t like much either. Putting some text in there got the import to work.
    Good luck. Doing all this is good practice for dealing with incorrigible children. Develops patience.

  13. I too am struggling with the csv upload function on phpMyAdmin.

    I’m using the CSV using LOAD DATA option, with the following settings …

    Fields terminated by “,” <—- without the " "
    Fields enclosed by [left empty]
    Fields escaped by [left empty]

    And next is the problem field. I tried all sorts of combinations for getting the application to recognise Line termination. I've tried "//r" and "\n" (all without the " ", of course) with no success. Following from @Jp's comments, when I look in the csv file with Smultron, I see the lines are terminated with the usual return symbol. So no joy there either.

    The suddenly, success! I tried "\r" and the upload worked. Well, sort of. I get an error message. There's no actual message, however, just a white cross in a red box with the message "MySQL said: ?"

    @David, I notice in your scrip that you terminated the values with "|". This seems counterintuitive. As it's a comma separated value file, then shouldn't the fields be terminated with "," ?

  14. Hi Guys, I have successfully imported a bunch of data into MySQL via PhpMyAdmin using import. I have 7 columns with approximately 1,500 records in it.

    Settings I have used that works for me are:

    1. Browse the location of the text file.
    2. Character set of the files to utf-8.
    3. Partial Import check allow interrupt if the file is not too large else don’t check it.
    4. Select CSV using LOAD DATA
    5. Fields terminated by , {comma}
    6. Fields enclosed by {blank}
    7. Fields escaped by {blank}
    8. Lines terminated by {auto}
    9. Click GO button.

    Hope this will help guys.

  15. Just a short comment from me. I downloaded the updated version of myPhpAdmin which now contains functionality to upload a variety of files (including .xls and .xlsx). I’ve had a little more success with this, though sometimes the upload process “hangs” (although the data has been uploaded).

    Thanks @Rolando. Every little bit helps …

  16. Thanks for the commentary. I too have been struggling with the CSV import. The phpMySQLAdmin doc is useless in this regard.

    Can I presume there is no such thing as field mapping where I can align the fields in my CSV file to columns in my SQL table?

  17. HI,

    In phpmyadmin I exported a Excel csv with headings from am oscommerce shop, some changes were made to data, then imported as you suggest. Used all defaults but I got this message..

    #1064 – You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ‘products_id,products_quantity,products_model,products_image,products_price,produ’ at line 1

    There is also data showing such as below.

    Can you offer any suggestions please?

    oz

    products_id,products_quantity,products_model,products_image,products_price,products_date_added,products_last_modified,products_date_available,products_weight,products_status,products_tax_class_id,manufacturers_id,products_ordered
    1,20,M2045G/A-g,No-Image1.gif,150,11/11/2005 21:57,15/01/2006 10:04,NULL,0,1,1,1,0
    2,20,92298A-g,No-Image1.gif,159.0909,11/11/2005 21:57,15/01/2006 9:25,NULL,0,1,1,1,0

  18. NEWLINES WITHIN FIELDS

    I’m trying to import a dump of a database into MySQL, where the original dump contains \n within some fields. I want this to become an actual newline in the MySQL table (it will be used for textareas).
    Have spent a day now trying everything I can think of (using phpMyAdmin). The \n just appears as literal in the table.
    Anyone solved this before?
    Thanks.

  19. Hello Ian and the others,

    I preffer not to use phpMyAdmin to import CSV, i suggest to use EMS MySQL Manager or other visual database tool, specially about field maping.

    csv->Mysql Manager->Export as Insert-> Execute SQL from PHPmyadmin.

    And save a lot of time!

    Can I presume there is no such thing as field mapping where I can align the fields in my CSV file to columns in my SQL table?

  20. Hi Dave,

    I have a question. I have a wordpress.com blog but recently purchased my own domain and hosting from godaddy. I have installed the WordPress Application on my godaddy hosting.

    So now I need to get all of the content (post, comments photos etc) from my worpress.com blog to my self hosted blog.

    I exported the wordpress.com XML file but on the selfhosted blog, you can only upload CSV files in the myPHPadmin.

    How can I get all of my content from the wordpress.com blog to my selfhosted blog?

    I know people do this all the time and yet,I can’t find any clear directions on the web, nor on wordpress.org

    Your help would be greatly appreciated

  21. Thanks, you just saved me a lot of frustration. It’s a pity Excel doesn’t allow you to change the comma to another character when you export. It’s also a pity that phpmyadmin doesn’t have better built in help.

  22. I’m importing a CVS file with portuguese chars like “António”

    I’m using phpmyadmin

    the only part that gets imported is “Ant”

    I need to import several files and i’m spending a lot of time fixing these errors because there are multiple entries with these kinds of chars

    á à í ê ç …

    Any way around this?

    I changes the Char Map to latin, ascci, utf8 (default) and still nothing.

    Any help?

  23. Thanks, it is very helpful, I just save my excel file as CSV file.

    I did same, with the following settings …

    Fields terminated by “,”
    Fields enclosed by [blank]
    Fields escaped by [blank]

  24. Hi guys,

    I’ve got here a nice HowTo of making a valid csv file syntax to prevent errors. Hope its usefull.

    greetings

  25. Thanks. I’m doing a huge project that is requiring me to upload a huge database with about 1.2 million records (which will be individual posts). Each upload has about 6 custom fields, so it’s a slow process using the csv importer plugin (which I’ve been using thus far). The CSV Importer plugin is GREAT…however, not when you have as many posts to upload as I do. I appreciate the help. Thanks.

  26. Thanks Dave. Very helpful article. If your data contains commas in any of the fields (e.g. an address column with 123, Somewhere St, Greatville) then saving as a csv won’t work. Instead you can save as tab delimited text file and use ‘\t’ in the ‘Fields terminated by’ options box. Hope this helps someone.

  27. Pingback: Designs Gallery
  28. i know this is a super old thread, but i solved my problem seconds after looking at it, so here is what i did.

    i was importing a CSV file with only one column of email addresses. there are no commas since it is one column an no quotes since there are no spaces.

    the import kept failing until i specified the column name in the columns field. then

  29. Some tips that worked for me:
    If you DB table auto increments be sure to add an extra comma to the beginning of each line.
    If you have a date that auto populates you’ll have to manually add a column with a prefilled in date and time like “2011-09-15 13:30:10”
    I’m on a Mac and I too had a problem with phpMyAdmin recognizing line breaks so I just added the pipe “|” to the end of each line and made “Lines terminated by” equal to “|”

    Hope this helps others

  30. Is my fist time using phpMyAdmin. I want to import some Excel file as tables. Is there’s a way to upload all the files automatically, instead with the import button? Is there’s a way to append datat to an existing table automatically?

  31. Thanks Dude. ITs just that easy lol.

    I was trying a bunch of stuff on my own and I was futzing around with it. lol

    =)

  32. — Importing CSV with DATE column in mySQL —

    there aren’t a lot of posts about this on Google so ‘ll add something here with some recent experience.

    mySQL v5.1.53 (from WAMP Server 2.0 – i think it’s actually 2.1i or something… )
    phpmyAdmin 3.3.9 (also from WAMP)

    problem:

    loading dates from CSV was giving me no end of problems.

    If you use the date format of yyyymmdd (e.g. 20111231) there’s some issue with phpmyAdmin loading CSV (or maybe mySQL

    loader, no idea) where any date with a month higher than 09 only gives you a date of 0000-00-00 in your database – no

    matter what the heck you do to try to overcome it.

    What I finally found out (by using the phpmyAdmin INSERT capability on the INSERT tab in a table with dates) was that the

    data *HAS* to be in the format yyyy-mm-dd. that’s the only format that’s going to load either via SQL (INSERT INTO

    VALUES….) or via CSV Import.

    here’s my test table:

    col 1 called “ID”: an autoincrement ID column (just to add a little challenge)
    col 2 called “DATE”: a DATE filed (not datetime, just DATE)
    Col 3 called “TEXT”: to emulate a bunch of text fields following

    Sample data in CSV (created in Win Notepad so it has windows generated line feed/carriage returns, exported out of excel

    seems to do the same):

    ID,DATE,Text
    NULL,2011-11-21,this is my text
    NULL,2012-12-23,some more text
    NULL,2011-08-08,some list txt

    SIDEBAR:

    1. CSV approach: How to get those dates in Excel? Custom format of yyyy-mm-dd seemed to work fine for me…?
    2. SQL Approach: this is a bit trickier since my source data (more than 1000 rows with more coming all the time) was in

    Excel.
    – the ONLY way I could get INSERT INTO to work in mySQL was to use the “verbose” approach:

    INSERT INTO test_table(`ID`, `Date`, `SomeText`) VALUES
    ( NULL, ‘2011-11-21’, ‘this is my text’ ) ,
    ( NULL, ‘2012-12-23, ‘some more tesxt’ ) ,
    ( NULL, ‘2011-08-08, ‘some list text’ )

    THINGS to NOTE:

    — by “verbose” I mean listing the column names immediately following the table name
    — there can be no space between table name and the opening parenthesis for the col names
    — IN ADDITION (this one drove me nuts for a while), the only delimiters i could get to work to surround the table names

    were the ACCENT MARK (` not ‘ — the accent mark is the one you get without shift on the upper left key below ESC on US

    Keyboards). But then *only* the apostrophe (‘) would work to delimit field values (maybe someone can clue me in on all

    these issues I ran into??? – my workaround was to be extremely anal about the structure of the SQL statement pasted into

    phpmyAdmin)

    My data was coming from Excel (CSV mentioned above) so I wanted to use Excel to generate the SQL statements (I just needed

    each cell in my rows surrounded with the text accepted by mySQL in the “VALUES” clause of INSERT INTO. Excel was a bit

    trickier to get formulas to work to generate the text of the SQL code. Assuming you have three columns in A1 (no values,

    this is your ID column which will be autogenerated by mySQL), B1 and C1 with the data above, my formula (somewhere else in

    the spreadsheet) to generate *text* SQL looked like this:

    =”(NULL, ‘”&B1&”‘, ‘”&TEXT(C1,”YYYY-MM-DD”)&”‘) , ”

    — NOTE the single ticks close to the double quotes… this is to delimit data as shown in the INsERT INTO statement above
    — NOTE the TEXT formula to get a standard date column in Excel to come out as TEXT, otherwise you get a numeric Julian

    Date in your text string. the second parameter of the TEXT formula (“YYYY-MM-DD”) had to be in ALL CAPS in my experience to get it to spit out the text date string properly.

    once I had a column of 1000+ “VALUES” statements with the formula above, I could then just copy them into the SQL Editor in mySQL and execute

  33. Biru32 responding. Critical piece of info that helped me and hopefully will help someone else.

    I saved my CSV file as a “Windows Comma Separated” csv file. This made all the difference between success and failure.

    Also I selected “Replace table data with file”.

    Everything I read above is accurate.

  34. Been here for the second time got
    a nice story to read in here . so keep up the
    good work .

  35. Hey Dave, I have 3 sheets on the one file and want them to appear as three separate tables in the MYSQL. Will this method work ok?

  36. i am using phpmyadmin and i have used as exactly you said but sql is producing an error “Invalid parameter for CSV import: Fields escaped by”. So what should i do?

  37. I’m not sure where you’re getting your info, but great topic.
    I needs to spend some time learning much more or understanding more.
    Thanks for great information I was looking for this info for my mission.

  38. my problem is this my db import data in differenct shapes and words…………why is this???????????

    load data local infile ‘D:\HOSTEL LISTS FROM DEC,12.csv’ into table excel
    fields terminated by ‘,’
    enclosed by ‘”‘
    lines terminated by ‘\n’
    (id,name,father)

  39. It’s awesome to visit this website and reading the views of all friends about this paragraph, while I am also eager of getting knowledge.

  40. Quality posts is the secret to be a focus for the visitors to pay a quick visit the web page,
    that’s what this web site is providing.

  41. I for all time emailed this webpage post page to all my friends, as if like to read
    it after that my links will too.

  42. I know this is an old article, but found it today.

    I have a local database that I export to CSV and import into my mySQL database about once a week. It has been working fine for a year the way I have been doing it. Exporting from Access, I have | for the separator and ^ for the text. (there are some quote characters inside the text so I used the ^) This has worked just fine for a year.

    Today, when I tried to import new data into an existing db, it created a new table instead.

    I can’t for the life of me figure out what is going on.

    I tired your method above, and omitted the ^ and blanked out the other options on the import and it tells me that a blank is not a valid value.

    Thanks,

    Kirk

  43. Have you ever thought about publishing an ebook or guest authoring on other websites?

    I have a blog based upon on the same subjects you discuss and would love to have you share some stories/information.
    I know my audience would enjoy your work. If you’re even remotely interested, feel free to shoot me an email.

  44. Wonderful beat ! I wish to apprentice while you amend your website, how can i subscribe for
    a blog website? The account aided me a acceptable deal. I had been tiny bit acquainted of this
    your broadcast provided bright clear idea

  45. Great post. I was checking constantly this blog and I’m impressed! Extremely helpful information specially the last part 🙂 I care for such information much. I was looking for this particular information for a long time. Thank you and good luck.

  46. I did same, with the following settings …
    by selecting CSV using load data

    Fields terminated by “,”
    Fields enclosed by [blank]
    Fields escaped by [blank]

    This is working fine BUT I face the problem excel data is inserted doubly.Why this is happened?

  47. Ahaa, its good discussion about this article at this place at this website, I have
    read all that, so at this time me also commenting at this
    place.

  48. Right here is the right web site for anybody who would like to find out about this topic.
    You know so much its almost tough to argue with you (not that I personally will need to…HaHa).
    You definitely put a brand new spin on a subject that’s been discussed for ages. Wonderful stuff, just great!

  49. Excellent blog you have here but I was wanting to know if you
    knew of any message boards that cover the same topics talked about in this article?

    I’d really love to be a part of community where I can get suggestions from other knowledgeable people that share the same interest. If you have any suggestions, please let me know. Thank you!

  50. Hi I appreciate your blog. I am having problems inserting my tab delimited text file into my table. When I go through the steps the file load however only shows info in one coloumn

  51. It’s hard to come by educated people about this subject, but you seem
    like you know what you’re talking about! Thanks

  52. I am not sure where you’re getting your information, but great topic.
    I needs to spend some time learning much
    more or understanding more. Thanks for excellent info I was looking for this
    info for my mission.

  53. I’m now not certain the place you’re getting your info, however great topic.
    I must spend a while studying much more or working out
    more. Thank you for fantastic info I used to be in search
    of this info for my mission.

Leave a comment