Convert a FileMaker database to MySQL without a Mac
A friend of mine studies Chinese medicine and had painstakingly entered a decent amount of data into a FileMaker database on his Mac. He wanted some help deploying to a Web based interface and I agreed to help on the condition that we move the data to MySQL, as I don’t have any skills in the Mac world. Another problem was that he no longer HAD his Mac , so did not have a copy of the FileMaker software, just the .fp7 files holding the data.
Using a 2 week trial of FileMaker on Windows, and some Python coding using the BeautifulSoup package, I was able to convert the FileMaker database to MySQL. The following is an overview of the steps I followed. I preface this with the fact that I do not really know the FileMaker software, so there may be an easier way to accomplish these steps, but this method produced the methods well enough for a one time project.
Export each table in Filemaker in XML format
I first downloaded the 2 week free trial of FileMaker Pro for Windows to see what I could do with the files I had. The interface is quite intuitive, so I was able to open and look at any of the tables. My first pass at a solution was to export a table into a CSV file and then import into MySQL. The problem I encountered here was this decoupled the table structure from the table data (which was contained in the CSV). I could find no way to export a table’s structure as SQL, so I would have had to manually build the table definition on the SQL side and then import the CSV. This might not be too bad for a single table, but I was looking at about 50 tables and this would have been too laborious.
With a little Googling, I realized there was an XML option, which produces an XML formatted file which contains both the table structure as well as the data. Here is the beginning of the XML produced for a table “dx” which contains 3 columns, and contains 586 rows. Note the last half of this screenshot contains the data associated with the first two rows of the table
Well, that’s all well and good, but how do we get from here to a MySQL table ?
Using Python and Beautiful Soup to create SQL
The XML file contains the information to define and create the table as well as the data to populate it, but it’s not in any sort of convenient form to do so. Using the Beautiful Soup package to parse the XML, I wrote a Python program that would create a SQL which would create a .SQL file containg the DDL to create the table as well as one INSERT state for each row represented in the XML file.
I had to make a few assumptions to create the table structures. If you note, the columns are defined as either TEXT or NUMBER in the XML above. This was consistent throughout the database; I had no particularly complex datatypes with which to deal, and every number I was dealing with was in fact an integer. As a first pass to table creation I mapped as follows to MySQL datatypes
I mapped all ‘NUMBER’ columns in FileMaker to an INT column in MySQL.
Any TEXT column in FileMaker became a VARCHAR(100) in MySQL. There was one exception to this. If the column name included the word ‘note’ in FileMaker , it was a note field and I made this into a VARCHAR(2000) to accommodate a longer free form note.
Here is the SQL corresponding to the XML above, creating the table and INSERTing the first 2 rows shown
With a little iteration, I was able to loop through the directory containing all the XML files and create a corresponding SQL file for each one and create the populated MySQL table.
Here’s the code used to produce the SQL for the dx table indicated in the examples above
from bs4 import BeautifulSoup as Soup4
file_table_name = 'dx'
#Literals for building SQL
cre_db_txt = 'CREATE TABLE IF NOT EXISTS '
int_txt = ' int(10)'
vc_txt = ' varchar(100) '
vc_long_txt = ' varchar(2000) '
null_txt = ' DEFAULT NULL'
end_cre_txt = ') ENGINE=InnoDB DEFAULT CHARSET=latin1;'
insert_lit = 'insert into '
# We will use this when building INSERT statements to determine if they need ticks for character data
col_type_list = []
#Open output SQL file
f = open('create' + file_table_name + '.sql', 'w')
f.write(cre_db_txt + file_table_name + ' ( ' + '\n')
# Read entire XML file into one string
with open(file_table_name + '.xml', 'r') as xmlfile:
xmldata=xmlfile.read()
soup=Soup4(xmldata,"xml")
dbnm = soup.find('DATABASE')
md = soup.find('METADATA')
fd_list = md.find_all('FIELD')
fd_len = len(fd_list)
fd_sep = ','
for i,fd in enumerate(fd_list) :
if fd['TYPE'] == 'NUMBER':
col_type_list.append('NUMBER')
fd_type = int_txt
if fd['TYPE'] == 'TEXT':
col_type_list.append('TEXT')
if 'note' in fd['NAME']: # a column names 'note' needs to be long
fd_type = vc_long_txt
else:
fd_type = vc_txt
# No comma after last field
if i == fd_len - 1:
fd_sep = ' '
f.write(fd['NAME'] + fd_type + null_txt + fd_sep + '\n')
# Finish up the table create statement
f.write(end_cre_txt + '\n')
# Now parse data and write out INSERTS
rdata = soup.find('RESULTSET')
for rd in rdata.find_all('ROW'):
in_text = '\n' + insert_lit + tbl_name + ' values('
rd_list = rd.find_all('COL')
rd_len = len(rd_list)
rd_sep = ','
for j,cd in enumerate(rd.find_all('COL')):
if j == rd_len -1:
rd_sep = ');'
if cd.find('DATA').string is None:
tag_val = 'Null'
else:
if col_type_list[j] == "NUMBER":
if cd.find('DATA').string == 'n/a':
tag_val = '0'
else:
tag_val = cd.find('DATA').string
else: # Must enclose in quotes if data is a string -- Stripping some special characters
tag_val = '"' + cd.find('DATA').string.replace(u"\u2018", "'").replace(u"\u2019", "'").replace('/', '').replace('"', "'") + '"'
in_text = in_text + tag_val + rd_sep
f.write(in_text)
f.close()