This title will be a bit misleading. I know what you are thinking...
Great this guy has a way for me to define a django model then copy in an entire database--- constraints, defaults, foreign keys and all!
Well, no, I don't have a great way to do that. Is it possible? Probably. But I don't know how to do it.
I have some tables that I want to preserve the relationship structure. So when I add tables in my django project (
models.py) then sync up to the database, I need to to disable the automatically generated primary key field, so I can insert the records into the database via the
\COPY command. Finally, I need to reinstate the serialization of the id field as a constraint.
So, first,
connect to the database using the method described in a previous post.
Then create the database model in
models.py. In a complex data model situation, I like to bring the tables in one at a time. That means, creating a class in
models.py, then syncing to the database one table (or collection of related tables) at a time.
Let's add the
States class, which is simply a listing of the United States. These will be used in conjunction with the
previously created Zipcodes class:
class States(models.Model):
abbr = models.CharField(max_length=2)
name = models.CharField(max_length=35)
modifieduser = models.CharField(max_length=50, default='admin')
modifieddate = models.DateField(auto_now=True)
In the powershell window, let's sync this up to the database.
(django-jay) PS C:\Python27\home\jtarlecki\djcode\django-jay\rebates> python manage.py syncdb
Which will hopefully result in some confirmation text saying that the table has been created.
Creating tables ...
Creating table rebate_states
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
Now, connect to that database with psql and drop the constraint of the id field. Basically, the idea here is to change it from a SERIAL field to a simple INTEGER field. The de-serialization drops the field's ability to be primary key in postgreSQL.
rebates=> ALTER TABLE rebate_states ALTER COLUMN id DROP DEFAULT;
Great-- with the default dropped, now let's copy in all the data.
jaytarle_rebates=> \COPY rebate_states FROM C:\Users\Dorothy\Desktop\states.csv
WITH CSV HEADER;
Now, we change the id field back into a true primary key by setting its default as nextval sequence. First, let's look at the syntax nicely formatted.
DROP SEQUENCE rebate_states_id_seq;
CREATE SEQUENCE rebate_states_id_seq
MAXVALUE 2147483647;
ALTER TABLE rebate_states
ALTER COLUMN id TYPE INTEGER;
ALTER TABLE rebate_states
ALTER COLUMN id SET DEFAULT nextval('public.rebate_states_id_seq'::text);
Now, just squish all that into one line so you can execute from the psql shell:
jaytarle_rebates=> DROP SEQUENCE rebate_states_id_seq; CREATE SEQUENCE rebate_st
ates_id_seq MAXVALUE 2147483647; ALTER TABLE rebate_states ALTER COLUMN id TYPE
INTEGER; ALTER TABLE rebate_states ALTER COLUMN id SET DEFAULT nextval('public.r
ebate_states_id_seq'::text);
From which we get the following confirmation:
DROP SEQUENCE
CREATE SEQUENCE
ALTER TABLE
ALTER TABLE
And, voila... our records are in the database and re-synced up with our django project.
Did I copy in a whole database? Well, no. I need to do this process for every table. I know, it sucks. Perhaps there is a better way...