<div dir="ltr"><div><div><div>You don't need the PK to add a relation, you can do:<br><br><font face="FreeMono">cust =
Customers(name='GazMyas',city='Chelyabinsk')</font> <br></div>cust.licenses.add(License(x=1, y=2))<br>cust.licenses.add(License(x=3, y=4))<br><br></div>So, create the customer, then loop over all licenses for that customer, doing customer.licenses.add<br><br></div>You can then commit after each iteration, or just once at the end of the script, up to you.<br></div><div class="gmail_extra"><br><div class="gmail_quote">On 25 March 2015 at 08:28, Вадим Бажов <span dir="ltr"><<a href="mailto:vadim@ideco.ru" target="_blank">vadim@ideco.ru</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF">
Hello happy pony users and devs !<br>
<br>
I do an import from old db to a new one, driven by pony orm. Import
iterates over records of customers and their related licences,
fetched from an old DB. So we have to fill two tables in new DB:
Customers table and Licence table with records, which have one2many
relations (one Customer can have many Licences, by PK). PK is an
autoincrementing integer in both tables.<br>
<br>
Within <b>each iteration</b> of import i need to write one Customer
record and its Licences records to new DB. Licence records should
have customer_id fields filled up with corresponding Customer PKs.
Here we have a problem:<br>
<br>
If we create an object of an entity Customers : <font face="FreeMono">cust =
Customers(name='GazMyas',city='Chelyabinsk')</font> , it wont be
recorded to a DB exactly at this moment and we wont able to use its
ID (pk of the record) to bind Licence records to it. In other words
there is no '<a href="http://cust.id" target="_blank">cust.id</a>' element at this moment.<br>
<br>
We can do a 'commit()' each time we create a Customer object. After
that '<a href="http://cust.id" target="_blank">cust.id</a>' comes ready. But in case of monstrous imports with
over 20 000 records committing every record to hdd slows down the
import process to hours. And hangs up your server hard drive.
Anyway, it's a bad practice to commit every record walking the huge
array of incoming data.<br>
<br>
So, by now, I fill up two dicts with Customers records, incrementing
their ID's manually (customer_id += 1) and Licences records bound to
this pre-calculated ID's. When dicts are ready, I write them to a
new DB walking through them.<br>
<br>
Is there a better way to complete a data import without filling up
transition dicts and calculating autoincrement IDs by hand ? Can I
somehow use PKs of the records before committing them to DB ?<br>
</div>
<br>_______________________________________________<br>
ponyorm-list mailing list<br>
<a href="mailto:ponyorm-list@ponyorm.org">ponyorm-list@ponyorm.org</a><br>
<a href="/ponyorm-list" target="_blank">/ponyorm-list</a><br>
<br></blockquote></div><br><br clear="all"><br>-- <br><div class="gmail_signature">Regards,<br><br>Matthew Bell<br></div>
</div>