[PonyORM-list] get PK of the record before committing it to DB
Вадим Бажов
vadim at ideco.ru
Wed Mar 25 09:54:52 UTC 2015
That's what i need ! Clean and simple.
Thanks a lot, Matthew !
On 25.03.2015 11:03, Matthew Bell wrote:
> You don't need the PK to add a relation, you can do:
>
> cust = Customers(name='GazMyas',city='Chelyabinsk')
> cust.licenses.add(License(x=1, y=2))
> cust.licenses.add(License(x=3, y=4))
>
> So, create the customer, then loop over all licenses for that
> customer, doing customer.licenses.add
>
> You can then commit after each iteration, or just once at the end of
> the script, up to you.
>
> On 25 March 2015 at 08:28, Вадим Бажов <vadim at ideco.ru
> <mailto:vadim at ideco.ru>> wrote:
>
> Hello happy pony users and devs !
>
> I do an import from old db to a new one, driven by pony orm.
> Import iterates over records of customers and their related
> licences, fetched from an old DB. So we have to fill two tables in
> new DB: Customers table and Licence table with records, which have
> one2many relations (one Customer can have many Licences, by PK).
> PK is an autoincrementing integer in both tables.
>
> Within *each iteration* of import i need to write one Customer
> record and its Licences records to new DB. Licence records should
> have customer_id fields filled up with corresponding Customer PKs.
> Here we have a problem:
>
> If we create an object of an entity Customers : cust =
> Customers(name='GazMyas',city='Chelyabinsk') , it wont be recorded
> to a DB exactly at this moment and we wont able to use its ID (pk
> of the record) to bind Licence records to it. In other words there
> is no 'cust.id <http://cust.id>' element at this moment.
>
> We can do a 'commit()' each time we create a Customer object.
> After that 'cust.id <http://cust.id>' comes ready. But in case of
> monstrous imports with over 20 000 records committing every record
> to hdd slows down the import process to hours. And hangs up your
> server hard drive. Anyway, it's a bad practice to commit every
> record walking the huge array of incoming data.
>
> So, by now, I fill up two dicts with Customers records,
> incrementing their ID's manually (customer_id += 1) and Licences
> records bound to this pre-calculated ID's. When dicts are ready, I
> write them to a new DB walking through them.
>
> Is there a better way to complete a data import without filling up
> transition dicts and calculating autoincrement IDs by hand ? Can I
> somehow use PKs of the records before committing them to DB ?
>
> _______________________________________________
> ponyorm-list mailing list
> ponyorm-list at ponyorm.com <mailto:ponyorm-list at ponyorm.com>
> /ponyorm-list
>
>
>
>
> --
> Regards,
>
> Matthew Bell
>
>
> _______________________________________________
> ponyorm-list mailing list
> ponyorm-list at ponyorm.com
> /ponyorm-list
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </ponyorm-list/attachments/20150325/d93e9cd8/attachment.html>
More information about the ponyorm-list
mailing list