[PonyORM-list] Pony's Cache Size Limit?

Alexander Kozlovsky alexander.kozlovsky at gmail.com
Sat Nov 8 13:19:06 UTC 2014


Hi, Matthew

If you retrieve these objects to render some HTML page, than I doubt that
you really need so many objects. I think that a typical web page requires
several thousands objects at most, and this amount is not big enough to
cause memory problems.

> each time the inner function is called, it returns a data structure full
of pony objects (but a lot less pony objects than the function internally
used)

Ideally you function should retrieve only the objects which are actually
required to render the Jinja template. If the function retrieves much more
objects, but then uses only small fraction of them, then it probably
written inefficiently. You should do object filtering not in memory, but in
the initial database query.

If you can show me the database query then I probably can suggest a better
way to do filtering, so only required objects will be materialized in
memory.

Regards,
Alexander


On Fri, Nov 7, 2014 at 5:40 PM, Matthew Bell <matthewrobertbell at gmail.com>
wrote:

> I think the problem is that each time the inner function is called, it
> returns a data  structure full of pony objects (but a lot less pony objects
> than the function internally used). I could pull the attributes of the
> objects that I need (to inject into a Jinja template, but not for a
> webapp), but that seems hacky. I could pickle / just return the IDs of the
> objects, but that also seems hacky.
>
> The relevant line in the jinja template is like:
>
> {% for keyword in keywords %}
>    {{ keyword.expensive_method() }}
>
> Do you see any solutions?
>
> Thanks
>
> On 6 November 2014 13:05, Alexander Kozlovsky <
> alexander.kozlovsky at gmail.com> wrote:
>
>> Hi Matthew,
>>
>> > Does Pony's in-memory cache have a size limit?
>>
>> Currently no, but we can think about it. It may be not so easy to
>> implement, because objects in cache have cross-relations.
>> May be we can add some method like db_session.forget(MyEntity) to clear
>> all unmodified entities of this type from the cache. Can this help in your
>> case?
>>
>> >  I have a function which does a lot of queries within one transaction
>>
>> Are you sure that in your case all those queries must be done in the same
>> transaction? If not, then wrap different queries in different db_sessions,
>> and old objects should be cleared automatically.
>>
>>
>> On Thu, Nov 6, 2014 at 3:43 PM, Matthew Bell <matthewrobertbell at gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>> Does Pony's in-memory cache have a size limit? I have a function which
>>> does a lot of queries within one transaction, dereferencing > 99% of the
>>> pony objects, but the memory used by the process grows and grows and the
>>> function goes through its loops.
>>>
>>> Psuedo code:
>>>
>>> results = []
>>> for i in range(5):
>>>   results.append(some_function_which_queries_lots(i))
>>>
>>> Thanks!
>>>
>>> --
>>> Regards,
>>>
>>> Matthew Bell
>>>
>>> _______________________________________________
>>> ponyorm-list mailing list
>>> ponyorm-list at ponyorm.com
>>> /ponyorm-list
>>>
>>>
>>
>> _______________________________________________
>> ponyorm-list mailing list
>> ponyorm-list at ponyorm.com
>> /ponyorm-list
>>
>>
>
>
> --
> Regards,
>
> Matthew Bell
>
> _______________________________________________
> ponyorm-list mailing list
> ponyorm-list at ponyorm.com
> /ponyorm-list
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </ponyorm-list/attachments/20141108/35ed5652/attachment.html>


More information about the ponyorm-list mailing list