Last modified: 2014-06-30 15:27:26 UTC
When the same module, calling once getEntity(), is used N times in the same page, the profiler shows a multiplied time for total getEntity(). This time could be reduced by caching "getEntity()" for all modules and wikidata-templates in a page. This time is also the far largest in the total time.
getEntity is provided by one of the Wikidata extensions, not by Scribunto itself, so reassigning there. Be careful not to introduce a new bug to be tracked by bug 65258 when implementing this cache.
As far as I am aware, getEntity should be handled via an CachingEntity(Revision)Lookup, which does caching (two layers, actually: in-process and memcached). That's how it's supposed to work, anyway. Is there evidence that this is not the case?
On a second thought: the Lua version of getEntity needs to convert the entity to a Lua object (or meta-table or whatever). I don't know how fast or slow that is. Perhaps we could also cache the Lua version of the Entity. Can we cache objects between Lua invocations in a sane way?
(In reply to Daniel Kinzler from comment #3) > On a second thought: the Lua version of getEntity needs to convert the > entity to a Lua object (or meta-table or whatever). I don't know how fast or > slow that is. Perhaps we could also cache the Lua version of the Entity. Can > we cache objects between Lua invocations in a sane way? We do in memory caching in Lua as far as I remember, but there's no way we can store data there longer. Most of the times when Lua requests an entity it actually only needs a label. I already talked to Aude about this: We should have a cached label lookup for Lua, the parser functions and the changes list stuff in order to load labels without having to load the full entities.
@Marius: we have a caching TermIndex service. The interface isn't pretty, but perhaps it could be polished for use in Lua.