Well, Wi-Fi and VPN will be slower than a more direct access (local
Ethernet or localhost). I still think breaking apart the query is a
good idea so you can do it in multiple DataContexts and manage the
memory footprint better (which may not be an issue for you).
mrg
On Sun, Apr 12, 2009 at 11:13 PM, Paul Logasa Bogen II <pl..amu.edu> wrote:
> Well this helped, I think the problem is my remote dev environment. The same
> query ran locally on the MySQL server only takes a couple of seconds. I
> think connecting through wifi wrapped in a VPN connection is slowing it
> down. I'm going to try running my whole program on the MySQL server and see
> if that solves the problem.
>
> plb
>
> Aristedes Maniatis wrote:
>>
>> On 13/04/2009, at 10:42 AM, Paul Logasa Bogen II wrote:
>>
>>> I have a table with ~683k entries in it. I have a maintenance task that
>>> requires me to hit every entry. Since I know the results are much too big to
>>> return all at once, I've set the page size to 50. However, Cayenne appears
>>> to be attempting the full query first before falling back to paging. This is
>>> taking nearly half an hour for it to fall back, at which point everything
>>> works flawlessly. Is there a way to force the query to be paginated or to
>>> reduce the timeout on a query?
>>
>> We use paged queries on tables of 150,000 records and they take only about
>> 2-3 seconds for the first query to grab the primary keys. I think you should
>> increase the logging level and figure out what is happening for those 30
>> minutes.
>>
>> Ari Maniatis
>>
>>
>> -------------------------->
>> ish
>> http://www.ish.com.au
>> Level 1, 30 Wilson Street Newtown 2042 Australia
>> phone +61 2 9550 5001 fax +61 2 9550 4001
>> GPG fingerprint CBFB 84B4 738D 4E87 5E5C 5EFA EF6A 7D2E 3E49 102A
>>
>
>
This archive was generated by hypermail 2.0.0 : Mon Apr 13 2009 - 00:07:30 EDT