cancel
Showing results for 
Search instead for 
Did you mean: 

Limit Error passing a large table

hoffmanroni
New Contributor III
Hello,

I am trying to pass a large table between q sessions and I am getting a limit error.  I know I need to chunk it into pieces but I can't get the syntax correct.

I have a connection handle h, and I'm trying to retrieve a `trade table from there and insert into a different session.

ie q) trade: h "select from trade where date=2015.05.01"
    q) `limit


I know I need to cut the trade table into chunks and insert them but I am not sure how write this, any help?

Thanks!

9 REPLIES 9

Tom_Martin
New Contributor
New Contributor
Hi Roni

You could get the size of the table and then chunk up the indices

q)s:h"exec first x from select count i from product where date=2015.08.06"
q)s
15521604

This function will take a size and a chunksize and return the necessary chunk indices

q)c:{.[;0 0;:;0]1_(2#0){1_(1+x),y}\(y*til 1+x div y),$[x mod y;x;()]}
q)inds:c[s;3000000]
q)inds
0        3000000
3000001  6000000
6000001  9000000
9000001  12000000
12000001 15000000
15000001 15521604


Then query for each chunk and insert

q){product,:h(?;`product;((=;`date;2015.08.06);(within;`i;x));0b;())} each inds
q)count product
15521604

Thanks
Tom

Right, yea that could work, thought there could be an easier way using an each loop with cut, but I will try that.

Thanks

another way, say you want blobs of 1000000

c: 15521604

d:(1_({(last[x];c&y-1)}\) 1000000*til 1+`int$c%1000000)

raze {[x]h({[x]select from (select from table where date=2015.08.06) where i within x};x)} each d

If you're pushing rather than pulling:

q)neg[h]@/:(insert;`trade;)each chunksize cut trade

rahul_asati04
Contributor
 Some of the options to get data in chunks are:
1. Fetch data by no. of rows: some solutions for it have already been suggested by other members.

2. Fetch data by symbols : each chunk will contain data for a particular symbol. You can modify it to have data for more than one symbol in each chunk.

  q)   trade: raze {h ( {select from trade where date=2015.05.01,sym=x};x) } each  syms:h "exec sym from select sym from trade where date=2015.05.01"

3. Fetch by column (one column each chunk):
  q)  trade:  (,') . { ?[`trade;enlist (=;date;2015.05.01);0b;(enlist x)!(enlist x)] } each col: h "cols trade"

Thanks very much, I'll play around and see which is fastest.

rahul_asati04
Contributor
Missed the distinct keyword in option 2.

2. Fetch data by symbols : each chunk will contain data for a particular symbol. You can modify it to have data for more than one symbol in each chunk.

  q)   trade: raze {h ( {select from trade where date=2015.05.01,sym=x};x) } each  syms:h "exec distinct sym from select sym from trade where date=2015.05.01"

LamHinYan
New Contributor
Lists in kdb32 are limited to less than 1GB. IPC calls are limited to 2GB. How did you create the list? I would expect to hit the wsfull error before the `limit error.

q)mylist:til 1023*1024*1024 div 8
q)mylist:til 1024*1024*1024 div 8
wsfull

http://code.kx.com/wiki/Errors

oh i have found 1 way

q)mylist:til 512*1024*1024 div 8
q)t:(a:mylist; b:mylist; c:mylist; d:mylist)
q)\p 5000
q)-22!t
2147483686
q)

q)h:hopen `::5000
q)t:h "t"
'limit
q)