cancel
Showing results for 
Search instead for 
Did you mean: 

sql server database to kdb+ database

mymichael88330
New Contributor
Hi everyone,

I'm new to kdb+. What I have now is huge amount of intraday equity data stored in sql server database, and I would like to store them in kdb+ dababase format. I learned that the best way to do it is through partitioned tables with date or trading code as my partition domain.

My question is that is it really necessary to convert from sql server format to kdb+ format? And if so, is it simple like "query from sql server and save the fields into different columns" or do I get something wrong?

Thanks in advance,

Michael
8 REPLIES 8

Flying
New Contributor III
Looks like you'll have some study to catch up first. You've got to understand the basic conceptual differences between a traditional RDBMS and kdb+ in order to effectively work out the work flow you need.

The short answer is: From kdb+, you can indeed run sql queries on you existing data in order to extract data into kdb+ for further processing. But in order to make full use of kdb+'s power, you'd have to make use of a database in kdb+'s native format.

Recommended starters:
  • https://code.kx.com/q4m3/
  • https://code.kx.com/q/cookbook/splayed-tables/
  • To export from SQL to large CSVs and import into q:
    • https://code.kx.com/q/cookbook/loading-from-large-files/
    • https://code.kx.com/q/cookbook/splaying-large-files/
  • Or, to query SQL from within q:
    • https://code.kx.com/q/interfaces/q-client-for-odbc/

On Thursday, July 5, 2018 at 1:49:45 PM UTC+8, Michael He wrote:
Hi everyone,

I'm new to kdb+. What I have now is huge amount of intraday equity data stored in sql server database, and I would like to store them in kdb+ dababase format. I learned that the best way to do it is through partitioned tables with date or trading code as my partition domain.

My question is that is it really necessary to convert from sql server format to kdb+ format? And if so, is it simple like "query from sql server and save the fields into different columns" or do I get something wrong?

Thanks in advance,

Michael

Thanks Flying.

The demo demonstrating how to import large CSV file into q proved quite helpful.

Previously I was trying to connect to sql server from q throught ODBC and hoping that I could directly create a database in kdb+'s native format with the data queried. It seems to me that loading CSV is an alternative way to fulfill what I initially intended, right?

-Michael

在 2018年7月5日星期四 UTC+8下午3:05:13,Flying写道:
Looks like you'll have some study to catch up first. You've got to understand the basic conceptual differences between a traditional RDBMS and kdb+ in order to effectively work out the work flow you need.

The short answer is: From kdb+, you can indeed run sql queries on you existing data in order to extract data into kdb+ for further processing. But in order to make full use of kdb+'s power, you'd have to make use of a database in kdb+'s native format.

Recommended starters:

On Thursday, July 5, 2018 at 1:49:45 PM UTC+8, Michael He wrote:
Hi everyone,

I'm new to kdb+. What I have now is huge amount of intraday equity data stored in sql server database, and I would like to store them in kdb+ dababase format. I learned that the best way to do it is through partitioned tables with date or trading code as my partition domain.

My question is that is it really necessary to convert from sql server format to kdb+ format? And if so, is it simple like "query from sql server and save the fields into different columns" or do I get something wrong?

Thanks in advance,

Michael

>csv and odbc

csv loading is an alternative to odbc.  

Thanks effbiae. After a few days of catching up I've successfully load data through odbc and save them as partitioned table.

Yet there comes another problem. The full table I'm working with in SQL Server is too large to load into q process, which makes me wonder if there exists some way, like a while...do... loop so I can get part of the table from SQL Server at one time, and next I can use upsert to recover the full table part by part. Previously I was able to realize this in MATLAB while handling smaller data set, and I guess kdb+ could possibly have similar function.


在 2018年7月8日星期日 UTC+8上午3:15:38,effbiae写道:
>csv and odbc

csv loading is an alternative to odbc.  

To solve the size problem, why not use a where clause like
  where date>=a and date<b
in your sql for a sequence of dates a,b,...

Thanks Jack.

I've solved the problem using a "where" clause like you mentioned. At first I was thinking about precise control so that my function wouldn't explode when data in a particular range suddenly become too large. I knew "where" could be a solution but wondered if there was a better one.

Michael

hzadonis
New Contributor
There's a video maybe helpful.
https://www.youtube.com/watch?v=sIbKB094rmM

this video is about connecting to kdb 
but the need is to connect to another db from kdb