0 votes
asked May 28, 2017 by datalab (2,990 points) | 1,353 views

1 Answer

0 votes

In principle, no. But if your query is expected to return a very large number of rows, it may trigger a timeout, or even an error message that your query is not suitable for synchronous mode.

In such case, please submit your query in asynchronous mode. In a Jupyter notebook, you would write:

from dl import queryClient as qc
query = # my complex or long query string
jobid = qc.query(token,query,async=True)
# and after a while...
status = qc.status(token,jobid)
if jobid == 'COMPLETED':
    result = qc.results(token,jobid)

Another problem, even for asynchronous mode, can be that you try to load too many rows into your memory (e.g. if you are working with the Jupyter notebook server). Instead, write out your query results directly to a file in your vospace:

jobid = qc.query(token,query,async=True,out='vos://results.csv')
answered May 30, 2017 by robnik (1,040 points)

78 questions

61 answers


121 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.