In principle, no. But if your query is expected to return a very large number of rows, it may trigger a timeout, or even an error message that your query is not suitable for synchronous mode.
In such case, please submit your query in asynchronous mode. In a Jupyter notebook, you would write:
from dl import queryClient as qc
query = # my complex or long query string
jobid = qc.query(token,query,async=True)
# and after a while...
status = qc.status(token,jobid)
if jobid == 'COMPLETED':
result = qc.results(token,jobid)
Another problem, even for asynchronous mode, can be that you try to load too many rows into your memory (e.g. if you are working with the Jupyter notebook server). Instead, write out your query results directly to a file in your vospace:
jobid = qc.query(token,query,async=True,out='vos://results.csv')