0 votes
24 views

Dear datalab colleagues,

I am retrieving catalogs from  ls_dr8.tractor_s. It is not successful because of too long entries. I will have the error info appended the following. I wonder how could I make it succeed for one-time run instead of breaking the query into multi-pieces?

Thanks,

Huanian

Error: ('Connection aborted.', error(104, 'Connection reset by peer'))
Error: HTTPConnectionPool(host='gp01.datalab.noao.edu', port=8080): Max retries exceeded with url: /ivoa-dal/tap/async/b3wzamiamjmsi7a4/phase (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f60dd169d90>: Failed to establish a new connection: [Errno 111] Connection refused',))
Error: HTTPConnectionPool(host='gp01.datalab.noao.edu', port=8080): Max retries exceeded with url: /ivoa-dal/tap/async/b3wzamiamjmsi7a4/phase (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f60dd0ad550>: Failed to establish a new connection: [Errno 111] Connection refused',))
Error: HTTPConnectionPool(host='gp01.datalab.noao.edu', port=8080): Max retries exceeded with url: /ivoa-dal/tap/async/b3wzamiamjmsi7a4/phase (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f60dcfe7710>: Failed to establish a new connection: [Errno 111] Connection refused',))
Error: HTTPConnectionPool(host='gp01.datalab.noao.edu', port=8080): Max retries exceeded with url: /ivoa-dal/tap/async/b3wzamiamjmsi7a4/phase (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f60dd397310>: Failed to establish a new connection: [Errno 111] Connection refused',))
Unable to complete your request: not found: not found: b3wzamiamjmsi7a4

Unable to complete your request: not found: not found: b3wzamiamjmsi7a4

Unable to complete your request: not found: not found: b3wzamiamjmsi7a4

Unable to complete your request: not found: not found: b3wzamiamjmsi7a4
asked Oct 15, 2019 by huanian (130 points) | 24 views

1 Answer

0 votes
Thank you for your question.  

The query is hitting a limit of our resources at the present time.  There's not really a way to query this much data for so long without breaking it up into smaller chunks unfortunately.  Breaking it up by dec should ensure you get a successful query.
answered Oct 15, 2019 by ascott (560 points)

157 questions

151 answers

116 comments

1,029 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.