SQL queries in Databox currently have a limit of 1000 rows. The request is to lift this limit.
2020/08/12 -
Activity
Newest / Oldest
Sandra
Hi all,
Great news!
We have implemented an improvement to all SQL integrations (MySQL, PostgreSQL, Microsoft SQL, Microsoft Azure, Amazon Redshift, Snowflake, Google BigQuery) - we raised the limit of 1,000 rows to 10,000 rows.
Query results are now limited to 10,000 rows. In case you expect your query to return over 10,000 rows or you're not sure how many rows will be returned, be sure to LIMIT the query.
If you have any follow-up questions or feedback to share, feel free to reply with a comment.
While I understand the rationale of limiting it to 1000 rows, and while there is a workaround of filtering and appending data, if there are many rows of data that need to be added it can be very tedious to have to repeatedly change the date range and do the import.
But the bigger issue is that if there is a deletion of rows or a change in the data model you have to repeat the process over and over again.
I would love to see a feature that does a starting import with no restrictions on the number of rows, and then further imports would be limited to 1000 rows. This would make importing historical data much easier and still keep performance benefits of limiting future imports to just 1000 lines.
Activity Newest / Oldest
Sandra
Hi all,
Great news!
We have implemented an improvement to all SQL integrations (MySQL, PostgreSQL, Microsoft SQL, Microsoft Azure, Amazon Redshift, Snowflake, Google BigQuery) - we raised the limit of 1,000 rows to 10,000 rows.
Query results are now limited to 10,000 rows. In case you expect your query to return over 10,000 rows or you're not sure how many rows will be returned, be sure to LIMIT the query.
If you have any follow-up questions or feedback to share, feel free to reply with a comment.
Thanks.
Sandra from Databox
Ziga Potocnik
Status changed to: Live
Sandra
Post moved to this board
William Yeack
This is an absolute must for us!
Andrew Seipp
I'll add a comment on this.
While I understand the rationale of limiting it to 1000 rows, and while there is a workaround of filtering and appending data, if there are many rows of data that need to be added it can be very tedious to have to repeatedly change the date range and do the import.
But the bigger issue is that if there is a deletion of rows or a change in the data model you have to repeat the process over and over again.
I would love to see a feature that does a starting import with no restrictions on the number of rows, and then further imports would be limited to 1000 rows. This would make importing historical data much easier and still keep performance benefits of limiting future imports to just 1000 lines.