I know this issue was brought up here but there was no resolution in that thread.
I’m using bitnami DF 1.9.1 with a service that connects to MSSQL.
I created my first stored procedure the other day - a simple join table. The response contains a little over 10,000 rows of data. It was working just fine for me. Then I assigned one of my roles to have access to that stored procedure and it mostly stopped working. I’m not sure if the role access edit is related, but it was a pretty immediate change. Since then the stored procedure has worked less and less. At first it was working 50% of the time. Now it works maybe 5% of the time. The rest of the time it returns an empty 500 response.
The stored procedure works with no errors if executed directly on the sql server. I’ve since tried re-creating the same procedure many times without adding any access to roles. But it essentially never works. I’ve also tried calling the stored procedure from a server-side script, but got the same response.
This morning I created a stored procedure that returns 1800 rows and it seems to be working fine via the API. So I guess the issue is the 10,000 rows of data being returned. But it is strange that I was able to receive all those rows with no problem at least 20 times before it started acting up. Am I over-flowing some sort of memory somewhere on df?