You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have used dagster and deployed it on kubernetes.
As the number of assets grows, I have noticed the amount of memory consumed by dagster daemon gets increased. Now the amount of used memory reaches at 20GB.
I thought that is an unexpected behavior so I installed memray on the dagster image to find why such a large amount of memory is used. The result is as follows.
As the above pic says, a method called sync_get_streaming_external_repositorys_data_grpc occupies much memory. Also a method called raw_decode uses much memory. My theory is that methods involving grpc require much memory and this seems to lead to the result that 20GB memory is used in my dagster daemon.
Is there any way to reduce the amount of used memory in dagster daemon?
What did you expect to happen?
Less memory is consumed in dagster daemon.
How to reproduce?
We use dagster_dbt and have roughly 1000 assets: about 370 assets are imported as tables generated by dbt, about 370 assets are used as source in dbt and the others are defined in dagster.
Dagster version
1.8.13
Deployment type
Other Docker-based deployment
Deployment details
No response
Additional information
No response
Message from the maintainers
Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
By submitting this issue, you agree to follow Dagster's Code of Conduct.
The text was updated successfully, but these errors were encountered:
What's the issue?
We have used dagster and deployed it on kubernetes.
As the number of assets grows, I have noticed the amount of memory consumed by dagster daemon gets increased. Now the amount of used memory reaches at 20GB.
I thought that is an unexpected behavior so I installed memray on the dagster image to find why such a large amount of memory is used. The result is as follows.
As the above pic says, a method called
sync_get_streaming_external_repositorys_data_grpc
occupies much memory. Also a method calledraw_decode
uses much memory. My theory is that methods involving grpc require much memory and this seems to lead to the result that 20GB memory is used in my dagster daemon.Is there any way to reduce the amount of used memory in dagster daemon?
What did you expect to happen?
Less memory is consumed in dagster daemon.
How to reproduce?
We use
dagster_dbt
and have roughly 1000 assets: about 370 assets are imported as tables generated by dbt, about 370 assets are used as source in dbt and the others are defined in dagster.Dagster version
1.8.13
Deployment type
Other Docker-based deployment
Deployment details
No response
Additional information
No response
Message from the maintainers
Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
By submitting this issue, you agree to follow Dagster's Code of Conduct.
The text was updated successfully, but these errors were encountered: