Hi! Thank you for your great work!
I would like to know Hardware Requirements if I need to perform local inference? and training ?
When I tried to run inference using the examples, it seems to take a long time without yielding any results. My local setup consists of only a single 4090 GPU, as shown in the figure.
Thank you very much again!


Hi! Thank you for your great work!
I would like to know Hardware Requirements if I need to perform local inference? and training ?
When I tried to run inference using the examples, it seems to take a long time without yielding any results. My local setup consists of only a single 4090 GPU, as shown in the figure.
Thank you very much again!