How much electricity do you consume in your daily local model runs?
24/l2w
No.2983
YMx7B4
No.2984
>>2983(OP)
I am gareeb, I don't have gpu and the one that I have in my laptop is gtx 1650 (which is good for general purpose, but not for running llm more that 2b).
yvCyyh
No.2985
>>2983(OP)
i dont use local llms as they are useless on my limited hardware
tGbf4Z
No.2986
> No GPU
> No life
tGbf4Z
No.2989
>>2983(OP)
I ran local models on my 13th i5. 65 watts peak. Ok for LLM use. Not fast. Image generation was not good.
I want to try to those new AMD AI cpus with npu's built into motherboard.
tGbf4Z
No.2990
>>2983(OP)
And while id love a 12 gb graphics कार्ड, thr costs even for a 3050 are high




