GPUs aren’t just for graphics. They speed up vector operations, including those used in “AI stuff”. I just never heard of NPUs before, so I imagine they may be hardwired for graph architecture of neural nets instead of linear algebra, maybe, so that’s why they can’t be used as GPUs.
Initially, x86 CPUs didn’t have a FPU. It cost extra, and was delivered as a separate chip.
Later, GPU is just a overgrown SIMD FPU.
NPU is a specialized GPU that operates on low-precision floating-point numbers, and mostly does matrix-multiply-and-add operations.
There is zero neural processing going on here, which would mean the chip operates using bursts of encoded analog signals, within power consumption of about 20W, and would be able to adjust itself on the fly online, without having a few datacenters spending exceeding amount of energy to update the weights of the model.
Can the NPU at least stand in as a GPU in case you need it?
No as it doesn’t compute graphical information and is solely for running computations for “AI stuff”.
GPUs aren’t just for graphics. They speed up vector operations, including those used in “AI stuff”. I just never heard of NPUs before, so I imagine they may be hardwired for graph architecture of neural nets instead of linear algebra, maybe, so that’s why they can’t be used as GPUs.
Initially, x86 CPUs didn’t have a FPU. It cost extra, and was delivered as a separate chip.
Later, GPU is just a overgrown SIMD FPU.
NPU is a specialized GPU that operates on low-precision floating-point numbers, and mostly does matrix-multiply-and-add operations.
There is zero neural processing going on here, which would mean the chip operates using bursts of encoded analog signals, within power consumption of about 20W, and would be able to adjust itself on the fly online, without having a few datacenters spending exceeding amount of energy to update the weights of the model.
Nope. Don’t need it