It's worth noting there are very widely deployed chips primarily built for inference (running the network) especially on mobile phones.
Depending on the device and manufacturer sometimes this is implemented as part of the CPU itself, but functionally it's the same idea.
The Apple Neural Engine is a good example of this. This is separate to the GPU which is also on the CPU.
Further information is here: https://machinelearning.apple.com/research/neural-engine-tra...
The Google Tensor CPU used in the pixel has a similar coprocessor called the EdgeTPU.
It's worth noting there are very widely deployed chips primarily built for inference (running the network) especially on mobile phones.
Depending on the device and manufacturer sometimes this is implemented as part of the CPU itself, but functionally it's the same idea.
The Apple Neural Engine is a good example of this. This is separate to the GPU which is also on the CPU.
Further information is here: https://machinelearning.apple.com/research/neural-engine-tra...
The Google Tensor CPU used in the pixel has a similar coprocessor called the EdgeTPU.