(24th-February-2021)
- DR.GEEK
- Feb 24, 2021
- 2 min read
Neural Networks Hardware

• Implementing your Neural Network in special hardware can entail a substantial investment of your time and money:
Ø the cost of the hardware
Ø cost of the software to execute on the hardware
Ø time and effort to climb the learning curve to master the use of the hardware and software.
• Before making this investment, you would like to be sure it is worth it.
• A scan of applications in a typical NNW conference proceedings will show that many, if not most, use feedforward networks with 10-100 inputs, 10-100 hidden units, and 1-10 output units.
• A forward pass through networks of this size will run in millisecs on a Pentium.
• Training may take overnight but if only done once or occasionally, this is not usually a problem.
• Most applications involve a number of steps, many not NNW related, that cannot be made parallel. So Amdahl's law limits the overall speedup from your special hardware.
• Intel 86 series chips and other von Neuman processors have grown rapidly in speed, plus one can take advantage of huge amount of readily available software.
• One quickly begins to see why the business of Neural Network hardware has not boomed the way some in the field expected back in the 1980's.
Applications of Hardware NNWs
• While not yet as successful as NNWs in software, there are in fact hardware NNW's hard at work in the real world. For example:
• OCR (Optical Character Recognition)
Ø Adaptive Solutions high volume form and image capture systems.
Ø Ligature Ltd. OCR-on-a-Chip
• Voice Recognition
Ø Sensory Inc. RSC Microcontrollers and ASSP speech recognition specific chips.
• Traffic Monitoring
Ø Nestor TrafficVision Systems
• High Energy Physics
Ø Online data filter at H1 electon-proton collider experiment in Hamburg using Adaptive Solutions CNAPS boards.
• However, most NNW applications today are still run with conventional software simulation on PC's and workstations with no special hardware add-ons.
コメント