Transputer-based parallel implementation of neural nets for a class of pattern recoganition problems
Abstract
Neural networks, inspired by the organizational principles of the human brain, consist of interconnected computing elements. Their large-scale computations pose challenges for conventional serial machines, making parallel implementation a desirable approach for improved performance. This work explores the parallel implementation of Backpropagation Networks and Bidirectional Associative Memory (BAM) using transputer-based architectures with various topologies including hypercube, mesh, linear array, and ring.
Performance metrics such as speedup and utilization are evaluated across different configurations. The hypercube topology demonstrates superior performance for BAM, while the ring and linear array topologies show complementary strengths in learning and recall phases of backpropagation networks. These networks are applied to numeral recognition, with techniques like multiple training and dummy data augmentation enhancing BAM’s storage capacity. Comparative analysis under noisy conditions reveals that backpropagation networks outperform BAM in classification accuracy.
Additionally, the backpropagation network is applied to medical diagnosis, successfully identifying and classifying various forms of arthritis and related rheumatic disorders. This study highlights the effectiveness of parallel neural network implementations in both pattern recognition and real-world diagnostic applications.