Machine Learning Algorithms Using Classical And Quantum Photonics
Author
Leelar, Bhawani Shankar
Metadata
Show full item recordAbstract
ABSTRACT
In the modern day , we are witnessing two complementary trends, exponential growth in data and shrinking of chip size. The Data is approaching to 44 zettabytes by 2020 and the chips are now available with 10nm technology. The hyperconnectivity between machine-to-machine and humanto- machine creates multi-dimensional data which is more complex. Our thesis addresses the quantum meta layer abstraction which provides the interface to the Application layer to design quantum and classical algorithms. The first part of the thesis addresses the quantum algorithms and second part address classical algorithms running on top of quantum meta layer. In the first part of our thesis we explored quantum stochastic algorithm for ranking Quantum Webpages, analogous to the classical Google PageRank. The architecture is a six-waveguide photonic lattice that runs finely-tuned quantum stochastic walk. The evolution of density matrix
solves the ranking of quantum webpages. We force the photon stochastic walk for quantum PageRank by matching the entries of Google matrix with parameters of the Kossakowski-Lindblad master equation. We have done extensive simulation to observe the density matrix evolution with different parameter settings. We have used noise in the Kossakowski-Lindblad master equation to break the symmetry (reciprocity) property of quantum system, which helps in distinguishable measurement of the quantum PageRank. We next propose a new quantum deep learning with photonic lattice waveguide as a feedforward neural network. The proposed deep photonic neural network uses the quantum properties for learning. The hidden layers of our deep photonic neural network can be designed to learn object representation and mentains the quantum quantum properties for longer time for optimal learning.
The second part of the thesis discusses the data based learning. We have used data graph method which captures the system representation. The proposed data graph model captures and encodes the data efficiently and then the data graph is updated and trained with new data to provide efficient predictions. The model retains the previously learned knowledge by transfer learning and improves it with new training. The proposed method is highly adaptive and scalable for different real-time scenarios. Data graph models the system where every node (object) is associated with data and if two objects are related then they are linked with a data edge. The proposed algorithm is an incremental algorithm which learns hidden objects and hidden relationships through the data pattern over time and updates the model accordingly. We have used algebraic graph transformation methods to trigger the mutation of the Data Graph. This new updated Data Graph behaves differently for the data it observes. We explore more into machine learning algorithms and have proposed a complete framework to predict the state of the system based on the system parameters. We have proposed the discretization of the data points using the symbol algebra and used Bayesian machine learning algorithm to select the best model to represent the new data. Symbol algebra provides unified language platform to different sensor data and it can process both, the discrete and continuous data. The portability of unified language platform in processing heterogeneous and homogeneous data increases the hypotheses space and Bayesian machine learning gets more degrees of freedom in choosing the best model with high measure of confidence level in the predicted state.