Quaternion Temporal Convolutional Neural Networks
Date of Award
2019
Degree Name
M.S. in Computer Engineering
Department
Department of Electrical and Computer Engineering
Advisor/Chair
Advisor: Vijayan Asari
Abstract
Sequence Processing and Modeling are a domain of problems recently receiving significant attention for significant advancements in research and technology. While traditionally sequence processing using neural networks has been done using a recurrent neural network such as the long-short term memory cell. These recurrent networks have some fairly large drawbacks. issue in networks is increasingly large networks, which have been proven to learn features from useless noise in their input data. A network called the Temporal Convolutional Network seeks to fix the issues that the long-short term memory cell have. While other recent research has been put into quaternion neural networks, networks that dramatically reduce the number of parameters in a network while keeping the same performance. This thesis combines both these recent advancements into a Quaternion Temporal Convolutional Network. The network performance is evaluated on a wide range of sequence processing and modeling tasks and compared to the base Temporal Convolutional Network. Through testing and evaluation it is shown that although there is a reduction in the number of learned parameters in the Temporal Convolutional network by up to 4x, the network performance stays relatively close, and actually beats the base network on some tasks.
Keywords
Computer Science, Engineering, Quaternion Temporal Convolutional Network, QTCN, Quaternion Neural Network, Sequence Processing, Machine Learning
Rights Statement
Copyright © 2019, author
Recommended Citation
Long, Cameron E., "Quaternion Temporal Convolutional Neural Networks" (2019). Graduate Theses and Dissertations. 6857.
https://ecommons.udayton.edu/graduate_theses/6857