Web Neural Network with Complete DiGraphs
Abstract: This paper introduces a new neural network model that aims to mimic the biological brain more closely by structuring the network as a complete directed graph that processes continuous data for each timestep. Current neural networks have structures that vaguely mimic the brain structure, such as neurons, convolutions, and recurrence. The model proposed in this paper adds additional structural properties by introducing cycles into the neuron connections and removing the sequential nature commonly seen in other network layers. Furthermore, the model has continuous input and output, inspired by spiking neural networks, which allows the network to learn a process of classification, rather than simply returning the final result.
- One Timestep is All You Need: Training Spiking Neural Networks with Ultra Low Latency, Oct. 2021. arXiv:2110.05929 [cs].
- Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, pages 8024–8035. Curran Associates, Inc., 2019.
- Understanding LSTM – a tutorial into Long Short-Term Memory Recurrent Neural Networks, Sept. 2019. arXiv:1909.09586 [cs].
- Exploring Randomly Wired Neural Networks for Image Recognition. In 2019 IEEE/CVF International Conference on Computer Vision (ICCV), pages 1284–1293, Seoul, Korea (South), Oct. 2019. IEEE.
- Graph Neural Networks: A Review of Methods and Applications, Oct. 2021. arXiv:1812.08434 [cs, stat].
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.