A construction for circulant type dropout designs

Shoko Chisaki, Ryoh Fuji-Hara, Nobuko Miyamoto

Research output: Contribution to journalArticlepeer-review


Dropout is used in deep learning to prevent overlearning. It is a method of learning by invalidating nodes randomly for each layer in the multi-layer neural network. Let V1, V2, … , Vn be mutually disjoint node sets (layers). A multi-layer neural network can be regarded as a union of the complete bipartite graphs K|Vi|,|Vi+1| on two consecutive node sets Vi and Vi+1 for i= 1 , 2 , … , n- 1. The dropout method deletes a random sample of activations (nodes) to zero during the training process. A random sample of nodes also causes irregular frequencies of dropout edges. A dropout design is a combinatorial design on dropout nodes from each layer which balances frequencies of selected edges. The block set of a dropout design is B={{C1|C2|…|Cn}|Ci⊆Vi,Ci≠∅,1≤i≤n} having a balancing condition in consecutive t sub-blocks Ci, Ci+1, … , Ci+t-1, see [3]. If | Vi| and | Ci| are constants for 1 ≤ i≤ n, then the dropout design is called uniform. If a uniform dropout design satisfies the circulant property, then the design can be extended to a design with as many layers as you need. In this paper, we describe a construction for uniform dropout designs of circulant type by using affine geometries.

Original languageEnglish
JournalDesigns, Codes, and Cryptography
Publication statusAccepted/In press - 2021


  • Affine geometry
  • Deep learning
  • Dropout
  • Dropout design
  • Split block design


Dive into the research topics of 'A construction for circulant type dropout designs'. Together they form a unique fingerprint.

Cite this