WO2023194438 - METHOD FOR INITIALIZING A NEURAL NETWORK
National phase entry:
Publication Number
WO/2023/194438
Publication Date
12.10.2023
International Application No.
PCT/EP2023/058940
International Filing Date
05.04.2023
Title **
[English]
METHOD FOR INITIALIZING A NEURAL NETWORK
[French]
PROCÉDÉ D'INITIALISATION D'UN RÉSEAU NEURONAL
Applicants **
ROBERT BOSCH GMBH
Postfach 30 02 20
70442 Stuttgart, DE
Inventors
SCHMIDT, Frank
Lindenstrasse 17/1
71229 Leonberg, DE
LONG, Torsten
Venloer Strasse 1
50672 Koeln, DE
Priority Data
22167452.6
08.04.2022
EP
Application details
| Total Number of Claims/PCT | * |
| Number of Independent Claims | * |
| Number of Priorities | * |
| Number of Multi-Dependent Claims | * |
| Number of Drawings | * |
| Pages for Publication | * |
| Number of Pages with Drawings | * |
| Pages of Specification | * |
| * | |
| * | |
International Searching Authority |
EPO
* |
| Applicant's Legal Status |
Legal Entity
* |
| * | |
| * | |
| * | |
| * | |
| Entry into National Phase under |
Chapter I
* |
| Translation |
|
Recalculate
* The data is based on automatic recognition. Please verify and amend if necessary.
** IP-Coster compiles data from publicly available sources. If this data includes your personal information, you can contact us to request its removal.
Quotation for National Phase entry
| Country | Stages | Total | |
|---|---|---|---|
| China | Filing | 961 | |
| EPO | Filing, Examination | 4562 | |
| Japan | Filing | 588 | |
| South Korea | Filing | 574 | |
| USA | Filing, Examination | 2710 |

Total: 9395 USD
The term for entry into the National Phase has expired. This quotation is for informational purposes only
Abstract[English]
) and wherein training comprises training parameters of a depth- wise convolutional layer of the neural network (60), wherein the depth-wise convolutional layer is initialized based on values drawn from a predefined probability distribution, wherein a variance of the probability distribution is characterized by a reciprocal of a square root of a number of filters applied at each depth of an input of the depth-wise convolutional layer.[French]
) et l'apprentissage comprenant des paramètres d'apprentissage d'une couche de convolution en profondeur du réseau neuronal (60), la couche de convolution en profondeur étant initialisée sur la base de valeurs tirées d'une distribution de probabilité prédéfinie, une variance de la distribution de probabilité étant caractérisée par une réciproque d'une racine carrée d'un nombre de filtres appliqués à chaque profondeur d'une entrée de la couche de convolution en profondeur.