File:Double descent in a two-layer neural network (Figure 3a from Rocks et al. 2022).png
From Wikimedia Commons, the free media repository
Jump to navigation
Jump to search
Size of this preview: 800 × 371 pixels. Other resolutions: 320 × 148 pixels | 640 × 297 pixels | 1,024 × 474 pixels | 1,280 × 593 pixels | 3,302 × 1,530 pixels.
Original file (3,302 × 1,530 pixels, file size: 396 KB, MIME type: image/png)
File information
Structured data
Captions
This image could be re-created using vector graphics as an SVG file. This has several advantages; see Commons:Media for cleanup for more information. If an SVG form of this image is available, please upload it and afterwards replace this template with
{{vector version available|new image name}} .
It is recommended to name the SVG file “Double descent in a two-layer neural network (Figure 3a from Rocks et al. 2022).svg”—then the template Vector version available (or Vva) does not need the new image name parameter. |
Summary
[edit]DescriptionDouble descent in a two-layer neural network (Figure 3a from Rocks et al. 2022).png |
English: A plot used as an illustration of the double descent phenomenon in deep learning in [1] ("test error falls, rises, then falls as a ratio of parameters to data"). From the original publication: "we plot the training error, test error, bias, and variance as a function of for fixed (more data points than input features [... for a] Random nonlinear features model (two-layer neural network). Analytic solutions for the ensemble-averaged [...] training error (blue squares) and test error (black circles) [...are] plotted as a function of for fixed . Analytic solutions are indicated as dashed lines with numerical results shown as points. [...] a black dashed line marks the boundary between the under and overparameterized regimes at ." |
Date | |
Source | Jason W. Rocks and Pankaj Mehta: Memorizing without overfitting: Bias, variance, and interpolation in overparameterized models. Phys. Rev. Research 4, 013201 – Published 15 March 2022. https://doi.org/10.1103/PhysRevResearch.4.013201 |
Author | Jason W. Rocks and Pankaj Mehta |
Licensing
[edit]This file is licensed under the Creative Commons Attribution 4.0 International license.
- You are free:
- to share – to copy, distribute and transmit the work
- to remix – to adapt the work
- Under the following conditions:
- attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
File history
Click on a date/time to view the file as it appeared at that time.
Date/Time | Thumbnail | Dimensions | User | Comment | |
---|---|---|---|---|---|
current | 17:57, 4 June 2023 | 3,302 × 1,530 (396 KB) | HaeB (talk | contribs) | Uploaded a work by Jason W. Rocks and Pankaj Mehta from Jason W. Rocks and Pankaj Mehta: Memorizing without overfitting: Bias, variance, and interpolation in overparameterized models. Phys. Rev. Research 4, 013201 – Published 15 March 2022. https://doi.org/10.1103/PhysRevResearch.4.013201 with UploadWizard |
You cannot overwrite this file.
File usage on Commons
There are no pages that use this file.
File usage on other wikis
The following other wikis use this file:
- Usage on en.wikipedia.org
Metadata
This file contains additional information such as Exif metadata which may have been added by the digital camera, scanner, or software program used to create or digitize it. If the file has been modified from its original state, some details such as the timestamp may not fully reflect those of the original file. The timestamp is only as accurate as the clock in the camera, and it may be completely wrong.
Software used | |
---|---|
Date and time of digitizing |
|