Copy reference, caption or embed code
Figure 1
- Bayes' Power for Explaining In-Context Learning Generalizations
The model is only trained on step functions (left), still it learns to make smooth predictions (right) just like the true posterior for the step function prior.
Go to figure page
Reference
Bayes' Power for Explaining In-Context Learning Generalizations - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-model-is-only-trained-on-step-functions-left-still-it-learns-to-make-smooth_fig1_384597999 [accessed 24 Apr 2025]
Caption
Figure 1: The model is only trained on step functions (left), still it learns to make smooth predictions (right) just like the true posterior for the step function prior.
Embed code
<a href="https://www.researchgate.net/figure/The-model-is-only-trained-on-step-functions-left-still-it-learns-to-make-smooth_fig1_384597999"><img src="https://www.researchgate.net/publication/384597999/figure/fig1/AS:11431281281697077@1727968872681/The-model-is-only-trained-on-step-functions-left-still-it-learns-to-make-smooth.png" alt="The model is only trained on step functions (left), still it learns to make smooth predictions (right) just like the true posterior for the step function prior."/></a>
or
Discover by subject area
Recruit researchers
Join for free
Login
Email
Tip:
Most researchers use their institutional email address as their ResearchGate login
Password
Forgot password?
Keep me logged in
Log in
or
Continue with Google
Welcome back! Please log in.
Email
· Hint
Tip:
Most researchers use their institutional email address as their ResearchGate login
Password
Forgot password?
Keep me logged in
Log in
or
Continue with Google
No account?
Sign up