Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Gifted and average students who aren’t given attention become poor performers quickly.

Gifted students especially.

Students who learn ahead don’t want to be told: “ok you have the material, so I’ll ignore you for a bit.” They want more. They want their questions answered, even if the questions aren’t part of the lesson plan.

Nobody wants to be told “we aren’t studying that today.”

You really can’t starve the rest of the class to cater to poor performers.



teaching independent learning is also a very valuable skill, most average people can do it especially the gifted ones as long as they are given the resources to do so and general skills taught in the first 8 grades since realistically I caught up by 10th grade while being a very poor performer in the first 8 (mostly due to lack of effort and generally being very stupid as a kid).

also there is very little excuse now with how advanced AI is getting at being able to explain subjects, it's all coming down to motivation, environment and the ability to use these tools in the first place - children struggle to use computers since we see it as a given, but to someone who didn't grow up with the incremental advancements it can seem very overwhelming.


AI is very bad at developing curricular materials.


that's mostly a context limitation with hallucinations sprinkled in, but they are currently good at understanding existing problems and how to solve them as long as they're not college-level or above since at that point they tend to fall apart due to complex interaction between different subjects and number percision overload esp when it comes to matricies.


I suspect it’s not just a context limit.

I suspect, but can’t prove, that model trainers deliberately steer models away from creating tests and worksheets.

The reason being that when a human asks a written question: “who president of the US in 1962?” it’s very likely to be a part of a worksheet.

Novels don’t contain many questions like that, nor do non fiction. They’re mostly paragraphs. Most text is mostly paragraphs.

Naked question usually means worksheet. AIs know this, so their if you ask it a question like: “who president in 1962?” It responds which the most likely next sentence, a related question: “How was the Cuban Missile resolved?”

So there’s a huge discrepancy between the next most likely sentence based on training, and what a user likely wants. If I ask, “Who was president in 1962?” I don’t want another question, nor does anyone else.

But that’s what the training data provides.

So model trainers have to bias it away from worksheets. This isn’t hard to do, and is a normal part of model training.

I’ve personally seen this behavior in poorly parameterized or trained models. They love answering questions by assuming they are in worksheets. It’s a huge pain.

Interestingly it never happens with top-line models like ChatGPT.

Carefully hyperparameterizatiin helps, but I think you’ll have to adjust the weights too. But that likely makes it harder to make actual worksheets.

This is just a guess. But I suspect models are weighted to discount pedagogical materials because of how different they are from what the users often expect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: