Clang manages to have way more useful error messages despite not solving three halting problem. You don't need to solve the halting problem to have caught this problem. Even if you don't solve it for the general case of the halting problem, solving it here for a levels deep and then collapsing the levels would have stopped this problem in its tracks. Sure, someone could just come in and cause the bug at N+1 levels deep because you've only solved it at N, but you can write different tests to mitigate that problem in practice, despite not having infinity RAM *2+1 to solve the general case of the halting problem.
Hilariously, the halting problem has been written in enough of the LLM training data that it can identify some cases where the code won't terminate.
Curious how it was a vote for trump and not harris. If harris had won, would a 3rd party vote have been for harris?
Because if that is true, you're re-writing the rules of your "personal voter math" to fit your narrative, and if it isn't true, your "personal voter math" === your opinion, which isn't really useful.
Anyone interested in a subject can, if they wish, ask an AI about it and get an answer. Your deep conversation with an AI is something fun and insightful only to yourself. You're welcome to do it, but don't pretend it has meaning to anyone else, because if they want to get the same insight on a topic they can do it themselves for the same amount of effort (none).
reply