Humans are terrible at it. There's a short sci fi story where an AI selects the next American President from all 350M citizens. I'm here for it. I think it's a fantastic idea.
No. In fact, not only no, but <insert expletive-laden but not-acceptable-language-for-HN negative>.
I do not want vibe-coding the law, especially criminal law. I do not want vibe-coding the tax rules. I do not want vibe-coding traffic safety.
And, in fact, we won't be governed by AI, even if we are. If we're governed by AI, we're really governed by whoever trained the AI, and/or whoever curated the training data. Do we want to be governed by them? Again, no, with expletives.
Maybe this boils down to people who think AI is on an exponential (self-improving) curve, materially unbounded by physical resources, and people who think it's on a series of sigmoid curves with material physical constraints.
If someone assumes AI will become significantly more capable than humans at reasoning through complexity, then I can empathize with their opinion. I was previously convinced (open to) this possibility, but in recent years and the better AI gets the clearer it is to me that it's going to take a lot longer, and the super AGI outcome is a lot harder to see.
I'm sure by the time it could possibly be a feasible and positive option people will be plenty ready for it... So no need to prepare prematurely.
TLDR: I agree with you, but without the expletives.
The implicit question here is are we willing to be governed by the people who own AI? Because that what this boils down to.
Do people really get to choose who they're governed by, or do they get shown a few choices that are really false choices
Yeah exactly, we will never be governed bu
It would be nice to be governed by any intelligence.
Fixed Title: Are We Ready to Be Governed by Sam Altman?
Humans are terrible at it. There's a short sci fi story where an AI selects the next American President from all 350M citizens. I'm here for it. I think it's a fantastic idea.
https://news.ycombinator.com/item?id=46420273
We're not putting this genie back in the bottle so better get Ready if you're not already.
How do you do that? Just submit before they command you without a whisper of doubt?
No. In fact, not only no, but <insert expletive-laden but not-acceptable-language-for-HN negative>.
I do not want vibe-coding the law, especially criminal law. I do not want vibe-coding the tax rules. I do not want vibe-coding traffic safety.
And, in fact, we won't be governed by AI, even if we are. If we're governed by AI, we're really governed by whoever trained the AI, and/or whoever curated the training data. Do we want to be governed by them? Again, no, with expletives.
Maybe this boils down to people who think AI is on an exponential (self-improving) curve, materially unbounded by physical resources, and people who think it's on a series of sigmoid curves with material physical constraints.
If someone assumes AI will become significantly more capable than humans at reasoning through complexity, then I can empathize with their opinion. I was previously convinced (open to) this possibility, but in recent years and the better AI gets the clearer it is to me that it's going to take a lot longer, and the super AGI outcome is a lot harder to see.
I'm sure by the time it could possibly be a feasible and positive option people will be plenty ready for it... So no need to prepare prematurely.
TLDR: I agree with you, but without the expletives.
Honestly, given how terrible humans are at it, I'm down for giving it a try.
Of course it will be every bit as bad as the people who implement it. But that just kinda highlights the core problem.