@Marc.2377 Uh? What makes you think this doesn’t happen?
It’s impossible to tell for sure because votes are anonymous. But there are answers which are, to a topic expert, clearly wrong, yet have a positive score (with many downvotes, but not enough to offset the upvotes). This is fairly common when an answer got a lot of off-site attention, for example via HNQ. And more rarely there are answers which are well-explained and, to a topic expert, clearly correct, and yet have a negative score (with many upvotes, but even more downvotes).
Security.SE has a famous example of an answer which was clearly upvoted by non-experts (or by people who didn’t read the answer): Jeff Atwood’s answer to what is now (and IIRC has been for a long time) the highest-scoring question on the site. Thomas Pornin’s current answer is clearly (to an expert) the best answer for an expert audience, but his original answer wasn’t nearly as informative: he added the good stuff really came more than a year later. AviD’s answer is a good “TLDR” answer and was the best early answer. Jeff’s answer is plain wrong (you don’t even need to be an expert to figure it out, but if you aren’t you need to read the answer very closely): it starts out alright, but then it goes completely off the rails (“Point 3 is almost unanswerable and I think personally highly unlikely in practice. I expect …” — no, point 3 is definitely answerable, but then Jeff would have had to forego the conclusion he wanted to reach…) and comes to a wrong conclusion. Yet, for years, Jeff’s answer scored above Thomas’s. It took a concerted effort of the Security.SE community to publicize this question on the site until Thomas’s answer overtook Jeff’s answer for second place. It’s clear that Jeff got upvotes for writing well and for writing a long answer, not for writing an answer that made sense.
Of course that’s just one single anecdote. But even one anecdote is enough to invalidate “this simply does not happen”.