Currently, when sorting through available answers to any question (mine or not) on a topic I'm not nearly an expert on, it may sometimes be VERY difficult to separate wheat from the chaff by using up-votes - which are heavily biased towards earlier even if not better answers).
Another helpful metric is of course the rep - but as noted many many times, the rep as it currently stands is often misleading for mid-range people (1k-7k), by heavily penalizing experts in less-popular fields, or people who give very few but unusually good answers, and being biased towards people who ask and answer many newbie level questions - the latter is not a slander against such people, but merely a note that this situation makes separating merely prolific posters from genuinely competent people is hard.
A suggestion to help in such separation of more-competent people is to display the "accept hit rate" of the user - e.g. I'd be more willing to trust a guy for whom 100% of their question askers so far accepted as "accepted answer", even if their rep is not in 10000s.
NOTE: There was a similar discussion on Meta previously (and here's one more) but it centered on usefulness of such statistics to the user himself, as opposed to others who will peruse their answers.
EXAMPLE1: For a question I asked yesterday, the very best answer was actually up-voted the least (it is the last in the list of answers), made by a user who has second lowest rep of all the answerer. BUT - his ratio of accepted answers for my tags of interest is VERY high. And the second best asnwer, while voted up the most, was from a user with the LOWEST rep - but he also had a decent enough accepted answer ratio that I'm inclined to trust their info despite low rep. Yet, f I did not know enough about the topic in question to be able to discriminate by content, i'd have likely been more trusting one of the other two answers which were very much not the best ones but came from higher-total-rep people.
EXAMPLE2: My own answer to a question where I didn't have nearly enough expertise - and the answer was very sub-optimal and much worse than the ones giveb later by others - was not only voted far above other answers, but actually accepted by the questioner. Go figure. (my answer wasn't bad enough to delete, and i did edit it to improve the info for SO consumption, but still, you get my point). Yet, I had zero accepted score on that particular tag, as opposed to other answerers who came after me yet were not voted up enough.
As both these examples illustrate, the rate of acceptance of someone's answers for specific tags could be a very useful tool when sifting through the asnwers.
What does everyone think? Is it useful? Should it be added as a feature?