Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Conflicts in evaluation with confidence values and IoU value #143

Unanswered
Satishrajc asked this question in Q&A
Discussion options

Frame

Manually Labeled data(Ground Truth): GT 1, GT2
Algorithm Detection: Det1, Det2

Now, IoU threshold : 50% and

• Det1 has a Confidence value 80
• Det1 has IoU wr.t to GT1 60
• Det1 has IoU wr.t to GT2 80 
• Det1 has a Confidence value of 60
• Det2 has IoU w.r.t to GT1 60
• Det2 has IoU w.r.t to GT2 90 
• In the tool we are going to sort detection according to confidence value and hence will consider Det1 first and then Det2
• First we take Det1 and compare it with all the GTs in the frame in this case it is GT1 and GT2
• Will compare, Det1 with GT1 and GT2, and since IoU with GT2 is high will take it as TP and flag the GT2
• Next time will take Det2 and will calculation for all the GT i.e. GT1 and GT2 and 
• Det2 has a higher IoU value for GT2 to hence will consider but GT2 is already flagged hence we IGNORE the Det2

Now, in this case why we are not considering Det2 anywhere?

You must be logged in to vote

Replies: 0 comments

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant

AltStyle によって変換されたページ (->オリジナル) /