Jump to content
Wikipedia The Free Encyclopedia

Talk:Mark and recapture

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This article is rated Start-class on Wikipedia's content assessment scale.
It is of interest to the following WikiProjects:
WikiProject icon Statistics Low‐importance
WikiProject icon This article is within the scope of WikiProject Statistics , a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.StatisticsWikipedia:WikiProject StatisticsTemplate:WikiProject StatisticsStatistics
Low This article has been rated as Low-importance on the importance scale.
WikiProject icon Ecology Top‐importance
WikiProject icon This article is within the scope of the WikiProject Ecology , an effort to create, expand, organize, and improve ecology-related articles.EcologyWikipedia:WikiProject EcologyTemplate:WikiProject EcologyEcology
Top This article has been rated as Top-importance on the importance scale.
WikiProject icon Biology Mid‐importance
WikiProject icon Mark and recapture is part of the WikiProject Biology , an effort to build a comprehensive and detailed guide to biology on Wikipedia. Leave messages on the WikiProject talk page.BiologyWikipedia:WikiProject BiologyTemplate:WikiProject BiologyBiology
Mid This article has been rated as Mid-importance on the project's importance scale.
Note icon
This article has been marked as needing immediate attention.
WikiProject icon This article is within the scope of WikiProject Mathematics , a collaborative effort to improve the coverage of mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.MathematicsWikipedia:WikiProject MathematicsTemplate:WikiProject Mathematicsmathematics
Low This article has been rated as Low-priority on the project's priority scale.


Other methods

[edit ]

jollys method should be mentioned also.

Yes, the cormack jolly seber method should be described here. It is of extreme importance. 24.147.119.33 17:51, 20 March 2007 (UTC) gukarma[reply ]

Sources

[edit ]

Some Sources, References, Links and related topics would be nice... Anyone with knowledge of the subject cares to do it?

--Lucas Gallindo 22:52, 12 July 2007 (UTC) [reply ]

Lincoln Index

[edit ]

Please see Talk:Lincoln Index to discuss relation of content here with that of the newer article. Melcombe (talk) 12:20, 2 December 2010 (UTC) [reply ]

Merge discussion

[edit ]

I would like to propose that Tag and release is merged with Mark and recapture for two main reasons;

  1. Tag and release has very little content
  2. Tag and release is just another mark and recapture method

Jamesmcmahon0 (talk) 12:04, 2 May 2013 (UTC) [reply ]

  • OpposeTag and release can include Mark and recapture, but it can also be quite different. For example small archival tags can be attached to marine animals like fish. These archival tags can be equipped with a camera or sensors that monitor and log things like salinity, temperature, depth, acceleration, and pitch and roll. They are designed to detach at a later date and float to the surface where some method can be used to retrieve the logged data. This has nothing to do with "marking" or recapture. Tag and release is often linked with catch and release, and is a term widely used, particularly in fisheries and by recreational fishermen. --Epipelagic (talk) 04:21, 3 May 2013 (UTC) [reply ]
Also, an example of an organization that refers to the practice as "tag and release" despite not being related to fishing: http://www.thevlm.org/turtle_tag_release.aspx 173.79.218.246 (talk) 09:36, 29 May 2013 (UTC) [reply ]

Statistical treatment

[edit ]

I was requested here to improve this article. However my expertise is on Bayesian statistics and so any contribution of mine on Mark and Recapture might be considered original research.

Assume that K animals out of a population of the unknown size N have been marked. Later n animals are captured out of which k animals turned out to be marked.

NK animals are unmarked. Nn animals are uncaptured. nk captured animals are unmarked. Kk marked animals are uncaptured. NKn+k uncaptured animals are unmarked. As all these numbers are non-negative, the following inequalities result: ( NK+nk) and (nk) and (Kk) and (k ≥ 0).

Knowing K and n and k the problem is to estimate N.

Probability distribution

[edit ]

The conditional probability (k|N) of observing k knowing N (and n and K), is the hypergeometric distribution.

( k | N ) = ( K k ) ( N K n k ) ( N n ) {\displaystyle (k|N)={\frac {{\binom {K}{k}}{\binom {N-K}{n-k}}}{\binom {N}{n}}}} {\displaystyle (k|N)={\frac {{\binom {K}{k}}{\binom {N-K}{n-k}}}{\binom {N}{n}}}}

But we were interested in estimating N knowing k.

Credibility distribution

[edit ]

When there is no prior knowledge regarding N and k, the credibility distribution (N|k) is proportional to the likelihood function (k|N).

( N | k ) = ( k | N ) N = K + n k ( k | N ) {\displaystyle (N|k)={\frac {(k|N)}{\sum _{N=K+n-k}^{\infty }(k|N)}}} {\displaystyle (N|k)={\frac {(k|N)}{\sum _{N=K+n-k}^{\infty }(k|N)}}}

Inserting the expression for (k|N) and cancelling the common factor:

( N | k ) = ( N K n k ) ( N n ) N = K + n k ( N K n k ) ( N n ) {\displaystyle (N|k)={\frac {\frac {\binom {N-K}{n-k}}{\binom {N}{n}}}{\sum _{N=K+n-k}^{\infty }{\frac {\binom {N-K}{n-k}}{\binom {N}{n}}}}}} {\displaystyle (N|k)={\frac {\frac {\binom {N-K}{n-k}}{\binom {N}{n}}}{\sum _{N=K+n-k}^{\infty }{\frac {\binom {N-K}{n-k}}{\binom {N}{n}}}}}}

The denominator series is convergent for k ≥ 2.

Graphs

[edit ]
load 'plot' NB. plotting software 
LF =: 4 : 0 NB. Likelihood Function
'K n k'=:x
(N>:K+n-k)*((n-k)!N-K)%n!N=:i.y
)
g =: [: 'dot; labels 1 0 ; pensize 4' & plot LF
 11 10 0 g 701
 11 10 1 g 501
 11 10 2 g 301
 11 10 3 g 101
 11 10 4 g 71
Likelihood for Mark and Recapture. K=11, n=10, k=0
Likelihood for Mark and Recapture. K=11, n=10, k=1
Likelihood for Mark and Recapture. K=11, n=10, k=2
Likelihood for Mark and Recapture. K=11, n=10, k=3
Likelihood for Mark and Recapture. K=11, n=10, k=4

This J programming created the 5 graphs to the right showing likelihood functions for the total number of animals, N, for K=11 marked animals and n=10 captured animals, and for k= 0, 1, 2, 3 and 4 recaptured marked animals.

When k=0 we didn't recapture any marked animals and the physical limit on how many animals there are around is not reduced by the mark and recapture observation. The likelihood function has no maximum, and so the maximum likelihood estimate is infinite.

When k=1 we recaptured a single marked animal, and the likelihood function has a maximum at N=110, but the median is infinite.

When k=2 we recaptured two marked animals. The maximum likelihood is at N=55, and the median is at N=137. A 95% confidence interval is 19≤N≤1367, but the mean value is infinite.

When k=3 we recaptured three marked animals. The maximum likelihood is at N=36, and the median is at N=57. A 95% confidence interval is 18≤N≤236. The mean value is finite but the standard deviation is infinite.

When k=4 the maximum likelihood is at N=27. The median is at N=36. The 95% confidence interval is 17≤N≤ 96.

The frequentist formulas from the article gives the estimates NKn/k = 27.5 and N ≈ (K+1)(n+1)/(k+1)−1 = 25.4.

 b=.6000 LF~a=.11 10 4
 +/0>:2-/\b
27
 *`%/a
27.5
 *`%/&.:>:a
25.4
 +/0 0.95 0.5<:/~(%{:)+/\b
17 96 36

Bo Jacoby (talk) 06:14, 6 March 2014 (UTC).[reply ]

Order of magnitude and statistical uncertainty

[edit ]

Knowing the credibility distribution function, (N|k), one can compute the order of magnitude, μ, and the statistical uncertainty, σ, of the unknown number N.

N μ ± σ {\displaystyle N\approx \mu \pm \sigma } {\displaystyle N\approx \mu \pm \sigma }

where

μ = N = K + n k ( N | k ) N {\displaystyle \mu =\sum _{N=K+n-k}^{\infty }(N|k)N} {\displaystyle \mu =\sum _{N=K+n-k}^{\infty }(N|k)N}
σ 2 + μ 2 = N = K + n k ( N | k ) N 2 {\displaystyle \sigma ^{2}+\mu ^{2}=\sum _{N=K+n-k}^{\infty }(N|k)N^{2}} {\displaystyle \sigma ^{2}+\mu ^{2}=\sum _{N=K+n-k}^{\infty }(N|k)N^{2}}

Bo Jacoby (talk) 14:10, 27 February 2014 (UTC).[reply ]


Summation

[edit ]

A closed form for the above sums can be found using Gosper's algorithm. However Wolframalpha does not immediately do it [1]. But the following detour does the trick.

Define the sums

S Q = N = K + n k ( N K n k ) ( N Q ) ( N n ) {\displaystyle S_{Q}={\sum _{N=K+n-k}^{\infty }{\frac {{\binom {N-K}{n-k}}{\binom {N}{Q}}}{\binom {N}{n}}}}} {\displaystyle S_{Q}={\sum _{N=K+n-k}^{\infty }{\frac {{\binom {N-K}{n-k}}{\binom {N}{Q}}}{\binom {N}{n}}}}} for Q = 0, 1, 2

so

μ = S 1 S 0 {\displaystyle \mu ={\frac {S_{1}}{S_{0}}}} {\displaystyle \mu ={\frac {S_{1}}{S_{0}}}}

and

σ 2 μ + μ 1 = 2 S 2 S 1 {\displaystyle {\frac {\sigma ^{2}}{\mu }}+\mu -1=2{\frac {S_{2}}{S_{1}}}} {\displaystyle {\frac {\sigma ^{2}}{\mu }}+\mu -1=2{\frac {S_{2}}{S_{1}}}}

The sums are evaluated [2]

N = K + n k m 2 ( N K n k ) ( N Q ) ( N n ) = ( n + K k Q ) ( n + K k n ) A Q ( m 1 Q ) ( m 1 n ) ( m K 1 n k ) B Q {\displaystyle \sum _{N=K+n-k}^{m-2}{\frac {{\binom {N-K}{n-k}}{\binom {N}{Q}}}{\binom {N}{n}}}={\frac {\binom {n+K-k}{Q}}{\binom {n+K-k}{n}}}A_{Q}-{\frac {\binom {m-1}{Q}}{\binom {m-1}{n}}}{\binom {m-K-1}{n-k}}B_{Q}} {\displaystyle \sum _{N=K+n-k}^{m-2}{\frac {{\binom {N-K}{n-k}}{\binom {N}{Q}}}{\binom {N}{n}}}={\frac {\binom {n+K-k}{Q}}{\binom {n+K-k}{n}}}A_{Q}-{\frac {\binom {m-1}{Q}}{\binom {m-1}{n}}}{\binom {m-K-1}{n-k}}B_{Q}}

where

A Q = 2 F 1 ( 1 + K k , 1 + n k ; 1 + K + n k Q ; 1 ) {\displaystyle A_{Q}=,円_{2}F_{1}(1+K-k,1+n-k;1+K+n-k-Q;1)} {\displaystyle A_{Q}=,円_{2}F_{1}(1+K-k,1+n-k;1+K+n-k-Q;1)}

and

B Q = 3 F 2 ( 1 , m K , m n ; m Q , m ( K + n k ) ; 1 ) {\displaystyle B_{Q}=,円_{3}F_{2}(1,m-K,m-n;m-Q,m-(K+n-k);1)} {\displaystyle B_{Q}=,円_{3}F_{2}(1,m-K,m-n;m-Q,m-(K+n-k);1)}

are generalized hypergeometric functions.

The limiting case for m → ∞ is

S Q = ( K + n k Q ) ( K + n k n ) A Q {\displaystyle S_{Q}={\frac {\binom {K+n-k}{Q}}{\binom {K+n-k}{n}}}A_{Q}} {\displaystyle S_{Q}={\frac {\binom {K+n-k}{Q}}{\binom {K+n-k}{n}}}A_{Q}}

and so

Q S Q S Q 1 = ( K + n k Q + 1 ) A Q A Q 1 {\displaystyle Q{\frac {S_{Q}}{S_{Q-1}}}=(K+n-k-Q+1){A_{Q} \over A_{Q-1}}} {\displaystyle Q{\frac {S_{Q}}{S_{Q-1}}}=(K+n-k-Q+1){A_{Q} \over A_{Q-1}}}

Gauss's theorem

2 F 1 ( a , b ; c ; 1 ) = ( c 1 ) ! ( c a 1 ) ! ( c a b 1 ) ! ( c b 1 ) ! {\displaystyle _{2}F_{1}(a,b;c;1)={\frac {(c-1)!}{(c-a-1)!}}{\frac {(c-a-b-1)!}{(c-b-1)!}}} {\displaystyle _{2}F_{1}(a,b;c;1)={\frac {(c-1)!}{(c-a-1)!}}{\frac {(c-a-b-1)!}{(c-b-1)!}}}

gives the simplification

A Q = ( K + n k Q ) ! ( K Q 1 ) ! ( k Q 2 ) ! ( n Q 1 ) ! {\displaystyle A_{Q}={\frac {(K+n-k-Q)!}{(K-Q-1)!}}{\frac {(k-Q-2)!}{(n-Q-1)!}}} {\displaystyle A_{Q}={\frac {(K+n-k-Q)!}{(K-Q-1)!}}{\frac {(k-Q-2)!}{(n-Q-1)!}}}

so

A Q A Q 1 = K Q K + n k Q + 1 n Q k Q 1 {\displaystyle {\frac {A_{Q}}{A_{Q-1}}}={\frac {K-Q}{K+n-k-Q+1}}{\frac {n-Q}{k-Q-1}}} {\displaystyle {\frac {A_{Q}}{A_{Q-1}}}={\frac {K-Q}{K+n-k-Q+1}}{\frac {n-Q}{k-Q-1}}}

and

Q S Q S Q 1 = K Q 1 n Q k Q 1 {\displaystyle Q{\frac {S_{Q}}{S_{Q-1}}}={\frac {K-Q}{1}}{\frac {n-Q}{k-Q-1}}} {\displaystyle Q{\frac {S_{Q}}{S_{Q-1}}}={\frac {K-Q}{1}}{\frac {n-Q}{k-Q-1}}}

So μ and σ are given by

μ = K 1 1 n 1 k 2 {\displaystyle \mu ={\frac {K-1}{1}}{\frac {n-1}{k-2}}} {\displaystyle \mu ={\frac {K-1}{1}}{\frac {n-1}{k-2}}} for k≥3

and

σ 2 μ + μ 1 = K 2 1 n 2 k 3 {\displaystyle {\frac {\sigma ^{2}}{\mu }}+\mu -1={\frac {K-2}{1}}{\frac {n-2}{k-3}}} {\displaystyle {\frac {\sigma ^{2}}{\mu }}+\mu -1={\frac {K-2}{1}}{\frac {n-2}{k-3}}} for k≥4.

and the final result is [3]

N K 1 1 n 1 k 2 ± K 1 1 n 1 k 2 K k + 1 k 2 n k + 1 k 3 {\displaystyle N\approx {\frac {K-1}{1}}{\frac {n-1}{k-2}}\pm {\sqrt {{\frac {K-1}{1}}{\frac {n-1}{k-2}}{\frac {K-k+1}{k-2}}{\frac {n-k+1}{k-3}}}}} {\displaystyle N\approx {\frac {K-1}{1}}{\frac {n-1}{k-2}}\pm {\sqrt {{\frac {K-1}{1}}{\frac {n-1}{k-2}}{\frac {K-k+1}{k-2}}{\frac {n-k+1}{k-3}}}}}

Bo Jacoby (talk) 09:51, 25 July 2014 (UTC).[reply ]

Lately I abandoned fraction bars and square root signs, using negative exponents and fractional exponents instead.

N A B C 1 ( 1 ± ( 1 A 1 C ) 2 1 ( 1 B 1 C ) 2 1 ( C 1 ) 2 1 ) {\displaystyle N\approx ABC^{-1}(1\pm (1-A^{-1}C)^{2^{-1}}(1-B^{-1}C)^{2^{-1}}(C-1)^{-2^{-1}})} {\displaystyle N\approx ABC^{-1}(1\pm (1-A^{-1}C)^{2^{-1}}(1-B^{-1}C)^{2^{-1}}(C-1)^{-2^{-1}})}

where A = K 1 {\displaystyle A=K-1} {\displaystyle A=K-1} and B = n 1 {\displaystyle B=n-1} {\displaystyle B=n-1} and C = k 2 {\displaystyle C=k-2} {\displaystyle C=k-2}.

Bo Jacoby (talk) 20:45, 10 July 2017 (UTC).[reply ]

Example

[edit ]
 ]a=.11 10 4, 11 10 5,: 11 10 6
11 10 4
11 10 5
11 10 6
 MR=.[:({.,.[:%:{.*[:>:-~/)[:(*`%`:3"1)0 1-~/1 1 2-~"1/]
 MR a
 45 35.4965
 30 14.4914
22.5 7.5

This calculation for K = 11 and n = 10 shows that if k = 4 then N ≈ 45 ± 35.5. If k=5 then N ≈ 30 ± 14.5. If k=6 then N ≈ 22.5 ± 7.5.

Bo Jacoby (talk) 20:07, 8 July 2014 (UTC).[reply ]


Thanks for this, I think I mostly followed what you have here, the stuff I've been looking at is using frequentist MLE approximations. Seeing it done from a Bayesian perspective is very interesting though. Have you seen any references or is it purely yourself? Jamesmcmahon0 (talk) 11:18, 4 March 2014 (UTC) [reply ]

Hi James.
I recently wrote this paper on Mark and Recapture.
https://www.dropbox.com/scl/fi/1ch767qo87qr2qzazsfkl/duction.pdf?rlkey=opm0x4byn6xqg6yoj6mden6j7&dl=0
There is also a text in Open Office format, such that you may play with the spreadsheets.
https://www.dropbox.com/scl/fi/h1kpdzfmjp0qi17dxi6et/duction.odt?rlkey=0iei7sc1df7jdth8sejqo3jpf&dl=0
The method is much simpler than what I did 10 years ago, but the results are not the same.
Now I get: for K = 11 and n = 10 , if k = 4 then N ≈ 35 ± 18. If k=5 then N ≈ 26 ± 10. If k=6 then N ≈ 21 ± 6. I am not sure which result is most wrong.
Bo Jacoby (talk) 06:25, 29 May 2024 (UTC) [reply ]

Reference to 2013 paper in The American Statistician (arXiv) for a Bayesian analysis, which demonstrates the above. — Preceding unsigned comment added by 194.81.223.66 (talk) 13:11, 15 December 2014 (UTC) [reply ]

Tagging may alert predators

[edit ]

Just read this the linked article will probably have contributions for this article. I don't have time for the foreseeable future unfortunately. 86.179.58.222 (talk) 18:54, 20 November 2014 (UTC) [reply ]

[edit ]

Hello fellow Wikipedians,

I have just modified 2 external links on Mark and recapture. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 15:04, 18 January 2018 (UTC) [reply ]

Confidence Interval

[edit ]

The formula K + n k + ( K k + 0.5 ) ( n k + 0.5 ) ( k + 0.5 ) exp ( ± z α / 2 σ ^ 0.5 ) {\displaystyle K+n-k+{\frac {(K-k+0.5)(n-k+0.5)}{(k+0.5)}}\exp(\pm z_{\alpha /2}{\hat {\sigma }}_{0.5})} {\displaystyle K+n-k+{\frac {(K-k+0.5)(n-k+0.5)}{(k+0.5)}}\exp(\pm z_{\alpha /2}{\hat {\sigma }}_{0.5})} should I think be K + n k 0.5 + ( K k + 0.5 ) ( n k + 0.5 ) ( k + 0.5 ) exp ( ± z α / 2 σ ^ 0.5 ) {\displaystyle K+n-k-0.5+{\frac {(K-k+0.5)(n-k+0.5)}{(k+0.5)}}\exp(\pm z_{\alpha /2}{\hat {\sigma }}_{0.5})} {\displaystyle K+n-k-0.5+{\frac {(K-k+0.5)(n-k+0.5)}{(k+0.5)}}\exp(\pm z_{\alpha /2}{\hat {\sigma }}_{0.5})} See (6) in [1] I have made this change. Nick Mulgan (talk) 01:53, 1 February 2019 (UTC) [reply ]

Maybe because of this change, but when I work out the example confidence interval I get 23 to 51, not the values shown on the web page. .&checktime(2601,47,47,':')7E:43C0:53E:E0C2:D4C7:8CAC (talk) 15:22, 9 September 2024 (UTC) [reply ]
  1. ^ Sadinle, Mauricio (2009年10月01日). "Transformed Logit Confidence Intervals for Small Populations in Single Capture–Recapture Estimation". Communications in Statistics - Simulation and Computation. 38 (9): 1909–1924. doi:10.1080/03610910903168595. ISSN 0361-0918.

AltStyle によって変換されたページ (->オリジナル) /