I want to implement a simulation: there are 1000 objects; during a period of time 1800 seconds, each object is randomly selected (or whatever action); the number of selected objects along time follows a rough distribution: 30% will be selected within 60 seconds, 40% will be selected after 60 seconds but within 300 seconds, 20% will be selected after 300 seconds but within 600 seconds, and 10% will be selected after 600 seconds.
So what is the probability for each object being selected every second?
-
2This seems like a theoretical question which doesn't actually have anything to do with Java.Matt Ball– Matt Ball2011年12月29日 05:01:36 +00:00Commented Dec 29, 2011 at 5:01
-
is the distribution within each time section also random? or evenly distributed? For instance, in the 300<seconds<600 period, you have 200 objects being selected...does that mean one every 1.5 seconds? or could ALL be selected the very first second?Robot Woods– Robot Woods2011年12月29日 05:03:50 +00:00Commented Dec 29, 2011 at 5:03
1 Answer 1
This might be more appropriate to the Programmers section of StackExchange here: Programmers Exchange
But just taking a quick swipe at this, you select 300 objects in the first 60 seconds, 400 objects in the next 240 seconds, 200 objects in the next 300 seconds, and 100 objects in the last 1200 seconds. That gives you a sense of objects per second for each second of your simulation.
So, for example, you select 5 objects per second for the first 60 seconds, so there is a 5/1000 or 0.5% probability of selecting any specific object in each second of those first 60 seconds.
I think that should lead you to the answer if I understand your question correctly.