Hi,
This is my first time here and I hope that I am asking for help in the correct place.
This is the problem I need to figure out:
This is the given set of data:
Over a 1000 second interval
Time of event P = 12 seconds
Chance of event P occuring = 11.25%
Frequency of F = 2.8325 seconds
every 2.8325 seconds event F occurs. 11.25% of the time that F occurs, event P will occur. event P lasts for 12 seconds.
The problem:
On average, how many times will event P have occurred?
What is the average amount of time that event P will be active?
Thank you for your time and help!
Edit: Sorry! i made a typo and wrote K in the problem instead of P
This is my first time here and I hope that I am asking for help in the correct place.
This is the problem I need to figure out:
This is the given set of data:
Over a 1000 second interval
Time of event P = 12 seconds
Chance of event P occuring = 11.25%
Frequency of F = 2.8325 seconds
every 2.8325 seconds event F occurs. 11.25% of the time that F occurs, event P will occur. event P lasts for 12 seconds.
The problem:
On average, how many times will event P have occurred?
What is the average amount of time that event P will be active?
Thank you for your time and help!
Edit: Sorry! i made a typo and wrote K in the problem instead of P
Last edited: