math question: I have two processes, A and B, which each take time. I can measure A a bunch of times and determine that it takes 1 second on average. I can also measure AB (process A followed by process B) a bunch of times, and let's say that it takes 3 seconds on average. How long does B take? The answer seems like it would be 2 seconds on average.. but lets say that the times for A and B are log-normally distributed, as process times tend to be (I think).. is it still fine to subtract averages?
..let's simulate this..
hm.. I thought process times tended to be Poisson distributed.. is Poisson distributed the same as log-normal?.. simulate..
hm.. it appears not.