Labfans是一个针对大学生、工程师和科研工作者的技术社区。 | 论坛首页 | 联系我们(Contact Us) |
![]() |
|
![]() |
#1 |
高级会员
注册日期: 2019-11-21
帖子: 3,006
声望力: 66 ![]() |
![]() So I wanted to record data on how long my computer took to perform a calculation. However, when I record the data (of generating 1,000 uniform random numbers), the actual time the processor takes to do that tends to be clustered around certain values. I.e. it's a multimodal distribution. Is there a reason why? Here is my matlab code:
Here's a plot of the computation time for 1 million iterates. You can clearly see the banding. This is not really a time series per se, but it also illustrates the correlation between successive iterations too, where the computer must have started on some other task thus changing the time it took for this particular task. Zooming in near t=0 also further illustrates more banding. I suppose matlab tic/toc might have a limited resolution and that could be factor? Should I try to record computation times with another method, or maybe use R instead? Suggestions for another venue for this question, or if there is a way to improve its clarity are appreciated as well. More answer... |
![]() |
![]() |