Labfans是一个针对大学生、工程师和科研工作者的技术社区。 论坛首页 | 联系我们(Contact Us)
MATLAB爱好者论坛-LabFans.com
返回   MATLAB爱好者论坛-LabFans.com > 其它 > 资料存档
资料存档 资料存档
回复
 
主题工具 显示模式
旧 2019-11-26, 08:42   #1
poster
高级会员
 
注册日期: 2019-11-21
帖子: 3,006
声望力: 66
poster 正向着好的方向发展
默认 Why is computation time clustered?

So I wanted to record data on how long my computer took to perform a calculation. However, when I record the data (of generating 1,000 uniform random numbers), the actual time the processor takes to do that tends to be clustered around certain values. I.e. it's a multimodal distribution.



Is there a reason why?



Here is my matlab code:



N=10^6; 
x=zeros(1,N);
for n=1:N;
tic;
for k=1:1000;
rand;
end;
x(n)=toc;
end;


Here's a plot of the computation time for 1 million iterates. You can clearly see the banding. This is not really a time series per se, but it also illustrates the correlation between successive iterations too, where the computer must have started on some other task thus changing the time it took for this particular task. Zooming in near t=0 also further illustrates more banding. I suppose matlab tic/toc might have a limited resolution and that could be factor? Should I try to record computation times with another method, or maybe use R instead?





Suggestions for another venue for this question, or if there is a way to improve its clarity are appreciated as well.





More answer...
poster 当前离线   回复时引用此帖
回复


发帖规则
不可以发表新主题
不可以发表回复
不可以上传附件
不可以编辑自己的帖子

启用 BB 代码
论坛禁用 表情符号
论坛启用 [IMG] 代码
论坛启用 HTML 代码



所有时间均为北京时间。现在的时间是 01:20


Powered by vBulletin
版权所有 ©2000 - 2025,Jelsoft Enterprises Ltd.