This is a static copy of a profile report

Home

Function details for fungrad_minibatchThis is a static copy of a profile report

Home

fungrad_minibatch (Calls: 11, Time: 1547.467 s)
Generated 19-Jun-2021 04:39:10 using performance time.
function in file /nfs/inm_phd/07/d07944009/2021/0618-proj6/simpleNN/MATLAB/cnn/fungrad_minibatch.m
Copy to new window for comparing multiple runs

Parents (calling functions)

Function NameFunction TypeCalls
newtonfunction11
Lines where the most time was spent

Line NumberCodeCallsTotal Time% TimeTime Plot
19
[net, loss] = lossgrad_subset(...
6491547.278 s100.0%
25
grad.dfdW{m} = grad.dfdW{m} + ...
14160.095 s0.0%
26
grad.dfdb{m} = grad.dfdb{m} + ...
14160.023 s0.0%
18
range = (i-1)*bsize + 1 : min(...
6490.011 s0.0%
23
if strcmp(task, 'fungrad')
6490.009 s0.0%
All other lines  0.051 s0.0%
Totals  1547.467 s100% 
Children (called functions)

Function NameFunction TypeCallsTotal Time% TimeTime Plot
lossgrad_subsetfunction6491547.243 s100.0%
gpufunction480.005 s0.0%
Self time (built-ins, overhead, etc.)  0.219 s0.0%
Totals  1547.467 s100% 
Code Analyzer results
No Code Analyzer messages.
Coverage results
Show coverage for parent directory
Total lines in function40
Non-code lines (comments, blank lines)8
Code lines (lines that can run)32
Code lines that did run31
Code lines that did not run1
Coverage (did run/can run)96.88 %
Function listing
time 
Calls 
 line
   1 
function [net, f, grad] = fungrad_minibatch(prob, param, model, net, task)
   2 

< 0.001 
     11 
   3
if ~any(strcmp(task, {'funonly', 'fungrad'})) 
   4 
	error('Unknown task.');
< 0.001 
     11 
   5
end 
   6 

< 0.001 
     11 
   7
grad = []; 
< 0.001 
     11 
   8
if strcmp(task, 'fungrad') 
< 0.001 
      6 
   9
	for m = 1 : model.L 
  0.008 
     24 
  10
		grad.dfdW{m} = gpu(@zeros,size(model.weight{m})); 
  0.005 
     24 
  11
		grad.dfdb{m} = gpu(@zeros,size(model.bias{m})); 
< 0.001 
     24 
  12
	end 
< 0.001 
     11 
  13
end 
  14 

< 0.001 
     11 
  15
f = 0; 
< 0.001 
     11 
  16
bsize = param.bsize; 
  0.001 
     11 
  17
for i = 1 : ceil(prob.l/bsize) 
  0.011 
    649 
  18
	range = (i-1)*bsize + 1 : min(prob.l, i*bsize); 
 1547.278 
    649 
  19
	[net, loss] = lossgrad_subset(prob, model, net, range, task); 
  20 

< 0.001 
    649 
  21
	f = f + loss; 
  22 

  0.009 
    649 
  23
	if strcmp(task, 'fungrad') 
  0.002 
    354 
  24
		for m = 1 : model.L 
  0.095 
   1416 
  25
			grad.dfdW{m} = grad.dfdW{m} + net.dlossdW{m}; 
  0.023 
   1416 
  26
			grad.dfdb{m} = grad.dfdb{m} + net.dlossdb{m}; 
< 0.001 
   1416 
  27
		end 
< 0.001 
    649 
  28
	end 
  0.003 
    649 
  29
end 
  30 

  31 
% Obj function value and gradient vector
< 0.001 
     11 
  32
reg = 0.0; 
  0.002 
     11 
  33
for m = 1 : model.L 
  0.006 
     44 
  34
	reg = reg + norm(model.weight{m}, 'fro')^2 + norm(model.bias{m})^2; 
  0.001 
     44 
  35
	if strcmp(task, 'fungrad') 
  0.008 
     24 
  36
		grad.dfdW{m} = model.weight{m}/param.C + grad.dfdW{m}/prob.l; 
  0.004 
     24 
  37
		grad.dfdb{m} = model.bias{m}/param.C + grad.dfdb{m}/prob.l; 
< 0.001 
     44 
  38
	end 
< 0.001 
     44 
  39
end 
  0.002 
     11 
  40
f = (1.0/(2*param.C))*reg + f/prob.l; 

Other subfunctions in this file are not included in this listing.