This is a static copy of a profile report

Home

Function details for lossgrad_subsetThis is a static copy of a profile report

Home

lossgrad_subset (Calls: 649, Time: 1529.172 s)
Generated 19-Jun-2021 04:01:57 using performance time.
function in file /nfs/inm_phd/07/d07944009/2021/0618-proj6/simpleNN/MATLAB/cnn/lossgrad_subset.m
Copy to new window for comparing multiple runs

Parents (calling functions)

Function NameFunction TypeCalls
fungrad_minibatchfunction649
Lines where the most time was spent

Line NumberCodeCallsTotal Time% TimeTime Plot
13
net = feedforward(prob.data(:,...
649883.644 s57.8%
19
v = JTv(model, net, v);
354645.136 s42.2%
8
Y = gpu(@zeros, [model.nL, num...
6490.115 s0.0%
21
net.dlossdW{m} = v{m}(:, 1:end...
14160.108 s0.0%
14
loss = norm(net.Z{L+1} - Y, 'f...
6490.064 s0.0%
All other lines  0.105 s0.0%
Totals  1529.172 s100% 
Children (called functions)

Function NameFunction TypeCallsTotal Time% TimeTime Plot
feedforwardfunction649881.294 s57.6%
JTvfunction354645.118 s42.2%
gpufunction6490.082 s0.0%
Self time (built-ins, overhead, etc.)  2.678 s0.2%
Totals  1529.172 s100% 
Code Analyzer results
Line numberMessage
4The value assigned to variable 'LC' might be unused.
9If you intend to specify expression precedence, use parentheses () instead of brackets [].
Coverage results
Show coverage for parent directory
Total lines in function24
Non-code lines (comments, blank lines)9
Code lines (lines that can run)15
Code lines that did run15
Code lines that did not run0
Coverage (did run/can run)100.00 %
Function listing
time 
Calls 
 line
   1 
function [net, loss] = lossgrad_subset(prob, model, net, batch_idx, task)
   2 

  0.002 
    649 
   3
L = model.L; 
  0.001 
    649 
   4
LC = model.LC; 
   5 

< 0.001 
    649 
   6
num_data = length(batch_idx); 
   7 

  0.115 
    649 
   8
Y = gpu(@zeros, [model.nL, num_data]); 
  0.045 
    649 
   9
Y(prob.y_mapped(batch_idx) + model.nL*[0:num_data-1]') = 1; 
  10 
%Y = prob.label_mat(:, batch_idx);
  11 

  12 
% fun
 883.644 
    649 
  13
net = feedforward(prob.data(:, batch_idx), model, net); 
  0.064 
    649 
  14
loss = norm(net.Z{L+1} - Y, 'fro')^2; 
  15 

  0.010 
    649 
  16
if strcmp(task, 'fungrad') 
  17 
	% grad
  0.013 
    354 
  18
	v = 2*(net.Z{L+1} - Y); 
 645.136 
    354 
  19
	v = JTv(model, net, v); 
  0.001 
    354 
  20
	for m = 1 : L 
  0.108 
   1416 
  21
		net.dlossdW{m} = v{m}(:, 1:end-1); 
  0.016 
   1416 
  22
		net.dlossdb{m} = v{m}(:, end);	 
  0.001 
   1416 
  23
	end 
  0.005 
    649 
  24
end 

Other subfunctions in this file are not included in this listing.