lossgrad_subset (Calls: 649, Time: 1529.172 s)
Generated 19-Jun-2021 04:01:57 using performance time.
function in file /nfs/inm_phd/07/d07944009/2021/0618-proj6/simpleNN/MATLAB/cnn/lossgrad_subset.m
Copy to new window for comparing multiple runs
Function Name | Function Type | Calls |
fungrad_minibatch | function | 649 |
Line Number | Code | Calls | Total Time | % Time | Time Plot |
13 | net = feedforward(prob.data(:,... | 649 | 883.644 s | 57.8% | ![]() |
19 | v = JTv(model, net, v); | 354 | 645.136 s | 42.2% | ![]() |
8 | Y = gpu(@zeros, [model.nL, num... | 649 | 0.115 s | 0.0% | ![]() |
21 | net.dlossdW{m} = v{m}(:, 1:end... | 1416 | 0.108 s | 0.0% | ![]() |
14 | loss = norm(net.Z{L+1} - Y, 'f... | 649 | 0.064 s | 0.0% | ![]() |
All other lines | 0.105 s | 0.0% | ![]() | ||
Totals | 1529.172 s | 100% |
Function Name | Function Type | Calls | Total Time | % Time | Time Plot |
feedforward | function | 649 | 881.294 s | 57.6% | ![]() |
JTv | function | 354 | 645.118 s | 42.2% | ![]() |
gpu | function | 649 | 0.082 s | 0.0% | ![]() |
Self time (built-ins, overhead, etc.) | 2.678 s | 0.2% | ![]() | ||
Totals | 1529.172 s | 100% |
Line number | Message |
4 | The value assigned to variable 'LC' might be unused. |
9 | If you intend to specify expression precedence, use parentheses () instead of brackets []. |
Total lines in function | 24 |
Non-code lines (comments, blank lines) | 9 |
Code lines (lines that can run) | 15 |
Code lines that did run | 15 |
Code lines that did not run | 0 |
Coverage (did run/can run) | 100.00 % |
time | Calls | line | |
---|---|---|---|
1 | function [net, loss] = lossgrad_subset(prob, model, net, batch_idx, task) | ||
2 | |||
0.002 | 649 | 3 | L = model.L; |
0.001 | 649 | 4 | LC = model.LC; |
5 | |||
< 0.001 | 649 | 6 | num_data = length(batch_idx); |
7 | |||
0.115 | 649 | 8 | Y = gpu(@zeros, [model.nL, num_data]); |
0.045 | 649 | 9 | Y(prob.y_mapped(batch_idx) + model.nL*[0:num_data-1]') = 1; |
10 | %Y = prob.label_mat(:, batch_idx); | ||
11 | |||
12 | % fun | ||
883.644 | 649 | 13 | net = feedforward(prob.data(:, batch_idx), model, net); |
0.064 | 649 | 14 | loss = norm(net.Z{L+1} - Y, 'fro')^2; |
15 | |||
0.010 | 649 | 16 | if strcmp(task, 'fungrad') |
17 | % grad | ||
0.013 | 354 | 18 | v = 2*(net.Z{L+1} - Y); |
645.136 | 354 | 19 | v = JTv(model, net, v); |
0.001 | 354 | 20 | for m = 1 : L |
0.108 | 1416 | 21 | net.dlossdW{m} = v{m}(:, 1:end-1); |
0.016 | 1416 | 22 | net.dlossdb{m} = v{m}(:, end); |
0.001 | 1416 | 23 | end |
0.005 | 649 | 24 | end |
Other subfunctions in this file are not included in this listing.