Resumen
For piecewise linear functions ??:R???R
f
:
R
n
?
R
we show how their abs-linear representation can be extended to yield simultaneously their decomposition into a convex ???
f
?
and a concave part ??^
f
^
, including a pair of generalized gradients ????R?????^
g
?
?
R
n
?
g
^
. The latter satisfy strict chain rules and can be computed in the reverse mode of algorithmic differentiation, at a small multiple of the cost of evaluating f itself. It is shown how ???
f
?
and ??^
f
^
can be expressed as a single maximum and a single minimum of affine functions, respectively. The two subgradients ???
g
?
and -??^
-
g
^
are then used to drive DCA algorithms, where the (convex) inner problem can be solved in finitely many steps, e.g., by a Simplex variant or the true steepest descent method. Using a reflection technique to update the gradients of the concave part, one can ensure finite convergence to a local minimizer of f, provided the Linear Independence Kink Qualification holds. For piecewise smooth objectives the approach can be used as an inner method for successive piecewise linearization.