New sensitivity analysis theory for parametric NLPs is presented; generalized derivative information is obtained for solutions of NLPs exhibiting active set changes. Two distinct approaches are considered: first, a nonsmooth implicit function theorem is applied to a nonsmooth NLP KKT system reformulation to yield sensitivity information under LICQ and SSOSC regularity conditions. This approach furnishes a nonsmooth sensitivity system that admits primal and dual sensitivities as its unique solution and recovers the classical results set forth by Fiacco and McCormick in the absence of active set changes. Next, multiparametric programming theory is applied to furnish limiting Jacobian elements of the primal variable solution. Here, the required regularity conditions are relaxed to MFCQ and CRCQ and GSSOSC; the set of multipliers is nonempty and bounded as linear dependence of constraints is permitted.
Both methods describe generalized derivative information in the form of elements of the limiting Jacobian (also called the B-subdifferential), which are computationally relevant objects in the sense that they imply attractive convergence properties of dedicated nonsmooth methods. Consequently, the newly created theory is amenable to tractable numerical implementations. This work is placed in the context of state-of-the-art sensitivity analysis theory for parametric NLPs and is motivated by nonlinear model predictive control problems.