oh god... I haven‘t touched that stuff for a few years now... lemme crack open my adv. engineernig maths text to refresh my memory...
..flip..flip..
short answer... I don‘t believe so...
I will just let you know what I found, and maybe we can work this out together...
let A be an n x n matrix of real or complex numbers..
A real or complex number (lambda) is an eigenvalue of A if and only if, for some non-zero n x 1 matrix E,
AE = (lambda)E
Any non-zero n x 1 matrix E satisfying AE = (lambda)E is an eigenvector associated with (or corresponding to) the eigenvalue (lambda)
So, eigenvalues and vectors only have to do with directionalization of matrices. The book here says that it has ramifications for the solutions of linear algebraic and differential equations.
As for the Lagrange multipliers... They are used in optimization of linear sets.. in other words, to find the extrema of f(x1,x2,...,xn) subject to the constraint g(x1,x2,...,xn)=C where f and g are functions with continuous first partial derivatives on the open set g(x1,x2,...,xn)=0, and the gradient of g is non-zero at any point on the curve.
so it doesn‘t seem like the two are related.
I am just doing a quickle google with lagrange multipliers and eigenvalues... lets see what it turns up..
nope... couldn‘t find anything. So, unfortunately Randall, it doesn‘t look like finding a Lagrange multiplier is the same as finding the eigenvalues/vectors.
Sorry bud!