• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

Favourite Mathematician

Bah. I passed trig with flying colours and then sank like a stone in calc.
 
Maybe so.... but, you probably have more recently calculated the SIN or COS of an angle than you have done a partial derivative of something in the cylindrical co-ordinate plane..

hey? can I get a show of hands? lemme here ya!

mumble mumble...
 
Partial differential equations... *shudders*

OK- favourite mathematician? - Honourable mention to Andrew Wiles, for solving Fermat‘s Last Theorem (even though he certainly didn‘t find the proof that Fermat did).

But my vote goes for my Mom. (Seriously- she‘s got a degree in Mathematics!)
 
Tyrangog, thanks for the link to the Higgs Boson; it made for some fascinating reading. Hey, you sound like the kind of guy who could answer my question. Lagrange Multipliers: is this equivalent to finding the Eigenvalues/Eigenvectors? This wasn‘t covered in my class, but having taken Linear Algebra I thought I saw a connection.

Cheers,

RG
 
oh god... I haven‘t touched that stuff for a few years now... lemme crack open my adv. engineernig maths text to refresh my memory...

..flip..flip..

short answer... I don‘t believe so... :D

I will just let you know what I found, and maybe we can work this out together...

let A be an n x n matrix of real or complex numbers..

A real or complex number (lambda) is an eigenvalue of A if and only if, for some non-zero n x 1 matrix E,

AE = (lambda)E

Any non-zero n x 1 matrix E satisfying AE = (lambda)E is an eigenvector associated with (or corresponding to) the eigenvalue (lambda)
So, eigenvalues and vectors only have to do with directionalization of matrices. The book here says that it has ramifications for the solutions of linear algebraic and differential equations.

As for the Lagrange multipliers... They are used in optimization of linear sets.. in other words, to find the extrema of f(x1,x2,...,xn) subject to the constraint g(x1,x2,...,xn)=C where f and g are functions with continuous first partial derivatives on the open set g(x1,x2,...,xn)=0, and the gradient of g is non-zero at any point on the curve.

so it doesn‘t seem like the two are related.

I am just doing a quickle google with lagrange multipliers and eigenvalues... lets see what it turns up..


nope... couldn‘t find anything. So, unfortunately Randall, it doesn‘t look like finding a Lagrange multiplier is the same as finding the eigenvalues/vectors.

Sorry bud!
 
Tyrnagog, thanks for your explanation. My experience of the method of Lagrange Multipliers was always subject to a constraint. The unconstrained optimization for functions of several variables using the method of Lagrange Multipliers utilizes the Hessian matrix (matrix of second partials) subject to grad(f)=0, and it is the eigenvalues of this matrix which determines local extrema. Or so I discovered today when searching the net. I‘ve not reached that level yet, so it is still a mystery to me. Thanks, I appreciate your efforts.
 
Back
Top