Linear Algebra: uses, in mathematics and otherwise?

JeffM

Elite Member
Joined
Sep 14, 2012
Messages
7,872
I hope that this question will not be thought captious or obscurantist. My math education was a bit hit or miss because my academic field was history. My sole knowledge of linear algebra comes from courses I took in abstract algebra and in numerical methods. What I gathered from that is that linear algebra provides a succinct notation for systems of linear equations and an efficient method to solve such systems. Is that it? Is there something I am missing that I would learn from studying linear algebra?
 
I hope that this question will not be thought captious or obscurantist. My math education was a bit hit or miss because my academic field was history. My sole knowledge of linear algebra comes from courses I took in abstract algebra and in numerical methods. What I gathered from that is that linear algebra provides a succinct notation for systems of linear equations and an efficient method to solve such systems. Is that it? Is there something I am missing that I would learn from studying linear algebra?
In addition, I was taught matrix algebra - eigenvalue, eigenfunction, etc.
 
In addition, I was taught matrix algebra - eigenvalue, eigenfunction, etc.
Thanks Subhotosh.

I think I learned matrix algebra long ago in order to do problems in abstract algebra; it was non-abelian with respect to multiplication if I remember correctly. During a career of now 52 years, I never saw anyone use matrix algebra for a practical purpose except to say something in an admittedly very concise notation. (I recognize the benefit of good notation.)

What is the purpose of an eigenvalue?

I am still in the same boat. Linear algebra seems to be about systems of linear equations, which can already be solved by elementary algebra. In fact, I remember writing computer programs to solve quite extensive systems of linear equations back when I studied numerical methods.

Is there any practical purpose to the field at all? Or is it a branch of pure mathematics still awaiting functional application?
 
Thanks Subhotosh.

I think I learned matrix algebra long ago in order to do problems in abstract algebra; it was non-abelian with respect to multiplication if I remember correctly. During a career of now 52 years, I never saw anyone use matrix algebra for a practical purpose except to say something in an admittedly very concise notation. (I recognize the benefit of good notation.)

What is the purpose of an eigenvalue?

I am still in the same boat. Linear algebra seems to be about systems of linear equations, which can already be solved by elementary algebra. In fact, I remember writing computer programs to solve quite extensive systems of linear equations back when I studied numerical methods.

Is there any practical purpose to the field at all? Or is it a branch of pure mathematics still awaiting functional application?
In vibration analysis - structural mechanics (lumped mass analysis) - it is used extensively. It is also used in electrical circuit analysis. And some more very practical stuff that I cannot think of right now....
 
In vibration analysis - structural mechanics (lumped mass analysis) - it is used extensively. It is also used in electrical circuit analysis. And some more very practical stuff that I cannot think of right now....
Subhotosh, I apologize for having inadvertently misled you. I am fundamentally asking what type of mathematical problem can be solved by linear algebra but not solved by elementary algebra. Is linear algebra merely a very concise notation for representing systems of linear equations, which must eventually be solved by elementary algebra and arithmetic. Most of reality is not linear, and, though solving linear equations may involve a lot of computation, it is quite mechanical. Thus, solving systems of linear equations seems to present nothing that was not addressed in theory during first year algebra and nothing that can apply to non-linear functions. So I really wonder whether, at my age, devoting any time to learning linear algebra makes sense.
 
I was introduced to linear algebra as an undergraduate. I barely passed the course. At some point, the concepts became too abstract. Yet, the takeaway for me was that linear algebra offers short-cuts which augment regular algebra (by reducing the workload). Linear algebra can find solutions to certain problems that cannot be solved in a useful way otherwise. In other cases, linear algebra is the only way to a solution -- not because other methods aren't available, but because computers can't yet handle the task using regular algebra.

I remember a statement (paraphrased): The importance of linear algebra is directly proportional to increases in computing power.

A large airline uses the solution to a linear system, to guide certain management decisions. Such a system may contain over 10,000 equations and variables. Without using linear algebra, the fastest computers available to the airline might require 17 hours, to solve the system (likely, too late, to be of use). Whereas, linear algebra provides a means to get the solution in 23 minutes. If the airline pays a third-party for the CPU time, then linear algebra has saved them money in two ways: less CPU-time expense and timely management decisions. (All numbers in this example are fictional.)

Also, there are real matrices today that are too big to fit in a computer's high-speed memory. Linear algebra provides ways to partition such matrices, so that current hardware and software can handle the original matrix (two or three partitions at a time). I found a reference to such a system, albeit from 1992. After partitioning a matrix into 837 rows and 51 columns, a Cray supercomputer solved the problem in four minutes. Each 51-block column contained about 250,000 individual columns. The original problem contained 837 equations involving over 12,750,000 variables. Nearly 10 million of the more than 10 billion entries in the original matrix were non-zero. Without linear algebra, the solution could not have be found (in 1992, at least). 8-)
 
Mark

Thanks, that's helpful. A technique for making tractable a class of computationally massive problems. I can see why that is valuable, but I am rather sure it is not relevant to my interests.
 
Jeff, I received the following information from Quora, today. The mention of linear algebra brought to mind this thread. You may find something interesting about it, or not. :cool:


Question: Will there ever be another great discoverer in maths or physics, i.e. Gauss, Maxwell, Newton, or Einstein?

Answered (07-31-2017) by Steven Lehar, Independent researcher with radical ideas; recently upvoted by Dale Gray.

Yes, its already happened, just that nobody knows about it. I’m talking about William Kingdon Clifford, inventor of Clifford Algebra, the most extraordinary advance in mathematics that (virtually) nobody has ever heard of!

It is a simplification / reformulation of the Vector / Matrix / Linear algebra that suddenly makes them more intuitive, and it merges seamlessly with the rotational multiplication of complex numbers, providing single scheme for both rotational and scaling operations. And there are many formulae that are far simpler when expressed in Clifford Algebra. For example in Clifford Algebra the four Maxwell’s Equations of electromagnetism reduces to a single equation, the rest is implicit in the math!

But the most extraordinary thing about Clifford Algebra, a.k.a. Geometric Algebra, is that it reveals that all of algebra is really a branch of geometry, and thus that mathematics is geometrical in nature, fundamentally.

And that in turn suggests a spatial computational principle in the brain that we experience as mathematics.

https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
 
Top