Anti-derivatives:Human ingenuity vs. computer algorithms

rogerstein

New member
Joined
Apr 13, 2008
Messages
16
Is it worthwhile for mere human beings to struggle to find anti-derivatives, using methods that are as much an art as a science, when Mathematica can do it so quickly, so effortlessly--but using methods that often for even simple integrals involve millions of calculations (according to their own web site), and thus bear no kinship to what we humans do when we integrate? If Mathematica and its cousins, by however alien a technique, ultimately produced anti-derivatives identical to the human creations, or at least equally simple ones, then perhaps only mathematical purists would deem continued human pursuit of anti-derivatives worthwhile. But do they?
Recently at this site, someone posted a request for help integrating Sqrt(1+sin(x)). Galactus, working from scratch(i.e. unaided by "special case" knowledge), produced this economical result: -2*Sqrt(1-sin(x)). In examining his solution (by all means check it out), if you're a newcomer to calculus like me, your synapses will start firing erratically when first confronted with this terse little tidbit:
Let u=1+sin(x)
dx=1 du/Sqrt(2u-u^2)
dx equals what??? Has galactus taken leave of his senses? Actually, no. A little effort and imagination reveals the intermediate steps, and makes you appreciate the cleverness of galactus's path to a solution.
On the other hand, Mathematica's online integrator (which uses Mathematica's actual integrating tool though it limits itself to those functions which can be integrated in a few seconds) uses algorithms, not ingenuity, and produced this fine gargoyle:

2*(sin(x/2)-cos(x/2))*Sqrt(sin(x)+)/(cos(x/2)+sin(x/2))

It's Sasquatch compared to galactus's cute little baby of an anti-derivative!!

So, if special relativity's consequences had required integration to derive (they didn't), and Einstein had used a worm-holed-back-to-1905 Mathematica to do it, and Mathematica had produced its apparently fairly typical ungainly result and e=mc^2 had been a 20-term monstrosity, would Einstein still have become Time Magazine's "Man of the Century" or just a slightly famous eccentric scientist with crazy hair?
 
Wow, thanks for the accolades. :oops:

That particular integral you mention has something about it that scares the computers. My TI-92 will not give an indefinite closed form at all. It just spits back what you entered. Yet, it isn't that bad. I don't know why the computers struggle with it so.
 

Yes indeed . . .
Well done, galactus!

I had my own approach to it . . . (is anyone surprised?)


\(\displaystyle \text{We have: }\;\sqrt{1 + \sin x} \;=\;\sqrt{\frac{1 + \sin x}{1}\cdot \frac{1-\sin x}{1-\sin x}} \;=\;\sqrt{\frac{1-\sin^2\!x}{1-\sin x}} \;=\;\sqrt{\frac{\cos^2\!x}{1-\sin x}}\)

\(\displaystyle \text{The integral becomes: }\;\int\frac{\cos x\,dx}{\sqrt{1-\sin x}}\)

\(\displaystyle \text{Let }\,u \:=\:1-\sin x\quad\Rightarrow\quad du \:=\:-\cos x\,dx\)

\(\displaystyle \text{Substitute: }\;\int \frac{-du}{u^{\frac{1}{2}}} \;=\;-\int u^{-\frac{1}{2}}du \;=\;-2u^{\frac{1}{2}} + C\)

\(\displaystyle \text{Back-substitute: }\;-2\sqrt{1-\sin x} + C\quad\hdots\) .ta-DAA!



 
and my approach (similar to Maple - I suppose)

\(\displaystyle \text{We have: }\;\sqrt{1 + \sin x} \;=\;\sqrt{\sin^2 \frac{x}{2} + \cos^2 \frac{x}{2} +2\cdot \sin \frac{x}{2}\cdot \cos\frac{x}{2}}\; = \sin \frac{x}{2} + \cos\frac{x}{2}\)

\(\displaystyle \text{The integral becomes: }\;\int(\sin \frac{x}{2} + \cos\frac{x}{2})\,dx\)

\(\displaystyle = 2\cdot( \sin \frac{x}{2} \, - \,\cos\frac{x}{2}) \, + \, C\)

I think "rogerstein" misquoted maple.
 
I think he was using the online Integrator which is powered by Mathematica.

I checked it and it does indeed give this form:
 

Attachments

  • MSP22417803395311089206_1025.gif
    MSP22417803395311089206_1025.gif
    3.7 KB · Views: 488
I see - he missed "1" in the (sin(x) + 1) term - thus I was confused.
 
First, to Subhotosh Khan: I noticed my typo immediately upon reading over my post after submission, but since my point only concerned the gargantuan size of Mathematica's output, not its content, I felt I needn't bother correcting it. What kind of fool am I?! In lay circles, a typo or two makes you a "regular guy". In mathematical circles--a pariah!! So, I heartily apologize for leading you astray, Subhotosh (if I may call you that). Incidentally, with your correction, Mathematica's Atrocity is now one digit more atrocious!
Now to the main purpose of this post. Probably surprisingly, I deliberately omitted from my first post the reason that actually impelled me to write about this subject! I did so because to have included it would have rendered it simply too long and complex. That first post raised essentially aesthetic objections to Mathematica's Monstrosities--contrasting one of their typical lumbering mammoths with an archetypal human creation, sleek and elegant like a Hitchcock blonde. And while mathematicians (and scientists too) justifiably greatly prize beauty and elegance in their equations--to the point of feeling that if they're not beautiful they must be wrong--in this case I have a more significant concern. I believe it's possible (I won't state it more strongly) that these Mathematica Anti-derivatives of Ill Repute will eventually have a detectable corrosive effect on math and, even more so, science. An absurd contention on the surface (and perhaps at its core, too!) but I want to present a brief argument and then get any feedback this intelligent and informed mathematical community might wish to provide.
First, I'm making two assumptions, both of which appear reasonable. 1)Due to the nature of the method Mathematica and its equivalents use to integrate, vastly-larger-than-human-produced anti-derivatives are routine, not anomalous, even with the simplifying algorithms I know they employ. 2)Mathematicians and scientists have been and will be increasingly reliant on Mathematica et. al., so that over time there will be a loss of "institutional memory" of how to integrate very complicated functions "the human way". The heuristics, the intuitions, the artistry that develops only through frequent practice from early on will fade away with the current generation's passing-- the rookies coming up will have supposedly more urgent and marketable mathematical skills to develop. Take a quick, savoring glance, while you still can, at the three different human (i.e. inventive) approaches of galactus, soroban, and subhotosh as each deftly harpooned the elusive integral that would so easily have wriggled away from lesser men--those lesser men will largely dominate our very near mathematical future. Skeptical that such precipitous erosion of a whole community's skills is possible? When the proposal to return to the moon was made a few years ago, a NASA scientist was asked why this second series of missions would take as long to consummate as the first, and he said that all the little unwritten tricks of the trip-to-the-moon trade had been institutionally forgotten and would have to be painstakingly relearned. Of course, there will always be a special few who almost mystically perceive the route to an anti-derivative, but most math and almost all day-to-day science will be performed by Saint Stephen of Mathematica's disciples.
So if these two assumptions are valid, the result will be that mathematicians and scientists will be "stuck" with Mathematica's Ungodly Octopi for functions that COULD have been integrated to compact anti-derivatives by humans. So how might this harm both math and science (especially the latter), as I so implausibly claim? In two ways.
1)Paul Dirac, creator of an equation as famous among physicists as it is unknown to the general public, said of his creation:"My equation is smarter than I am." He didn't elaborate, but I believe what he meant was that sometimes scientists ASSEMBLE equations-- perfectly understanding all the concepts involved and the relationships among them, the scientists symbolically represent what they already know, and poof!, there's the equation. But other times, scientists DERIVE equations mathematically, having only the most tenuous grasp of what's really going on physically, and not a glimmer of where it will end up mathematically. These mysterious new equations hold secrets about nature that can be pried loose for the scientists' edification--THESE equations are smarter than they are. (And sometimes the prying takes decades--Alan Guth is said to have discovered Big Bang Inflation lurking deep in the bowels of General Relativity about 60 years after its debut.) But the discerning of these hidden truths may well depend on the transparency of the equations, on their being as simple as possible a rendering of the relationship among the components--not exactly a description of an equation with the perhaps 20 extra terms of a Mathematica Monstrosity. The smartest equation in the world would have difficulty communicating with scientists through that!!
2)The second way Mathematica's Anti-derivatives Not Even Wolfram's Mother Could Love might do harm concerns those situations where you integrate the anti-derivative a second time, as from acceleration to velocity and then to distance. Or, alternatively, when you have what I'll call amnesic anti-derivatives, ones living in a new location, as just another algebraic term in a new larger equation, with no memory of their past life as an anti-derivative, and you must integrate that new equation. In either of these cases (and I'm sure there are others), Mathematica's Sprawling Swamps may well make the task impossible when it would be eminently doable if you had a neat human anti-derivative--everyone reading this has observed how quickly a little complexity renders a function unintegratable by anti-derivative, so that numerical procedures are necessary. And while numerical procedures are fine for solving a particular problem, they deny you the opportunity of creating and interrogating a new equation that's smarter than you are, and so the advance of science is again thwarted. Now, will the truly paradigm-doing-somersaults-not-merely-shifting science be affected by this? Will Ed Witten of the Institute for Advanced Studies fail to perfect String Theory for want of a compact anti-derivative? No. But there will be countless instances when little insights in everyday science are missed because final integrations couldn't be achieved or equations spoke but couldn't be heard, muffled by Mathematica's ruffians. Meanwhile, the mathematicians who could have created the necessary anti-derivatives will be busy testing non-trivial zeros of Bernie's zeta function or God knows what. By the way, regarding this second line of argument, I freely acknowledge my calculus is not yet sophisticated enough to do more than tentatively raise a few of many possibilities--that's why I'm eager to see responses from the knowledgeable, frequent posters, all you Sovereigns of the Infinitesimal Realm, who can affirm or pitilessly savage my argument, as its validity dictates.
In fact, I welcome comments from ANYONE who sees fit to comment. No one who visits a site like this would be unworthy of being heard.
 
Hey rogerstein. First let me say I enjoy reading your eloquent posts, but it might help to space out such a long one.

While its possible that techonology such as Mathematica might not help people learn the material "properly" (or develop intutiion) it is still a very useful tool. Without such advanced tools science may progress more slowly than when left to the human brain. In this light, computers save time and are a blessing. That said, there will always be a need for both pure and applied mathematicians to drive the creativeness and research to make this technology possible. Not all theorists will use this technology either. In the books I've read, even though I've barely touched on advanced topics, the authors are (or at least seemed to be) more concerened about solving general problems, with the occasional concrete construction. Most of the assignments are not calculations, so such tools won't affect how pure mathematicians learn in a negative way.

Also, there are ways in which "easy" mathematical concepts, when written as "complicated" equations and definitions might be beneficial. I've heard from my professors on more than one occasion say something of the form, "We'll redefine this in that way and you'll see why later." (Of course also showing by proof that the definitions agree). I think I've seen at least 4 or 5 definitions of the number e (depending on the subject and use).

I remember when I started learning a particular subject, there were very awkward and unintuitive ways in which things were defined. I found out later that they were defined this way to make the harder things in the field easier to work with. So, the simplest solution is not always the best, and/or your simple solution may be more complicated than mine in my eyes. There might be a good reason why Mathematica uses a particular algorithm to solve something, but who knows. If the solution is correct, thats still good, right? Way before these Integrators was around, there were integral tables... large books full of solved integrals. Maybe the way an integral was solved in the tables is not a pretty thing and has a "simpler" (that is, "simpler" for me) solution. Same idea.

Mathematica, as an example, is a very powerful tool and as much as I've used it (not much at all, really) has usually given "simple enough" answers to work with. Sometimes, however, the results are not fun to look at. If anything, the programmers are very intelligent, and I'm sure will work to make it as simple as possible to use. If or when things become too complicated there will be a sort of revolution I'm sure, or at least there will be a niche available and room for real competition between mathematics software developers.
 
First, thanks daon for the kind words, and you are absolutely right about the spacing. As I gaze now (in horror!) at my post, I'm uncomfortably reminded of the dense and forbidding pages of a Henry James novel. (Although I hope I was a little spicier. And at least I avoided the paragraph-long sentences.)

Re your point about Mathematica's being a very useful tool despite its drawbacks, you're completely correct, and never for a picosecond (by the way, have you noticed how the non-mathematical/scientific world has appropriated our lingo? When Lindsay Lohan (FOR HEAVEN'S SAKE!) says nanosecond, I think self-respect demands we move on to pico, femto, or beyond!), I've never thought for a zeptosecond that Mathematica IN GENERAL is a bad thing--only bad in the way it might hinder THEORETICAL scientific progress, as opposed to solving concrete, work-a-day problems. For the latter, computer software, from the TI-89 to Mathematica, is great. But for maximum THEORETICAL progress, you need NEW equations, and CLEAR, TRANSPARENT equations, and the path to both of these is strewn with the obstacles that are the needless extra terms of a Mathematica anti-derivative.


You're also correct that the human way of doing anti-derivatives is still being taught. And always will be, I'm sure. But how deeply? And when very deeply, to how many very deeply? Oh, if only the universe were filled with functions that could be integrated with the few deft strokes that everyone will always learn. But the instant true complexity arises (which happens constantly in real life) , you're drawn into an asphyxiating maelstrom from which only those studying and practicing integration techniques for a lifetime will emerge clutching an anti-derivative--and even they will be soaking wet and gasping! And so, Mathematica promotes loss of integration skills, and loss of integration skills promotes Mathematica, among the typical scientists in Baltimore and Butte, including those doing theoretical work, with perhaps the dire consequences I suggest.


(Notice the lavish spacing? Who says I'm uneducable!!) You also reasonably point out that there will always be pure mathematicians who will be as adept at integrating as Euler. I once heard a just-below-major Hollywood star say that he's always amused when people come up to him and ask,"What is Madonna (or George Clooney,etc.) really like?" as if all celebrities are intimately acquainted with one another. He said he has to laugh at the idea that he could just give Meryl Streep a buzz, or invite de Niro to dinner. He said, "I have no more access to them than you do!" (meaning the fans asking him). And similarly, those Euler-caliber integrators in Gottingen would be as inaccessible to the typical scientist doing lower level theoretical work as Julia Roberts is to Pauly Shore.


You are dramatically correct when you say that complexity often is necessary in definitions. Was it Einstein who said (I'm acting now on the principle that when in doubt about the author of an exceptionally pithy statement, if you cite Einstein, Churchill, or Mark Twain, depending on the subject, you'll probably be right) "Make things as simple as possible, but not simpler". So yes, definitions can seem irritatingly complex, and only later on do we see the justification for all those annoying, seemingly superfluous terms. But do you think this extends to equations? Ockham's Razor, universally embraced, says no, but you'd be correct if you said that's just an intuitive rule of thumb, not a proven principle. Its kernel, which is that if you have two models that account equally well for your observations, choose the simpler, has not been PROVEN to yield greater and faster progress, to spur additional insights at a quicker pace than choosing the more complex model would, but daon, do you think any sane scientist would defy that deeply intuitive notion? Yet if the Mathematica Borg continues to extend its domain, they soon won't have a choice.


And there remains this point: In my long post, where I advance the main thrust of my argument, I suggest not only equations of greater (and unnecessary) complexity will result from Mathematica's Elephant Man Anti-derivatives, but there will be Equations That Could Have Been But Never Were, because of the inability to integrate obscenely large anti-derivatives a second time. It's on this very mathematical point (and any other, of course) that I would love to see responses from the dazzling Calculus Mavens who frequent this site, but who seem to be reluctant to offer their views. To me, you are like virtuoso pianists who are content to help awkward novices with the fingering of difficult chromatic passages, but who don't voice their insights into the deeper structure and themes of the music. I want, we all want, your insights, gentlemen (and ladies)! Even if it means my argument gets pummeled into submission as a result.
 
Hey Roger:

There is a contributor to this site who contends that integration by hand and partial fractions is an obsolete art due to the fact that we have technology to perform the integration. The trick is setting up the integral, which is something a computer can not do.....yet. If for no other reason, I look at finding antiderivatives as solving a puzzle even if a computer can solve them in a few seconds. It keeps one sharp. For instance, find the integral of:

\(\displaystyle \int_{0}^{1}(1-x^{4})^{\frac{1}{4}}dx\)

This can not be done too easily by elementary means(substitution, parts, etc.), but using the beta function it isn't bad. When it is run through Maple or Mathematica, they give a horrendous answer in the form of hypergeometrics series which are tremendously complicated.

I see your point and tend to agree. At least to a point. I must confess, once I set up an integral I solve it using my calculators depending on what it is. A solid of revolution, for instance. The trick is setting up the integral. Once that is done, by all means, use tech.

By the way, I like the way you write. That must be your forte'?.
 
Hiya galactus,

It's a pleasure and an honor (by jiminy, did I say an honor? I meant an HONOR!) to have the inimitable galactus saunter into my saloon, sheriff's badge flashing, fingers lightly touching his TI-92 in its holster, ready for any integrals up to no good.

Now, there IS the little matter of your saying, "I tend to agree with you. Up to a point." My unfortunate natural tendency (which I'm trying to curb, I swear it!) would be to begin preparing to apply the rhetorical equivalent of all the holds I learned watching wrestling on TV, including some illegal ones, to force out of you something more whole-hearted than that.

But actually, I think we're already in complete concord because what you then add as though it's a minor disagreement is really something I completely endorse, i.e. you said you always use a calculator after setting up the integral--what you didn't state explicitly is key here, namely that in these cases you're simply trying to solve a practical problem, not do theoretical science, and it's only in the theoretical science circumstance that I feel it is harmful. I don't care how integration is done in all other situations.

But here's WHY I do object when scientists, seeking theoretical advances, even tiny ones -- like an equation modeling the relationship between temperature, humidity, and the pitch of frog-croaking in the Everglades--rely (or are forced to rely) on Mathematica for their integration. Actually, let's just look at my second reason (in my long post where I lay out the basic argument, it's the second) so we can keep it right in your line, purely mathematical. If we take the integral you present in your post, and integrate it the two ways you describe, with the results you describe--the human way producing a reasonable anti-derivative but Mathematica producing one that, if you said it out loud in front of your grandmother, she'd wash your mouth out with soap. Now, suppose the function you just integrated was originally some kind of acceleration that became a velocity and to do your THEORETICAL work you HAD TO HAVE an equation for distance, necessitating a second integration. Isn't it likely that you'd succeed with your human-created anti-derivative but fail with the Mathematica one, and with that failure, your research progress would halt (which is a stipulation of this question)? You wouldn't disagree with that, would you? And if you wouldn't, then really we're in total harmony, so long as you also share my view that if you take the emphasis on integration skills in the 1940 calculus community--- when you needed anti-derivatives not just for theoretical-advance equations but for ALL quick evaluations of integrals since computers weren't yet available for numerical integration, and this great need caused a massive focus on integration skills--- and compare this with the relatively modest emphasis on integration skills today, the difference is stark and getting starker. Which means that some interesting but not terribly important theoretical research, say at Northern Alabama Tech if there were such a place, will fail because all AVAILABLE mathematicians (NOT all those alive in the world but all AVAILABLE to this low level theoretician) will fail to have the skill to be able to produce that very-hard-to-get-but-still-gettable anti-derivative. Hence no new equation, and hence a failure to make a THEORETICAL advance.

So I think we really are in perfect accord, which is a very happy occurrence since now I won't have to use the debating equivalent of a certain little maneuver that should have gotten Ric "Nature Boy" Flair seven to ten of hard time at Folsom.

Oh, by the way, galactus, I've been meaning to mention this for a while: Frankly, I don't care WHAT my fellow earthlings may think, NO I REALLY DON'T!!-- I for one am very glad you survived the Cosmic Egg.

And one very brief, serious, non-mathematical aside to ANYONE who happens to read this: If life at the moment is just too much (or painfully, not enough) and you wish to be mystically transported for a few minutes, consider going to YouTube and typing in the search terms "Mary Black Heart Like A Wheel". Even if your body is encased in 50 pounds of bitterness, cynicism, disgust, and misanthropy (which, in the city I come from, would be regarded as a light load), the exquisite beauty of this song, and this performance, will pierce right through to your soul.
 
To rogerstein and galactus:

Gentlemen: There's a chance that this issue can be definitively resolved, and quickly!

rogerstein, you proposed a thought experiment in your last post. But there's no reason it has to be a mere thought experiment--galactus is in a position to easily carry it out!!

In galactus's last post he described what happened when he integrated (1-x^4)^.25 the human way, using the beta function (whatever that is), which apparently produced something manageable. And then he put it through Mathematica, which produced what rogerstein described as "if you said it out loud in front of your grandmother, she'd wash your mouth out with soap."

rogerstein then hypothesized (I'm condensing it, forgive me rogerstein) that if you tried re-integrating each, you'd probably succeed with the human-created one but fail with the Mathematica product, and that would prove his argument.

Well, galactus just has to take that beta-function anti-derivative, representing human ingenuity, and see if it can be integrated, by any means (it doesn't have to be by human means, you can just put it through Mathematica galactus). And then of course take that wild hypergeometric Mathematica-produced anti-derivative and try to integrate THAT a second time through Mathematica. Here's a breakdown of the 4 possible outcomes and their meanings.

1)The beta-function, human-ingenuity-produced anti-derivative will be re-integrated but the Mathematica one won't be, proving rogerstein's thesis. Congratulations, rogerstein.

2)Both will be re-integrated. Issue unresolved.

3)Neither will be re-integrated. Issue unresolved.

4)(And you won't like this rogerstein) The Mathematica hypergeometric giant will be re-integrated but the beta-function human one will fail to be re-integrated, suggesting that the OPPOSITE of rogerstein's hypothesis is true and Mathematica is actually superior to human ingenuity.

This is interesting--a scientific experiment carried out solely by mathematical means. rogerstein has a theory and galactus goes into his mathematical laboratory to test it.

They say that in the 19th century people waited anxiously on the docks for the ship bringing the next installment of the new, serialized Dickens' novel. Well, those of us who have been following this discussion are waiting anxiously on the docks for galactus to steam in with his experimental results. Godspeed, galactus!
 
Top