Free download. Book file PDF easily for everyone and every device. You can download and read online Special Functions file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Special Functions book. Happy reading Special Functions Bookeveryone. Download file Free Book PDF Special Functions at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Special Functions Pocket Guide.

By the s, trigonometry was getting pretty well established. And in the mids, it was pretty crucial to Copernicus's big book De Revolutionibus. Which, starting a long-continued tradition, effectively had an appendix about mathematical functions. Lots of the trigonometry there is just like today. There are a few noticeable differences, though. Like a lot of use of the versine. Anyone know what that is?


  • Formulas and Theorems for the Special Functions of Mathematical Physics | SpringerLink.
  • Special Functions - MATLAB & Simulink!
  • Psychological Trauma and Addiction Treatment, Vol. 8, No. 2.
  • Food waste recovery : processing technologies and industrial techniques.

It's actually 1-Cos[ x ] —a sine "versed" through 90 degrees. And you'll find it in books of tables up to quite recently. But today it's so easy to do the extra arithmetic that it's not worth talking about this function. Well, after trigonometry, the next big thing was logarithms. Which burst onto the scene in Within a few years, there were many tables of logarithms. It had become quite an industry—which actually lasted more than years. It took a few years for natural logarithms and exponentials to get their modern form.

But then by the mids all of the ordinary elementary functions were firmly in place. And from then until now they've been essentially the only explicit mathematical functions that most people ever learn about. Well, in the late s calculus arrived. And that's when today's special functions started appearing. Most of it actually happened rather quickly. Around one of the Bernoullis had the idea that perhaps the integral of any elementary function would be an elementary function. But that didn't pan out.

Still, within a couple of years people were talking about elliptic integrals. At least in terms of series. And they'd seen Bessel functions. And by the s, Euler was starting his major industrial drive into the world of calculus. And he started talking about lots of our standard special functions. He found the gamma function as a continuation of factorial. He defined Bessel functions for investigating circular drums.

Keywords/Phrases

He looked systematically at elliptic integrals. He introduced the zeta function. He looked at polylogs. But gradually more and more of the functions he talked about started being used by several people. And usually after a few iterations they started having definite notations, and definite names. There were a few more bursts of special-function activity. In the late s, there was potential theory and celestial mechanics.

And for example Legendre functions—also for a long time called Laplace functions—came in around Then in the s, complex analysis was all the rage, and the various doubly periodic functions came in. There wasn't terribly good communication between people in the field. So there ended up being quite a few incompatible notations. Which have persisted to the present day, and regularly lead to Mathematica tech support calls. A few years later, harmonic analysis got big, and the various orthogonal polynomials—Hermite, Laguerre, and so on—came in. Well, even quite early in the s, it was already fairly clear that a serious zoo of special functions was developing.

And that got Gauss in particular thinking about how to unify it. He investigated hypergeometric series—which had in fact already been defined and named by Wallis in the s. And he noted that the 2F1 or Gauss hypergeometric function actually covered a lot of known special functions. Well, by the mids there was a serious industry of scholarship about special functions—particularly in Germany—with many erudite books being written.

So by the time Maxwell was writing his books on electromagnetic theory in the s he didn't think he even needed to say much in his books about special functions; he just quoted standard books. Along with scholarly works listing properties of functions, there were also tables of values being created. Sometimes by people one's never heard of. And sometimes by people one has—like Jacobi or Airy or Maxwell. So, well before the end of the s, essentially all the special functions we deal with today were established. There were some extras too.

Like does anyone know what a Gudermannian is? I remember seeing it in books of tables when I was a kid. It's named after Christof Gudermann, a student of Gauss's. And it gives a relation between ordinary trig functions and hyperbolic functions. And it's also relevant to Mercator projections of maps. But somehow it hasn't really made it to modern times.

Special Functions of Mathematical (Geo-)Physics

Well, there was lots of erudition poured into special functions in the last few decades of the s. And I suppose they could have gone the way of invariant theory, or syzygies, or those other somehow characteristically Victorian mathematical pursuits. And indeed in most of pure mathematics, the push towards abstract generality made special functions look rather irrelevant and arbitrary. Like studying strange specific animals in a zoo, rather than looking at general biochemistry. But physics kept on reenergizing special functions. Elasticity theory.

Electromagnetic theory. Then in the s quantum mechanics, where even the most basic problems involved special functions—like Laguerre polynomials and Hermite polynomials.

Table of contents

And then there was scattering theory, which seemed to use almost the whole zoo of special functions. Well, one of the things that happened from all this was that people got the idea that any nice clean problem could somehow always be solved in terms of special functions. And certainly the textbooks encouraged that idea. Because the problems they discussed were somehow always ones that came out nicely in terms of special functions. There were cracks, to be sure. Quintic equations. The three-body problem. But these were somehow thought to be unusually messy.

Not really the norm. And not what one needed to look at in modern probabilistic theories. But special functions were a big business. Table making had become a major government activity—particularly in England. And was thought strategically important. Particularly for things like navigation. And there were lots of tables. Like here's a particularly fine set from When I first saw them, I thought it was a time warp.

Actually, that Wolfram was a Belgian artillery officer. Probably no more of a relation than the Saint Wolfram in the 7th century AD. Tables were sufficiently important that Babbage had for example come up with his difference engine in the s specifically for the purpose of printing accurate tables. And by the late s, special functions were a common subject of tables that were being made.

Mechanical calculators were becoming ever more widespread, and in Britain and the U. Like the WPA project in the s—to give people jobs during the Depression computing mathematical functions. There were gradually starting to be systematic reference works on the properties of special functions. Each one was a lot of work, so there really weren't very many. Though they thought they were pretty important. Like here's the cover of the U.

Emulating the plaster and wood models of functions that arrived in so many math departments in the early s. And in fact, I got the idea for the picture of the zeta function that I used on the first edition of The Mathematica Book from Jahnke and Emde. Somehow a lot of special-function reference work came to a head right at the time of the Second World War. I'd sort of assumed it was for military reasons. But actually I think it was a coincidence. But the potential connection with strategic stuff made it not get closed down.

In , Harry Bateman died, having collected a dozen shoeboxes of cards with facts about special functions. Which eventually got published as the Bateman Manuscript Project. The Manhattan project, and later the H-bomb development project, were important consumers of special functions. And from this gradually grew the Abramowitz-Stegun book published in , which became the staple for American special-function users. In the s and 70s there was beginning to be lots of effort put into numerical algorithms for computers. And special-function evaluation was a favorite area. The work was in most cases painfully specific—with huge amounts of time being spent on a particular Bessel function of a particular order to a particular precision.

But there gradually emerged subroutine libraries with collections of specific special-function-evaluation algorithms. Though really most people still used books of tables, which were to be found in prominent places in the reference sections of science libraries. Well, my own involvement with special functions started when I was a mid-teenager, in the mids. The official math that I learned in school in England pointedly avoided special functions. It tended to be all about finding clever tricks to squeak through to an answer just using elementary functions.

I wasn't very happy with that. I wanted something more systematic, more technological. Less tricky. And I liked the idea of special functions. They seemed like bigger hammers to hit problems with. Though the discussion of them in mathematical physics books never seemed terribly systematic. Yes, they were more-powerful functions. But they still seemed quite arbitrary—a kind of zoo of curious creatures with impressive-sounding names. I think I was 16 when I first actually used a special function for real.

The Special Functions and Their Approximations, Volume 53A

It was a dilogarithm. And it was in a particle physics paper. And I'm embarrassed to say that I just called it f. But my excuse is that polylogarithms were actually pretty unknown then. The standard mathematical physics books had Bessels and elliptic integrals and orthogonal polynomials, even some hypergeometric functions. But no polylogarithms. As it turns out, Leibniz had already talked about dilogarithms. But somehow they hadn't made it in the usual historical zoo of special functions. And the only real information about them that I could find in the mids was a book from by a microwave engineer named Leonard Lewin.

Soon thereafter, I started computing lots of integrals for Feynman diagrams. And I realized that there polylogarithms were really absolutely the key functions that were needed. They became my friends, and I started collecting lots of properties of them. And there was something that I noticed then, that I didn't really realize the significance of until much later. I should explain that in those days doing integrals well was sort of a hallmark of a good theoretical physicist.

Well, I had never thought of myself as algebraically talented. But somehow I discovered that by using fancy special functions, I could do integrals like the best of them. Lots of people even started coming to me to do integrals. Which I found kind of amazing. Especially since it seemed like every integral I did, I did just by writing it as a parametric derivative of a beta function integral.

And I think from this I had some dim awareness that the way to do lots of integrals might be by going to higher functions, and then back again. A bit like one has to go to complex numbers to solve real cubics. And lots of other things like that in math. Well, by or so, I decided I really needed to codify how to do these integrals. And here for example was a block diagram of a Macsyma program that I made for it. A combination of algorithms and table lookup.

That actually did work, and was quite useful. Well, in addition to polylogs, I kept on running into other special functions. K-type Bessel functions in cosmology calculations. QCD calculations that became a whole festival of zeta functions. I made a theory of "event shapes"—that still gets used in many particle physics experiments today—that was completely based on spherical harmonics and Legendre polynomials.

I'd never met those in the wild before. You know, the culture around special functions was interesting. It was always very satisfying when a problem came out in terms of a special function. It was sort of stylish. It was sort of funny showing particularly an older physicist some mathematical problem. They'd look at it a bit like an archaeologist looks at a pot shard. They'd stroke their beard if they had one. Inverse of gdtr vs x. Inverse of nbdtr vs p. Inverse of nbdtr vs k. Inverse of nbdtr vs n. Inverse with respect to f of the CDF of the non-central F distribution.

Cumulative distribution function of the non-central t distribution. Inverse to pdtr vs m. Inverse to pdtr vs k. Inverse of stdtr vs df. Inverse of stdtr vs t. Inverse to chdtrc. Inverse to chdtr vs v. Inverse of ndtr vs x. Inverse to chndtr vs df. Inverse to chndtr vs nc. Inverse to chndtr vs x. Inverse to smirnov. Inverse to gammainc.

Special functions Research Papers - ehudokunezet.tk

Inverse to gammaincc. Complementary error function, 1 - erf x. They don't really bear the marks of the human creators. So we have the tables, but we really don't quite know where they came from. Well, OK, so in the late s I had become a fairly serious special-functions enthusiast. So in , when I started building SMP, which was in some ways a precursor to Mathematica , it was only natural for it to have good coverage of special functions. It just seemed like that was clearly a part of doing mathematics by computer. Well, here is a very early design document for SMP, written in the first couple of weeks of the project.

And already with evidence of special functions. And by the time SMP Version 1 was released in , it had a pretty complete list of special functions. The support for some of them was quite thin. But people were already quite excited to see their favorite functions in there. The numerics for the functions were somewhat primitive. But one thing that helped a lot was that we spent a lot of effort on the evaluation of fairly general hypergeometric functions.

And then we treated many other functions as special cases. Which was OK, except for the fact that quite often the most interesting cases of functions end up being sort of degenerate limits. So in practice even the numerics ended up being a bit spotty. Well, a few years passed. And in I started designing Mathematica. And this time, I wanted to be sure to do a definitive job, and have good numerics for all functions, for all values of parameters, anywhere in the complex plane, to any precision.

At first I thought perhaps I should try talking to experts. And I remember very distinctly a phone call I had with someone at a government lab. I explained what I wanted to do. And there was a silence. And then he said: "Look, you have to understand that by the end of the s we hope to have the integer-order Bessel functions done to quad precision.

OK, so we needed another approach. See, the problem was that the guy with the Bessel functions was thinking that he was going to be the one deriving all the numerical methods. But what I realized was that to do what we wanted to do, we needed automation. And what happened is that Jerry Keiper developed a whole automated system for searching for algorithms for function evaluation. It's in some ways a very NKSish thing: we set up general forms of rational approximations to functions.

Then we just did big searches to optimize the parameters. And it worked really well. And it let us grind through getting good numerics for all our functions. You know, it's actually quite a difficult thing to put a special function into Mathematica. You don't just have to do the numerics.

For all parameter values and so on. But you also have to support all sorts of symbolic properties. Like derivatives, and series, and asymptotic expansions. With respect to arguments, and parameters. And everything. Often we have to derive completely new formulas. That have never been in the literature.

And of course we have to hook the functions into integrals, and differential equations, and sums, and integral transforms. And FunctionExpand , and FullSimplify. There's a long list. Often the very trickiest issues come because of branch cuts. Most continuous special functions are in effect defined implicitly, usually from differential equations. And that means that—just like with something like a square root—they can in general have multiple possible values.

Corresponding to different sheets on their Riemann surface.


  • Sexualized Brains: Scientific Modeling of Emotional Intelligence from a Cultural Perspective?
  • Still More Playboys Party Jokes, Volume 3.
  • Online Help.
  • Special Functions - Maple Programming Help.
  • Parallel coordinates: visual multidimensional geometry and its applications.
  • Recommended for you.
  • Was this information helpful?!

Well, to make genuine functions out of them one has to pick a principal sheet. And that means there end up being seams—or branch cuts—where there are discontinuities. And representing those branch cuts symbolically is a really really tricky business. Even for elementary functions. They don't teach that in school. And actually we only derived it fairly recently. But it's what you need if you want to get symbolic manipulations with these functions in the complex plane correct.

Well, OK, we've put a lot of effort into special functions in Mathematica. With algorithms. And now our whole Wolfram Functions Site. And by immense amounts of human effort—and clever automation—we've multiplied by a large factor the special function knowledge that's in the world. I think it's ended up having quite a bit of an effect.

It's hard to measure, but somehow I think having all these functions as accessible as sine and cosine has made them vastly more popular. They are no longer obscure things nobody's ever seen. They're things anyone might get out of The Integrator. Not just conserved like an endangered mathematical species. But developed and made mainstream and useful. Well, OK, I said I would say something about the future of special functions. It's interesting to see special-function trends. There are lots of special functions that are always in style. Like the Bessels.

Or the orthogonal polynomials. Then there are things like polylogs, that languished for a long time, but ended up coming to prominence fairly recently through several quite separate applications. There are special functions that are still languishing. From a hundred years ago.

That people occasionally ask about in Mathematica. Well, in developing Mathematica , we regularly talk about the future of special functions. Which functions will really be useful for the kinds of directions people seem to be taking. Which ones are a huge amount of effort to support, but will go the way of the Gudermannian. One obvious current trend—that recent Mathematica s have obviously been part of—is getting functions from discrete difference equations as well as continuous differential equations. Of course, like so many ideas, this one isn't really new.

And in Mathematica 5 we exhumed a lot of his results—and, as it happens, Babbage's—and made algorithms out of them. Well, for Boole and Babbage, it didn't really count as a "solution" if it didn't come out in terms of elementary functions. But one can perfectly well have discrete special functions—and I wouldn't be surprised that if we invent them correctly, we'll find they were already seen in the s.

Well, OK, thinking about all this makes one really want a more general theory of special functions. Some kind of more fundamental framework. The qualitative picture—ever since the Babylonians—is that special functions are sort of organizing centers for mathematical calculations. In the sea of possible functions, they're definite anchor points. Functions with fairly few arguments, that somehow can be used as primitives in a lot of useful calculations. In a sense, it's a little like built-in functions in the Mathematica language.

There's a sea of possible computations people want to do. And our job in designing Mathematica is to find certain good primitives—that we can name and implement—from which those computations can be built up. So what makes a special function good? Well, we can start thinking about that question empirically. Asking what's true about the special functions we normally use. And of course, from what we have in Mathematica and in our Wolfram Functions Site, we should be in the best position in the world to answer this. Well, here are a few facts.

First, most special functions have a small number of arguments. Two is the most common. In old-fashioned printed tables there was a big deal about "single entry" tables—functions with one argument. And beyond that they became very awkward. Well, of course we don't have exactly the same constraints now.

But as the number of arguments goes up, there are sort of more nooks and crannies for a function. It's harder to support it well, and one somehow loses the integrity of talking about something as being a particular special function. Here's another thing one can ask: how connected is a particular special function?

CRP429C Engine ABS airbag SRS AT system diagnostic 11 special functions

How many relations does it have? To itself? To other functions? Well, one can answer this rather quantitatively from the Wolfram Functions Site. Here's the complete sparse matrix of what function in the Wolfram Functions Site refers to what other function. One might have thought that the more relations a function has, the more useful it would be. But the Gamma function—which is one of the most useful special functions—has very few relations. Whereas the Weierstrass functions—which aren't so useful—have oodles of relations. And Zeta is like a very separate prize peacock. OK, but let's look a bit more carefully at the continuous special functions that we typically use.

Ever since the beginning of calculus, power series have been one of the main ways one looks at functions. And indeed essentially all the special functions that we use have a rather special feature in their power series: the coefficients are rational. And that means that the functions are hypergeometric. They can be expressed in terms of p F q generalized hypergeometric functions.

And at least to some extent the values of p and q for a particular function are correlated with how exotic the function is. There are a few functions that we use that aren't hypergeometric. Like the Mathieu functions , for example. But so maybe this is our answer: that for some reason what we mostly need are things with rational power series, and the special functions are ways to package those things. But there's an easy experiment that shows that that's not the whole story. We can just enumerate possible rational power series, and ask what functions they are.

It's an easy thing to do with Sum in Mathematica. Here are some results:. Now of course everything is some kind of p F q hypergeometric function. But what's notable is how rarely friendly functions like Gamma or Bessels show up. It's mostly raw p F q 's. So there's clearly some other selection process going on to get to the common special functions. Well, one important characteristic of a hypergeometric function is that it satisfies a differential equation. So here's another theory: perhaps the special functions that occur are just solutions to a certain class of differential equations.

And that's definitely much closer. Let's just imagine enumerating second-order differential equations constructing integer polynomials, say from digit sequences of successive integers. Then using DSolve on them. Well, here's the result. It's kind of interesting. From just this one class of differential equations, we're seeing lots of our favorite special functions. So this is definitely part of the answer. But again this isn't the whole story. Take a look at the whole array of results from DSolve.

There are holes. Are the holes just deficiencies in DSolve? No, not most of the time. Most of the time they're "missing" special functions. It's actually rather easy to set up equations that need these. Even integrals can do it. Like the integrals of Sin[Sin[ x ]]. You might think that'd have to be a fairly straightforward special function.

And indeed lots of that kind of nested trig integral are. But this isn't. To express it as a special function, you'd have to go to much more complicated two-variable hypergeometric functions. It's kind of fun these days. Integrate in Mathematica is good enough that one can use it to get a fairly accurate map with it. Just trying out different integrals, and seeing where the boundaries of being able to do them with special functions are. And here's sort of a very fundamental issue: just what kinds of computations can one expect to give "exact solutions" in terms of special functions?

In theoretical physics, there's tended to be a sort of implicit view that for anything that's worthy of being called a physics problem, it should be possible with enough effort to "solve" it. Fluid turbulence. And effectively to come up with a formula for what happens in the system. And in a sense that's pivotal to the whole self-image of theoretical physics.