[Math] Classes of functions with specific product integration property

Started by
19 comments, last by NotAYakk 18 years, 1 month ago
Quote:Original post by staaf
Quote:Original post by Winograd
I believe you meant int{g(x)h(x)dx} = int{g(x)dx}int{h(x)dx}?

Yes. My mistake. I didn't think about that it wouldn't be the same thing so I used y to distinguish the two integrals.

Anyway, you are right about polynomials not being eligible as g(x) or h(x), but there are lots of other kinds of continous functions that behave differently when integrated. I think it is wrong to rule out the existence of such functions just based on the result from polynomials. Probably the functions that actually has this property are quite few in number, but nevertheless I'm curious about them.


Then you should restrict your search to such functions that do not satisfy the Taylor's theorems conditions. They rule out all continuous functions with infinitely many continuous derivatives (derivatives may vanish of course).

Analysis may get quite involved...
Advertisement
Quote:Original post by Winograd

Then you should restrict your search to such functions that do not satisfy the Taylor's theorems conditions. They rule out all continuous functions with infinitely many continuous derivatives (derivatives may vanish of course).


Except for g(x) = h(x) = 0
Quote:Original post by Winograd
Then you should restrict your search to such functions that do not satisfy the Taylor's theorems conditions. They rule out all continuous functions with infinitely many continuous derivatives (derivatives may vanish of course).
You're right. I didn't consider that. Thanks.

Quote:Original post by Anonymous Poster
Except for g(x) = h(x) = 0
Obviously, but it'd be hard to approximate any other function with that.
Hack my projects! Oh Yeah! Use an SVN client to check them out.BlockStacker
Quote:Original post by staaf
Quote:Original post by Anonymous Poster
Except for g(x) = h(x) = 0
Obviously, but it'd be hard to approximate any other function with that.


You're just not trying hard enough then [smile]
let fk(x) = k for all x.

Then
int {fk fj} = k * j * [x-x0]
int {fk} int {fj} = k * j * [x-x0]

In other words, it works with constant functions. =)
No, it doesn't work with constant functions.

j*k*x != j*x * k*x
If I didn't do any mistakes then your functions should satisfy also following equation:

int{h(x)dx}/h(x) + int{g(x)dx}/g(x) = 1

You can arrive to the result yourself by taking derivatives on both sides:

h(x)g(x) = h(x)*int{g(x)dx} + g(x)*int{h(x)dx}

and first separating h to lhs and g to rhs.

I'm not sure if this is useful at all, but atleast gives some insight to the problem, provides a necessary identity and IMHO it looks beatiful ;).
EDIT: There was an error in some equations. Lhs was ln(h(x), while it should have been ln(int{h(x)dx}).

OK I have a single set of solutions (there is perhaps more), from the previous formulation with a little work we can get:

h(x) / int{ h(x)dx } = g(x) / ( g(x) - int{ g(x)dx } )

Integrating this from the both sides gives:

ln(int{h(x)dx}) + C0 = int{ g(x) / (g(x) - int{ g(x)dx } )dx } (1)

OK, if I want an easy solution for the rhs I would demand that

g(x) = diff{ g(x) - int{g(x)dx} }

If this would be satisfied then the rule applied for the lhs could be used for rhs too.

So we get that g(x) = g'(x) - g(x) => g'(x) = 2g(x)
I came up only one function that satisfies this: g(x) = exp(2*x) (2)

Now substituting (2) to (1) we get

ln(int{h(x)dx}) + C0 = ln(exp(2*x)) + C1 => ln(int{h(x)dx}) = 2*x + C => int{h(x)dx} = A*exp(2*x), where A is some arbitrary constant.

Thus g(x) = h(x) = exp(2*x) will do. Also h(x) = -2 * exp(2*x) and g(x) = 2*exp(2*x) will do and so on...

Unfortunately this is not very useful for approximation and even more unfortunately this pretty much voids my Taylor polynomial argument :(

Generally h(x) = diff(exp( int{ g(x) / (g(x) - int{ g(x)dx } + C)dx } ),x), so the trick is to be able to calculate the outer integral. Sometimes it is easy when C is 0, but this doesn't always yield a correct solution. Note, that i'm not sure if the C is only constant that has to be introduced. They appear from the integrations.

[Edited by - Winograd on March 6, 2006 2:49:48 AM]
Quote:Original post by Winograd
If I didn't do any mistakes then your functions should satisfy also following equation:

int{h(x)dx}/h(x) + int{g(x)dx}/g(x) = 1

...

I'm not sure if this is useful at all, but atleast gives some insight to the problem, provides a necessary identity and IMHO it looks beatiful ;).

I believe it is very useful. It states a reasonable condition to satisfy the original equation.
Quote:Original post by Winograd
Thus g(x) = h(x) = exp(2*x) will do. Also h(x) = -2 * exp(2*x) and g(x) = 2*exp(2*x) will do and so on...

Yes, and this can be generalized to all functions

g(x) = A*exp(p*x)
h(x) = B*exp(q*x)

as long as

1/p + 1/q = 1 [1] and (A, B, p, q != 0)

since for g(x)

int{g(x)dx}/g(x) = int{ A*exp(p*x) dx } / A*exp(p*x) = (A/p)*exp(p*x) / A*exp(p*x) = 1/p

and the same goes for h(x).

Moreover for the original equation:
If we use the same g(x) and h(x) as above we have at the left hand:

int{g(x)h(x)dx} = int{A*B*exp((p+q)*x)dx} = (A*B)/(p+q) * exp((p+q)*x)

and at the right hand:

int{g(x)dx}int{h(x)dx} = (A/p * exp(p*x))(B/q * exp(q*x)) = (A*B)/(p*q) * exp((p+q)*x)

Now if we rearrange [1] we get that p+q = p*q, and by substituting this into the left or right hand side, the equation is satisfied.

Still, as you say, a single A*exp(p*x) is not very useful for approximation. A linear combination of such functions would do better. Hmm...
Hack my projects! Oh Yeah! Use an SVN client to check them out.BlockStacker
Spotted the error today. Lhs was missing an integral. I corrected it to my previous post.

Using Maple I found that when g is a first order polynomial h has the following solution:
g(x) = a + b*xh(x) = -2*(a+b*x)*exp((-ln(-2*a-2*b*x+2*a*x+b*x^2)*(b^2+a^2)^(1/2)+2*b*arctanh((-b+a+b*x)/(b^2+a^2)^(1/2)))/(b^2+a^2)^(1/2))/(-2*a-2*b*x+2*a*x+b*x^2)


Note also that the condition or identity I gave earlier isn't accurate. It is satisfied by g and h for which the original equation isn't satisfied. It is a necessary but not sufficient condition.

This topic is closed to new replies.

Advertisement