The Right Way to Formalise Mathematics
Carefully! This always pays off. Just watching the first few minutes of this you will see how this is going to make Tensor Calculus much, much more clear and intuitive.
Now read How To Be A Genius Part II and then watch this:
Which will allow you to put Charles Dodgson's beautiful An Elementary Treatise on Determinants into a modern setting.
See Minimalist Mathematics Lectures and also my comments on YouTube here:
I wrote ”8:01 this gives you a nice connection with Euler-Lagrange functional analysis, see https://youtu.be/oW4jM0smS_E which essentially approximates functions by successive terms in their Taylor series expansions and fits nicely into an abstract framework of vector spaces and scalar fields. Your preceeding sliding-ladder explanation of differentials as functionally related rates of change also connects nicely with De Casteljau-Bezier splines. See https://youtu.be/xwn1UmaorCw Then, if you use the functional analysis on a space over finite fields, see e.g. http://web.mit.edu/wisdom/www/AIM-2005-003.pdf I think you would have a nice framework in which you could use techniques from optimal control theory to build adaptive stochastic systems for calculating numerical solutions to some quite general classes of systems of differential equations, and where exact analytic solutions appear as successive moments in moment generating functions, as expressions for expectations, variance and central moments. These exact solutions would also fit nicely into abstract frameworks of vector spaces and fields, such as Clifford algebras, see https://arxiv.org/abs/0907.5356 and would be amenable to analysis by Fourier and Laplace transforms. This is urgent, because we need to address this problem in climate models: https://youtu.be/-fkCo_trbT8?t=1400 explained in detail at https://youtu.be/-fkCo_trbT8?t=1087 and https://youtu.be/-fkCo_trbT8?t=1209"
And another comment: "12:42 this is a rather complicated way of explaining the fact that the derivative of a function is the reciprocal of the derivative of its inverse (because the graph of the inverse of f is the graph of f reflected about the line y=x). This relation turns up in the Laplace transform where differentiation and integration correspond to multiplication and division in the s-domain. See e.g. https://youtu.be/2FZlz4-pf-M?t=595 So there may be some mileage to get by considering stability of solutions in terms of inverse functions relating differentials. See https://youtu.be/CfW845LNObM"
Norman Wildberger with an outline of an algebraic calculus course:
If you want to plug in some application specific details here, then watch these three videos, and write some code to allow your u to experiment with graphs and computations:
Then:
And to get your basic Taylor Series approximations to transcendental functions see:
Also my comment here:
Which was: 3:12 "Assume that each seller has some underlying probable success rate." This position, that "the probabilities are out there in the world, and it is the job of the scientist to learn what they are" leads to a world of pain, in my experience. And not just my personal experience. For example, you would want a machine learning system to be able to suggest to suppliers optimal ways in which they could improve their service, or improve their review rating, which are two distinct things. Or in terms of public health, you would want to be able to reduce the avoidable coronavirus fatalities which entails building a better model of the causes of coronavirus deaths, and you would want to optimise the data sets you collect to achieve that goal. Thinking about the death rate of people infected with coronavirus as an "objective probability" out there to be discovered makes this a hard problem! The truth is that these probabilities are all conditioned on a certain degree of ignorance about what is really going on "out there", and it is that ignorance we should be exploring. More precisely it is the structure of that ignorance that we are interested in.
See Measuring Ignorance.
Now read How To Be A Genius Part II and then watch this:
Which will allow you to put Charles Dodgson's beautiful An Elementary Treatise on Determinants into a modern setting.
See Minimalist Mathematics Lectures and also my comments on YouTube here:
I wrote ”8:01 this gives you a nice connection with Euler-Lagrange functional analysis, see https://youtu.be/oW4jM0smS_E which essentially approximates functions by successive terms in their Taylor series expansions and fits nicely into an abstract framework of vector spaces and scalar fields. Your preceeding sliding-ladder explanation of differentials as functionally related rates of change also connects nicely with De Casteljau-Bezier splines. See https://youtu.be/xwn1UmaorCw Then, if you use the functional analysis on a space over finite fields, see e.g. http://web.mit.edu/wisdom/www/AIM-2005-003.pdf I think you would have a nice framework in which you could use techniques from optimal control theory to build adaptive stochastic systems for calculating numerical solutions to some quite general classes of systems of differential equations, and where exact analytic solutions appear as successive moments in moment generating functions, as expressions for expectations, variance and central moments. These exact solutions would also fit nicely into abstract frameworks of vector spaces and fields, such as Clifford algebras, see https://arxiv.org/abs/0907.5356 and would be amenable to analysis by Fourier and Laplace transforms. This is urgent, because we need to address this problem in climate models: https://youtu.be/-fkCo_trbT8?t=1400 explained in detail at https://youtu.be/-fkCo_trbT8?t=1087 and https://youtu.be/-fkCo_trbT8?t=1209"
And another comment: "12:42 this is a rather complicated way of explaining the fact that the derivative of a function is the reciprocal of the derivative of its inverse (because the graph of the inverse of f is the graph of f reflected about the line y=x). This relation turns up in the Laplace transform where differentiation and integration correspond to multiplication and division in the s-domain. See e.g. https://youtu.be/2FZlz4-pf-M?t=595 So there may be some mileage to get by considering stability of solutions in terms of inverse functions relating differentials. See https://youtu.be/CfW845LNObM"
Norman Wildberger with an outline of an algebraic calculus course:
If you want to plug in some application specific details here, then watch these three videos, and write some code to allow your u to experiment with graphs and computations:
Then:
And to get your basic Taylor Series approximations to transcendental functions see:
Also my comment here:
Which was: 3:12 "Assume that each seller has some underlying probable success rate." This position, that "the probabilities are out there in the world, and it is the job of the scientist to learn what they are" leads to a world of pain, in my experience. And not just my personal experience. For example, you would want a machine learning system to be able to suggest to suppliers optimal ways in which they could improve their service, or improve their review rating, which are two distinct things. Or in terms of public health, you would want to be able to reduce the avoidable coronavirus fatalities which entails building a better model of the causes of coronavirus deaths, and you would want to optimise the data sets you collect to achieve that goal. Thinking about the death rate of people infected with coronavirus as an "objective probability" out there to be discovered makes this a hard problem! The truth is that these probabilities are all conditioned on a certain degree of ignorance about what is really going on "out there", and it is that ignorance we should be exploring. More precisely it is the structure of that ignorance that we are interested in.
See Measuring Ignorance.
Comments
Post a Comment