math

/math300

Place to talk about anything related to mathematics

I actually really recommend listening to Scott Aaronson on Quantum Computing, and if you have math chops buying his book "Quantum Computing Since Democritus" (I haven't read it yet but I do have it).

There's a lot of stuff going around about Google's new Willow machine but what I am hearing is some pretty interesting stuff about cats, which is not at all what quantum is about, but this video should shed some light of the capabilities of quantum computing and what's in store in the next 20 years.

https://www.youtube.com/watch?v=fuSZoh7EURI
馃槀馃槀馃槀馃槀馃槀馃槀
Not the easiest thing you'll watch today but well worth it, decent intro to martingales

https://youtu.be/sqrgqhl8C-A?si=mHEe5ut-CA7bcln_
New SIAM book recommendation is out on Non-Linear dynamics.

https://epubs.siam.org/doi/10.1137/1.9781611978162
Hey @degenveteran.eth,

If a grenade is thrown at me in an open area, the move is to get flat with my feet towards the grenade right.

Then the flux of the shrapnel wrt the surface area I鈥檝e exposed is just my feet right?
No better demonstration of how limited human imagination can be, even with better learning resources and tools, people are still hyper-fixated on excelling at outdated and largely nonsensical curricula

https://x.com/Austen/status/1860780913380262223
spotted a little cardiod in the wild at dinner on Friday night
Interesting recent SIAM introductory paper on Stochastic Rounding.

Looks pretty interesting as in numerical computing deterministic rounding can introduce errors, since floating point numbers can sometimes automatically trail off significant rounding bits.

Deterministic rounding in AI contexts with accumulated multiplication, can accumulate large error propagation. Stochastic Rounding errors with mean 0.

https://www.siam.org/publications/siam-news/articles/stochastic-rounding-20-with-a-view-towards-complexity-analysis
yes eigenvalues
With the right large algorithm the entire works of Shakespeare can be extracted from any random data.

With the right small algorithm, e.g. understanding English, the entire works of Shakespeare can be extracted from the entire works of Shakespeare.

In each case the entropy of the system is the same. Either you need an algorithm that can extract the works of Shakespeare from randomness, in which case the algorithm will have entropy equal to the works of Shakespeare itself, or the algorithm will be small (understand English) and the data will be patterned (Shakespeare).

So

entropy(massive_algorithm(randomness)) = entropy(understand_english(works_of_shakespeare))

This is why the factoid that the entire works of Shakespeare can be found within the digits of pi is not as awesome as it initially seems. This is true for literally any data set in the universe, given the right algorithm.

Either the information is in the data set and the algorithm is small, or the information is stored in the algorithm.
Easily the best slides for quickly understanding fat tails, also beautifully motivates the Hill Estimator

https://adamwierman.com/wp-content/uploads/2020/09/2013-SIGMETRICS-heavytails.pdf
You take some transport phenomena classes and you can never enjoy these moments.

Maybe a dragons' body is 80% lungs and 1 absolutely massive heart with arteries the size of fire hoses?
even chatgpt recommends 3Blue1Brown videos for learning math
More geometry than math... but still a nice one.
TIL Wald's equation, derived from martingale property