EGH


Total Posts: 59 
Joined: Nov 2014 


good yes downloaded Clark two days ago actually, need to read more on it, possibly we even misquoted him on what he found... in my models and models book I think I speculated on another type of deterministic relativistic clock combination. Namely combining several different clocks (worlds) running at different rates, and mixed their uncertainty and one would get...? well in this scenario one will end up in how to interpret the many SR results.
will look into some code, possibly this weekend 




pj


Total Posts: 3440 
Joined: Jun 2004 


IMHO, there are much better ways than CLT to check whether a sequence is truly i.i.d (Knuth, Donald E. The Art of Computer Programming. vol. 2 comes into mind).
But generating phat tails with some simple algorithm is interesting. 
The older I grow, the more I distrust the familiar doctrine that age brings wisdom
Henry L. Mencken 


EGH


Total Posts: 59 
Joined: Nov 2014 


the fattailed was simply generated by binary random variable (using photons also for this) deciding on 2000 or 21000. Then if 21000, then 21000 photons flipped and sum of them is then one observation.
A few minor details, for example 2000 area is close to normal distributed, but not normal distributed, it is discrete and finite. Longest possible tail that can be observed is 2000 heads in a row, the probability of this is of course incredible low. 2001 has zero probability, while in normal distribution a positive probability. Shortest that can be observed above 0 is 1 (the flipgap :). This however is a minor detail with likely minimal significance here, but related to something else I will likely write about later.
On another note, not directly related to above, but still linked to flipping many coins or photons is entropy. I strongly recommend reading
https://www.amazon.com/EntropyDemystifiedSecondReducedCommon/dp/9812832254/ref=sr_1_8?
Of the many entropy books I bought on entropy to try to understand the basics of it, this one is by far the best in my view (at least for someone like me that is a novice on this). I see he also has a new book (great title)
Entropy: The Truth, the Whole Truth, and Nothing But the Truth 




EGH


Total Posts: 59 
Joined: Nov 2014 


Concerning infinities, atomism gives exact boundaries on a long series of concepts in relativistic physics. The foundation for this is in my book. If the smallest and only truly fundamental particle has spatial dimension then there is a limitation on maximum velocity for all subatomic masses (and naturally all masses as they are composite of subatomics) that is directly linked to the diameter of the indivisible particle.
In my book I had no idea what the diameter of the indivisibles should be, it is now clear it must be the Planck length. My theory gives all mathematical SR end results, but show that SR and Lorentz symmetry breaks down at the Planck scale. As I understand also quantum gravity theories tend to predict break down of Lorentz symmetry at the Planck scale.
work in progress:
Where Standard Physics Runs into Infinite Challenges, Atomism Predicts Exact Limits
The most interesting prediction from my theory in addition to this "exact" limits is likely that a Planck mass particle is at absolute rest, this is collision point between light particles, the same across any reference frame. Directly related to break down of Lorentz symmetry at the Planck scale. Well pure energy (light) is a unique frame. Pure mass is also unique, and the only pure mass is the Planck mass particle (or more precisely the massgap).
To claim that something can get as close as one want to infinite but never to infinite is in my view to vague, and it even leads to absurd predictions in physics.
The Planck length can easily be measured without any knowledge of big G. It has exactly half the measurement error (in %) compared measurement error in big G. It is the diameter of the only truly fundamental particle. This again gives us exact limitations on all SR equations, and leads to break down of Lorentz symmetry at the Planck scale. 




"To claim that something can get as close as one want to infinite but never to infinite is in my view to vague"
It's very precise:





EGH


Total Posts: 59 
Joined: Nov 2014 


katastrofic when applied to certain predictions 



 

EGH


Total Posts: 59 
Joined: Nov 2014 


"You're vague"
Yes naturally Me Too 



pj


Total Posts: 3440 
Joined: Jun 2004 


The generation method described below simply generates Gaussian distribution according to CLT.
@EGH I dare you to show the error intervals. 
The older I grow, the more I distrust the familiar doctrine that age brings wisdom
Henry L. Mencken 



EGH


Total Posts: 59 
Joined: Nov 2014 


you mean we not will get figure 3 with the procedure I described?
"The older I grow, the more I distrust the familiar doctrine that the old age of CLT brings wisdom" Well I am sure CLT works inside its assumptions.
"The generation method described below simply generates Gaussian distribution according to CLT."
This is impossible, Gaussian has infinite long tails for infinite number of observations? (but yes thin Gauss tails). For my 2000 sample space there could never be longer tails than 2000 (heads or tails in a row), even if flipped infinite number of times. In particular when moving towards infinite number of observations it for sure had to be different than Gauss. You assume cutoff tails are Gauss because CLT? But this was not your point here I think, you just meant my blue chart fig 3 should be much closer to Gauss looking than it looks like based on procedure I described due to CLT? 



pj


Total Posts: 3440 
Joined: Jun 2004 


> you just meant my blue chart fig 3 should be > much closer to Gauss looking than it looks like > based on procedure I described due to CLT?
Yes, maybe. Your method generates i.i.d.variables with finite moments. Thus straight in the domain of CLT. By Berry Esseen theorem the difference looked quite surprising to me.
But since no statistical tests like Kolmogorov Smirnov test were done, it might be just a fluke. An artifact.

The older I grow, the more I distrust the familiar doctrine that age brings wisdom
Henry L. Mencken 



EGH


Total Posts: 59 
Joined: Nov 2014 


2 and 90 or so, binary random switch, i. yes!! i.i.d. hemm I would not say so. Only 10 000 runs, so you just have to wait for CLT to turn it into Gauss Looks like this Gauss curve has some kind of high peak? Probably just an artifact, it might be just a fluke.
The chart seems to fit my intuition at least.
For short runs not even important if pseudo or true random. For very long run even with the blink of the eye one can distinguish several crappy pseudo random from true random. But yes some pseudo are very good imitators.




pj


Total Posts: 3440 
Joined: Jun 2004 


10000 observations, eh? You should be normalizing by
Therefore the normalization should be of the order of 100. If your peak is over 5000 your normalization seems to be seriously off.

The older I grow, the more I distrust the familiar doctrine that age brings wisdom
Henry L. Mencken 



EGH


Total Posts: 59 
Joined: Nov 2014 


not 10000 flips, 10000 runs "Only 10 000 runs"...number of flips is naturally massively much higher than 10000 based on the recipe, that can even be seen with the blink of an eye from the figure itself.
yes when for example comparing with Gauss curve one must naturally scale and make sure total probability space matches. Sum must be 1 as long as we avoid negative probabilities, but that is outside topic here.
Question for you, to get something as close to normal distribution on perfect coinflips should for example five heads in a row only count as 5 or as 1, 2, 3, 4 and 5? this is critical and could it be reason for confusion on our figures (fig 3 for example)? We worked on several alternatives here, but think we got it right, but small changes here in the code gives very different results and "interpretation". It is all quite logical when into it. I am not so into it now, but worked considerably with coin flipping machines some years ago.
I think to program and playing around with coin flip machines (both pseudo and true random) gives lots of intuition, to describe everything in a simple way consistent with existing literature to others is naturally another thing. 



pj


Total Posts: 3440 
Joined: Jun 2004 


Apologies for the confusion. I was totally misunderstanding what you were doing. You were generating randomly sums of either 21000 or 2000 flips. And of course your thus obtained distribution isn't normal. Thank you for your patience.

The older I grow, the more I distrust the familiar doctrine that age brings wisdom
Henry L. Mencken 



EGH


Total Posts: 59 
Joined: Nov 2014 


"I was totally misunderstanding what you were doing."
When writing papers it is important to leave some room for misunderstandings (for example by avoiding the language of precision: math), because so much focus on impact factor these days. What is the best measurement for impact factor? A former hedge fund quant recently suggested
impact factor = (number of pissed off people)^2
at least some (all) politicians seems to follow this "success" formula. Think if all political statements had to be backed by math and coin flipping machines. 



EGH


Total Posts: 59 
Joined: Nov 2014 

 


You needed *help* editing a 1page note? 




"Critics and comments as always welcome." You're ugly and your mother dresses you funny. 




EGH


Total Posts: 59 
Joined: Nov 2014 

 


If you want to be taken seriously by the physics community, you should quote someone else than yourself, too. You won't go anywhere by ignoring the existing literature. Your lack of familiarity with existing research has led you to making this bogus statement:
"modern physics abandoned the study of minimum spatial dimensions"
A quick search will reveal some papers on the topic:
https://arxiv.org/pdf/grqc/0601097.pdf https://arxiv.org/ftp/grqc/papers/0304/0304032.pdf (speculative) https://www.amazon.co.uk/dp/B000USUDNG/ref=dpkindleredirect?_encoding=UTF8&btkr=1
+ work on micro black holes... 




EGH


Total Posts: 59 
Joined: Nov 2014 


Thanks! yes need to add more references.
yes should likely be changed to "modern physics abandoned the study of minimum spatial dimensions for particles". Space itself is continuous in atomism, the indivisibles on other hand has spatial dimension. One could argue that space then to some degree can/should be modeled as discrete as one only can measure something with help of also indivisibles. Well it is more complex than that, and something I am working further on.
yes I will add some critics towards micro black holes, but likely in a separate paper, I touched upon micro black holes it in several of my other working papers. Micro black holes have interestingly some of the same mathematical aspects as indivisible particles and the collision between them, but indivisibles seems to give more logic.
interesting topics, and only a early working paper from my side, needs a lot of improvement 


