Home » ·¼ν » T±¡: 5 Ðimotions & worldcycle

T±¡: 5 Ðimotions & worldcycle

±∞ ¬∆@ST:


Dark Forces in time:




Some basic principles of epistemology and truth.

Before we enter into the exploration of the ‘invisible scales’ of forces with whom our ∆ø humind interacts to achieve the simplest action of motion (for which perception is not needed, as pure motion and pure static perceived form have inverse properties) , we shall clarify a few ‘first concepts’ of great importance that often come up on this blog on the argument on the foundations of physics and the limiting realms of the humind, which are the quantum and cosmological scales:

  1. The equal value of all scales of Nature. It is obvious that the more removed we are in 5D scale, the more uncertainty we obtain in our analysis, hence the relative unimportance of physics in those realms to understand a Universe with likely infinite scales. Those scales are NOT more important and are MORE distorted to human observation, so they are less relevant for a philosophy of science. Why then physicists have so much intellectual prestige? There is the worldly power – the make our technological machines which today are more important than life to this planet, as its company-mothers of machine dominate our eco(nomic)system. But that is not the theme of this post. The theory behind is. It is called the ‘constructionist hypothesis’ that pretends the lower scales CAUSE the upper scales. This is NOT truth, as we have seen in many posts, since the scalar causality is dual: information flows from lower, faster time  scales with more cyclical form=information and energy from larger scales. So causality is dual. And further on all scales are self-similar – facts defined long ago by system scientists as the next quoted text shows:Screen Shot 2018-03-06 at 6.40.20 PM
  2. This said, in this post we shall deal with the smallest scales, which belong to quantum physics, so two important questions on those scales are the difference between the quantum scale of particles and the thermodynamic larger scales of molecules. In Physics it matters more to understand the quite evident scale of thermodynamics, our scale. And when we do so, it becomes obvious as Einstein put it that ‘the statistical quantum theory would … take an approximately analogous position to the statistical mechanics within the framework of classical mechanics”. How this happens is easy to understand in 5D metric, and mathematically we shall time permitted include a more ‘pro’ analysis for specialists.SINCE in 5D Metric there IS no DISPUTE between Einstein and Bohr.  First S=T means we have always equivalences between space-form-statistical populations and time-motions-probabilities. THE time o-1 description IS BETTER FOR faster ‘clocks of smaller particles’, according to the  5D scalar metric: smaller scales in space run faster time clocks – time, not space thus become dominant in quantum physics, hence better described probabilistically. The Universe is thus probabilistic in smallish, time dominant scales (the 0-1 mathematical unit sphere after normalization of parameters) vs. STATISTICAL IN larger SPACE (the 1-∞ thermodynamic plane). Yet  BOTH ARE EQUIVALENT MATHEMATICAL FORMULATIONS (fundamental theorem of measure theory – the 0-1 sphere and all its laws are equivalent to the 1-∞ plane, which is better for slower LARGE thermodynamic ensembles that occupy more space.
  3. Finally is essential to accept that in 5D scales, the c-speed limit is ONLY the limit of speed of the light space-time scale, as S x T = Constant. So for smaller scales speed ARE faster, both in rotary clocks, as we have just shown, and in lineal inertia. This means gravitation scales have c as the limit of lower speed and are often, outside the realm of high density of light space-time (that is between galaxies) faster, non-local. So there IS a quantum potential (Bohm), NOT a hidden variable, but the invisible ∆-4 non-local = faster than light scale we shall study in this post. Non-locality IS experimentally proved both for intergalactic space (or else galaxies would not interact) and in the under-light smallish level of invisible particles (Bohm’s model, which we accept in 5D as the more reality). This brings a series of important consequences both to human stience and reality, which we study here.
  4. Finally among the key insights of 5d applied to those lower scales there is the invisibility of it, natural to the relativistic perception of reality centered inner ∆ø, which has its limits beyond which all becomes invisible; and on the other hand, the fact that faster motions in space are seen as larger distances. So IT is precisely the faster than c-speed of the ∆-4 scale WHICH IS INVISIBLE what we see as ‘expansion of space-distances’ between galaxies (balanced by the implosion of form within galaxies that create mass from vacuum space, making the Universe a wobbling immortal infinite entity). It is also obvious, abounding on the uncertainty of measure, because time is quantic, discontinuous, cyclical, it can be only measured when a ‘cycle is completed’ as a unit of time (in humans the second)… but as 5D metrics allow different speeds of time, for a being with a measuring mind that has a huge length in its time quanta, all what happens within that time quanta of measure at faster speeds will seem simultaneous, non-local, even if the phenomena take some time to ‘travel’ from A to B, or from  a,b,c… points into the synchronous knot of communication of them all – the mind that absorbs those ultra > c speeds as pixels of form. So BECAUSE WE SEE simultaneously the Universe of invisible gravitation, this must be between galaxies faster than light, and all what we perceive as simultaneous distance is caused by that faster than light speed perceived with our slower light eyes.


The faster than light – expansive distances – invisibility of the ∆-4 plane.

Let us try to fully grasp this essential property of time, mind and measure. When a wheel turns fast you see it as solid, because you see simultaneously in your eye-mind all what happens within a second and so all the points of the turning wheel might seem to be in the same point at the same time (within that second) even if IN A FASTER TIME SCALE, they will be recognisable pictured at slow-motion as clearly different rays. This concept of simultaneity of measure embedded in special relativity is very relevant for many phenomena of perception, and stientific description of reality, as a source of much confusion when humans study in detail those so fast small scales of 5D.

The most obvious example is human thought – what we see as ‘simultaneous’ non-local IS the entire planet, because the speed of light is so great that all within a second, including the moon, seems to us ‘simultaneous, non-local, co-existing at the same time-quanta’.

The light from that distant mountain you see below the horizon TAKES sometime to arrive here but it seems simultaneous, non-local in its ‘infinite speed’ to our one second senses, to the point that till the XVII century humans thought light was ‘instantaneous’ non-local as they could not even measure its speed. Galileo tried with mirrors in far away mountains, but the time of reaction of the person that had to stop the clock when emitting the ray and the one that received the ray and stopped his clock, which should allow to find a difference of time in both locations to measure the delay due to the motion of light between both points, was much larger (normally two seconds one to perceive one to act) than the millesimals it took light to go from Mountain A to Mountain B. Only when we did measures from far away Jupiter’s satellites we could get a meaningful measure.

Non-local quantum fields below light space-time, the next scale which Bohm formalised must therefore be much faster than light, to the point of being seemingly non-local or else causality essential to time processes (even if sometimes is co-causality or multiple causal rays joined in a point) would not happen.

Normally those parameters that quantify the difference of speed (s/t), density (t/s) and momentum ( s x t) between scales and its species are on decametric, ternary potencies, as the 5D scales are. So ± 1, 10, 1oo, 1ooo, 10.000 are the commonest differences between scales. I.e. the fine structure constant, measure the difference between the light scale speed and the next upper ∆+1 electronic scale, around +100 (137).

The difference of scaling between photons and electrons and the lower ∆-1 scale though is larger.

In graph, experimental evidence of faster than light intergalactic scales: 10c quasar jet at ∆+4 and quantum non-locality at ∆-4 at 10.000 c

After all dense photons are just the ‘cellular’ level of electrons. So for complex more detailed reasons concerning the parameters of GST scaling, action at distance should be on the α² bidimensional speed scaling, compared to the larger, light world, around 10.000 times faster and all that happens in 10 thousand times more distance within the minimal quanta of human observation should seem to us NON-LOCAL. Actually recently a Chinese lab measured non-locality which turned out to have a c=10.000 speed, for its ‘thinner’ minimal messages; or upper boundary of maximal speed of non-local effects; and so we do observe all kind of non-local c<V<10.000 C PHENOMENA, which of course theorist then vehemently deny to uphold its pretension of absolute truths under c speed, which is only the speed of information in the scale of light space-time, as the substrata of the scale is the Bg light background radiation.

So as you cannot travel faster than a plane when you are inside of it, you cannot travel faster than light when you are inside a galaxy sustained by a light space-time substrata. But outside, in the dark world you do have the possibility to go faster.

Thus C speed (v=s/t) and 0 temperature (limit of friction, disorder that impinges faster speeds) are JUST the limits of our light space-time world. And we bring this conclusion in many parts of this blog in many ways, from the different ∆ºST perspectives, as it is customary in GST, this perspective being that of the ‘mind paradoxes’ of perception.

Let us consider the other perspectives on faster than light.

• mind: The 3 dimensions of light space-time. Its functions in all systems made of it.

Let us  consider the meaning of faster than light speeds from the perspective of reproduction and perception of information. The mind’s quanta of time and speed simply cannot see ‘gravitational smallish, faster than light carriers of information’. But smaller particles do, as the dark work is ∆±4 and light, electrons and atoms exist in the neighbourhood: ∆±3,2.

So the dark world IS the ∆-1: ‘energy-feeding level’ for a photon… reason why it follows as prey-predators do the ‘scattering’ rays of Bohm’s pilot wave theory.

Don’t raise your eyes. We can explain all phenomena from all languages and povs. So as all has topo-bio-logical properties, we can always make an abstract mathematical explanation, in ‘detail’ but also a biological, organic explanation in ‘whole vital terms’.

Thus light merely warps≈feeds≈evolves the entropy of the lower scale-field, ∆-1, which is invisible gravitation to us into visible information, and for that reason ‘as information must be copied and imprinted on the quantum potential field with no form, it takes time and reduces speed into c-light.

Indeed, gravitation from our mind pov must have less information and more speed, as experiments prove (gravitational non-local, invisible action at distance, due to the lack of human detectable information).

And if we plug this act into the equation of ‘cyclical time-speed’ it gives as an ∞ speed for the quantum potential field/gravitational field:

v -> $/ðƒ≈0 -> s/0=∞:

Below we see the humind which better understood this paradox, Monet’s first ‘impression(ist): sunrise’ painting, in which space is no longer painted as ‘a background’ cartesian canvass, but as the frozen view of light rays by the human eye. Physicists though still don’t get what artists eyes saw intuitively. We shall in that sense consider the relationship of art with physics, as a pioneer of its space-time analysis, from the work of Alberti on perspective, way before Desargues found projective geometry and Descartes the self-frame of reference, to the work of Leonardo, under his lemma of a living Universe of organic forms we must learn how to see (saper vedere), which certainly would have enjoyed immensely this blog and the part on topological vital space-time, to the impressionist realisation light is the mind of space, to Picasso’s analysis of imaginary worlds made only with ‘lineal motions’ (cubism) or cyclic, reproductive yin-female ‘forms’ (post-war period).

In the graph, the infinite speed of the quantum potential of gravitation that feeds light, which follows the ‘string’ of gravitation, tended between the emitter and perceiver particle, which lock each other in entanglement, (neutrino theory of light)

And so when both particles have connected through non-local action at distance in the dark world, they can guide their motions; they can assess their relative distance (which is the main information a single line provides) and as a secondary effect – since they are all ‘locked’ joined by the gravitational string, regardless of the external speed of their world, the speed of light communication between both will be always constant as they are in relative stillness to each other (explanation of the constant of c- light speed from the @mind perspective).

All this in fact has been explored by physicists in its ‘fringe theories’, notably by Feynman’s absorber theory whereas the two solutions of electromagnetism, one with a negative sign are considered two rays simultaneously produced by A and B particles, but as usual in physics, while all has been discovered ‘mathematically’ by the mere pedestrian process of manipulating algebras, its deepest meaning is not understood – this is the guy that said the why is what he never questions, an extraordinary mathematical physicist with the usual conceptual peanut brain of his practitioners.

The mind of man as Kant understood is EUCLIDEAN, because our space is Euclidean, and it is so because it is light space-time, made of three perpendicular fields able to carry information: the electric, magnetic and c-speed field, the longest direction of view, which explains why we indeed look ahead in the horizontal plane, but have an informative head on the height direction of information as the electron which produces the ray of light does, moving up and down (left, bottom); and finally stores ‘energy’ in the width, MAGNETIC ‘BELLY’ direction (: ALL HAS changed in the evolution of light through 5D scales till creating man, but all has ultimately remained the same: a game of vital dimensions of space-time.

The consequences of the previous graph for the understanding of both the humind (human mind) and the external Universe of space-time are multiple, from a proper understanding of special relativity to the analysis of magnetism as an independent force NOT an observer’s effect as modern physics think, to the fundamental analysis of light and photons as the minimal organism of our known-known Universe, each of those themes exploring an element of the ∆@st light supœrganism.

Back to the understanding of motion as reproduction of information.

What this means is that motion is not only relative, but related to the density of information of a system, so as mass-information grows in density the system takes longer time to reproduce the Ti element of v=s/Ti, and so it slows down. And on the other extreme when information tends to zero speed increases.

So as we humans perceive information with light that is our limit of speed of transmission of information we can perceive c=s/Ti. But this wave of light is really impressing a quantum field of action at distance, faster than light (Bohm’s discovery on the pilot-wave theory which we have to marry with the non-observance of particle during the motion, to fully grasp the process).

It is then obvious that from the proper endophysical perspective of the human visual mind, faster than light speeds cannot transmit information among us. But – and this is again a topic egotist error of the humind – this doesn’t mean as physicists claim that other species, specifically those on the ‘verge’ of ∆±4 (galactic black holes and quantum particles), which are connected directly to its ∆±1 scales can communicate information through dark (to us) gravitational waves. 

It is then the humind, which sees those flows of information as ‘invisible’ actions at distance – but as we predicted even action at distance has a limit of speed… which seems to be on the 10.000 c range.

The Dark World. E<cc>M and Beyond: the c-t limits. 

The dark world is according to 5D metric: S x T = C, ‘faster than light’ both in its lineal inertia (S) which is seen as expansive space between galaxies – since its motion is invisible, and its temporal speed or frequency, 1/t=ƒ, on the rotary motion of black holes that therefore store maximal information beyond the c-speed event horizon. A fact in the complex mathematical model we formalize as t<0. Thus ok and c-speed become merely the limits of our light space-time plane in terms of lineal speed and cyclical/angular momentum, the singularity center and clock-like membrane that encloses and control a vital energy and have its maximal example in the black hole, an accelerated vortex of gravitation that therefore should go faster than c, passed the event horizon;  and should move ‘faster than light’ if we adopt the point of view of the black hole, as it ‘creates’ distance-speed by ejecting at faster than c (Kerr metric) the absorbed light energy that becomes a jet of dark entropy on its poles. 5D in that sense has not hang ups with postulates such as those of Einstein’s c-speed limit or the concept of entropy as the single arrow of time, which are just born out of the ego and naive realist of huminds, but are neither logic nor experimentally sound.

Humans confuse things they don’t know with dogma… Top quarks and black holes cannot be explained fully considering a single continuous spacetime plane.  Because for them to work as quark stars able to eject dark entropy and dark matter through its vortex they NEED ‘a gravitational plane’ at faster than c-speeds, besides the light space-time plane we ‘see’ limited by Einstein’s relativity to go at c-speed.

But while respecting Einstein, ‘beyond’ the c-speed plane’ there is immense evidence of the galactic plane of ‘dark entropy’ and ‘dark matter’ going faster, hence allowing the black hole to emit dark entropy seen as expanding space between galaxies. A few proofs:

– The gravitational vortex (graph) IS accelerated. This is the basic TRUTH of Einstein general relativity: Principle of equivalence between gravitation and acceleration.

So obviously BEYOND THE EVENT HORIZON THE SAME PROCEDURE – ACCELERATED LIGHT SPACE-TIME BEYOND C happens and LIGHT must become something else, as when it is accelerated in accelerators at c-speed. What then becomes in accelerators, obviously QUARKS! SO THAT IS WHAT IT WILL BECOME IN BLACK HOLES, LIGHT BECOMES QUARKS! EASY, SOUND THEORETICALLY, PROVED EXPERIMENTALLY, just Mr. hawking Mr. Penrose and Mr. Wheeler – the guys of the mathematical singularity – didn’t see it, but we don’t see gravitation and infer its laws. Nobody doesn’t deny gravitation because we don’t see it. Nobody should deny the obvious conversion in a vortex of c-speed spacetime beyond the event horizon  into quarks as it happens at CERN in its accelerators

GRAVITATION IS AN ACCELERATED VORTEX OF TIME, which follows the same laws that A VORTEX OF THERMODYNAMIC time (hurricane, Eddie  or a vortex of quantum time (charge). The 3 in fact are easily UNIFIED by a simple 5D metric equation (which physicists have been trying to find for 100 years but they can’t in a single plane of space-time. This is AN ABERRATION OF CREATIONIST MATHEMATICS THAT ONLY EXISTS IN THE CARTESIAN CONTINUOUS SINGLE GRAPH THAT NEWTON USED TO CREATE THE CONCEPT OF A BACKGROUND SPACETIME…)

We live and in this even the most retarded physicist will agree in a relational space-time, we ARE MADE OF VITAL space-energy and cyclical time membranes (energy and angular momentum, the conserved quantities)… Read the central page at unification which explains it perfectly simple.


Bellow you see the scales of the big-bang, we escaped chemical lesser thermodynamic big-bangs, so you have the non-existence – background radiation as we explained in other articles CAN ONLY REALISTICALLY BE PRODUCED BY A SINGLE SPECIES IN THE COSMOS A BLACK HOLE THE SIDE OF THE MOON, PROVED EXPERIMENTALLY.

SO again WHEN we APPLY REALISM to physics and abandon the belief that all inflationary mathematics, are REAL, we are left always with a single candidate in economical nature to be the substance of beings: A SOUP OF heavy QUARKS IN BLACK HOLES, A BLACK HOLE MOON MACHO FOR THE BACKGROUND RADIATION (a black hole that eats a moon will have exactly the same temperature required to reproduce redshift background radiation at 2.7 K).


To the point:

In the graph, the ∆±4 scales of physical systems, self centred @ the human  pov and the 5D metric equivalence between its time speed and space distance which gives us its co-invariant energy, in all scales (S x T = K)

Once this CORRECTIONS are done and light space-time kept for those regions where we see light, we find above, a slower thermodynamic scale and below a faster intergalactic scale, which I would call simply ‘dark scale’ of faster than light rotary top quarks bc-atoms, and dark entropy expanding vacuum.


NOW physicists think this is not happening BECAUSE THE INTERACTION OF TOP QUARKS AND CB ATOMS IS SO FAST, THEIR rotary motions are so incredibly fast, billions of interactions in a second, that they think they cannot act with the strong force whose coupling constant is slower, IF they rotate at less than c-speed.

But IF THEY ROTATE faster than c-speed, which is proved by the ‘billions of time per second frequency when they switch on-off from quarks into antiquarks at FASTER THAN C,  and the accelerated vortex beyond the event horizon of EFE equations, heavy quarks can become stable cb-atoms and top quark bosons.

So by symmetry with pulsars black holes will be dark stars with a center of top quarks and a cover of cbc atoms.

 WHY DO I THINK THEY ARE  LIKE PULSARS which have STRANGE QUARKS IN THE CENTER AND light atoms in the surface – hence with top quarks in the center CB-ATOMS IN THE SURFACE. BECAUSE THE UNIVERSE IS PERFECTLY SYMMETRIC, IN SCALES, FORMS AND it is ECONOMIC, iterative and so pulsars and black holes are symmetric frozen stars as Einstein wanted.


Screen Shot 2016-05-16 at 18.26.57

In the graph, Galaxies are fractals of stars and dark, quark matter built with 3 topologies: a reproductive body of stars, sandwiched between an informative nucleus of black holes and an external halo of dark matter, probably strangelets and other dense stars. The closest self-similarity in our world scale is a cell and in the quantum scale an atom.

dark macrocosmos

In the graph the ‘gala cell’, an organism of stars, joined by a network of ‘nervous, gravitational information’, composed of the 3 relative families of mass, of increasing density, which act as the ‘DNA-informative centre’ (top quark stars, aka black holes, as top quarks are the only ∆-1 ‘points’ with the same density that black holes ), the protein, hard membrane (strangelet halo of dark matter – Witten hypothesis), and a visible electromagnetic network of ‘stars’, the energetic mitochondria that becomes the food that reproduces both strangelets and black holes. It is a simple organic scheme that explains the whys of physical particles, and the ternary structure of galaxies, similar to that of an organic system of the Universe. Yet such models are not even explored by astrophysicists, as they are based in organic concepts, which physicists by ‘dogma’ cannot accept.

ALL THIS IS WHAT EINSTEIN ASKED FOR in everything he wrote about philosophy of science, the limits of mathematical equations, the hypothesis of frozen stars, the relational nature of space-time, and the use of c-speed limits only in phenomena related to light – you name it: ‘Leibniz is right, there are infinite time clocks in the Universe but it so we have to change physics principles’…

which is what I do in those texts. But I Don’t DENY EINSTEIN AS the singularity guys who deny his affirmation that black holes only will be real when they have a cut-off substance, and believe as creationist mathematicians do that if they ‘talk’ numbers alas reality appears.

Fact is the black hole is a frozen star, the dark world goes beyond c-speed between galaxies, since special relativity and c-speed IS only the speed of the luminous space-time of galaxies, which is also the substance of the background light radiation. Beyond galaxies and within black hole OBVIOUSLY the light limit doesn’t work, BECAUSE THERE IS NO LIGHT THERE TO IMPOSE THE LIMIT.

ST-perspective: travels in time.

Now, the most beautiful perspective on faster than light speeds comes from the proper understanding of the quoted Feynman’s absorber theory – an interpretation of electrodynamics derived from the assumption that the solutions of the electromagnetic field equations must be invariant under time-reversal transformation.  Indeed, those equations do not  singles out a preferential time direction and thus makes no distinction between past and future, but consider both rays one from past to future and one from future to past, to happen at the same time. 

Why then physicists discharge one solution? Essentially because they do not understand the logic of the 3 arrows of time, and its local past to future and future to past converging flows of space that create a present simultaneous event in space.

But that is exactly what those two solutions show. 

Maxwell’s equations and the equations for electromagnetic waves have, in general, two possible solutions: a retarded (delayed) solution and an advanced one. Accordingly, any charged particle generates waves, say at time t0=0 and point x0=0, which will arrive at point x1 at the instant t1=x1/c , and other waves, will arrive at the same place at the instant t2=-x1/c, before the emission (advanced solution).

The latter, however, violates the causality principle: advanced waves could be detected before their emission. Thus the advanced solutions are usually discarded in the interpretation of electromagnetic waves.

In the absorber theory, instead charged particles are considered as both emitters and absorbers, and the emission process is connected with the absorption process as follows: Both the retarded waves from emitter to absorber and the advanced waves from absorber to emitter are considered. The sum of the two, however, results in causal waves, although the anti-causal (advanced) solutions are not discarded a priori.

Feynman and Wheeler obtained this result in a very simple and elegant way. They considered all the charged particles (emitters) present in our universe and assumed all of them to generate time-reversal symmetric waves.

It is exactly in this manner how the ‘present space-time light background’ of our perceive light Universe – the eternal present underlying reality – is formed. Since we live in a galaxy with a background radiation substrata of present, constant light space-time.

So what GST does as usual with the awesome mathematical work of mr. Feynman is to add ‘consistency, logic’ and its whys (: even if he also had a long-life laugh to philosophers of science with the usual egotist view of the ‘entropy-only’ makers of weapons who think all comes from big-bang bombs ):

So this is what we ad on the simple mathematical equations to explain action at distance with the absorber theory, just by moving the sign in t2=-x1/c, to the side of time: –t2=x1/c,

If A sends to B at c speed a ray of light, from the future to the past, which will take t1=x1/c time, but A is in the relative past, – t2; as the ray moves towards the future and A also moves to the future, both meet in the present.

Since the absolute value of |t1|=|-t2| both cancel each other, meaning the time, t1, it takes the ray to travel from A to B in the -t2 past, is cancel by the fact the ray was emitted in an equal amount of time back to the past, so both sum o and the ray arrives in relative present for both beings, seen in our Universe as infinite non-local speed.

This is also implicit in Feynman’s sum of both fields, as he writes:

The assumption that the free field is identically zero is the core of the absorber idea. It means that the radiation emitted by each particle is completely absorbed by all other particles present in the universe, as they feed them.

If the incoming wave is absorbed, the result is a zero outgoing field. In the absorber theory the same concept is used for both retarded and advanced waves, the meal of the two electrons who emit the photons who feed on the neutrino-string that guide the pilot wave.

Then the resulting wave appears to have a preferred time direction as feynman discovered when adding both solutions to maxwell equations:

So at the macroscopic level of huminds it respects causality.

However, this is only an illusion. Indeed, it is always possible to reverse the time direction by simply exchanging the labels emitter and absorber. Thus, the apparently preferred time direction results from the arbitrary labeling (objective view), or from the role of the particle which will feel to be both, in the relative past of its causality-time as emitter, and the relative future as absorber – hence ultimately in an eternal present (for us).

Why is that? On a different psychological perspective of time, which is essential to understand how the human mind perceives time, regardless of the more objective ∆st perspectives on it – we deal with the logic of actions vs. the passivity of perceptors:

The ‘active part’, the emitter is looking always at the future of his actions that ‘move forwards’; but the passive perceptor is actually looking at the past, from where he receives the information or action.

And if you have not understood anything of it. Well, JUST  remember Einstein’s quip: the separation between past, present and future is an illusion.

You live indeed in an astoundingly simple Universe in terms of its first substances, which the enormous number of spatial parts, make complex enough to ‘fog’ the forest. Science today deals and cares only to describe each tree, with its parameters of measure and that is what they call knowledge. We are interested in the absolute synthetic knowledge, which is provided by the organic whys, far simpler and intuitive.

∆-perspective: The deepest truths: time reversals.

Sof why, light travels to the past from A to B? In organic terms, because it feeds entropically on the lower scale of the dark world of neutrinos≈strings≈gravitons.

The ‘formal’ answer is then more complicated than the intuitive vital one. And it brings us a key equation of ¬Æ:

It is called the reversal of time between scales of 5D. And that plugs in the dark world, theme of this post. It was first formulated as usual by the forgotten genius, Mr. De broglie, and it is called the neutrino theory of light. Neutrinos are the ∆-4 singularity, which therefore become the entropic food of light. And entropy is literally NOT only figurative, a past motion, as in death. So when light eats neutrinos≈strings (same scattering weak angle≈size, likely the same concepts: Gravitons≈neutrinos≈strings, explained from three different perspectives=functions arrows)  it follows a flow of past and meets finally the entangled ‘other particle’ hunting together the neutrino. Now, for two neutrinos to create a ray of light, Jordan proved they just need to be emitted in perpendicular, inverse but parallel form, which is exactly what they do in Maxwell’s equation, in absorber theory, in neutrino-light theory, in GST, all over the Universe.

But alas, physicists wishful thinking decided such precision was not possible for lowly photons (reason why the usual idiot, Mr. Pauli, who also busted Broglie – he along Bohr had to be the genius, so they basically stole the material of the French humble aristocrat, who had also discovered the particle-wave duality today ascribed to the power-broker, a banker’s son, with political, financial and military clout, Mr. Bohr, also busy bullying Einstein – what a bore!)

So the dark world is faster than light, it is made of neutrino strings of gravitation, and it is the lower ∆-1 relative scale of light, and so we have now defined properly the invisible ∆-4 scale, which in the inverse ∆+4 world corresponds to the dark entropy traveling faster than c (expansion of the Universe) between galaxies, produced by ∆+4 ‘stringy black holes’.

As usual the maths of it are all over the place, scattered in physical papers not even physicists understand (:conceptually:)

So we shall not play the pedantic role of writing here specialised equations – physicists do know what I talk about and can plug in the articles if so they wish. Our goal is that any university major who has been serious about studying his ‘first year courses’ or even last high school physics can understand the Universe conceptually better than any specialist does today by adding on the main discoveries and 5 Disomorphisms written in a simplified language. 

Why we do this is obvious: there is an overgrowing of computer-generated maths that make physicists think they (their machines in fact) are so smart, exactly inverse to the degradation of conceptual thought by lazy plugged-into-chips scientists. The real task left to mankind though is to upgrade their conceptual logic chip and that is what we do.



. The invisible world of gravitation.

“I had, for a good many years earlier, been of the opinion that the space-time continuum picture of reality would prove inadequate on some small scale.” Penrose on twistors

The pretension of physicist to know on a scale in which perception with our light>electronic instruments is ideally π-3/π, a 0.04%, leaving 96% of dark entropy and dark matter, which does NOT have energetic=present, evident effects on us is a waste of time resources of stience better applied to ∆≤1 planes of the human Universe.

This said we can hint at that vast overlarge Universe which would descend down to the limits of theoretical planck strings, of which the only experimentally sound particle of similar size might be the neutrino, and as such the element of this first unit of reality would be the neutrino ν, or ƒ, as we shall consider in our units of time, the ‘ideal particle’ of time planck size/frequency, etc. This ideal particle akin to a string renders obviously as we have NO experimental evidence beyond the 4% of an ideal ‘pi circle’ observing an opaque Universe, would be the vast reflection of a hyper larger universe made of superstrings of cosmic size; and or therefore stretch to the ∆-4,5 potential scales reality in somewhat different but similar physical games.




Theory on the invisible Planck scale – String theory

Those are though limits for ‘social growth’ of those Tiƒ forms in networks, which physicists cannot ‘humanly’ resolve for the lowest scales… and it would be preposterous to pretend to understand beyond theory the dark-cosmological-Planck scale which will always remain speculative.

Those lower scales to the Planck scale of strings have little evidence, and so while it is very interesting mathematical stuff to theoretically develop them – the  more so with ¬Æ, it is also truth there is not so much interest, if we accept the ∞ of scales in the Universe, according to the most likely truths of the fractal Universe. It is also a fact that there will be variations in different scales and regions of the cosmos and parallel Universe in each ‘scale’ with different ‘dna-codes’ to put it in biological terms. So while the ‘eideos’, or ‘ideal’ would be a 10ˆ11 scale it is obvious that in this universe scales are not that perfect. Still it is remarkable to observe the clear scales down from man to atom (through cell and molecule), and then from atom to ‘dark matter’ (hence with 2 intervals of scales of particles and scales of photons)… This would be the ‘neutrino’ scale and/or that of its micro-particles of a neutrino wave… Below the neutrino scale therefore we can consider one of strings (open and closed, entropic and temporal strings to call them properly).

I tried back in my 20s (90s), during my age studying around American universities, to start an encyclopaedia of stiences to fully reorder all this stuff. But it was impossible to interest the ‘dogma’ on the new ‘paradigm’.

So as in the case of the cosmos, we are not going to make much of a fuss from it. Neutrinos are an obvious gravitational background, invisible, below the light scale and its c,h, constants. And the cosmological constant is too good, within 5D canonical theory with its 3 space-time topologies according to its variation around 0 or 1 (in the reduced Ω version), as to ignore them.

Now, the whole thing should be the ‘labor’ of professional ‘relativist’ and ‘particle’ physicists. I do have a lot of work done during the years I was fighting Nuclear physicists in courts for ethical reasons, and studying in depth all those models, but this will be really the last part of my work to be ‘filled’ with equations, when i finish all other chapters.

Humans though being what they excel at more than anything else (their ego and belief they are above heavens and earth), will simply not give up and accept an objective, ‘limiting’ theory of our role within the Universe. So it is really a matter of beliefs and paper publishing. We can just ‘basically’ construct a parallel theoretical world in those lower scales; to make sense of them. And that’s about what we can do and will sketch in this post, regarding neutrinos, dark energy and matter and the cosmological scale.

So what scientists of the -4th scale most today string and neutrino theorists and strong and weak force theorists, the forces that act at – 4  (strong force) and -4><-3 transitional (weak force) should do to complete its translation to stience of its disciplines is to beat its brains to mix, neutrinos the real stuff with string theory, to make string-neutrinos, and make them faster, and then connect them with the quantum potential field of emergence in our scale and so many other fancy things, in a time-space background independent field where strings are closed time and open space strings, and in a fractal universe, where strings do follow the same mathematical dimensional concept of 2 of space, 2 of time for each plane, emerging as a point surface of the larger ∆+1 scale, adapting all its maths, shake it shake it and pour it all over again. Good luck, a lot of algebra involved.


The first scales of the Universe, the bottom line of our perception of energy and information (light space-time or 1st scale) and beyond, in the invisible world of gravitation (0 non perceivable scale) and beyond, into the -∞ probable scales (string theory and beyond) of non-observable min. Spe size, are considered from a ‘physical perspective’, scales beyond experimental evidence, where theoretical analysis reigns supreme. Yet in as much as we consider experimental evidence the beginning of stience, before physics, comes metaphysics, which therefore in ¬Æ STiences acquires a new meaning as the ‘physical theories’ of uncertain, unobservable scales of the Universe.

In that sense metaphysics and cosmology, the study of ∆≥ |4| scales beyond the ∆±3 scale of the atomic galaxy, clearly observed from our ∆o human point of view, are closely related. In fact there are cosmological strings as they are smaller ones. But when analysing those ∆+∞ scales we shall consider the more ‘logic’ concept of a god or super organism, the relative infinite element for the parts of a super organism.

And so we reserve the more semantic name of the stience of metaphysics for the study of forces, which have a great degree of uncertainty (invisible forces, dark matter and energy and gravitation) or are the structure of the human mind, and scientific analysis has been clearly lacking perspective.

We also use consciously the term metaphysics, because it implies certain imaginary analysis not necessarily truth as it is the case of many of the theories physicists develop on metaphysical scales beyond our perception (black hole evaporation, Supersymmetry, string theory etc.)

Consider the simplest form of the Universe a unit interval, I, (O,1), It is a bit of Entropy or motion, distance or space (both perceptions are correct).

This open string I, can convert itself in a circle π, with a diameter 1, the original string. The circle will be made of 3 D turning around with an aperture of 0.14, and so this entity, with a zero point in x=0.5, which can perceive through 0,14 apertures a 4% of the external Universe, with a 96% of dark space blinded by 3 i-strings that turn around it, is the simplest time-space organism, the pi point. The pi point as it turns to be reflects quite well the ultimate web of our gravitational space-time over which the electromagnetic space-time we see exists.

The Universe is bidimensional both at small and large scales (Holographic Principle), but its units, Non-Euclidean points, reproduce in fractal patterns till emerging as a 4-dimensional social network of space-time, the Universe we live in. This discovery of the most successful model of quantum gravity (casual, dynamic triangulation, galactic scales), shows that even the simplest space-time membrane of gravitation is built with the isomorphisms of social networks. But in its deepest meaning the fact that the Universe in the limits of perception of man has ONLY 2 SPATIAL DIMENSIONS, WITH NO TIME-HEIGHT DIMENSIONS, means we are ‘touching’ two immortal limits of reality that might be the absolute scalar limits of the Universe – a philosophical question which belongs to the metaphysics of i-logic time.

400 Ultimately we are made of pi circles. This simple bidimensional form has therefore a lineal |-Entropy space configuration and a O-closed one and from the Planck scale this strings of space and time can create our 4D Universe, as casual triangulation has proved:

The scales of physical space departing from that bidimensional, gravitational, invisible world of strings are studied by astrophysics. The scale of strings of space and time, of quarks and gluons of strong forces, of ¥-light and electrons of electromagnetic forces, the human scale and the gravitational, cosmic scale.

Screen Shot 2015-09-09 at 14.11.49


The limits of knowledge.

A consequence of the reversed entropy of information and energy crossing though 5D planes, which goes beyond the idol-ogical shortcomings of the metal-based cults of western thought is the limit of any human theory of the Universe that pretends to explain scales beyond those which our superorganism interacts with. Namely theories on the scales that is invisible to our perception, below the limit of gravitational energy and informative light, which we use to exist.

Thus from the ∆o human point of view, beyond the ∆±4 scales of the local Universe and the interior of atoms, (regions with huge uncertainties of perception but from which some information is still perceived) reality is and will always be uncertain. Of course, physicists will tell you that making big-bangs on earth with high speed accelerators can observe those scales – to the risk of extinguishing the earth into a strangelet or black hole, and eliminating all knowledge as a extinct scientist knows nothing. There are also absurd plans for faster than light speed future flights in search of the beyond-the-galaxy regions, which obviously are absurd megalomaniac concepts, as we cannot even go beyond the Oort cloud due to radiation )we would need a kilometric iron shield on our starships).

Thus all analysis of those scales is theoretical and the best we can do is to ‘project’ as Mendeleyev did with his elements’ table the isomorphic properties of known scales in the beyond and below. This is the proposal of 5D metaphysics, fully aware that there will never be theoretical proofs of those scales, worth to mention.

Let us then briefly consider the lower scale limit of u-4, string theory from this perspective, trying to accommodate it to the isomorphism that should define its properties on the metaphysical realm.

In that regard we accept the correspondence between the lowest atomic and highest known-unknown scales of galaxies, as there is clear theoretical evidence in the equation of unification of forces, provided in this post latter on with the metric of the 5th dimension and in the uncertainties due to reversed entropy we have on those 2 scales, uncertainty of information on the larger galaxy (unknown to experience is the 90% of dark matter on the halo of galaxies and the internal matter of the black hole nuclei), and uncertainty of energy measures on the atom due to motion entropy (with the added unknown of lineal time physics theory’; that is the proper interpretation of the laws of quantum systems, which however can be solved with the right models of cyclical time as this blog will show when completed).

This would imply that the same duality exists in the non-perceivable scales of string theory, between the minimal strings and the cosmological strings; when properly modeled within the restrictions of 5D fractal space-time as the lowest and upper membranes of reality.

they would be two scales above and below the local universe and the quark-gluon systems.

These unknown unknown, theoretical strings, are of course just mathematical functions; hence without evidence they should remain linguistic fictions not different from a novel or Jewish book of history talking about the creator of the Universe. The consistency of a literary work of art is by no means proof of its truth as Gödel showed with his incompleteness theorem, studied in depth in ‘the future’ in section ∆±∞. Since as Einstein quipped ‘I know when mathematics are truth, but not when they are real’. Still in as much as the best fictions ARE linguistic mirrors of the human reality, which do allow knowledge of that reality (so the best novel, war and peace does help to understand Napoleonic wars likely much better than any treatise of history), what string theory should do is to follow ‘Tolstoy’s method’ of limiting ad maximal ‘imagination’ and ‘excess of formalism’ in his writing and stick to the barebones of 5D isomorphic laws to restrict the mathematical formalism to its likely true minimum. We thus give here just some basic advices to convert string theory into a workable likely ‘war and peace’ model of the unknown scales beyond chromo dynamics and local Universe astrophysics.

So it is a waste of time to deal with string theory;  as promising as it might look, specially as long as its dominant forms are Newtonian (using an absolute space-time frame), on the lower scales – and we shall not do it beyond recommend them the ‘quixotic worshippers of mathematical fictions’, to model them at least with the simple parameters of cyclical time, in the following terms:

  • Closed strings can be modeled as the minimal time cycles of the Universe. With spin 2, which means a closed, ‘immortal’ time loop, and 10c tachyon velocity they might be used to model hyper-gravitons as a force of repulsive dark energy between galaxies.
  • Lineal strings can be modeled as the minimal lines and planes of energetic space of the Universe. As their energy is proportional to their length, they are easily converted in units of spatial energy, at the plank scale, and their conversion into relative background space is immediate.
  • Neither of them can be modeled with more than the canonical dimensions of the Universe. This means they must be modeled with the the 3 x 3 ± Œ symmetries of the Universe, in space, Se≤ST≤Tiƒ; time, Sp>ST>Tƒ, and 5Di Planes, œ-1, Œ, Œ+1. So the way to do it is to construct with them 3 x 3 + ∆ dimensional scales to make up a 5D plane. In brief, 3 space and 3 time symmetric dimensions create a whole Œ string, which then can be considered a point-particle to construct a new scale, for a total of 3 x 3 +∆ dimensions of 5D space-time.
  • And this can be done with both atomic strings to define the strong force below the level of gluons and cosmological graviton-tachyon strings to model an upper scale over that of galaxies (black holes and dark energy). This duality is of interest as it would put in correspondence top quarks and its hyper strong force and black holes as all seems to indicate that top quarks with its parameters of information density (mass) and rotary speed are the ‘atoms of black holes’ in 5D physics. And the Dualities of cosmic strings, graviton strings and boson strings can do the trick.


Foreword: GisT Dimensions, Planes and isomorphisms.

The Goal of GiST is to study the 10 i-somorphisms (equal sets of laws) that define the similar forms, events and actions of all the entities that exist.

Those ‘i-somorphic’ laws derive from their common nature as ‘Scalar Space-time beings’ made of:

The same ‘3 topological finite dimensions of space’ that configure its ‘3 organic parts’ As all systems have lineal fields/limbs of energy; and spherical heads/particles of information that combine in their toroid bodies/waves that iterate the system).

The same ages of time, dominated sequentially by those topologies: an energetic youth, a reproductive maturity and an informative old age that define the finite duration of its life-death worldcycle.

And the same ‘scalar planes’ of organic existence, divided into, the closer 3 i±1 scales:

– The i-1 existential cellular/atomic plane, the i-ndividual plane and the i+1 social, gravitational plane (physical/biological systems).

– And the larger ‘ecosystemic’ scales, with whom the i-being inter-acts to get its energy, information and force (Its i±2 world and energy quanta, its i±3 micro and macrocosms and its information bits and its i±4 Universe and its motion forces).

And so as a result of that common structure of all beings, we can consider anything that exists a variety of fractal scalar space-time; part of an infinite isomorphic reality defined formally by a ‘Fractal Generator equation’ written with the symbols of those 3+3+9 dimensional planes of space-time. Which in its simplest generic form would write:

[STe≤ STr≥STo]i±4

Representing the 3×2 bidimensional Space-time topological organs of all species in space, ST limbs/fields of energy, ST repetitive body/waves repletion and ST particles/heads of information, which exchange in time flows of energy < and information > or repeat them =, through 5 type of actions, across 9 i±4 planes of existence, increasing its acceleration with the help of ∆i-4 forces, its ∆i-3 informative bits, ∆i-2 energy quanta, its ∆i-1 reproductive seeds and evolving social with ∆i parallel clones, to create a bigger social i+1 superorganisms.

This is the Universe in a ‘nutshell’ and so we shall study in this 3rd line, each specific species, its dimensions of space and time, its 2-manifold topologies, i±4 actions and social planes, through the analysis of its 10 isomorphic elements, common to all those species.


standard model

In the graph we can see how this dark, gluon-top quark soup (toplet liquid) of a gas-9 reaction (black hole quasar explosion), neatly reclassifies properly the 3 families of increasing mass in the Universe:

In the graph, a first look at the reordering of the strange and top quark triangles of mass, to define the symmetry between the ∆-1 and ∆+1 scales of ‘atoms and galaxies’, in terms of its ∆-1 components (quarks).

To fully understand the previous graph, you should consult the next post on atoms, as quarks fully correspond there. We have already treated in the introduction to this post, the analysis of the ternary symmetry between ∆-1 quarks and ∆+1 parts of the galaxy;

∆+1: top quarks->To: Galactic black holes, ∆-strange quark: Sæ: galactic halo, ∆-1: Ud quarks (our matter): ST-galactic stars.

Now, in this system, it is necessary to understand the role of the Higgs field and top quarks, which conform the outside-the-galaxy dark matter and dark energy.


now it is fundamental to understand that gravitons do NOT EXIST on the sense given by physicists – as the attractive gauging particle between physical masses, but gravitomagnetic transversal waves of neutrinos do exist and are essential to the structure of galaxies and solar systems.

THIS IS THE SOLUTION to the conundrum put up before on the ‘neutrino 1/2 spin’ vs. the graviton 2 spin. Simple: neutrinos are the equivalent in the next cosmic scale of the 5th dimension of an electromagnetic repulsive wave of ‘dark energy-repulsive gravitation’.

The neutrino theory of light, as the pilot-wave theory and the particle-wave duality is the work of De broglie, brutally assaulted by the Bohr gang who took their work as theirs and misinterpreted. A neutrino is the ’emerging’ flow of entropic communication between two atoms, or atom-galaxies, which either goes the entropic way, dissolving-stretching space-time into action at distance and disappears without trace of information: V=s/t=s/0=∞.

Or is not in an open ‘domain’ but constrained in its two limiting ends by the two atoms/galaxies it communicates and then, this condition (to be guided and controlled by two point-particles at their ends, is the only condition needed for two inverse neutrinos, with inverse motion, that is, one emitted by particle a and the other by particle b in a communicating act: 

Fermion < neutrino+Neutrino>Fermion

for a light boson that ‘warps’ pure gravitational action at distance space-time giving it form information to exist. So neutrinos do form light-beams and do dissolve them. And that is another beautiful thought of De Broglie on the path of truth in science: economicity, simplicity and causality.

As we have shown the symmetry of scale between atoms with positive quarks and negative electrons, and galaxies with positive black hole top quark stars and negative strangelet halos ad nauseam, it follows now that dark energy waves will be just the equivalent created by dark energy flows which acquire form between galaxies. The math of it are more or less complex but as we REPEAT ad nauseam, the beauty of the fractal, organic, vital, topological= mathematical universe, is that we can explain it much simpler and intuitively using the concept of symmetries between scales and organic properties and the laws of the scientific method of truth (economicity, simplicity etc.) as long as in the background we use sound-sound mathematical theories (as the neutrino light, providing they are indeed as Jordan proved, exactly inverse neutrinos constrained in its ends – by the particles that use them to entangle and communicate). So we do not need you to be a math-addict, just trust me, i read physics, love maths and only use sound-sound theories of the Universe, unlike we must say many of our fantaphysicists these days.

So there must be G-waves of dark entropy and dark faster than c gravitation in the upper scale between galaxy atoms; and between (at light speed constrained by the galaxy inner structure) stars and planets and black holes and starts.


And thus with those pre-conclusions we can deal now with gravitational waves, which we call in 5D G-waves, likely made of neutrinos either constrained and able to reproduce light or unconstrained and escaping constantly into the dark energy lower scale of action at distance and null information -for our electronic perceivers. Consider for example an earlier use i did of them to calculate the titus law of distances between planets:


One of the hypothesis of 5D physics is the fundamental role that must play neutrinos by sheer rational evidence, as the gravitational quanta of the gravitational transversal wave structure of the universe. AS IT is simple enough we can bring it here just for fun.

The theoretical difficulty being the 2-spin prediction of quantum gravitational theories based in 2 assumptions not completely hold in 5D physics – the conservation of spin (which in 5D physics might be transformed in upper or lower forms of rotational momentum, conserved only through 3 planes of the 5th dimension, as they become inverted in its emergence in upper planes), which allows to create a model of 2-spin neutrinos as gravitons or inversely the assumption that gravitons do NOT have 2 spin (or rather are made of 4 neutrinos, lineal strings of one dimension, which become the 4 component of a field that forms a complex doublet of spin ½ with the peculiarity observed that both neutrinos and antineutrinos have longitudinal polarization of positive spin.

The mathematics of the model are somewhat complex but essentially mean that neutrinos ARE VERY important, almost as multi-faceted as light space-time is – couldn’t be otherwise as they are the other 2nd background space-time network of the galactic universe. So as nervous and blood systems do have multiple roles in your body neutrinos and photons share all the networks roles of the galaxy.

They do have therefore both boson and fermion ‘nature’ as the weak force they mediate.

This means that we can put neutrinos and antineutrinos together in and 4-doublets, as if they were bosons. Then:

– 2 gravitational tachyon neutrino strings, when put together in pairs of opposite direction form the up and down, particle and antiparticle sides of the magnetic light wave-field.

– When communicated between particles, the fixed distance in gravitational space that allows the sharing of information between the particles at fixed c-speed.

– When colliding with neutrons they catalyze the beta decay, as well as other weak force transformations, in a role similar to the one mediated by the Higgs boson on the top quark faster triplet of heavier quarks.

– When emitted massively and constantly by all type of cosmic bodies the origin of the 3 ‘G’ ‘giant electromagnetic waves’ of the gravitational scale, which form the different transversal wavelengths that order in ternary symmetries the structure of solar systems and galaxies.

Those transversal waves are the only of the many roles of the Neutrino ‘fantaphysics’ (: I confess neutrino gravitational string physics is the only speculative part of 5D Physics of which I have not yet by lack of evidence put my firm on blood), which we will discuss here.

Essentially there are 3 neutrinos which constantly oscillate as they abandon the cosmic bodies that constantly reproduce them in proportions often higher than the photon radiation, forming 3 basic type of gravitational transversal waves, whose combinations allow us to model galaxies and its spiral arms and rotary orbits of stars in the wells of those waves. The same neutrinos structure the lows of planetary waves, and so we can recognize clearly in ternary symmetries a 0.33 short wave of neutrinos of higher energy that crosses the planet crystal centers provoking the well-known Fe-Co-Ni chain of reactions that puts on ‘fire’ the internal energy engine of planets. A longer 5 A.U. wave exists also clearly as it allows to put on its nodes the main ‘Jupiter like planets’.

And finally there should be an even longer neutrino wave that will put in connection stars among them.

In the next graph we see the 3 oscillatory neutrino waves observed together coming out of the sun:

Screen Shot 2016-04-08 at 21.26.33

Gravitational, transversal, fractal waves shape the structure of galaxies and solar systems, transferring form and Entropy between cosmological bodies, in a self-similar process to the transference of information and Entropy between atoms through electromagnetic waves. In the graph, Titius Law of distances between planets reflects their position in the nodal points of those transversal gravitational waves. In the core of planets, there could be a crystalline or super-fluid zone where those flows of dark, gravitational Entropy are processed, causing flows of heat and matter that make planets ‘grow’.

Below we observe the oscillations of neutrino waves.

Now the 5D model of astrophysics, implies that there are also transversal repulsive gravitational waves, which do account for many of the structures of the Universe, starting by the expansion perceived in intergalactic space-time. This is just the expansion produced by any lineal electromagnetic wave of any range, now in the cosmic scale. Moreover, black holes produce two type of wave-shift that compensates each other and explains why the total space-time of the Universe remains constant:

  • When a wave of light enters the black hole it ‘blue shifts’ imploding space into higher frequency/density of energy-mass. But this effect is NOT seen obviously from far away as it is the inner production of mass within the galaxy.
  • When the wave leaves a black hole region it redshifts that explains the initial red shifting of light as it leaves the galaxy towards us

So the hypothesis is clear.

Somewhat neutrino waves, of the hipouniverse, which have as a lower limit the speed of light and an upper unknown limit as tachyons, theoretically, if they are the field force between galaxies up to c=100 z, gather in ginormous numbers, allowing the stretching in the ∆+4 symmetric hyperUniverse.

According to this ∆-4,5 reality above us we can then consider that the stretching reveals either gravitation as having a ‘slower’ speed on the ‘glass of light of the galaxy’, as light has slower speeds on ‘glasses of solids’, and neutrinos becoming the stretching particle of the ∆-4 intergalactic field to 100 c-distances/speeds, or neutrinos becoming something ‘else’ (dark entropy), at the halo of strangelets (dark matter wit ten’s hypothesis candidate to the ternary symmetry between quark masses and regions of the galaxy (top quark stars=’dna’ black holes, strangelets=’protein’ MACHOs, ud-matter: internal star ‘mitochondria’).

Titius law. A local proof of transversal gravitational waves.

While transversal gravitational waves cannot be detected directly we shall bring here a couple of proofs, at local and cosmic level. At local level it is dry easy to create a model of the Titus law (the exact position of the planets) using 3 waves. So the planets will be on the nodes of those wave frequencies, as in quantum physics, the electrons ARE in the nodes of waves that restrict their positions (multiple of h-frequencies). In the next graph we see the short, high energy wave that locates in its gravito-magnetic nodes (as the magnetic number locates the orbits of electrons), the orbits of ferromagnetic planets. A longer ±5 AU wave locates the outer giant gaseous planets:

In the graph, we see one of those waves, which between ‘galaxies’ with no light warping it to create light space-time at the c-limit, SHOULD move in a new ‘decametric scale’ at 10z. As it happens, we find in this cosmic scale is the outer ‘region’ of the galactic atom, ruled by transversal, expansive, intergalactic giant gravitational waves MATTER gushing out of black holes that produce those waves at 10 z=c speed.

Thus its existence within the galaxy -albeit reduced to c-speeds, unlike outside the galaxy where they are spotted routinely at z=10c, would define a new dual game of implosive and attractive gravitation similar to the scalar atomic game. Many laws can be solved and deduced from this parallelism. Below the Titus law of planetary distances as caused by G-waves.

This is a necessary introduction to the most fascinating finding of 5D physics, the unification of charges and masses, as the two limiting scales of the Universe perceived by man. We shall expose this finding as the Linguistic Method prescribes.

First with a conceptual understanding of the Scientific method of truth (simplicity, correspondence), then from the organic perspective (topological and scalar structure of the Universe and its 3 relative symmetries of topological parts and scales)

And only at the end in its quantifying methods, since T.OE works with inverse causality to a mere use of how-mathematics.

We use constantly self-similarities, based in the 3 Isomorphisms of invariance of the Universe, scalar, formal and motion invariance and the ternary differentiations in time (Entropy, reproduction, information) and its symmetric function in space (planar, spherical membranes; toroid bodies and hyperbolic centers).

With those simple isomorphisms we can describe the galaxy as we did with the atom. Yet self-similarity is not equality, which means that we cannot use the exact formalism of quantum physics but use those self-similarities for gravitational systems we do not perceive, except for its secondary effects.

This is the case of gravitational waves, which are self-similar to electromagnetic waves. They organize the structure of stars and galaxies, as electromagnetic waves organize the orbits of electrons. Both respond to the same morphological equations that relate 2 particles through a lineal force field defined by the ratio between the informative density of masses or charges and their distance.

Yet even if form remains invariant at scale, as it is an essential topological property that defines the why of the Universe, the metric space changes as the space-time ratios/constants that define the size, speed, frequency and range of those waves change.

We observed a self-similar change when studying the cyclical informative vortices of both scales – the G-constants of Newton’s gravitational vortices and Coulomb’s equation of an electronic vortex, unified by their invariance of topological form and scale).

Thus galaxies and solar systems show a gravitational, morphological, spatial structure similar to that of an electromagnetic atom in the cosmological scale, a fact that Einstein predicted, establishing 2 kinds of gravitational waves, parallel to the 2 types of electromagnetic fields we know:

– Static waves that create the gravitational bi-dimensional fields over which galaxies form.

– Discontinuous, transversal, quantized waves, which shape the orbits of stars and galaxies, in the same manner photons control the orbital distance of electrons in atoms (l numbers).

Thus, those gravitational waves should have the same functions in galaxies and solar systems that electromagnetic waves have in the world of atoms, explaining cosmological structures and becoming by self-similarity with electromagnetic waves, the fundamental force of interaction between celestial bodies.

We know that the gravitational activity of black holes set up star orbits and probably influences its evolution, growth and formation, determining the basic properties of magnetic fields, ecliptic orbits and distances between stars in a galaxy and planets in a solar system. So even if gravitational waves are invisible, using their morphological self-similarity with light waves, the equations of Einstein’s relativity and the indirect proofs provided by the orbital distances and rotational fields of stars and planets, we can explain many ‘whys’ on the structure of those celestial bodies:

– Astronomers have always wondered what rules the distances between the planets of the solar system. The existence of regularities in the distribution of planets in the Solar System was recognized long ago. This was Kepler’s main motivation in his search for planetary isomorphisms.

The Titius-Bode law (rn = 0.4 + 0.3 × 2n) was the first empirical attempt at describing these regularities, and was followed by several other proposals. The discovery of similar structures in the distribution of the satellites of the great planets led to a revival of interest for such studies, and to the hope that indeed a physical mechanism was at work. Now we can add a topological why to the how and when of metric space measure:

Those planets are in the nodes between gravitational waves of different frequency/amplitude and the solar system’s orbital plane in which planets feed, ‘deforming’ space-time, as they follow their static gravitational orbits; as electrons are in the nodes of their quantum waves, fine-tuned by the secondary levels they access according to the strength of the electromagnetic waves they exchange with their environment. For the same reasons stars should in the nodes of gravitational waves caused by galactic black holes.

In the graph we draw the 2 fundamental wave lengths that could explain the distances between planets: a high frequency, short gravitational wave of 0.33 AU could explain the positions of ferromagnetic, inner planets on its nodes. While 2 low frequency long wavelengths at 5 and 10 AU, could explain the position of bigger, and lighter gaseous planets. Since Jupiter is located at 5 AU, Saturn at 10 AU, Uranus at 20 AU, Neptune at 35 AU, Pluto at 40 AU; and as I predicted a decade ago, we have found a new planet, which I called then Chronos, ‘the last of the titans’ at 100 AU, in the limit of the solar system, ‘renamed’ Selma (-;.

– G-waves explain why planets have ecliptic orbits with an inclination on its axis, which is a natural orientation if they are receiving curved G-waves with a certain angle through its polar axis. In that regard, the rings of gaseous planets in the point of maximal activity of those waves (Jupiter and Saturn) and the spiral vortices of galaxies, could act as ‘antennae’ for those waves at star and galactic level.

– Those waves might cause, as all lineal movements do, a cyclical vortex around them, originating the condensation of planetary nebulae. While in galaxies their wave structure seems to originate the different densities of stars in their nodal zones.

– Planets suffer catastrophic changes in their magnetic fields, probably produced by changes in the directionality of those waves, emitted through the tropical dark spots of the sun.

There are advanced mathematical models of gravito-magnetism that have unified both type of waves, departing from Einstein’s work. Thus energetic ‘gravito-magnetic’ waves might cause a change in planetary magnetic fields as a magnetic field changes the spin of an atom that aligns itself with the field. For example, Uranus is tumbled and it has lost most of its magnetic field: perhaps it was knocked-out and relocated by a G wave.

Lineal magnetism is in fact in complex physics of multiple planes of space-time the intermediate SæxTo force that ‘transcends’ from the gravitational to the electromagnetic scale: for example, electromagnetic light or ferromagnetic atoms like iron should absorb gravitational Entropy through their magnetic fields.

– Solar spots are the probable source of those waves. Yet its origin might be the central core of the star or the activity of the central black hole, whose G-waves might be absorbed and re-emitted by the star. We cannot perceive G-waves directly; but magnetic storms, solar winds and the highly energetic electromagnetic flows and particles that come from the sun’s spots, might be its secondary effects. In the same way we only perceive indirectly the waves of dark Entropy emitted by black holes that position the stars of spiral galaxies, by observing the mass and radiation dragged by those waves.

– Those catastrophes might cause the climatic changes that modulate the evolution of life on Earth, since we already know that the activity of sun spots affects the temperature of the Earth.

– G-waves could structure the galaxy and its stars in the way electromagnetic impulses structure a crystalline atomic network, ordering the distance between its molecules: electromagnetic waves also feed with Entropy and information those crystal webs. For example, electromagnetic waves cause the vibration of quartzs, which absorb Entropy from light and vibrate, emitting ‘maser-like’, highly ordered discharges of electromagnetism. We observe similar maser beams in neutron stars, called for that reason pulsars.

Recap. Gravitational waves produced by black holes control the location of stars and planets, and its spin/orientation through smaller gravito-magnetic waves.

Organic Patterns in the Galaxy. The why of G-waves.

The closest homology of the 2 dual networks of the galaxy is with an atom in which the central nucleon with max. density of gravitational information and the external electronic membrane interact in a middle space-time vacuum through gravitational forces and electromagnetic photons.

Another self-similarity between scales of multiple space-times might be established in complex analysis between the galaxy and a simple ‘cellular’ organism, which introduces elements of complex biology in astronomy obviously more difficult to accept from a mechanist perspective).

Following the cellular or physiological homologies, the network of dark, informative matter and gravitational Entropy, connected to black holes, surrounds and controls the stars’ electromagnetic Entropy. We know that it was formed first and then guided the creation of electromagnetic Entropy, so we can observe it indirectly and deduce its form from the highly quantized shape of the filaments of light-galaxies that were formed around dark matter (right graph). Thus dark matter acts in a similar way to the RNA that shapes and controls the Golgi membranes of the cell or the nervous system that guides and builds the morphology of the body; while the network of stars and electromagnetic – the slower Entropy that produces the substances of galaxies – surrounds those strands of dark matter. In the cell’s homology ribosomes that create most products of the cell are pegged to those membranes.

We shall consider briefly here 2 of those controversial hypothesis: The possibility that black holes perceive gravitation and the chains of causality between the different scales of the Universe, self-similar to the chains of causality between cells and bodies.

– The most controversial element of a cosmological model based in G-waves is the existence of gravitational information that allows strange, neutron stars and black holes and maybe in the future evolved planets such as the Earth through its machine systems to perceive and move at will within a static field of gravitation.

On the Earth animals use light as information and dominate plants, which use it as Entropy. The hypothesis of complex cosmology is that stars are ‘gravitational plants’ that merely feed and curve gravitational space-time, while Worm Holes are ‘gravitational animals’, which are able to process gravitation as information and control and shape with gravitational waves the form of galaxies, their territorial space-time. They are in that sense extremely simple plants and animals. A more proper comparison would be with a cell, where the DNA molecules are the Worm Holes, the informative masses of physical space; and the mitochondria that produce energetic substances, the stars.

Thus frozen, quark stars could be ‘gravitational perceivers’ in the cosmological realm, as animals are light perceivers in the Earth’s crust and DNA perceives van der Waals forces in the cellular realm.

On the other hand stars would be plant-like, floating in the sea of gravitation, used as Entropy of their motion, feeding on interstellar gas, as planckton does, floating in the sea of water.

Do black holes perceive gravitation as complex animals perceive light, instinctively or mechanically, as DNA perceives the forces of the cell? They probably gauge gravitation in very simple ‘forms’, as a cellular DNA-system, much simpler than the brain of animals, perceives its territorial cell.

That is the supremacy of man in a relative universe were size is less important than form: While all systems process information, man is a summit of form and hence one of the most conscious species. Yet black holes have enough quark complexity to act/react to informative flows, as they seek Entropy to feed on – our electronic Entropy. This hypothesis has experimental proofs, since pulsars and black holes emit gravitational waves and we have observed many black holes following erratic paths through the galaxy, which defy the tidal, regular orbits of stars.

In that sense, multiple space times theory considers that in the same way light waves are the Entropy of plants and the information of animals, gravitational waves move stars and inform black holes, the most evolved celestial bodies, which emit or feed on the Entropy and information provided by those gravitational waves.

Further on, gravitational waves emitted by black holes might reproduce matter on the cores of stars and planets:

If those gravitational waves degenerate easily into quark matter as the jets of quasars show, they could also become converted into matter in the super fluid cores of stars and the crystalline centers of planets, in a process inverse to the Lorenz Transformations; since tachyons acquire more mass when they slow down, trapped by those superfluid and crystalline vortex.

Thus, as light is converted into Entropy in plants, stars and planets will create their ‘amino acids’, quark matter, in those processes. Thus, dark Entropy, tachyon strings would decelerate into c-speed gluons that would reproduce quarks; (as electromagnetic waves become photons and electrons).

There is also a mechanism by which those planets and stars ‘jump’ or change their spin position under the effect of gravitational waves, as electrons do under a magnetic field: the core of stars are made of super fluid helium and the core of planets of iron crystals, which are the only atoms that can absorb the Entropy of a gravito-magnetic field to change its motion.

Finally, all those events will have a ‘why’ in the 4 arrows of the organic Universe; since they would represent the feeding, matter reproduction, informative perception and social location within the galactic or planetary network of celestial bodies, equivalent to the 4 whys of the 4 quantum numbers, described before.

Recap. Gravitational waves accomplish the 6 arrows of time for celestial bodies, as the 4 quantum numbers describe those arrows for electrons.

Intermediate Space. Gravitational Waves and Solar Systems.

Now gravitons are massless just because it is considered to be a force with infinite range but this is not the case. The gravitation we study works within galaxies. Outside galaxies it should be the dark energy field, which is repulsive. So gravitons do not need to be massless in 5D. They might have the neutrino mass. It remains therefore to accommodate somehow the 2 rank tensor of gravitation, which determined 2 spin gravitons to convert them into ½ spin, as neutrino ½ spin seems to be proved. I will on the 4th line, charged with mathematical physics adventure my solution, but at this stage, those are questions, which require far more serious maths than this 2nd line can hold.

We thus will consider the following proposition: Gravitons are ½ spin, do not have infinite range – nothing does in 5D, ‘are’ neutrinos, create gravitational waves that communicate galaxies and maintain the structure of galaxies as single organisms, form the second background of the galaxy along the light background, representing the duality of any organic system, and originate when used as strings that connect two atoms in opposite directions a photon of light.

As I said the mathematical analysis goes to the 4th line, but it cannot be otherwise within the organic structure of the galaxy.

The reader in that sense must understand how T.Œ works: any of the 4 analysis possible of an event – the mathematical, most detailed one – the mental and action oriented from the zero-point of view that survives, and apperceives automatically the Universe, seeking for energy and information, iteration and social evolution; the organic spatial one of 3 space-time elements communicated through the rules of ¬Æ geometry, and the temporal, causal of the 3±∆ planes of existence, ARE necessary to SATISFY and qualify a system, event or form as truth.

And vice versa, each of them can properly define an event. And so we give an o.25% of truth to a theory with a mathematical consistence, a 0.25% when such theory satisfies the organic paradigm, another 0.25% when it defines the 5 actions of a system and another 0.25% when it defines the 3±∆ ages of the system.

This of course will be contested for decades if no centuries till T.Œ is accepted if ever by mankind. It does not matter to me. The point here is that at this stage of ‘inflation’ on mathematical physics one can easily find a theory for everything. So the way this post is constructed is simple. Unlike the classic scholar of physics which have no intention to define organically, temporally and from the perspective of actions its systems, but merely will work out equations of physics, twist them (today not even coherence is required specially when dealing with the internal space of black holes the external dark energy and any other region not experimentally evident), and then seek for with expensive machines to find parallel Universes, muinos or whatever.

T.Œ does not work like that. What I do is to start with the organic structure of a system, then to define it in space as a simultaneous organic system, then in time through its 3±∆ges and then normally as a third stage to work out the mathematics of it, because they are closely related believe it or not with the 4th element, which are the space-time actions of each ‘mental point of view’. So basically we work in 2 blocks: the spatial and temporal symmetry on one side, and the actions o space-time of the 0-point of view and its mathematical equations on the other side.

In this 2nd line I will work out the space-time symmetry and refer to the mathematical and space-time actions and constants very briefly. We are making in this introduction an exception, for the sake of representing to the specialist the full concept of T.Œ, considering in more detail the Neutrino ‘affair’.

During the 1930s there was great interest in the neutrino theory of light and Pascual Jordan and Max Born, and others worked on the theory. We are here talking of 2 of the 6 foremost fathers of quantum theory. So it is not to be taken lighter (Jordan worked out the matrices of Heisenberg, as a better mathematician and Born the probabilistic theory of Bohr, which only give the idea). De Broglie and Fermi also worked it out, but there were some obvious problems to it. The one most often mentioned – that a photon is not a composite particle is silly. To start with is composed of a myriad of H-planktons and if we get to see it in detail it will grow as any fractal point to cosmic dimensions – to become a quasi-star of the string world.

Now this was understood first by Dirac, and with many of the ambivalences of quantum theory, only made reasonable with the 3 different perspectives of 5D physics (scale, space or time), as any point from an upper scale view is a fractal of populations (space view) or probabilities (time view). So we should choose the description that better fits a phenomena, as related the OBSERVER’S experiment and the OBSERVABLE through one of those different perspectives, all of which add truth to the analysis (but only all together describe the multiplicity of the universe):

Some time before the discovery of quantum mechanics people realized that the connection between light waves and photons must be of a statistical character. What they did not clearly realize, however, was that the wave function gives information about the probability of one photon being in a particular place and not the probable number of photons in that place. The importance of the distinction can be made clear in the following way. Suppose we have a beam of light consisting of a large number of photons split up into two components of equal intensity. On the assumption that the beam is connected with the probable number of photons in it, we should have half the total number going into each component.

If the two components are now made to interfere, we should require a photon in one component to be able to interfere with one in the other. Sometimes these two photons would have to annihilate one another and other times they would have to produce four photons. This would contradict the conservation of energy. The new theory, which connects the wave function with probabilities for one photon gets over the difficulty by making each photon go partly into each of the two components. Each photon then interferes only with itself. Interference between two different photons never occurs.

So indeed, photons do split into its h-plancktons as electrons do split into its photon cells, and so on. It is just the point of view we adopt. From our ∆o perspective obviously both electrons and photons are point particles but also waves smearing its components. The same with neutrinos. We see a neutrino ∆-3 beta decay as a single particle-point but we do see a neutron star ‘beta decay’ at ∆+1 as 1059 neutrinos rushing out.

So how we can manage all that? Through parameters which are ‘integrals’ of space or time quanta, such as energy. So we find that the proportion neutrino/quark mass-energy in beta decay is the same than the proportion of neutrino/star novae mass energy, a 10%, which incidentally is once more the decametric scale of 3 x 3 +0 elements of 5D theory.

(5D strictly speaking would be a 10 Dimensional theory of 3 spatial components, 3 time ages, 3 scales of size-time speed and a 0-point pegging it all together, but not to scare too much the ‘primitives’ I decided given the idolatry to Einstein and his 4D concept to call it 5D call the 3 dimensions of time, 3 ages, whereas 4D is the present age, and so on.

But if there was some respect for it, of course I would use the 10 Dimensional analysis which is much better for the full ® model, anyway that is how it was originally written 24 years ago; and so one can imagine if we are still here by 2140, some Chinese congress on 10D rewriting the whole thing on the proper 10 D formalism. In that sense 10D means 10 parameters to fully describe a system, which are 10 parameters in General relativity 10 parameters in string theory, 10 systems in physiology, 10 is then the number of the game).

We have so far settle down just for 10 isomorphisms.

Now that is clear that light is NOT a point particle but can be treated in many ways what the Neutrino theory of light means is merely the ‘feeding of light’ in its lower gravitational scale, the use of neutrino and then light by other particles, Fermion < Boson > Fermion to communicate and ultimately the beginning of a creation of a network between particles and or stars that ‘cements’ the internal structure of galaxies to make them ‘galacells’, not mere rotary systems with no connection between its parts.

The point more difficult is how to match the polarization which is different, and how to find a mechanism that makes 2 neutrinos to couple together into the photon, in ‘lineal fashion’ to avoid the problems of obtaining its Fermi statistics from Bose-Dirac ones.

Jordan’s hypothesis that the neutrinos are emitted in exactly the same direction eliminated the need for theorizing an unknown interaction, but his hypothesis seemed rather artificial and was ignored. However it is precisely because neutrinos are the first communication act between two particles, entangling them that this is in T.Œ a must. Jordan obtained exact Bose–Einstein commutation relations for the composite photon – a longitudinally polarized photon, as commutation relations for pairs of fermions were similar to those for bosons:

Bosons are defined as the particles that adhere to the commutation relations:

Screen Shot 2016-04-08 at 21.27.04

The difference is minimal – a mere ‘delta term’

Specifically the size of the deviations from pure Bose behavior, ∆ (p, p) depends on the degree of overlap of the fermion wave functions and the constraints of the Pauli exclusion principle. which is cancelled by a Raman effect in which a neutrino with momentum P+K is absorbed while another o with opposite spin and momentum K is emitted.

But again that is precisely what 5D T.Πrequires: a particle communicates with a second particle, which absorbs the neutrino and emits in the inverse direction a new one.


Neutrinos as tachyons.

Now we must understand that humans simply speaking will not easily accept any correction to their dogmas. ‘humans are slaves they believe, they don’t reason’. So physicist do NOT accept the dogma that c-speed is the limit of light speed, as it is the absolute modern dogma of their idol-god Mr. Einstein. Trust me. My first work on T.Œ was called the error of Einstein – the part of physics his theory did not account for and that was enough – young and bold – to get 300 rejections to the book (-;

This I said because in 5D, outside galaxies, outside light space-time, there are faster than light speeds, quite likely in the ‘inflaton’ field of the Higgs o-scalar, and maybe if Neutrinos have properties that experiments seem to show – but at this stage there is so much ‘ad hoc’ arrangement of data to favor ‘theory’ that frankly I am skeptical, and this particular part of 5D theory – neutrino physics, is the only one I haven’t yet settled down for definitive conclusions.

Now for faster than light particles all what you need is negative mass, as the Higgs has above its minimal energy. So happens to neutrinos, till recently when assailed by all data against it, c-believers have scared anyone pretending to measure faster than light neutrinos.

Measurements for electron neutrino mass

The mass of electron neutrinos is measured in tritium beta decay experiments. The decay results in a 3-helium, electron and an electron antineutrino. If neutrinos have non-zero mass, the spectrum of the electrons is deformed at the high energy part, i.e. the neutrino mass determines the maximum energy of emitted electrons.

To be exact, the experiments measure the neutrino mass squared. Curiously, when taken at the face value, all results during the last century, pointed to a negative mass squared, before the last scandal of faster-than-light neutrinos who ended the career of a few researchers till they gave up and ‘found’ luckily a loose cable:

Screen Shot 2016-04-05 at 13.25.44

Screen Shot 2016-04-08 at 21.28.17

Thus precisely in as much as the neutrino goes first back and the forwards between two particles that entangle each other with them, all the 3 problems of the neutrino theory of light are solved, miraculously by postulating what T.Πrequires in its organic and perceptive-action elements.

The 4 elements, the organic structure in which each new layer of reality builds upon the previous later (photons are created upon the previous layer of neutrinos), the mental structure that requires the event to correspond to one of the 5 possible actions of the Universe (neutrinos entangle particles over which waves of light communicate information), the temporal structure (neutrinos die-are absorbed giving birth to light at both extremes of the communication) and finally the mathematical structure (which can only be resolved precisely by a lineal communication of neutrinos and antineutrinos).

It remains now to consider the transversal vs. longitudinal polarization. Which is the final concept that truly solves the question
In the graph, the V> c is the front wave of neutrinos, B and A are the particles communicated by the neutrino wave front, which start a non-local communication, the neutrino wave locates both and locks them to start the building up of layers of the 5th dimension that will give birth to light

In simple terms:

– The spin of the 2 neutrinos back and forth ad up to form a single 1-spin unit as the photon has. And here makes sense that wonderful discovery that neutrinos are always left handed (-: so we can put them together to ad up 1 spin.

The lineal back and forth motion makes them both follow a lineal form. And its polarizes them in the same direction than light. So a ‘new entity’, light appears, which as an emerging new form does have obviously new properties, but is born from the neutrino in which it feeds. This is an essential characteristic that imposes a certain order in the galaxy.

And so it allow us to go further and consider in general that the neutrino waves between galaxies that carry energy interact constantly with the light waves and are the minimal information quanta of the Universe, which only give us information on the location along its path but good enough to form the necessary background for light to maintain its constant c-speed. And we will go back to its details, latter.

So the task to develop by a top-top physicist today would be to consider the possibility of a neutrino, which acts as a graviton for transversal gravitational waves between galaxies, a theme we will outline ‘grosso modo’ in this post and develop with more finesse but by no means exhaust in the 4th line. In brief, there must be transversal gravitational waves with a fundamental role in the Universe to connect galaxies, and keep the inner structure of galaxies in place, as it is the case (galaxies act as a solid structure with no difference of rotary speed between stars regardless of position).

So neutrinos should play roles similar to the graviton, as the ‘photon quanta’’ or more precisely the H-quanta of gravitational cosmic waves. This role in most hypothetical models of physics today is played by a string with 2 spin as that of gravitons tachyon speed, in some cases, which is a feature also of intergalactic gravitational waves in 5D (not within galaxies, where light is the dominant force that slows down and maintains c-speed as the limit, as Relativity well considers). So here is the surprising fact, which makes neutrinos so likely to be the graviton ‘string’:

Standard Model neutrinos are fundamental point-like particles. But an effective size can be defined using their electroweak cross section (apparent size in electroweak interaction). The average electroweak characteristic size is r2 = × 1033 cm2 ( × 1 nanobarn), where = 3.2 for electron neutrino, = 1.7 for muon neutrino and = 1.0 for tau neutrino. As it happens 1.7 nanobarns is the area of a string. So both coincide.

Both can also be modeled as hyper luminal open strings/neutrinos, outside galaxies obviously, and so the only reason so far I do NOT affirm directly that the present neutrino IS the h-Planck constant of quantum gravity is the question of the neutrino, string and graviton spin, which is supposed to be ½, but by no means this question, in which I would recommend top theoretical physicist to work – as it is the unknown piece of 5D astrophysics which requires more analysis. Of course the strings to be used should be background independent, which is the other area that should be worked deeply.


%d bloggers like this: