Entropy, scattering, death, war, Big Bang, disorder, gaseous state… all are similar terms for the same phenomena in different scales.
It all mean the dissolution of the whole, and its ‘informative networks’ that put it together back to its ∆-1 scale. Entropy thus is the erasing of a plane of existence; that of the whole, which we identified with the still linguistic mind mapping.
Entropy tends to lie into the future, as 5D generation from the seed to the emergent whole, ∆-1>∆0 happens in the origin of the being, if we are to use the specific sequential time of the human mind. But it is a local future, which can be seen as the beginning of a new generation. So information as the future or entropy as the future in cyclical time matters not: ∆=∇, going up and down the planes of the fifth dimension finally both dimensions balance into a zero sum.
As minds are still order, and we are a mind talking to minds we have chosen wholes and informative minds as the fifth dimension of our relative perception of the future. But if i were a nietzschian lion eating the camel’s load, and call shiva, the god of death the future of the Universe, in its interstellar vacuum space as physicists, masters of entropy weapons, do; I promote entropy=death as the future and believe in the relative second law of thermodynamics.
Of course the law is not absolute, only applies to gaseous states – the entropic state of matter, reordered by crystals – or to dark entropy expanding interstellar vacuum between galaxies, balanced by the warping of vacuum space into matter under gravitation in galactic vortices.
In a logic sense though to call the mind-whole the fifth makes sense: Fact is t we can ONLY GO upwards and create wholes if the parts are there. So the parts must be the fourth dimension of relative past, to which entropy returns.
Entropy and probability.
In physics entropy is studied in terms of potential degrees of freedom which means the more freedom degrees the more entropy and that is so.
The fourth has then ‘potential qualities’ and it is a potential in physics; which can be used to give birth to different wholes in the five. As humans become different complex super organisms; and once ‘made’ we remain with a fixed form; all other potential paths gone. But when we die and become mere atoms, again the the number and range of possible pyramids of different wholes to that of the old whole ‘humind’ reappear, and so the fourth dimension of entropy, holds all potential future wholes, in the form of probability. And so it is the entrance to the renewal of infinite parallel worlds of super organisms of space-time.
It is then really the fourth dimension, sum of smaller, faster time cycles which can create many envelopes, many particle covers of static mind spaces surrounded them, self-centered into a central singularity. Then entropy becomes vital energy, enclosed and ordered able to move without scattering the whole as a ‘wave’.
Entropy though is free, it is expansive, fast and Markowian, memoriless. AS WHEN WE DIE entropy erases two co-existing scales: ∆+1<<∆-1; devolving the being twice, jumping down two scales, expanding thus enormously in instantaneous space, the time quanta, now dying quanta crossing the thin barrier in a single moment of time between being and not being, the inner and outer world, the concave and convex side of ± curvature…
We shall indeed describe entropy in many ways under the Rashomon effect, the Disomorphic method and the 9 planes of exi≈st¡ences.
ENTROPY IN THE ∆∞ SCALES OF THE 5TH DIMENSION.
In the graph, entropy often confused with energy, is pure expansive motion that erases space: the limbs/fields of all systems. When entropy has 0 information then it disappears from perception and becomes pure distance which seems not to have motion, but it is pure simultaneous, infinite relative motion V=S/Tiƒ=S/0=∞
The word entropy has a restricted meaning in Physics, which we widen in GST, to all the scales of the Universe, as one of the 3 fundamental arrows of time. We call with the generic name of entropy or space, ‘S’ (symbol which physicists coincidentally use for both), one of the two extreme poles of the universe, lineal motion-distance, translation in space with minimal form, even negative absorption of information. Yet as all systems are relative to the p.o.v. that describes them, pure entropy only exists in the absolute Universe, in those scales we do not perceive – for human beings, in the gravitational pure space-distance with no form, no information, and hence a relative infinite distance-speed; V=S/t=S/o=∞. In the next, quantum scale, Pure entropy would be then non-local action at distance – Bohm’s quantum field which the particle uses to displace its wave and acts at faster than c speeds – not a problem for the scalar field dimension.
Next, we arrive to the thermodynamic ∆º scale, the human scale in which entropy finally meets its meaning in classic physics, as the expansive disordered tendency of an atomic or molecular ensemble to increase its spatial distance and hence its disorder; so it corresponds to the gaseous state of the 3 states of matter (Spe-gas, ST-liquid, Tiƒ-solid).
How this specific scale and state of matter has become among physics first and then among all humans the ‘only arrow of time of he Universe defies even the ‘imagination’ of the most irreal dreamer of absolute theories about the Universe. And of course is the biggest reductionism of scientific models of reality carried on latter into the entropy only big-bang theory of the Universe.
We shall thus debunk those absurd ‘totalitarian’ ideas about the wholeness in the posts on specific physical scales (Cosmology, thermodynamics). It is though in this scale, being the human ∆º scale where most of the great questions about entropy as an arrow of time can be answered and generalised then to each scale, but only as a single arrow of the 3 in existence, ascribing to human error – that is to the worldly profession of physicists – to make weapons of mass destruction, the absurd idea that it is all what the future holds for all of us – or in words of Helmholtz: ‘the Universe is dying’ LOL, the Universe is ∞ in scales, immortal in time. Only Mr. H is dead and so it should be his theory of time.
Entropy at this stage however becomes interesting to model all other ∆º±1 scales of reality because of the great detail of thermodynamic equations, and so we shall introduce in this post, the mathematical physics of entropy with special interest in the relationships, well studied by statistical mechanics, of the relationship between entropy-disorder in the different scales of the fifth dimension.
Since it will allow us to argue the inverse arrows of time as they apply not only in a single but in several scales of the fifth dimension:
In the graph, we deduce from thermodynamic studies, and generalise to any 3 ∆º±1 5D dimensions the concept of a growing entropy in the ‘upward’ arrow of the wholeness, and a diminishing entropy in the lower arrow, which physicists have found in the study of heat, such as the order of microscopic particles does not travel without ‘friction’ upwards, yet the synchronous order of the whole travels mechanically downwards without friction, as you do not suffer ‘entropy’ when moving.
Next it comes the scales of life, and again we generalise the fact that in what we might consider Tiƒ states, or solid states there is negantropy, that is growth of information. This happens not only in humans which can be considered a liquid-solid state (cells act as DNA crystal for what it matters in terms of order, growth of information and reproduction), but in all scales where entropy will always be superseded by a higher ordered arrow, of creation of form (reproductive liquid wave, or body wave) and further on by the tightening of cloned herds by an informative branching simultaneous network, departing from a tiƒ, still focused mind-seed of information.
So we arrive to the cosmic scale, where entropy is played by dark energy, action at distance, expansive space, which the simplex physical models of entropy only consider it all.
∂. ENTROPY AS A FUNCTION OF TIME
The function of entropy is self-evident in all scales of nature: disorder and death; hence the arrow of past, inverse to the creation of information. This physicists got it right, so we can again considering it for all scales as a function of time, more than one function of form (as we have done in the previous paragraph):
Death is entropy, a big bang is entropy, decelerated expansive motions are entropy, the arrow of disorder and chaos is entropy, the past is the relative motion of entropy. Entropy coincides broadly in the thermal scale of physics with the concept of disorder, heat and gas expansion, but that is the ‘limit’ of the concept in a single scale of the Universe.
In the Fractal Universe entropy is the Spe-expansive spatial ‘arrow of time’, one of the 3 arrows of the fractal generator, and it has different equations and parameters in each scale.
So here we shall deal with ‘entropy only’ as a process that expands space by erasing a ‘tighter whole configuration of the 5th dimension and eliminating its informative warping, liberating 1 ‘scale below’ in the 5th dimension, the ∆ being (partial entropy) or 2 scales, ∆<<∆-2 (Death). As humans perceive several scales the ‘intensity of an entropic’ process of destruction thus will vary.
So the final absolute entropy of infinite motion and zero form, (relative to a p.o.v. of perception) will be for man the dark energy of intergalactic space, the absolute/relative past of an small island-universe:
– Dark energy is expansive gravitational entropy between galaxies
-Neutrino quantum potential waves are the entropic field of the v:∆-4 scale.
-E (=mc2) is the entropic ‘explosion of mass’ into an expansive ‘big-bang field’. And any field in general is the entropic subtract that informative systems (in an inverse fashion: M=e/c2) contract and form.
-Entropy is the heat expansion of a thermodynamic ∆-3 scale.
-Entropy is the moment of death (in general terms): Max. Spe x T=0, which is specially relevant to explain living systems.
-Entropy is war in the ∆+1 scale of super organisms of history.
So entropy is death, erasing of information, expansion of form into space in any scale of the Universe:
Entropy is death in all scales; disorder of information for human biological beings or physical beings (big-bang events). As such it is the E on E=mc², often confused with Energy (in that equation energy is the bidimensional c² field), so properly written would be:
Entropy < Energy: c² > Time vortex: Tiƒ
Entropy is maximal at death where the arrow of entropy is pure, but we can consider that it grows also in youth, the first age, albeit with less pure intensity. As we write death as:
Which means to descend two scales of the fifth dimension and fully erase the form of a system. While youth, after a fast reproductive growth truly increases towards maturity its information. So youth is the relative ‘entropy age’ when compared to the other 2 ages of more energy and information.
This means entropy is in past-lower scales (ST∆-i), which are always happening together (entropy with lesser information is the relative past state of any system, as a loose herd of expanding motions of micro-points, from the p.o.v. of a higher scale), is defined in terms of expansive, open space as one of the 3 ‘functions-forms’ of space, as ‘ENTROPY’, only.
Finally to notice that as ‘the future is a relative choice of 3 arrows for every logic being (Non-Æ logic), entropic death can also be in the future.
So in 5D entropy is both a more reduced time arrow (as it shares the definition of future with the arrows of information and repetitive, conservative world cycles of energy) and a more expanded one, as it applies also to other scales.
Thus we talk of two levels of entropy: pure entropic motions or death processes and by extension, the perception of a -2 lower scale of reality from the ∆º point of view, and feeding processes of ∆-1, which can be considered partially energy events, as they will end up reforming the field into a higher energetic quanta of the ∆º being.
In mathematical terms, each of those processes means the extraction from an ∆-1 scale of an integral or derivative quantity (depending on which direction we measure), and so when we study mathematical physics, we shall notice that pure entropic processes imply 2 derivatives as in Lagrangians and one single derivative carries us to an energetic analysis (as in Hamiltonians).
S: ENTROPY AS AN ORGAN OF SPACE: FIELD/LIMBS
In the same manner limbs are the entropic, moving element of the being, but they are complex systems, which can be decomposed in a head that attaches it to the body, a main organic part, the cilia or dual part of the leg, and the more entropic proper part, the feet which disorders the electronic matter over which it walks, or any other ‘organic tail of a comet, cilia of a paramecium in water, alcubierre engine in gravitation’, etc, which moves the closed system by disordering selectively the ∆-n scales over which it transits.
It often happens that the entropic field/limb is not even attached to the body/wave & head/particle; the entropy is truly disordered, either because it feeds the body wave and so by definition food is always entropic, killed and lowered ∆-2 scales. So when we eat we lower the food to the ∆-2 amino acid scale; and when an electron moves it does so according to Bohm’s interpretation of Schrodinger equation over the quantum field (of dark energy/expansive gravitation).
The inverse arrow of information: Spe: entropy as one of the 3 arrows of the generator of time-space.
In continuous physics, entropy is a constant K, of the thermodynamic scale, and it applies the concept of an expansive Universe, taken from thermodynamics, which is a sweeping generalisation that must be corrected.
In the graph, entropy is the expansive, lineal, planar element of all systems of reality, which dominates in the processes of death, but normally is connected in complementary forms to informative particle/heads:
Thus we write entropy in the GST generator equation of space-time beings in any of its particular cases as:
|-Spe (entropy-youth/death) < St-wave/body of energy > Tiƒ (informative particle/head)
In the fourth line we will break the parallel ST•∆ elements of space-time in its parts and simplex one-dimensional concepts. So e shall study each specific entropic element in more detail (time permitted, if not future humans or robots should do it:
Entropy from the point of view of the system belongs to the external time membrane, which cycles as a clock of time in tune with the singularity processing of logic information, synchronising in this manner the whole system.
As the membrane, of an open ball, the entropic dimension tends from a cartesian view to infinity, but truly if we change coordinates to polar ones it does change to within a radius 1 circle, a transformation that mathematically often is easier in complex plane.
In the graph, the equation of diffusion, discovered in heat, explains a process of ∆1<<∆-1, observed in the equality of the first and second derivative, or gradient of the field state of an initial point particle emitting energy soon degraded into entropy. As such the equality between the initial state, ∂Φ (r,t)/∂t into its second gradient as field implies the complete degeneration of the system into stochastic entropy in which all particles are disconnected, no longer under a tiƒ, ∆º form missed in the second state. The first state thus normally represents a 2wave of confined cyclical present energy.
In mathematics, many phenomena in various science fields are expressed by using the well-known evolution equations. The diffusion equation is one of them and mathematically corresponds to the Markov process in relation to the normal distribution rule.
In physics, the motion of diffusion particles corresponds to the well-known Brown motion satisfying the parabolic law.
It is widely accepted that the Brown problem is a general term of investigating subjects in various science fields relevant to the Markov process, such as the material science, the information science, the life science, the social science, and so on. The extended diffusion equations are used for various sciences fields. In that case, they sometimes have a sink and source of their concerned elements, for example, such as a local equilibrium relation between native defects in silicon crystal in the material science or between predation and prey in the life science.
In probability theory and related fields, a Markov process (or Markoff process), named after the Russian mathematician Andrey Markov, is a stochastic process that satisfies the Markov property (sometimes characterized as “memorylessness”). Loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process’s full history, hence independently from such history; i.e., conditional on the present state of the system, its future and past states are independent.
In probability theory, the normal (or Gaussian) distribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known.
The normal distribution is useful because of the central limit theorem. In its most general form, under some conditions (which include finite variance), it states that averages of random variables independently drawn from independent distributions converge in distribution to the normal, that is, become normally distributed when the number of random variables is sufficiently large. Physical quantities that are expected to be the sum of many independent processes (such as measurement errors) often have distributions that are nearly normal.
Moreover, many results and methods (such as propagation of uncertainty and least squares parameter fitting) can be derived analytically in explicit form when the relevant variables are normally distributed.
The normal distribution is sometimes informally called the bell curve. However, many other distributions are bell-shaped (such as the Cauchy, Student’s t, and logistic distributions). Even the term Gaussian bell curve is ambiguous because it may be used to refer to a some function defined in terms of the Gaussian function which is not a probability distribution because it is not normalized in that it does not integrate to 1.
The probability density of the normal distribution is:
A random variable with a Gaussian distribution is said to be normally distributed and is called a normal deviate.
We can thus consider that the first information or modelling by an ∆º mind of a field of micro-points, will be to qualify them as a gaussian distribution. Yet the system will in itself have such distribution for qualities, which are not correlated in time-networks with form.
The element here that we can construct further is the existence of a higher potency which in exponential decay shows more clearly the fundamental property of independent free particles of a field of decaying energy being transformed into entropic radiation:
A quantity is subject to exponential decay if it decreases at a rate proportional to its current value. Symbolically, this process can be expressed by the following differential equation, where N is the quantity and λ (lambda) is a positive rate called the exponential decay constant:
The solution to this equation (see derivation below) is:
where N(t) is the quantity at time t, and N0 = N(0) is the initial quantity, i.e. the quantity at time t = 0.
The decay of a form such as e=mc² is in certain form a similar case, here a mass entity of the ∆+1 physical scale a mass, reduces to an expansive electromagnetic wave, two scales below, past the electronic next scale of the mass. How it happens, likely through the theory of the neutrino as origin of light waves. two neutrinos between the quarks and the electron elements of the atom produce them with different topological interpretations.
Exponential decay occurs in a wide variety of situations. Most of these fall into the domain of the natural sciences.
Many decay processes that are often treated as exponential, are really only exponential so long as the sample is large and the law of large numbers holds. For small samples, a more general analysis is necessary, accounting for a Poisson process.
From the net we extracted a few cases
Chemical reactions: The rates of certain types of chemical reactions depend on the concentration of one or another reactant. Reactions whose rate depends only on the concentration of one reactant (known as first-order reactions) consequently follow exponential decay. For instance, many enzyme-catalyzed reactions behave this way.
Electrostatics: The electric charge (or, equivalently, the potential) contained in a capacitor (capacitance C) changes exponentially, if the capacitor experiences a constant external load (resistance R). The exponential time-constant τ for the process is R C, and the half-life is therefore R C ln2. This applies to both charging and discharging, ie. a capacitor charges or discharges according to the same law. The same equations can be applied to the current in an inductor. (Furthermore, the particular case of a capacitor or inductor changing through several parallel resistors makes an interesting example of multiple decay processes, with each resistor representing a separate process. In fact, the expression for the equivalent resistance of two resistors in parallel mirrors the equation for the half-life with two decay processes.)
Fluid dynamics: A fluid emptying from a tube with an opening at the bottom will empty at a rate which depends on the pressure at the opening (which in turn depends on the height of the fluid remaining). Thus the height of the column of fluid remaining will follow an exponential decay.
Geophysics: Atmospheric pressure decreases approximately exponentially with increasing height above sea level, at a rate of about 12% per 1000m.
Heat transfer: If an object at one temperature is exposed to a medium of another temperature, the temperature difference between the object and the medium follows exponential decay (in the limit of slow processes; equivalent to “good” heat conduction inside the object, so that its temperature remains relatively uniform through its volume). See also Newton’s law of cooling.
Luminescence: After excitation, the emission intensity – which is proportional to the number of excited atoms or molecules – of a luminescent material decays exponentially. Depending on the number of mechanisms involved, the decay can be mono- or multi-exponential.
Pharmacology and toxicology: It is found that many administered substances are distributed and metabolized (see clearance) according to exponential decay patterns. The biological half-lives “alpha half-life” and “beta half-life” of a substance measure how quickly a substance is distributed and eliminated.
Physical optics: The intensity of electromagnetic radiation such as light or X-rays or gamma rays in an absorbent medium, follows an exponential decrease with distance into the absorbing medium. This is known as the Beer-Lambert law.
Radioactivity: In a sample of a radionuclide that undergoes radioactive decay to a different state, the number of atoms in the original state follows exponential decay as long as the remaining number of atoms is large. The decay product is termed a radiogenic nuclide.
Thermoelectricity: The decline in resistance of a Negative Temperature Coefficient Thermistor as temperature is increased.
Vibrations: Some vibrations may decay exponentially; this characteristic is often found in damped mechanical oscillators, and used in creating ADSR envelopes in synthesizers. An overdamped system will simply return to equilibrium via an exponential decay.
Finance: a retirement fund will decay exponentially being subject to discrete payout amounts, usually monthly, and an input subject to a continuous interest rate. A differential equation dA/dt = input – output can be written and solved to find the time to reach any amount A, remaining in the fund.
In simple glottochronology, the (debatable) assumption of a constant decay rate in languages allows one to estimate the age of single languages. (To compute the time of split between two languages requires additional assumptions, independent of exponential decay).