RESUMO
The asymptotic structure of null and spatial infinities of asymptotically flat spacetimes plays an essential role in discussing gravitational radiation, gravitational memory effect, and conserved quantities in General Relativity (GR). Bondi, Metzner and Sachs (BMS) established that the asymptotic symmetry group for asymptotically simple spacetimes is the infinite-dimensional BMS group. Given that null infinity is divided into two sets: past null infinity [Formula: see text] and future null infinity [Formula: see text], one can identify two independent symmetry groups: [Formula: see text] at [Formula: see text] and [Formula: see text] at [Formula: see text]. Associated with these symmetries are the so-called BMS charges. A recent conjecture by Strominger suggests that the generators of [Formula: see text] and [Formula: see text] and their associated charges are related via an antipodal reflection map near spatial infinity. To verify this matching, an analysis of the gravitational field near spatial infinity is required. This task is complicated due to the singular nature of spatial infinity for spacetimes with non-vanishing ADM mass. Different frameworks have been introduced in the literature to address this singularity, e.g. Friedrich's cylinder, Ashtekar-Hansen's hyperboloid and Ashtekar-Romano's asymptote at spatial infinity. This paper reviews the role of Friedrich's formulation of spatial infinity in the investigation of the matching of the spin-2 charges on Minkowski spacetime and in the full GR setting. This article is part of a discussion meeting issue 'At the interface of asymptotics, conformal methods and analysis in general relativity'.
RESUMO
This paper describes conservation laws in general relativity (GR) dating back to the mass-energy conservation of Bondi and Sachs in the early 1960s but using 2-spinor techniques. The notion of conformal infinity is employed, and the highly original ideas of E. T. Newman are discussed in relation to twistor theory. The controversial NP constants are introduced, and their meaning is considered in a new light related to the problem of equations of motion in GR. This article is part of a discussion meeting issue 'At the interface of asymptotics, conformal methods and analysis in general relativity'.
RESUMO
Gravitational waveforms play a crucial role in comparing observed signals with theoretical predictions. However, obtaining accurate analytical waveforms directly from general relativity (GR) remains challenging. Existing methods involve a complex blend of post-Newtonian theory, effective-one-body formalism, numerical relativity and interpolation, introducing systematic errors. As gravitational wave astronomy advances with new detectors, these errors gain significance, particularly when testing GR in the nonlinear regime. A recent development proposes a novel approach to address this issue. By deriving precise constraints-or balance laws-directly from full nonlinear GR, this method offers a means to evaluate waveform quality, detect template weaknesses and ensure internal consistency. Before delving into the intricacies of balance laws in full nonlinear GR, we illustrate the concept using a detailed mechanical analogy. We will examine a dissipative mechanical system as an example, demonstrating how mechanical balance laws can gauge the accuracy of approximate solutions in capturing the complete physical scenario. While mechanical balance laws are straightforward, deriving balance laws in electromagnetism and GR demands a rigorous foundation rooted in mathematically precise concepts of radiation. Following the analogy with electromagnetism, we derive balance laws in GR. As a proof of concept, we employ an analytical approximate waveform model, showcasing how these balance laws serve as a litmus test for the model's validity. This article is part of the theme issue 'The particle-gravity frontier'.
RESUMO
This is an introductory article for the proceedings associated with the Royal Society Hooke discussion meeting of the same title which took place in London in May 2023. We review the history of Penrose's conformal compactification, null infinity and a number of related fundamental developments in mathematical general relativity from the last 60 years. This article is part of a discussion meeting issue 'At the interface of asymptotics, conformal methods and analysis in general relativity'.
RESUMO
This paper is about two important trends of scattering theory in general relativity: time-dependent spectral analytic scattering and conformal scattering. The former was initiated by Jonathan Dimock and Bernard Kay in the mid-1980s and is based on spectral and functional analysis. The latter was proposed by Roger Penrose in 1965 and then constructed for the first time by Gerard Friedlander in 1980 by putting together Penrose's conformal method and another analytic approach to scattering: the Lax-Phillips theory due to Peter Lax and Ralph Phillips. We shall review the history of the two approaches and explain their general principles. We shall also explore an important question: 'can the tools of one approach be used to obtain a complete construction in the other?' This article is part of a discussion meeting issue 'At the interface of asymptotics, conformal methods and analysis in general relativity'.
RESUMO
The problem of formulating thermodynamics in a relativistic scenario remains unresolved, although many proposals exist in the literature. The challenge arises due to the intrinsic dynamic structure of spacetime as established by the general theory of relativity. With the discovery of the physical nature of information, which underpins Landauer's principle, we believe that information theory should play a role in understanding this problem. In this work, we contribute to this endeavour by considering a relativistic communication task between two partners, Alice and Bob, in a general Lorentzian spacetime. We then assume that the receiver, Bob, reversibly operates a local heat engine powered by information, and seek to determine the maximum amount of work he can extract from this device. As Bob cannot extract work for free, by applying both Landauer's principle and the second law of thermodynamics, we establish a bound on the energy Bob must spend to acquire the information in the first place. This bound is a function of the spacetime metric and the properties of the communication channel.
RESUMO
Gravitational redshift effects undoubtedly exist; moreover, the experimental setups which confirm the existence of these effects-the most famous of which being the Pound-Rebka experiment-are extremely well-known. Nonetheless-and perhaps surprisingly-there remains a great deal of confusion in the literature regarding what these experiments really establish. Our goal in the present article is to clarify these issues, in three concrete ways. First, although (i) Brown and Read (2016) are correct to point out that, given their sensitivity, the outcomes of experimental setups such as the original Pound-Rebka configuration can be accounted for using solely the machinery of accelerating frames in special relativity (barring some subtleties due to the Rindler spacetime necessary to model the effects rigorously), nevertheless (ii) an explanation of the results of more sensitive gravitational redshift outcomes does in fact require more. Second, although typically this 'more' is understood as the invocation of spacetime curvature within the framework of general relativity, in light of the so-called 'geometric trinity' of gravitational theories, in fact curvature is not necessary to explain even these results. Thus (a) one can often explain the results of these experiments using only the resources of special relativity, and (b) even when one cannot, one need not invoke spacetime curvature. And third: while one might think that the absence of gravitational redshift effects would imply that spacetime is flat (indeed, Minkowskian), this can be called into question given the possibility of the cancelling of gravitational redshift effects by charge in the context of the Reissner-Nordström metric. This argument is shown to be valid and both attractive forces as well as redshift effects can be effectively shielded (and even be repulsive or blueshifted, respectively) in the charged setting. Thus, it is not the case that the absence of gravitational effects implies a Minkowskian spacetime setting.
RESUMO
We carry out a systematic study on the motion of test particles in the region inner to the naked singularity of a quasi-hyperbolically symmetric γ-metric. The geodesic equations are written and analyzed in detail. The obtained results are contrasted with the corresponding results obtained for the axially symmetric γ-metric and the hyperbolically symmetric black hole. As in this latter case, it is found that test particles experience a repulsive force within the horizon (naked singularity), which prevents them from reaching the center. However, in the present case, this behavior is affected by the parameter γ which measures the departure from the hyperbolical symmetry. These results are obtained for radially moving particles as well as for particles moving in the θ-r subspace. The possible relevance of these results in the explanation of extragalactic jets is revealed.
RESUMO
The paper re-examines the principal methodological questions, arising in the debate over the cosmological standard model's postulate of Dark Matter vs. rivalling proposals that modify standard (Newtonian and general-relativistic) gravitational theory, the so-called Modified Newtonian Dynamics (MOND) and its subsequent extensions. What to make of such seemingly radical challenges of cosmological orthodoxy? In the first part of our paper, we assess MONDian theories through the lens of key ideas of major 20th century philosophers of science (Popper, Kuhn, Lakatos, and Laudan), thereby rectifying widespread misconceptions and misapplications of these ideas common in the pertinent MOND-related literature. None of these classical methodological frameworks, which render precise and systematise the more intuitive judgements prevalent in the scientific community, yields a favourable verdict on MOND and its successors-contrary to claims in the MOND-related literature by some of these theories' advocates; the respective theory appraisals are largely damning. Drawing on these insights, the paper's second part zooms in on the most common complaint about MONDian theories, their ad-hocness. We demonstrate how the recent coherentist model of ad-hocness captures, and fleshes out, the underlying-but too often insufficiently articulated-hunches underlying this critique. MONDian theories indeed come out as severely ad hoc: they do not cohere well with either theoretical or empirical-factual background knowledge. In fact, as our complementary comparison with the cosmological standard model's Dark Matter postulate shows, with respect to ad-hocness, MONDian theories fare worse than the cosmological standard model.
Assuntos
Gravitação , Julgamento , Tempo , ConhecimentoRESUMO
In this discourse, we would like to discuss some issues of concept and principle in the context of the following three aspects. One, how [Formula: see text] arises as a constant of space-time structure on the same footing as the velocity of light. These are the two constants innate to space-time without reference to any force or dynamics whatsoever, and are interwoven in the geometry of 'free' homogeneous space-time. Two, how does the vacuum energy gravitate? Could its gravitational interaction in principle be included in general relativity or a new theory of quantum space-time/gravity would be required? Finally, we would like to raise the fundamental question: How does the Universe physically expand? Since there does not lie anything outside into which it can expand, instead it has to expand on its own-maybe by creating new space-time out of nothing at each instant and at every location! Thus not only was the Universe created at some instant in the past marking the beginning in the Big Bang, it is in fact being created continuously at each epoch as it expands. We thus need quantum theory of space-time/gravity for fully understanding the working of the Universe. This article is part of the theme issue 'The future of mathematical cosmology, Volume 2'.
RESUMO
Following Smolin, we proceed to unification of general relativity and quantum theory by operating solely with events, i.e., without appealing to physical systems and space-time. The universe is modelled as a dendrogram (finite tree) expressing the hierarchic relations between events. This is the observational (epistemic) model; the ontic model is based on p-adic numbers (infinite trees). Hence, we use novel mathematics: not only space-time but even real numbers are not in use. Here, the p-adic space (which is zero-dimensional) serves as the base for the holographic image of the universe. In this way our theory is connected with p-adic physics; in particular, p-adic string theory and complex disordered systems (p-adic representation of the Parisi matrix for spin glasses). Our Dendrogramic-Holographic (DH) theory matches perfectly with the Mach's principle and Brans-Dicke theory. We found a surprising informational interrelation between the fundamental constants, h, c, G, and their DH analogues, h(D), c(D), G(D). DH theory is part of Wheeler's project on the information restructuring of physics. It is also a step towards the Unified Field theory. The universal potential V is nonlocal, but this is relational DH nonlocality. V can be coupled to the Bohm quantum potential by moving to the real representation. This coupling enhances the role of the Bohm potential.
RESUMO
As an extension of Gabor signal processing, the covariant Weyl-Heisenberg integral quantization is implemented to transform functions on the eight-dimensional phase space x,k into Hilbertian operators. The x=xµ values are space-time variables, and the k=kµ values are their conjugate frequency-wave vector variables. The procedure is first applied to the variables x,k and produces essentially canonically conjugate self-adjoint operators. It is next applied to the metric field gµν(x) of general relativity and yields regularized semi-classical phase space portraits g˵ν(x). The latter give rise to modified tensor energy density. Examples are given with the uniformly accelerated reference system and the Schwarzschild metric. Interesting probabilistic aspects are discussed.
RESUMO
I take non-locality to be the Michelson-Morley experiment of the early 21st century, assume its universal validity, and try to derive its consequences. Spacetime, with its locality, cannot be fundamental, but must somehow be emergent from entangled coherent quantum variables and their behaviors. There are, then, two immediate consequences: (i). if we start with non-locality, we need not explain non-locality. We must instead explain an emergence of locality and spacetime. (ii). There can be no emergence of spacetime without matter. These propositions flatly contradict General Relativity, which is foundationally local, can be formulated without matter, and in which there is no "emergence" of spacetime. If these be true, then quantum gravity cannot be a minor alteration of General Relativity but must demand its deep reformulation. This will almost inevitably lead to: matter not only curves spacetime, but "creates" spacetime. We will see independent grounds for the assertion that matter both curves and creates spacetime that may invite a new union of quantum gravity and General Relativity. This quantum creation of spacetime consists of: (i) fully non-local entangled coherent quantum variables. (ii) The onset of locality via decoherence. (iii) A metric in Hilbert space among entangled quantum variables by the sub-additive von Neumann entropy between pairs of variables. (iv) Mapping from metric distances in Hilbert space to metric distances in classical spacetime by episodic actualization events. (v) Discrete spacetime is the relations among these discrete actualization events. (vi) "Now" is the shared moment of actualization of one among the entangled variables when the amplitudes of the remaining entangled variables change instantaneously. (vii) The discrete, successive, episodic, irreversible actualization events constitute a quantum arrow of time. (viii) The arrow of time history of these events is recorded in the very structure of the spacetime constructed. (ix) Actual Time is a succession of two or more actual events. The theory inevitably yields a UV cutoff of a new type. The cutoff is a phase transition between continuous spacetime before the transition and discontinuous spacetime beyond the phase transition. This quantum creation of spacetime modifies General Relativity and may account for Dark Energy, Dark Matter, and the possible elimination of the singularities of General Relativity. Relations to Causal Set Theory, faithful Lorentzian manifolds, and past and future light cones joined at "Actual Now" are discussed. Possible observational and experimental tests based on: (i). the existence of Sub- Planckian photons, (ii). knee and ankle discontinuities in the high-energy gamma ray spectrum, and (iii). possible experiments to detect a creation of spacetime in the Casimir system are discussed. A quantum actualization enhancement of repulsive Casimir effect would be anti-gravitational and of possible practical use. The ideas and concepts discussed here are not yet a theory, but at most the start of a framework that may be useful.
RESUMO
The present paper revisits conventionalism about the geometry of classical and relativistic spacetimes. By means of critically examining a recent evaluation of conventionalism, we clarify key themes of, and rectify common misunderstandings about, conventionalism. Reichenbach's variant is demarcated from conventionalism simpliciter, associated primarily with Poincaré. We carefully outline the latter's core tenets-as a selective anti-realist response to a particular form of theory underdetermination. A subsequent double defence of geometric conventionalism is proffered: one line of defence employs (and thereby, to some extent, rehabilitates) a plausible reading of Reichenbach's idea of universal forces; another consists in independent support for conventionalism, unrelated to Reichenbach. Conventionalism, we maintain, remains a live option in contemporary philosophy of spacetime physics, worthy of serious consideration.
Assuntos
Filosofia , Filosofia/históriaRESUMO
The problem of observables and their supposed lack of change has been significant in Hamiltonian quantum gravity since the 1950s. This paper considers the unrecognized variety of ideas about observables in the thought of constrained Hamiltonian dynamics co-founder Peter Bergmann, who trained many students at Syracuse and invented observables. Whereas initially Bergmann required a constrained Hamiltonian formalism to be mathematically equivalent to the Lagrangian, in 1953 Bergmann and Schiller introduced a novel postulate, motivated by facilitating quantum gravity. This postulate held that observables were invariant under transformations generated by each individual first-class constraint. While modern works rely on Bergmann's authority and sometimes speak of "Bergmann observables," he had much to say about observables, generally interesting and plausible but not all mutually consistent and much of it neglected. On occasion he required observables to be locally defined (not changeless and global); at times he wanted observables to be independent of the Hamiltonian formalism (implicitly contrary to a definition involving separate first-class constraints). But typically he took observables to have vanishing Poisson bracket with each first-class constraint and took this result to be justified by the example of electrodynamics. He expected observables to be analogous to the transverse true degrees of freedom of electromagnetism. Given these premises, there is no coherent concept of observables which he reliably endorsed, much less established. A revised definition of observables that satisfies the requirement that equivalent theories should have equivalent observables using the Rosenfeld-Anderson-Bergmann-Castellani gauge generator G, a tuned sum of first-class constraints that changes the canonical action ∫dt(pqÌ-H) by a boundary term. Bootstrapping from theory formulations with no first-class constraints, one finds that the "external" coordinate gauge symmetry of GR calls for covariance (a transformation rule and hence a 4-dimensional Lie derivative for the Poisson bracket), not invariance (0 Poisson bracket), under G (not each first-class constraint separately).
Assuntos
Gravitação , Humanos , Masculino , TempoRESUMO
Since 2015 the gravitational-wave observations of LIGO and Virgo have transformed our understanding of compact-object binaries. In the years to come, ground-based gravitational-wave observatories such as LIGO, Virgo, and their successors will increase in sensitivity, discovering thousands of stellar-mass binaries. In the 2030s, the space-based LISA will provide gravitational-wave observations of massive black holes binaries. Between the â¼ 10 -103 Hz band of ground-based observatories and the â¼ 1 0 - 4 -10- 1 Hz band of LISA lies the uncharted decihertz gravitational-wave band. We propose a Decihertz Observatory to study this frequency range, and to complement observations made by other detectors. Decihertz observatories are well suited to observation of intermediate-mass ( â¼ 1 0 2 -104 M â) black holes; they will be able to detect stellar-mass binaries days to years before they merge, providing early warning of nearby binary neutron star mergers and measurements of the eccentricity of binary black holes, and they will enable new tests of general relativity and the Standard Model of particle physics. Here we summarise how a Decihertz Observatory could provide unique insights into how black holes form and evolve across cosmic time, improve prospects for both multimessenger astronomy and multiband gravitational-wave astronomy, and enable new probes of gravity, particle physics and cosmology.
RESUMO
We study fluid distributions endowed with hyperbolic symmetry, which share many common features with Lemaitre-Tolman-Bondi (LTB) solutions (e.g., they are geodesic, shearing, and nonconformally flat, and the energy density is inhomogeneous). As such, they may be considered as hyperbolic symmetric versions of LTB, with spherical symmetry replaced by hyperbolic symmetry. We start by considering pure dust models, and afterwards, we extend our analysis to dissipative models with anisotropic pressure. In the former case, the complexity factor is necessarily nonvanishing, whereas in the latter cases, models with a vanishing complexity factor are found. The remarkable fact is that all solutions satisfying the vanishing complexity factor condition are necessarily nondissipative and satisfy the stiff equation of state.
RESUMO
Neural network is a dynamical system described by two different types of degrees of freedom: fast-changing non-trainable variables (e.g., state of neurons) and slow-changing trainable variables (e.g., weights and biases). We show that the non-equilibrium dynamics of trainable variables can be described by the Madelung equations, if the number of neurons is fixed, and by the Schrodinger equation, if the learning system is capable of adjusting its own parameters such as the number of neurons, step size and mini-batch size. We argue that the Lorentz symmetries and curved space-time can emerge from the interplay between stochastic entropy production and entropy destruction due to learning. We show that the non-equilibrium dynamics of non-trainable variables can be described by the geodesic equation (in the emergent space-time) for localized states of neurons, and by the Einstein equations (with cosmological constant) for the entire network. We conclude that the quantum description of trainable variables and the gravitational description of non-trainable variables are dual in the sense that they provide alternative macroscopic descriptions of the same learning system, defined microscopically as a neural network.
RESUMO
Super-substantivalism (of the type we'll consider) roughly comprises two core tenets: (1) the physical properties which we attribute to matter (e.g. charge or mass) can be attributed to spacetime directly, with no need for matter as an extraneous carrier "on top of" spacetime; (2) spacetime is more fundamental than (ontologically prior to) matter. In the present paper, we revisit a recent argument in favour of super-substantivalism, based on General Relativity. A critique is offered that highlights the difference between (various accounts of) fundamentality and (various forms of) ontological dependence. This affords a metaphysically more perspicuous view of what super-substantivalism's tenets actually assert, and how it may be defended. We tentatively propose a re-formulation of the original argument that not only seems to apply to all classical physics, but also chimes with a standard interpretation of spacetime theories in the philosophy of physics.
RESUMO
This paper discusses some philosophical aspects related to the recent publication of the experimental results of the 2017 black hole experiment, namely the first image of the supermassive black hole at the center of galaxy M87. In this paper I present a philosophical analysis of the 2017 Event Horizon Telescope (EHT) black hole experiment. I first present Hacking's philosophy of experimentation. Hacking gives his taxonomy of elements of laboratory science and distinguishes a list of elements. I show that the EHT experiment conforms to major elements from Hacking's list. I then describe with the help of Galison's Philosophy of the Shadow how the EHT Collaboration created the famous black hole image. Galison outlines three stages for the reconstruction of the black hole image: Socio-Epistemology, Mechanical Objectivity, after which there is an additional Socio-Epistemology stage. I subsequently present my own interpretation of the reconstruction of the black hole image and I discuss model fitting to data. I suggest that the main method used by the EHT Collaboration to assure trust in the results of the EHT experiment is what philosophers call the Argument from Coincidence. I show that using this method for the above purpose is problematic. I present two versions of the Argument from Coincidence: Hacking's Coincidence and Cartwright's Reproducibility by which I analyse the EHT experiment. The same estimation of the mass of the black hole is reproduced in four different procedures. The EHT Collaboration concludes: the value we have converged upon is robust. I analyse the mass measurements of the black hole with the help of Cartwright's notion of robustness. I show that the EHT Collaboration construe Coincidence/Reproducibility as Technological Agnosticism and I contrast this interpretation with van Fraassen's scientific agnosticism.