Random Articles (Page 81)

Have a deep view into what people are curious about.

πŸ”— Aleatoric Music

πŸ”— Music theory πŸ”— Classical music πŸ”— Music/Music genres

Aleatoric music (also aleatory music or chance music; from the Latin word alea, meaning "dice") is music in which some element of the composition is left to chance, and/or some primary element of a composed work's realization is left to the determination of its performer(s). The term is most often associated with procedures in which the chance element involves a relatively limited number of possibilities.

The term became known to European composers through lectures by acoustician Werner Meyer-Eppler at the Darmstadt International Summer Courses for New Music in the beginning of the 1950s. According to his definition, "a process is said to be aleatoric ... if its course is determined in general but depends on chance in detail". Through a confusion of Meyer-Eppler's German terms Aleatorik (noun) and aleatorisch (adjective), his translator created a new English word, "aleatoric" (rather than using the existing English adjective "aleatory"), which quickly became fashionable and has persisted. More recently, the variant "aleatoriality" has been introduced.

Discussed on

πŸ”— Artificial Intelligence Act (EU Law)

πŸ”— International relations πŸ”— Technology πŸ”— Internet πŸ”— Computing πŸ”— Computer science πŸ”— Law πŸ”— Business πŸ”— Politics πŸ”— Robotics πŸ”— International relations/International law πŸ”— Futures studies πŸ”— European Union πŸ”— Science Policy πŸ”— Artificial Intelligence

The Artificial Intelligence Act (AI Act) is a European Union regulation concerning artificial intelligence (AI).

It establishes a common regulatory and legal framework for AI in the European Union (EU). Proposed by the European Commission on 21 April 2021, and then passed in the European Parliament on 13 March 2024, it was unanimously approved by the Council of the European Union on 21 May 2024. The Act creates a European Artificial Intelligence Board to promote national cooperation and ensure compliance with the regulation. Like the EU's General Data Protection Regulation, the Act can apply extraterritorially to providers from outside the EU, if they have users within the EU.

It covers all types of AI in a broad range of sectors; exceptions include AI systems used solely for military, national security, research and non-professional purposes. As a piece of product regulation, it would not confer rights on individuals, but would regulate the providers of AI systems and entities using AI in a professional context. The draft Act was revised following the rise in popularity of generative AI systems, such as ChatGPT, whose general-purpose capabilities did not fit the main framework. More restrictive regulations are planned for powerful generative AI systems with systemic impact.

The Act classifies AI applications by their risk of causing harm. There are four levels – unacceptable, high, limited, minimal – plus an additional category for general-purpose AI. Applications with unacceptable risks are banned. High-risk applications must comply with security, transparency and quality obligations and undergo conformity assessments. Limited-risk applications only have transparency obligations and those representing minimal risks are not regulated. For general-purpose AI, transparency requirements are imposed, with additional evaluations when there are high risks.

La Quadrature du Net (LQDN) stated that the adopted version of the AI Act would be ineffective, arguing that the role of self-regulation and exemptions in the act rendered it "largely incapable of standing in the way of the social, political and environmental damage linked to the proliferation of AI".

Discussed on

πŸ”— History of the Monte Carlo method

πŸ”— Computing πŸ”— Computer science πŸ”— Mathematics πŸ”— Physics πŸ”— Statistics

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.

In physics-related problems, Monte Carlo methods are useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model, interacting particle systems, McKean–Vlasov processes, kinetic models of gases).

Other examples include modeling phenomena with significant uncertainty in inputs such as the calculation of risk in business and, in mathematics, evaluation of multidimensional definite integrals with complicated boundary conditions. In application to systems engineering problems (space, oil exploration, aircraft design, etc.), Monte Carlo–based predictions of failure, cost overruns and schedule overruns are routinely better than human intuition or alternative "soft" methods.

In principle, Monte Carlo methods can be used to solve any problem having a probabilistic interpretation. By the law of large numbers, integrals described by the expected value of some random variable can be approximated by taking the empirical mean (a.k.a. the sample mean) of independent samples of the variable. When the probability distribution of the variable is parameterized, mathematicians often use a Markov chain Monte Carlo (MCMC) sampler. The central idea is to design a judicious Markov chain model with a prescribed stationary probability distribution. That is, in the limit, the samples being generated by the MCMC method will be samples from the desired (target) distribution. By the ergodic theorem, the stationary distribution is approximated by the empirical measures of the random states of the MCMC sampler.

In other problems, the objective is generating draws from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability distributions can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depend on the distributions of the current random states (see McKean–Vlasov processes, nonlinear filtering equation). In other instances we are given a flow of probability distributions with an increasing level of sampling complexity (path spaces models with an increasing time horizon, Boltzmann–Gibbs measures associated with decreasing temperature parameters, and many others). These models can also be seen as the evolution of the law of the random states of a nonlinear Markov chain. A natural way to simulate these sophisticated nonlinear Markov processes is to sample multiple copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and MCMC methodologies, these mean-field particle techniques rely on sequential interacting samples. The terminology mean field reflects the fact that each of the samples (a.k.a. particles, individuals, walkers, agents, creatures, or phenotypes) interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes.

Despite its conceptual and algorithmic simplicity, the computational cost associated with a Monte Carlo simulation can be staggeringly high. In general the method requires many samples to get a good approximation, which may incur an arbitrarily large total runtime if the processing time of a single sample is high. Although this is a severe limitation in very complex problems, the embarrassingly parallel nature of the algorithm allows this large cost to be reduced (perhaps to a feasible level) through parallel computing strategies in local processors, clusters, cloud computing, GPU, FPGA, etc.

Discussed on

πŸ”— Fractional calculus

πŸ”— Mathematics

Fractional calculus is a branch of mathematical analysis that studies the several different possibilities of defining real number powers or complex number powers of the differentiation operator D

D f ( x ) = d d x f ( x ) , {\displaystyle Df(x)={\frac {d}{dx}}f(x)\,,}

and of the integration operator J

J f ( x ) = ∫ 0 x f ( s ) d s , {\displaystyle Jf(x)=\int _{0}^{x}f(s)\,ds\,,}

and developing a calculus for such operators generalizing the classical one.

In this context, the term powers refers to iterative application of a linear operator D to a function f, that is, repeatedly composing D with itself, as in D 2 ( f ) = ( D ∘ D ) ( f ) = D ( D ( f ) ) {\displaystyle D^{2}(f)=(D\circ D)(f)=D(D(f))} .

For example, one may ask for a meaningful interpretation of:

D = D 1 2 {\displaystyle {\sqrt {D}}=D^{\frac {1}{2}}}

as an analogue of the functional square root for the differentiation operator, that is, an expression for some linear operator that when applied twice to any function will have the same effect as differentiation. More generally, one can look at the question of defining a linear functional

D a {\displaystyle D^{a}}

for every real-number a in such a way that, when a takes an integer value n ∈ β„€, it coincides with the usual n-fold differentiation D if n > 0, and with the βˆ’nth power of J when n < 0.

One of the motivations behind the introduction and study of these sorts of extensions of the differentiation operator D is that the sets of operator powers { Da |a ∈ ℝ } defined in this way are continuous semigroups with parameter a, of which the original discrete semigroup of { Dn | n ∈ β„€ } for integer n is a denumerable subgroup: since continuous semigroups have a well developed mathematical theory, they can be applied to other branches of mathematics.

Fractional differential equations, also known as extraordinary differential equations, are a generalization of differential equations through the application of fractional calculus.

Discussed on

πŸ”— The Theory of American Decline

πŸ”— United States πŸ”— Economics πŸ”— Politics

American decline is a term used by various analysts to describe the diminishing power of the United States geopolitically, militarily, financially, economically, socially, and in health and the environment. There has been a debate between declinists, those who believe America is in decline, and exceptionalists, those who feel America is immortal.

Some analysts say that the U.S. was in decline long before Donald Trump ran for presidency; becoming the first presidential candidate to promote the idea that the U.S. was in decline. While others suggest the decline either stems from or has accelerated with Trump's foreign policy and the "country’s ongoing withdrawal from the global arena." According to Noam Chomsky, America's decline started at the end of WWII, dismissing the "remarkable rhetoric of the several years of triumphalism in the 1990s" as "mostly self-delusion".

Gallup's pollsters recently reported that worldwide approval of U.S. leadership has plunged from 48% in 2016 to a record low of 30% in 2018, in part due to the increasingly isolationist stances of Donald Trump. This drop places the U.S. a notch below China's 31% and leaving Germany as the most popular power with an approval of 41%. Michael Hudson describes financial pillar as paramount, resulting from bank-created money with compound interest and the inbuilt refusal to forgive debts as the fatal flaw.

China's challenging the U.S. for global predominance constitutes the core issue in the debate over the American decline.

Discussed on

πŸ”— Honeywell 316

πŸ”— Computing πŸ”— Computing/Computer hardware

The Honeywell 316 was a popular 16-bit minicomputer built by Honeywell starting in 1969. It is part of the Series 16, which includes the Models 116 (1965, discrete:β€Š4β€Š), 316 (1969), 416 (1966), 516 (1966) and DDP-716 (1969). They were commonly used for data acquisition and control, remote message concentration, clinical laboratory systems, Remote Job Entry and time-sharing. The Series-16 computers are all based on the DDP-116 designed by Gardner Hendrie at Computer Control Company, Inc. (3C) in 1964.

The 516 and later the 316 were used as Interface Message Processors (IMP) for the American ARPANET and the British NPL Network.

Discussed on

πŸ”— Herd Immunity

πŸ”— Medicine πŸ”— Statistics πŸ”— Microbiology πŸ”— Game theory

Herd immunity (also called herd effect, community immunity, population immunity, or social immunity) is a form of indirect protection from infectious disease that occurs when a large percentage of a population has become immune to an infection, whether through previous infections or vaccination, thereby providing a measure of protection for individuals who are not immune. In a population in which a large proportion of individuals possess immunity, such people being unlikely to contribute to disease transmission, chains of infection are more likely to be disrupted, which either stops or slows the spread of disease. The greater the proportion of immune individuals in a community, the smaller the probability that non-immune individuals will come into contact with an infectious individual, helping to shield non-immune individuals from infection.

Individuals can become immune by recovering from an earlier infection or through vaccination. Some individuals cannot become immune due to medical reasons, such as an immunodeficiency or immunosuppression, and in this group herd immunity is a crucial method of protection. Once a certain threshold has been reached, herd immunity gradually eliminates a disease from a population. This elimination, if achieved worldwide, may result in the permanent reduction in the number of infections to zero, called eradication. Herd immunity created via vaccination contributed to the eventual eradication of smallpox in 1977 and has contributed to the reduction of the frequencies of other diseases. Herd immunity does not apply to all diseases, just those that are contagious, meaning that they can be transmitted from one individual to another. Tetanus, for example, is infectious but not contagious, so herd immunity does not apply.

The term "herd immunity" was first used in 1923. It was recognized as a naturally occurring phenomenon in the 1930s when it was observed that after a significant number of children had become immune to measles, the number of new infections temporarily decreased, including among susceptible children. Mass vaccination to induce herd immunity has since become common and proved successful in preventing the spread of many infectious diseases. Opposition to vaccination has posed a challenge to herd immunity, allowing preventable diseases to persist in or return to communities that have inadequate vaccination rates.

Discussed on

πŸ”— Peak car

πŸ”— Climate change πŸ”— Environment πŸ”— Transport πŸ”— Urban studies and planning

Peak car (also peak car use or peak travel) is a hypothesis that motor vehicle distance traveled per capita, predominantly by private car, has peaked and will now fall in a sustained manner. The theory was developed as an alternative to the prevailing market saturation model, which suggested that car use would saturate and then remain reasonably constant, or to GDP-based theories which predict that traffic will increase again as the economy improves, linking recent traffic reductions to the Great Recession of 2008.

The theory was proposed following reductions, which have now been observed in Australia, Belgium, France, Germany, Iceland, Japan (early 1990s), New Zealand, Sweden, the United Kingdom (many cities from about 1994) and the United States. A study by Volpe Transportation in 2013 noted that average miles driven by individuals in the United States has been declining from 900 miles (1,400Β km) per month in 2004 to 820 miles (1,320Β km) in July 2012, and that the decline had continued since the recent upturn in the US economy.

A number of academics have written in support of the theory, including Phil Goodwin, formerly Director of the transport research groups at Oxford University and UCL, and David Metz, a former Chief Scientist of the UK Department of Transport. The theory is disputed by the UK Department for Transport, which predicts that road traffic in the United Kingdom will grow by 50% by 2036, and Professor Stephen Glaister, Director of the RAC Foundation, who say traffic will start increasing again as the economy improves. Unlike peak oil, a theory based on a reduction in the ability to extract oil due to resource depletion, peak car is attributed to more complex and less understood causes.

Discussed on

πŸ”— Sortition

πŸ”— Greece πŸ”— Politics πŸ”— Elections and Referendums

In governance, sortition (also known as selection by lot, allotment, demarchy, or Stochocracy) is the selection of political officials as a random sample from a larger pool of candidates. Filling individual posts or, more usually in its modern applications, to fill collegiate chambers. The system intends to ensure that all competent and interested parties have an equal chance of holding public office. It also minimizes factionalism, since there would be no point making promises to win over key constituencies if one was to be chosen by lot, while elections, by contrast, foster it. In ancient Athenian democracy, sortition was the traditional and primary method for appointing political officials, and its use was regarded as a principal characteristic of democracy.

Today, sortition is commonly used to select prospective jurors in common law-based legal systems and is sometimes used in forming citizen groups with political advisory power (citizens' juries or citizens' assemblies).

Discussed on