# Frequency Format Hypothesis

The frequency format hypothesis is the idea that the brain understands and processes information better when presented in frequency formats rather than a numerical or probability format. Thus according to the hypothesis, presenting information as 1 in 5 people rather than 20% leads to better comprehension. The idea was proposed by German scientist Gerd Gigerenzer, after compilation and comparison of data collected between 1976–1997.

# 1999 Loomis Truck Robbery

California Crime

The 1999 Loomis truck robbery was a robbery of a Loomis, Fargo & Co. semi-trailer truck on March 24, 1999, as it transported money from Sacramento, California to San Francisco. At some point during the transit, one or more robbers boarded the truck, cut a hole in the roof, removed approximately 2.3 million dollars, and exited the truck with the money, completely evading detection by the truck's driver and guards. The robbery was not discovered until after the truck arrived at its destination. No suspects were ever identified by authorities and the robbery is now a cold case. Even the exact tools and methods used by robber or robbers were never conclusively determined.

### Discussed on

Philosophy Education Philosophy/Analytic philosophy

In philosophy, Wittgenstein's ladder is a metaphor set out by Ludwig Wittgenstein about learning. In what may be a deliberate reference to Søren Kierkegaard's Concluding Unscientific Postscript to Philosophical Fragments, the penultimate proposition of the Tractatus Logico-Philosophicus (translated from the original German) reads:

6.54

My propositions serve as elucidations in the following way: anyone who understands me eventually recognizes them as nonsensical, when he has used them—as steps—to climb beyond them. (He must, so to speak, throw away the ladder after he has climbed up it.)

He must transcend these propositions, and then he will see the world aright.

Given the preceding problematic at work in his Tractatus, this passage suggests that, if a reader understands Wittgenstein's aims in the text, then those propositions the reader would have just read would be recognized as nonsense. From Propositions 6.4–6.54, the Tractatus shifts its focus from primarily logical considerations to what may be considered more traditionally philosophical topics (God, ethics, meta-ethics, death, the will) and, less traditionally along with these, the mystical. The philosophy presented in the Tractatus attempts to demonstrate just what the limits of language are—and what it is to run up against them. Among what can be said for Wittgenstein are the propositions of natural science, and to the nonsensical, or unsayable, those subjects associated with philosophy traditionally—ethics and metaphysics, for instance.

Curiously, the penultimate proposition of the Tractatus, proposition 6.54, states that once one understands the propositions of the Tractatus, one will recognize that they are nonsensical (unsinnig), and that they must be thrown away. Proposition 6.54, then, presents a difficult interpretative problem. If the so-called picture theory of language is correct, and it is impossible to represent logical form, then the theory, by trying to say something about how language and the world must be for there to be meaning, is self-undermining. This is to say that the picture theory of language itself requires that something be said about the logical form sentences must share with reality for meaning to be possible. This requires doing precisely what the picture theory of language precludes. It would appear, then, that the metaphysics and the philosophy of language endorsed by the Tractatus give rise to a paradox: for the Tractatus to be true, it will necessarily have to be nonsense by self-application; but for this self-application to render the propositions of the Tractatus nonsense (in the Tractarian sense), then the Tractatus must be true.

Other philosophers before Wittgenstein, including Zhuang Zhou, Schopenhauer and Fritz Mauthner, had used a similar metaphor.

In his notes of 1930 Wittgenstein returns to the image of a ladder with a different perspective:

I might say: if the place I want to get could only be reached by way of a ladder, I would give up trying to get there. For the place I really have to get to is a place I must already be at now.
Anything that I might reach by climbing a ladder does not interest me.

### Discussed on

United States United States/Texas U.S. Roads/U.S. Route 66

Cadillac Ranch is a public art installation and sculpture in Amarillo, Texas, US. It was created in 1974 by Chip Lord, Hudson Marquez and Doug Michels, who were a part of the art group Ant Farm.

The installation consists of ten Cadillacs (1949–1963) buried nose-first in the ground. Installed in 1974, the cars were either older running, used or junk cars — together spanning the successive generations of the car line — and the defining evolution of their tailfins. The cars are inclined at the same angle as the pyramids at Giza.

# Loess Regression

Mathematics Statistics

Local regression or local polynomial regression, also known as moving regression, is a generalization of the moving average and polynomial regression. Its most common methods, initially developed for scatterplot smoothing, are LOESS (locally estimated scatterplot smoothing) and LOWESS (locally weighted scatterplot smoothing), both pronounced . They are two strongly related non-parametric regression methods that combine multiple regression models in a k-nearest-neighbor-based meta-model. In some fields, LOESS is known and commonly referred to as Savitzky–Golay filter (proposed 15 years before LOESS).

LOESS and LOWESS thus build on "classical" methods, such as linear and nonlinear least squares regression. They address situations in which the classical procedures do not perform well or cannot be effectively applied without undue labor. LOESS combines much of the simplicity of linear least squares regression with the flexibility of nonlinear regression. It does this by fitting simple models to localized subsets of the data to build up a function that describes the deterministic part of the variation in the data, point by point. In fact, one of the chief attractions of this method is that the data analyst is not required to specify a global function of any form to fit a model to the data, only to fit segments of the data.

The trade-off for these features is increased computation. Because it is so computationally intensive, LOESS would have been practically impossible to use in the era when least squares regression was being developed. Most other modern methods for process modeling are similar to LOESS in this respect. These methods have been consciously designed to use our current computational ability to the fullest possible advantage to achieve goals not easily achieved by traditional approaches.

A smooth curve through a set of data points obtained with this statistical technique is called a loess curve, particularly when each smoothed value is given by a weighted quadratic least squares regression over the span of values of the y-axis scattergram criterion variable. When each smoothed value is given by a weighted linear least squares regression over the span, this is known as a lowess curve; however, some authorities treat lowess and loess as synonyms.

### Discussed on

On January 27, 1994, the national football teams of Barbados and Grenada played against each other as part of the qualification round for the 1994 Caribbean Cup. Barbados won 4-2 in extra time. In the last minutes of regular time, both teams attempted to score own goals. The result has been described as "one of the strangest matches ever".

The outcome of the match was criticised by Grenadian coach James Clarkson, who felt that his team had been unfairly prevented from advancing to the finals. However, given the fact that the unusual tournament rules had not been broken, FIFA cleared Barbados of any wrongdoing.

# Miller test

Law Sexology and sexuality Pornography

The Miller test, also called the three-prong obscenity test, is the United States Supreme Court's test for determining whether speech or expression can be labeled obscene, in which case it is not protected by the First Amendment to the United States Constitution and can be prohibited.

# Spermaceti

Cetaceans Energy

Spermaceti is a waxy substance found in the head cavities of the sperm whale (and, in smaller quantities, in the oils of other whales). Spermaceti is created in the spermaceti organ inside the whale's head. This organ may contain as much as 1,900 litres (500 US gal) of spermaceti. It has been extracted by whalers since the 17th century for human use in cosmetics, textiles, and candles.

Theories for the spermaceti organ's biological function suggest that it may control buoyancy, may act as a focusing apparatus for the whale's sense of echolocation, or possibly both. There has been concrete evidence to support both theories. The buoyancy theory holds that the sperm whale is capable of heating the spermaceti, lowering its density and thus allowing the whale to float; in order for the whale to sink again, it must take water into its blowhole which cools the spermaceti into a denser solid. This claim has been called into question by recent research which indicates a lack of biological structures to support this heat exchange, as well as the fact that the change in density is too small to be meaningful until the organ grows to huge size. Measurement of the proportion of wax esters retained by a harvested sperm whale accurately described the age and future life expectancy of a given individual. The proportion of wax esters in the spermaceti organ increases with the age of the whale: 38–51% in calves, 58–87% in adult females, and 71–94% in adult males.

Spermaceti wax is extracted from sperm oil by crystallisation at 6 °C (43 °F), when treated by pressure and a chemical solution of caustic alkali. Spermaceti forms brilliant white crystals that are hard but oily to the touch, and are devoid of taste or smell, making it very useful as an ingredient in cosmetics, leatherworking, and lubricants. The substance was also used in making candles of a standard photometric value, in the dressing of fabrics, and as a pharmaceutical excipient, especially in cerates and ointments.

The whaling industry in the 17th and 18th centuries was developed to find, harvest and refine the contents of the head of a sperm whale. The crews seeking spermaceti routinely left on three-year tours on several oceans. Cetaceous lamp oil was a commodity that created many maritime fortunes. The light produced by a single pure spermaceti source (candle) became the standard measurement of "candlepower" for another century. Candlepower, a photometric unit defined in the United Kingdom Act of Parliament Metropolitan Gas Act 1860 and adopted at the International Electrotechnical Conference of 1883, was based on the light produced by a pure spermaceti candle.

# Perturbation Theory

Physics

In mathematics and applied mathematics, perturbation theory comprises methods for finding an approximate solution to a problem, by starting from the exact solution of a related, simpler problem. A critical feature of the technique is a middle step that breaks the problem into "solvable" and "perturbative" parts. In perturbation theory, the solution is expressed as a power series in a small parameter ${\displaystyle \varepsilon }$. The first term is the known solution to the solvable problem. Successive terms in the series at higher powers of ${\displaystyle \varepsilon }$ usually become smaller. An approximate 'perturbation solution' is obtained by truncating the series, usually by keeping only the first two terms, the solution to the known problem and the 'first order' perturbation correction.

Perturbation theory is used in a wide range of fields, and reaches its most sophisticated and advanced forms in quantum field theory. Perturbation theory (quantum mechanics) describes the use of this method in quantum mechanics. The field in general remains actively and heavily researched across multiple disciplines.

# Jaccard Index

Computer science Statistics

The Jaccard index, also known as the Jaccard similarity coefficient, is a statistic used for gauging the similarity and diversity of sample sets. It was developed by Grove Karl Gilbert in 1884 as his ratio of verification (v) and now is frequently referred to as the Critical Success Index in meteorology. It was later developed independently by Paul Jaccard, originally giving the French name coefficient de communauté, and independently formulated again by T. Tanimoto. Thus, the Tanimoto index or Tanimoto coefficient are also used in some fields. However, they are identical in generally taking the ratio of Intersection over Union. The Jaccard coefficient measures similarity between finite sample sets, and is defined as the size of the intersection divided by the size of the union of the sample sets:

${\displaystyle J(A,B)={{|A\cap B|} \over {|A\cup B|}}={{|A\cap B|} \over {|A|+|B|-|A\cap B|}}.}$

Note that by design, ${\displaystyle 0\leq J(A,B)\leq 1.}$ If A intersection B is empty, then J(A,B) = 0. The Jaccard coefficient is widely used in computer science, ecology, genomics, and other sciences, where binary or binarized data are used. Both the exact solution and approximation methods are available for hypothesis testing with the Jaccard coefficient.

Jaccard similarity also applies to bags, i.e., Multisets. This has a similar formula, but the symbols mean bag intersection and bag sum (not union). The maximum value is 1/2.

${\displaystyle J(A,B)={{|A\cap B|} \over {|A\uplus B|}}={{|A\cap B|} \over {|A|+|B|}}.}$

The Jaccard distance, which measures dissimilarity between sample sets, is complementary to the Jaccard coefficient and is obtained by subtracting the Jaccard coefficient from 1, or, equivalently, by dividing the difference of the sizes of the union and the intersection of two sets by the size of the union:

${\displaystyle d_{J}(A,B)=1-J(A,B)={{|A\cup B|-|A\cap B|} \over |A\cup B|}.}$

An alternative interpretation of the Jaccard distance is as the ratio of the size of the symmetric difference ${\displaystyle A\triangle B=(A\cup B)-(A\cap B)}$ to the union. Jaccard distance is commonly used to calculate an n × n matrix for clustering and multidimensional scaling of n sample sets.

This distance is a metric on the collection of all finite sets.

There is also a version of the Jaccard distance for measures, including probability measures. If ${\displaystyle \mu }$ is a measure on a measurable space ${\displaystyle X}$, then we define the Jaccard coefficient by

${\displaystyle J_{\mu }(A,B)={{\mu (A\cap B)} \over {\mu (A\cup B)}},}$

and the Jaccard distance by

${\displaystyle d_{\mu }(A,B)=1-J_{\mu }(A,B)={{\mu (A\triangle B)} \over {\mu (A\cup B)}}.}$

Care must be taken if ${\displaystyle \mu (A\cup B)=0}$ or ${\displaystyle \infty }$, since these formulas are not well defined in these cases.

The MinHash min-wise independent permutations locality sensitive hashing scheme may be used to efficiently compute an accurate estimate of the Jaccard similarity coefficient of pairs of sets, where each set is represented by a constant-sized signature derived from the minimum values of a hash function.