Random Articles (Page 16)

Have a deep view into what people are curious about.

🔗 Basel Problem

🔗 Mathematics

The Basel problem is a problem in mathematical analysis with relevance to number theory, first posed by Pietro Mengoli in 1650 and solved by Leonhard Euler in 1734, and read on 5 December 1735 in The Saint Petersburg Academy of Sciences. Since the problem had withstood the attacks of the leading mathematicians of the day, Euler's solution brought him immediate fame when he was twenty-eight. Euler generalised the problem considerably, and his ideas were taken up years later by Bernhard Riemann in his seminal 1859 paper "On the Number of Primes Less Than a Given Magnitude", in which he defined his zeta function and proved its basic properties. The problem is named after Basel, hometown of Euler as well as of the Bernoulli family who unsuccessfully attacked the problem.

The Basel problem asks for the precise summation of the reciprocals of the squares of the natural numbers, i.e. the precise sum of the infinite series:

∑ n = 1 ∞ 1 n 2 = 1 1 2 + 1 2 2 + 1 3 2 + ⋯ . {\displaystyle \sum _{n=1}^{\infty }{\frac {1}{n^{2}}}={\frac {1}{1^{2}}}+{\frac {1}{2^{2}}}+{\frac {1}{3^{2}}}+\cdots .}

The sum of the series is approximately equal to 1.644934. The Basel problem asks for the exact sum of this series (in closed form), as well as a proof that this sum is correct. Euler found the exact sum to be π2/6 and announced this discovery in 1735. His arguments were based on manipulations that were not justified at the time, although he was later proven correct, and it was not until 1741 that he was able to produce a truly rigorous proof.

Discussed on

🔗 Zip Bomb

🔗 Computing 🔗 Computer Security 🔗 Computer Security/Computing

A zip bomb, also known as a zip of death or decompression bomb, is a malicious archive file designed to crash or render useless the program or system reading it. It is often employed to disable antivirus software, in order to create an opening for more traditional viruses.

Rather than hijacking the normal operation of the program, a zip bomb allows the program to work as intended, but the archive is carefully crafted so that unpacking it (e.g. by a virus scanner in order to scan for viruses) requires inordinate amounts of time, disk space or memory.

Most modern antivirus programs can detect whether a file is a zip bomb, to avoid unpacking it.

Discussed on

🔗 Radio Yerevan Jokes

🔗 Soviet Union 🔗 Armenia

The Radio Yerevan jokes, also known as the Armenian Radio jokes, have been popular in the Soviet Union and other countries of the former Communist Eastern bloc since the second half of the 20th century. These jokes of the Q&A type pretended to come from the Question & Answer series of the Armenian Radio. A typical format of a joke was: "Radio Yerevan was asked," and "Radio Yerevan answered."

Discussed on

🔗 Jaccard Index

🔗 Computer science 🔗 Statistics

The Jaccard index, also known as the Jaccard similarity coefficient, is a statistic used for gauging the similarity and diversity of sample sets. It was developed by Grove Karl Gilbert in 1884 as his ratio of verification (v) and now is frequently referred to as the Critical Success Index in meteorology. It was later developed independently by Paul Jaccard, originally giving the French name coefficient de communauté, and independently formulated again by T. Tanimoto. Thus, the Tanimoto index or Tanimoto coefficient are also used in some fields. However, they are identical in generally taking the ratio of Intersection over Union. The Jaccard coefficient measures similarity between finite sample sets, and is defined as the size of the intersection divided by the size of the union of the sample sets:

J ( A , B ) = | A ∩ B | | A ∪ B | = | A ∩ B | | A | + | B | − | A ∩ B | . {\displaystyle J(A,B)={{|A\cap B|} \over {|A\cup B|}}={{|A\cap B|} \over {|A|+|B|-|A\cap B|}}.}

Note that by design, 0 ≤ J ( A , B ) ≤ 1. {\displaystyle 0\leq J(A,B)\leq 1.} If A intersection B is empty, then J(A,B) = 0. The Jaccard coefficient is widely used in computer science, ecology, genomics, and other sciences, where binary or binarized data are used. Both the exact solution and approximation methods are available for hypothesis testing with the Jaccard coefficient.

Jaccard similarity also applies to bags, i.e., Multisets. This has a similar formula, but the symbols mean bag intersection and bag sum (not union). The maximum value is 1/2.

J ( A , B ) = | A ∩ B | | A ⊎ B | = | A ∩ B | | A | + | B | . {\displaystyle J(A,B)={{|A\cap B|} \over {|A\uplus B|}}={{|A\cap B|} \over {|A|+|B|}}.}

The Jaccard distance, which measures dissimilarity between sample sets, is complementary to the Jaccard coefficient and is obtained by subtracting the Jaccard coefficient from 1, or, equivalently, by dividing the difference of the sizes of the union and the intersection of two sets by the size of the union:

d J ( A , B ) = 1 − J ( A , B ) = | A ∪ B | − | A ∩ B | | A ∪ B | . {\displaystyle d_{J}(A,B)=1-J(A,B)={{|A\cup B|-|A\cap B|} \over |A\cup B|}.}

An alternative interpretation of the Jaccard distance is as the ratio of the size of the symmetric difference A △ B = ( A ∪ B ) − ( A ∩ B ) {\displaystyle A\triangle B=(A\cup B)-(A\cap B)} to the union. Jaccard distance is commonly used to calculate an n × n matrix for clustering and multidimensional scaling of n sample sets.

This distance is a metric on the collection of all finite sets.

There is also a version of the Jaccard distance for measures, including probability measures. If μ {\displaystyle \mu } is a measure on a measurable space X {\displaystyle X} , then we define the Jaccard coefficient by

J μ ( A , B ) = μ ( A ∩ B ) μ ( A ∪ B ) , {\displaystyle J_{\mu }(A,B)={{\mu (A\cap B)} \over {\mu (A\cup B)}},}

and the Jaccard distance by

d μ ( A , B ) = 1 − J μ ( A , B ) = μ ( A △ B ) μ ( A ∪ B ) . {\displaystyle d_{\mu }(A,B)=1-J_{\mu }(A,B)={{\mu (A\triangle B)} \over {\mu (A\cup B)}}.}

Care must be taken if μ ( A ∪ B ) = 0 {\displaystyle \mu (A\cup B)=0} or ∞ {\displaystyle \infty } , since these formulas are not well defined in these cases.

The MinHash min-wise independent permutations locality sensitive hashing scheme may be used to efficiently compute an accurate estimate of the Jaccard similarity coefficient of pairs of sets, where each set is represented by a constant-sized signature derived from the minimum values of a hash function.

Discussed on

🔗 Paradox of tolerance

🔗 Philosophy 🔗 Philosophy/Social and political philosophy

The paradox of tolerance states that if a society is tolerant without limit, its ability to be tolerant is eventually seized or destroyed by the intolerant. Karl Popper described it as the seemingly paradoxical idea that, "In order to maintain a tolerant society, the society must be intolerant of intolerance." The paradox of tolerance is an important concept for thinking about which boundaries can or should be set.

Discussed on

🔗 Flocken Elektrowagen

🔗 Environment 🔗 Automobiles 🔗 Environment/Green vehicle

The Flocken Elektrowagen is a four-wheeled electric car designed by Andreas Flocken (1845–1913), manufactured in 1888 by Maschinenfabrik A. Flocken in Coburg. It is regarded as the first real electric car.

Discussed on

🔗 The repetitive and boring gameplay in WoW is probably intentional.

🔗 Video games

In video games, grinding is performing repetitive tasks, usually for a gameplay advantage or loot but in some cases for purely aesthetic or cosmetic benefits. Many video games use different tactics to implement, or reduce, the amount of grinding in the gameplay. The general use of grinding is for "experience points", or to improve a character's level. In addition, the behavior is sometimes referred to as pushing the bar (leveling up), farming (acquiring loot repeatedly from one source), or catassing.

Discussed on

🔗 Chewbacca defense

🔗 United States 🔗 Television 🔗 Law 🔗 Comedy 🔗 Animation 🔗 Popular Culture 🔗 South Park 🔗 United States/American television 🔗 United States/Colorado 🔗 Star Wars 🔗 United States/Animation - American animation

In a jury trial, a Chewbacca defense is a legal strategy in which a criminal defense lawyer tries to confuse the jury rather than refute the case of the prosecutor. It is an intentional distraction or obfuscation.

As a Chewbacca defense distracts and misleads, it is an example of a red herring. It is also an example of an irrelevant conclusion, a type of informal fallacy in which one making an argument fails to address the issue in question. Often an opposing counsel can legally object to such arguments by declaring them irrelevant, character evidence, or argumentative.

The name Chewbacca defense comes from "Chef Aid", an episode of the American animated series South Park. The episode, which premiered on October 7, 1998, satirizes the O. J. Simpson murder trial—particularly attorney Johnnie Cochran's closing argument for the defense. In the episode, Cochran (voiced by Trey Parker) bases his argument on a false premise about the 1983 film Return of the Jedi. He asks the jury why a Wookiee like Chewbacca would want to live on Endor with the much smaller Ewoks when "it does not make sense". He argues that if Chewbacca living on Endor does not make sense—and if even mentioning Chewbacca in the case does not make sense—then the jury must acquit.

In the Simpson murder trial, the real Johnnie Cochran tried to convince jurors that a glove found at the crime scene, alleged to have been left by the killer, could not be Simpson's because it did not fit Simpson's hand. Because the prosecution relied on the glove as evidence of Simpson's presence at the scene, Cochran argued that the lack of fit proved Simpson's innocence: "It makes no sense; it doesn't fit; if it doesn't fit, you must acquit." "If it doesn't fit, you must acquit" was a refrain that Cochran also used in response to other points of the case.

🔗 Gray Goo

🔗 Technology 🔗 Science Fiction 🔗 Transhumanism

Gray goo (also spelled grey goo) is a hypothetical global catastrophic scenario involving molecular nanotechnology in which out-of-control self-replicating machines consume all biomass on Earth while building more of themselves, a scenario that has been called ecophagy ("eating the environment", more literally "eating the habitation"). The original idea assumed machines were designed to have this capability, while popularizations have assumed that machines might somehow gain this capability by accident.

Self-replicating machines of the macroscopic variety were originally described by mathematician John von Neumann, and are sometimes referred to as von Neumann machines or clanking replicators. The term gray goo was coined by nanotechnology pioneer K. Eric Drexler in his 1986 book Engines of Creation. In 2004 he stated, "I wish I had never used the term 'gray goo'." Engines of Creation mentions "gray goo" in two paragraphs and a note, while the popularized idea of gray goo was first publicized in a mass-circulation magazine, Omni, in November 1986.

Discussed on

🔗 Permian–Triassic Extinction Event

🔗 Palaeontology 🔗 Geology 🔗 Extinction

The Permian–Triassic extinction event, also known as the P–Tr extinction, the P–T extinction, the End-Permian Extinction, and colloquially as the Great Dying, formed the boundary between the Permian and Triassic geologic periods, as well as between the Paleozoic and Mesozoic eras, approximately 252 million years ago. It is the Earth's most severe known extinction event, with up to 96% of all marine species and 70% of terrestrial vertebrate species becoming extinct. It was the largest known mass extinction of insects. Some 57% of all biological families and 83% of all genera became extinct.

There is evidence for one to three distinct pulses, or phases, of extinction. Potential causes for those pulses include one or more large meteor impact events, massive volcanic eruptions (such as the Siberian Traps), and climate change brought on by large releases of underwater methane or methane-producing microbes.

The speed of the recovery from the extinction is disputed. Some scientists estimate that it took 10 million years (until the Middle Triassic), due both to the severity of the extinction and because grim conditions returned periodically for another 5 million years. However, studies in Bear Lake County, near Paris, Idaho, showed a relatively quick rebound in a localized Early Triassic marine ecosystem, taking around 2 million years to recover, suggesting that the impact of the extinction may have been felt less severely in some areas than others.

Discussed on