Random Articles (Page 3)
Have a deep view into what people are curious about.
π The Machine Stops
"The Machine Stops" is a science fiction short story (12,300 words) by E. M. Forster. After initial publication in The Oxford and Cambridge Review (November 1909), the story was republished in Forster's The Eternal Moment and Other Stories in 1928. After being voted one of the best novellas up to 1965, it was included that same year in the populist anthology Modern Short Stories. In 1973 it was also included in The Science Fiction Hall of Fame, Volume Two.
The story, set in a world where humanity lives underground and relies on a giant machine to provide its needs, predicted technologies similar to instant messaging and the Internet.
Discussed on
- "The Machine Stops" | 2015-05-14 | 93 Upvotes 26 Comments
π Fittsβs law
Fitts's law (often cited as Fitts' law) is a predictive model of human movement primarily used in humanβcomputer interaction and ergonomics. This scientific law predicts that the time required to rapidly move to a target area is a function of the ratio between the distance to the target and the width of the target. Fitts's law is used to model the act of pointing, either by physically touching an object with a hand or finger, or virtually, by pointing to an object on a computer monitor using a pointing device.
Fitts's law has been shown to apply under a variety of conditions; with many different limbs (hands, feet, the lower lip, head-mounted sights), manipulanda (input devices), physical environments (including underwater), and user populations (young, old, special educational needs, and drugged participants).
Discussed on
- "Fittsβs law" | 2018-03-18 | 83 Upvotes 2 Comments
- "Fitt's Law" | 2016-03-02 | 70 Upvotes 43 Comments
π Assassination of Boris Nemtsov
The assassination of Boris Nemtsov, a Russian politician opposed to the government of Vladimir Putin, occurred in central Moscow on Bolshoy Moskvoretsky Bridge at 23:31 local time on 27 February 2015. An unknown assailant fired seven or eight shots from a Makarov pistol; four of them hit Boris Nemtsov in the head, heart, liver and stomach, killing him almost instantly. He died hours after appealing to the public to support a march against Russia's war in Ukraine. Nemtsov's Ukrainian partner Anna Duritskaya survived the attack as its sole eyewitness.
The assassination was met with worldwide condemnation and concern for the situation of the Russian opposition. Russian authorities also condemned the murder and vowed to conduct a thorough investigation.
On 8 March 2015, Russian authorities charged Anzor Gubashev and Zaur Dadaev, both originating from the Northern Caucasus, with involvement in the crime. According to Russian authorities, Dadaev confessed to involvement in the murder. However, he later retracted his statement, as extracted by torture. Three more suspects were arrested around the same time and, according to Russian media, another suspect blew himself up in Grozny when Russian police forces surrounded his apartment block.
Discussed on
- "Assassination of Boris Nemtsov" | 2022-03-05 | 110 Upvotes 27 Comments
π Prophetic Perfect Tense
The prophetic perfect tense is a literary technique used in the Bible that describes future events that are so certain to happen that they are referred to in the past tense as if they had already happened.
Discussed on
- "Prophetic Perfect Tense" | 2023-09-28 | 154 Upvotes 96 Comments
π Blue iceberg
A blue iceberg is visible after the ice from above the water melts, causing the smooth portion of ice from below the water to overturn. The rare blue ice is formed from the compression of pure snow, which then develops into glacial ice.
Icebergs may also appear blue due to light refraction and age. Older icebergs reveal vivid hues of green and blue, resulting from a high concentration of color, microorganisms, and compacted ice. One of the better known blue icebergs rests in the waters off Sermilik fjord near Greenland. It is described as an electric blue iceberg and is known to locals as "blue diamond".
Discussed on
- "Blue iceberg" | 2023-07-29 | 82 Upvotes 14 Comments
π 18XX Train Games
18XX is the generic term for a series of board games that, with a few exceptions, recreate the building of railroad corporations during the 19th century; individual games within the series use particular years in the 19th century as their title (usually the date of the start of railway development in the area of the world they cover), or "18" plus a two or more letter geographical designator (such as 18EU for a game set in the European Union). The games 2038, set in the future, and Poseidon and Ur, 1830 BC, both set in ancient history, are also regarded as 18XX titles as their game mechanics and titling nomenclature are similar despite variance from the common railroad/stock-market theme.
The 18XX series has its origins in the game 1829, first produced by Francis Tresham in the mid-1970s. 1829 was chosen as it was the year of the Rainhill Trials. 1830 was produced by Avalon Hill in 1986, and was the first game of the series widely available in the United States; it is seen as the basic 18XX game by the U.S. audience.
In addition to traditionally published games, the 18XX series has spawned self-published variants and games published by low-volume game companies.
With few exceptions (such as 2038), 18XX titles are multiplayer board games without random variables in their game mechanics.
Discussed on
- "18XX Train Games" | 2020-12-25 | 101 Upvotes 18 Comments
π V-2 No. 13
The V-2 No. 13 was a modified V-2 rocket that became the first object to take a photograph of the Earth from outer space. Launched on 24 October 1946, at the White Sands Missile Range in White Sands, New Mexico, the rocket reached a maximum altitude of 65Β mi (105Β km).
Discussed on
- "V-2 No. 13" | 2024-04-12 | 34 Upvotes 9 Comments
π Internet 0
Internet 0 is a low-speed physical layer designed to route 'IP over anything.' It was developed at MIT's Center for Bits and Atoms by Neil Gershenfeld, Raffi Krikorian, and Danny Cohen. When it was invented, a number of other proposals were being labelled as "internet 2." The name was chosen to emphasize that this was designed to be a slow, but very inexpensive internetworking system, and forestall "high-performance" comparison questions such as "how fast is it?"
Effectively, it would enable a platform for pervasive computing -- everything in a building could be on the same network to share data gathering and actuation. A light switch could turn on a light bulb by sending a packet to it, they can be linked together by the user.
Discussed on
- "Internet 0 " | 2010-01-21 | 25 Upvotes 7 Comments
π Zooming User Interface (ZUI)
In computing, a zooming user interface or zoomable user interface (ZUI, pronounced zoo-ee) is a type of graphical user interface (GUI) where users can change the scale of the viewed area in order to see more detail or less, and browse through different documents. Information elements appear directly on an infinite virtual desktop (usually created using vector graphics), instead of in windows. Users can pan across the virtual surface in two dimensions and zoom into objects of interest. For example, as you zoom into a text object it may be represented as a small dot, then a thumbnail of a page of text, then a full-sized page and finally a magnified view of the page.
ZUIs use zooming as the main metaphor for browsing through hyperlinked or multivariate information. Objects present inside a zoomed page can in turn be zoomed themselves to reveal further detail, allowing for recursive nesting and an arbitrary level of zoom.
When the level of detail present in the resized object is changed to fit the relevant information into the current size, instead of being a proportional view of the whole object, it's called semantic zooming.
Some consider the ZUI paradigm as a flexible and realistic successor to the traditional windowing GUI, being a Post-WIMP interface.
Discussed on
- "Zooming User Interface (ZUI)" | 2024-04-15 | 68 Upvotes 50 Comments
π Kernel Embedding of Distributions
In machine learning, the kernel embedding of distributions (also called the kernel mean or mean map) comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function (measuring similarity between elements of ) may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard SchΓΆlkopf. A review of recent works on kernel embedding of distributions can be found in.
The analysis of distributions is fundamental in machine learning and statistics, and many algorithms in these fields rely on information theoretic approaches such as entropy, mutual information, or KullbackβLeibler divergence. However, to estimate these quantities, one must first either perform density estimation, or employ sophisticated space-partitioning/bias-correction strategies which are typically infeasible for high-dimensional data. Commonly, methods for modeling complex distributions rely on parametric assumptions that may be unfounded or computationally challenging (e.g. Gaussian mixture models), while nonparametric methods like kernel density estimation (Note: the smoothing kernels in this context have a different interpretation than the kernels discussed here) or characteristic function representation (via the Fourier transform of the distribution) break down in high-dimensional settings.
Methods based on the kernel embedding of distributions sidestep these problems and also possess the following advantages:
- Data may be modeled without restrictive assumptions about the form of the distributions and relationships between variables
- Intermediate density estimation is not needed
- Practitioners may specify the properties of a distribution most relevant for their problem (incorporating prior knowledge via choice of the kernel)
- If a characteristic kernel is used, then the embedding can uniquely preserve all information about a distribution, while thanks to the kernel trick, computations on the potentially infinite-dimensional RKHS can be implemented in practice as simple Gram matrix operations
- Dimensionality-independent rates of convergence for the empirical kernel mean (estimated using samples from the distribution) to the kernel embedding of the true underlying distribution can be proven.
- Learning algorithms based on this framework exhibit good generalization ability and finite sample convergence, while often being simpler and more effective than information theoretic methods
Thus, learning via the kernel embedding of distributions offers a principled drop-in replacement for information theoretic approaches and is a framework which not only subsumes many popular methods in machine learning and statistics as special cases, but also can lead to entirely new learning algorithms.
Discussed on
- "Kernel Embedding of Distributions" | 2014-02-15 | 13 Upvotes 3 Comments