Random Articles (Page 173)
Have a deep view into what people are curious about.
🔗 Bovril
Bovril is the trademarked name of a thick and salty meat extract paste similar to a yeast extract, developed in the 1870s by John Lawson Johnston. It is sold in a distinctive, bulbous jar, and also as cubes and granules. Bovril is owned and distributed by Unilever UK.
Bovril can be made into a drink by diluting with hot water or, less commonly, with milk. It can be used as a flavouring for soups, broth, stews or porridge, or as a spread, especially on toast in a similar fashion to Marmite and Vegemite.
Discussed on
- "Bovril" | 2015-11-17 | 29 Upvotes 15 Comments
🔗 Josephson Voltage Standard
A Josephson voltage standard is a complex system that uses a superconducting integrated circuit chip operating at a temperature of 4 K to generate stable voltages that depend only on an applied frequency and fundamental constants. It is an intrinsic standard in the sense that it does not depend on any physical artifact. It is the most accurate method to generate or measure voltage and, by international agreement in 1990, is the basis for voltage standards around the world.
🔗 Kernel Embedding of Distributions
In machine learning, the kernel embedding of distributions (also called the kernel mean or mean map) comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function (measuring similarity between elements of ) may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.
The analysis of distributions is fundamental in machine learning and statistics, and many algorithms in these fields rely on information theoretic approaches such as entropy, mutual information, or Kullback–Leibler divergence. However, to estimate these quantities, one must first either perform density estimation, or employ sophisticated space-partitioning/bias-correction strategies which are typically infeasible for high-dimensional data. Commonly, methods for modeling complex distributions rely on parametric assumptions that may be unfounded or computationally challenging (e.g. Gaussian mixture models), while nonparametric methods like kernel density estimation (Note: the smoothing kernels in this context have a different interpretation than the kernels discussed here) or characteristic function representation (via the Fourier transform of the distribution) break down in high-dimensional settings.
Methods based on the kernel embedding of distributions sidestep these problems and also possess the following advantages:
- Data may be modeled without restrictive assumptions about the form of the distributions and relationships between variables
- Intermediate density estimation is not needed
- Practitioners may specify the properties of a distribution most relevant for their problem (incorporating prior knowledge via choice of the kernel)
- If a characteristic kernel is used, then the embedding can uniquely preserve all information about a distribution, while thanks to the kernel trick, computations on the potentially infinite-dimensional RKHS can be implemented in practice as simple Gram matrix operations
- Dimensionality-independent rates of convergence for the empirical kernel mean (estimated using samples from the distribution) to the kernel embedding of the true underlying distribution can be proven.
- Learning algorithms based on this framework exhibit good generalization ability and finite sample convergence, while often being simpler and more effective than information theoretic methods
Thus, learning via the kernel embedding of distributions offers a principled drop-in replacement for information theoretic approaches and is a framework which not only subsumes many popular methods in machine learning and statistics as special cases, but also can lead to entirely new learning algorithms.
Discussed on
- "Kernel Embedding of Distributions" | 2014-02-15 | 13 Upvotes 3 Comments
🔗 Rhind Mathematical Papyrus
The Rhind Mathematical Papyrus (RMP; also designated as papyrus British Museum 10057 and pBM 10058) is one of the best known examples of ancient Egyptian mathematics. It is named after Alexander Henry Rhind, a Scottish antiquarian, who purchased the papyrus in 1858 in Luxor, Egypt; it was apparently found during illegal excavations in or near the Ramesseum. It dates to around 1550 BC. The British Museum, where the majority of the papyrus is now kept, acquired it in 1865 along with the Egyptian Mathematical Leather Roll, also owned by Henry Rhind. There are a few small fragments held by the Brooklyn Museum in New York City and an 18 cm (7.1 in) central section is missing. It is one of the two well-known Mathematical Papyri along with the Moscow Mathematical Papyrus. The Rhind Papyrus is larger than the Moscow Mathematical Papyrus, while the latter is older.
The Rhind Mathematical Papyrus dates to the Second Intermediate Period of Egypt. It was copied by the scribe Ahmes (i.e., Ahmose; Ahmes is an older transcription favoured by historians of mathematics), from a now-lost text from the reign of king Amenemhat III (12th dynasty). Written in the hieratic script, this Egyptian manuscript is 33 cm (13 in) tall and consists of multiple parts which in total make it over 5 m (16 ft) long. The papyrus began to be transliterated and mathematically translated in the late 19th century. The mathematical translation aspect remains incomplete in several respects. The document is dated to Year 33 of the Hyksos king Apophis and also contains a separate later historical note on its verso likely dating from the period ("Year 11") of his successor, Khamudi.
In the opening paragraphs of the papyrus, Ahmes presents the papyrus as giving "Accurate reckoning for inquiring into things, and the knowledge of all things, mysteries ... all secrets". He continues with:
This book was copied in regnal year 33, month 4 of Akhet, under the majesty of the King of Upper and Lower Egypt, Awserre, given life, from an ancient copy made in the time of the King of Upper and Lower Egypt Nimaatre. The scribe Ahmose writes this copy.
Several books and articles about the Rhind Mathematical Papyrus have been published, and a handful of these stand out. The Rhind Papyrus was published in 1923 by Peet and contains a discussion of the text that followed Griffith's Book I, II and III outline. Chace published a compendium in 1927–29 which included photographs of the text. A more recent overview of the Rhind Papyrus was published in 1987 by Robins and Shute.
Discussed on
- "Rhind Mathematical Papyrus" | 2023-04-02 | 42 Upvotes 30 Comments
🔗 Brouwer–Hilbert controversy
In a foundational controversy in twentieth-century mathematics, L. E. J. Brouwer, a supporter of intuitionism, opposed David Hilbert, the founder of formalism.
Discussed on
- "Brouwer–Hilbert controversy" | 2018-11-08 | 95 Upvotes 32 Comments
🔗 Los Alamos Chess
Los Alamos chess (or anti-clerical chess) is a chess variant played on a 6×6 board without bishops. This was the first chess-like game played by a computer program. This program was written at Los Alamos Scientific Laboratory by Paul Stein and Mark Wells for the MANIAC I computer in 1956. The reduction of the board size and the number of pieces from standard chess was due to the very limited capacity of computers at the time.
The program was very simple, containing only about 600 instructions. It was mostly a minimax tree search and could look four plies ahead. For scoring the board at the end of the four-ply lookahead, it estimates a score for material and a score for mobility, then adds them up. Pseudocode for the chess program is described in Figure 11.4 of Newell, 2019. In 1958, a revised version was written for MANIAC II for full 8×8 chess, though its pseudocode was never published. There is a record of a single game by it, circa November 1958 (Table 11.2 of Newell, 2019).
Discussed on
- "Los Alamos Chess" | 2024-04-02 | 265 Upvotes 107 Comments
🔗 30 years ago, Doom was released
Doom is a first-person shooter game developed and published by id Software. Released on December 10, 1993, for DOS, it is the first installment in the Doom franchise. The player assumes the role of a space marine, later unofficially referred to as Doomguy, fighting through hordes of undead humans and invading demons. The game begins on the moons of Mars and finishes in hell, with the player traversing each level to find its exit or defeat its final boss. It is an early example of 3D graphics in video games, and has enemies and objects as 2D images, a technique sometimes referred to as 2.5D graphics.
Doom was the third major independent release by id Software, after Commander Keen (1990–1991) and Wolfenstein 3D (1992). In May 1992, id started developing a darker game focused on fighting demons with technology, using a new 3D game engine from the lead programmer, John Carmack. The designer Tom Hall initially wrote a science fiction plot, but he and most of the story were removed from the project, with the final game featuring an action-heavy design by John Romero and Sandy Petersen. Id published Doom as a set of three episodes under the shareware model, marketing the full game by releasing the first episode free. A retail version with an additional episode was published in 1995 by GT Interactive as The Ultimate Doom.
Doom was a critical and commercial success, earning a reputation as one of the best and most influential video games of all time. It sold an estimated 3.5 million copies by 1999, and up to 20 million people are estimated to have played it within two years of launch. It has been termed the "father" of first-person shooters and is regarded as one of the most important games in the genre. It has been cited by video game historians as shifting the direction and public perception of the medium as a whole, as well as sparking the rise of online games and communities. It led to an array of imitators and clones, as well as a robust modding scene and the birth of speedrunning as a community. Its high level of graphic violence led to controversy from a range of groups. Doom has been ported to a variety of platforms both officially and unofficially and has been followed by several games in the series, including Doom II (1994), Doom 3 (2004), Doom (2016), and Doom Eternal (2020), as well as the films Doom (2005) and Doom: Annihilation (2019).
Discussed on
- "30 years ago, Doom was released" | 2023-12-10 | 64 Upvotes 36 Comments
🔗 Lakes of Wada
In mathematics, the lakes of Wada (和田の湖, Wada no mizuumi) are three disjoint connected open sets of the plane or open unit square with the counterintuitive property that they all have the same boundary. In other words, for any point selected on the boundary of one of the lakes, the other two lakes' boundaries also contain that point.
More than two sets with the same boundary are said to have the Wada property; examples include Wada basins in dynamical systems. This property is rare in real-world systems.
The lakes of Wada were introduced by Kunizō Yoneyama (1917, page 60), who credited the discovery to Takeo Wada. His construction is similar to the construction by Brouwer (1910) of an indecomposable continuum, and in fact it is possible for the common boundary of the three sets to be an indecomposable continuum.
Discussed on
- "Lakes of Wada" | 2018-10-07 | 66 Upvotes 15 Comments
🔗 Father of the PDP-1: The TX-0, Transistorized EXperimental Computer Zero (1956)
The TX-0, for Transistorized Experimental computer zero, but affectionately referred to as tixo (pronounced "tix oh"), was an early fully transistorized computer and contained a then-huge 64K of 18-bit words of magnetic core memory. Construction of the TX-0 began in 1955 and ended in 1956. It was used continually through the 1960s at MIT. The TX-0 incorporated around 3600 Philco high-frequency surface-barrier transistors, the first transistor suitable for high-speed computers. The TX-0 and its direct descendant, the original PDP-1, were platforms for pioneering computer research and the development of what would later be called computer "hacker" culture.
Discussed on
- "Father of the PDP-1: The TX-0, Transistorized EXperimental Computer Zero (1956)" | 2021-02-28 | 30 Upvotes 12 Comments
🔗 20 years ago Far Cry was released
Far Cry is a 2004 first-person shooter game developed by Crytek and published by Ubisoft. It is the first installment in the Far Cry franchise. Set on a mysterious tropical archipelago, the game follows Jack Carver, a former American special operations forces operative, as he searches for journalist Valerie Constantine, who accompanied him to the islands but went missing after their boat was destroyed by mercenaries. As Jack explores the islands, he begins to discover the horrific genetic experiments being conducted on the local wildlife and must confront the mad scientist behind them.
The game was the first to use Crytek's CryEngine, and was designed as an open-ended first-person shooter, though it lacks most of the freedom its successors would offer to the player. While players can freely explore the game's world like in later Far Cry titles, they are most often discouraged from doing so due to the linear structure of missions and the lack of side content. Despite this, the gameplay formula established in Far Cry—placing the player in a foreign environment occupied by enemy forces where they must use various weapons and tools, as well as their surroundings to overcome any threat—would prove essential in defining the series' identity going forward.
Far Cry was released for Microsoft Windows in March 2004 to generally positive reviews, being praised for its visuals, gameplay mechanics, and the level of freedom given to players. The game was also a commercial success, selling over 730,000 units within four months of release and over 2.5 million units in its lifetime. The success of Far Cry led to a series of standalone sequels developed by Ubisoft, starting with Far Cry 2 in 2008. A remake of the game with a different storyline and new mechanics, Far Cry Instincts, was released for the Xbox in 2005, and for the Xbox 360 in 2006 as part of the Far Cry Instincts: Predator compilation. A loose film adaptation was released in 2008. The original version of Far Cry, updated with HD graphics, was re-released under the title Far Cry Classic for the PlayStation 3 and Xbox 360 in 2014.
Discussed on
- "20 years ago Far Cry was released" | 2024-03-30 | 70 Upvotes 63 Comments