Random Articles (Page 287)

Have a deep view into what people are curious about.

πŸ”— Computational sociology

πŸ”— Sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.

It involves the understanding of social agents, the interaction among these agents, and the effect of these interactions on the social aggregate. Although the subject matter and methodologies in social science differ from those in natural science or computer science, several of the approaches used in contemporary social simulation originated from fields such as physics and artificial intelligence. Some of the approaches that originated in this field have been imported into the natural sciences, such as measures of network centrality from the fields of social network analysis and network science.

In relevant literature, computational sociology is often related to the study of social complexity. Social complexity concepts such as complex systems, non-linear interconnection among macro and micro process, and emergence, have entered the vocabulary of computational sociology. A practical and well-known example is the construction of a computational model in the form of an "artificial society", by which researchers can analyze the structure of a social system.

Discussed on

πŸ”— Cow Tools

πŸ”— Comics

"Cow tools" is a cartoon from The Far Side by American cartoonist Gary Larson, published in October 1982. It depicts a cow standing behind a table of bizarre, misshapen implements with the caption "cow tools". The cartoon confused many readers, who wrote or phoned in seeking an explanation of the joke. In response to the controversy, Larson issued a press release clarifying that the thrust of the cartoon was simply that, if a cow were to make tools, they would "lack something in sophistication". It has been described as "arguably the most loathed Far Side strip ever" while also becoming a popular internet meme.

Discussed on

πŸ”— Zero Knowledge Proofs

πŸ”— Cryptography πŸ”— Cryptography/Computer science

In cryptography, a zero-knowledge proof or zero-knowledge protocol is a method by which one party (the prover) can prove to another party (the verifier) that they know a value x, without conveying any information apart from the fact that they know the value x. The essence of zero-knowledge proofs is that it is trivial to prove that one possesses knowledge of certain information by simply revealing it; the challenge is to prove such possession without revealing the information itself or any additional information.

If proving a statement requires that the prover possesses some secret information, then the verifier will not be able to prove the statement to anyone else without possessing the secret information. The statement being proved must include the assertion that the prover has such knowledge, but not the knowledge itself. Otherwise, the statement would not be proved in zero-knowledge because it provides the verifier with additional information about the statement by the end of the protocol. A zero-knowledge proof of knowledge is a special case when the statement consists only of the fact that the prover possesses the secret information.

Interactive zero-knowledge proofs require interaction between the individual (or computer system) proving their knowledge and the individual validating the proof.

A protocol implementing zero-knowledge proofs of knowledge must necessarily require interactive input from the verifier. This interactive input is usually in the form of one or more challenges such that the responses from the prover will convince the verifier if and only if the statement is true, i.e., if the prover does possess the claimed knowledge. If this were not the case, the verifier could record the execution of the protocol and replay it to convince someone else that they possess the secret information. The new party's acceptance is either justified since the replayer does possess the information (which implies that the protocol leaked information, and thus, is not proved in zero-knowledge), or the acceptance is spurious, i.e., was accepted from someone who does not actually possess the information.

Some forms of non-interactive zero-knowledge proofs exist, but the validity of the proof relies on computational assumptions (typically the assumptions of an ideal cryptographic hash function).

Discussed on

πŸ”— How to get bias into a Wikipedia article

Tilt! How to get bias into a Wikipedia article

To all you budding propagandists in Wikiland: too many of you are working like a bunch of amateurs. Sorry to be so negative, but you have to understand that getting bias into the Wikipedia is a skill; it requires practice, finesse and imagination. It has to be learned; it is not a natural thing, though some have more talent for it than others.

I have been following the Middle East Wikipedia battleground for a few years now, and have been very impressed with the skill of some editors in introducing bias into articles. To ingenuous editors, some of these techniques may seem innocuous enough; in many cases, it is hard to see how proposed edits are biasing an article one way or another. It is, in fact, only in the last few months that I have been able to define what these techniques are and how they work to introduce bias.

The first thing you need to know as a budding propagandist is this: there are two levels at which bias is introduced into the Wikipedia: at the article level, and at the topic level. You need to set your sights high: you don't want to merely bias a single article, you want the entire Wikipedia on your side. Without understanding the importance of topic bias, it is hard to understand many of the article-level techniques, so I will start with the topic level.

Discussed on

πŸ”— Transputer

πŸ”— Computing πŸ”— Computing/Computer hardware

The transputer is a series of pioneering microprocessors from the 1980s, featuring integrated memory and serial communication links, intended for parallel computing. They were designed and produced by Inmos, a semiconductor company based in Bristol, United Kingdom.

For some time in the late 1980s, many considered the transputer to be the next great design for the future of computing. While Inmos and the transputer did not achieve this expectation, the transputer architecture was highly influential in provoking new ideas in computer architecture, several of which have re-emerged in different forms in modern systems.

Discussed on

πŸ”— Wikipedia tests a new UI design

πŸ”— Religion πŸ”— Biology πŸ”— History of Science πŸ”— Science πŸ”— Evolutionary biology πŸ”— Molecular Biology πŸ”— Creationism πŸ”— Tree of Life πŸ”— Molecular Biology/Genetics

Evolution is change in the heritable characteristics of biological populations over successive generations. These characteristics are the expressions of genes that are passed on from parent to offspring during reproduction. Different characteristics tend to exist within any given population as a result of mutation, genetic recombination and other sources of genetic variation. Evolution occurs when evolutionary processes such as natural selection (including sexual selection) and genetic drift act on this variation, resulting in certain characteristics becoming more common or rare within a population. The evolutionary pressures that determine whether a characteristic would be common or rare within a population constantly change, resulting in a change in heritable characteristics arising over successive generations. It is this process of evolution that has given rise to biodiversity at every level of biological organisation, including the levels of species, individual organisms and molecules.

The theory of evolution by natural selection was conceived independently by Charles Darwin and Alfred Russel Wallace in the mid-19th century and was set out in detail in Darwin's book On the Origin of Species. Evolution by natural selection was first demonstrated by the observation that more offspring are often produced than can possibly survive. This is followed by three observable facts about living organisms: (1) traits vary among individuals with respect to their morphology, physiology and behaviour (phenotypic variation), (2) different traits confer different rates of survival and reproduction (differential fitness) and (3) traits can be passed from generation to generation (heritability of fitness). Thus, in successive generations members of a population are more likely to be replaced by the progenies of parents with favourable characteristics that have enabled them to survive and reproduce in their respective environments. In the early 20th century, other competing ideas of evolution such as mutationism and orthogenesis were refuted as the modern synthesis reconciled Darwinian evolution with classical genetics, which established adaptive evolution as being caused by natural selection acting on Mendelian genetic variation.

All life on Earth shares a last universal common ancestor (LUCA) that lived approximately 3.5–3.8Β billion years ago. The fossil record includes a progression from early biogenic graphite, to microbial mat fossils, to fossilised multicellular organisms. Existing patterns of biodiversity have been shaped by repeated formations of new species (speciation), changes within species (anagenesis) and loss of species (extinction) throughout the evolutionary history of life on Earth. Morphological and biochemical traits are more similar among species that share a more recent common ancestor, and can be used to reconstruct phylogenetic trees.

Evolutionary biologists have continued to study various aspects of evolution by forming and testing hypotheses as well as constructing theories based on evidence from the field or laboratory and on data generated by the methods of mathematical and theoretical biology. Their discoveries have influenced not just the development of biology but numerous other scientific and industrial fields, including agriculture, medicine, and computer science.

Discussed on

πŸ”— Bicameralism (Psychology)

πŸ”— Philosophy πŸ”— Skepticism πŸ”— Psychology πŸ”— Philosophy/Contemporary philosophy πŸ”— Philosophy/Philosophy of mind πŸ”— Alternative Views πŸ”— Neuroscience

Bicameralism (the condition of being divided into "two-chambers") is a hypothesis in psychology that argues that the human mind once operated in a state in which cognitive functions were divided between one part of the brain which appears to be "speaking", and a second part which listens and obeysβ€”a bicameral mind. The term was coined by Julian Jaynes, who presented the idea in his 1976 book The Origin of Consciousness in the Breakdown of the Bicameral Mind, wherein he made the case that a bicameral mentality was the normal and ubiquitous state of the human mind as recently as 3,000 years ago, near the end of the Mediterranean bronze age.

Discussed on

πŸ”— Airbus Beluga

πŸ”— Aviation πŸ”— Aviation/aircraft

The Airbus A300-600ST (Super Transporter), or Beluga, is a version of the standard A300-600 wide-body airliner modified to carry aircraft parts and outsize cargo. It received the official name of Super Transporter early on; however, the name Beluga, a whale it resembles, gained popularity and has since been officially adopted. The Beluga XL, based on the Airbus A330 with similar modifications and dimensions, was developed by Airbus to replace the type in January 2020.

Discussed on

πŸ”— Refal programming language

πŸ”— Computing

Refal ("[of] Recursive functions' algorithmic language") "is functional programming language oriented toward symbolic computations", including "string processing, language translation, [and] artificial intelligence". It is one of the oldest members of this family, first conceived of in 1966 as a theoretical tool, with the first implementation appearing in 1968. Refal was intended to combine mathematical simplicity with practicality for writing large and sophisticated programs.

One of the first functional programming languages to do so, and unlike Lisp of its time, Refal is based on pattern matching. Its pattern matching works in conjunction with term rewriting.

The basic data structure of Lisp and Prolog is a linear list built by cons operation in a sequential manner, thus with O(n) access to list's nth element. Refal's lists are built and scanned from both ends, with pattern matching working for nested lists as well as the top-level one. In effect, the basic data structure of Refal is a tree rather than a list. This gives freedom and convenience in creating data structures while using only mathematically simple control mechanisms of pattern matching and substitution.

Refal also includes a feature called the freezer to support efficient partial evaluation.

Refal can be applied to the processing and transformation of tree structures, similarly to XSLT.

Discussed on