Topic: Computing/Computer science (Page 2)

You are looking at all articles with the topic "Computing/Computer science". We found 35 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

๐Ÿ”— Fittsโ€™s law

๐Ÿ”— Computing ๐Ÿ”— Cognitive science ๐Ÿ”— Computing/Computer science ๐Ÿ”— Humanโ€“Computer Interaction

Fitts's law (often cited as Fitts' law) is a predictive model of human movement primarily used in humanโ€“computer interaction and ergonomics. This scientific law predicts that the time required to rapidly move to a target area is a function of the ratio between the distance to the target and the width of the target. Fitts's law is used to model the act of pointing, either by physically touching an object with a hand or finger, or virtually, by pointing to an object on a computer monitor using a pointing device.

Fitts's law has been shown to apply under a variety of conditions; with many different limbs (hands, feet, the lower lip, head-mounted sights), manipulanda (input devices), physical environments (including underwater), and user populations (young, old, special educational needs, and drugged participants).

Discussed on

๐Ÿ”— Karatsuba Algorithm

๐Ÿ”— Computing ๐Ÿ”— Mathematics ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science

The Karatsuba algorithm is a fast multiplication algorithm. It was discovered by Anatoly Karatsuba in 1960 and published in 1962. It reduces the multiplication of two n-digit numbers to at most n log 2 โก 3 โ‰ˆ n 1.58 {\displaystyle n^{\log _{2}3}\approx n^{1.58}} single-digit multiplications in general (and exactly n log 2 โก 3 {\displaystyle n^{\log _{2}3}} when n is a power of 2). It is therefore faster than the classical algorithm, which requires n 2 {\displaystyle n^{2}} single-digit products. For example, the Karatsuba algorithm requires 310 = 59,049 single-digit multiplications to multiply two 1024-digit numbers (n = 1024 = 210), whereas the classical algorithm requires (210)2 = 1,048,576 (a speedup of 17.75 times).

The Karatsuba algorithm was the first multiplication algorithm asymptotically faster than the quadratic "grade school" algorithm. The Toomโ€“Cook algorithm (1963) is a faster generalization of Karatsuba's method, and the Schรถnhageโ€“Strassen algorithm (1971) is even faster, for sufficiently large n.

Discussed on

๐Ÿ”— Flashsort

๐Ÿ”— Computing ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science

Flashsort is a distribution sorting algorithm showing linear computational complexity O ( n ) {\displaystyle O(n)} for uniformly distributed data sets and relatively little additional memory requirement. The original work was published in 1998 by Karl-Dietrich Neubert.

Discussed on

๐Ÿ”— List of unsolved problems in computer science

๐Ÿ”— Computing ๐Ÿ”— Computer science ๐Ÿ”— Computing/Computer science

This article is a list of notable unsolved problems in computer science. A problem in computer science is considered unsolved when no solution is known, or when experts in the field disagree about proposed solutions.

Discussed on

๐Ÿ”— Hofstadter's Law

๐Ÿ”— Computing ๐Ÿ”— Systems ๐Ÿ”— Business ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science ๐Ÿ”— Engineering ๐Ÿ”— Systems/Systems engineering

Hofstadter's law is a self-referential adage, coined by Douglas Hofstadter in his book Gรถdel, Escher, Bach: An Eternal Golden Braid (1979) to describe the widely experienced difficulty of accurately estimating the time it will take to complete tasks of substantial complexity:

Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law.

The law is often cited by programmers in discussions of techniques to improve productivity, such as The Mythical Man-Month or extreme programming.

Discussed on

๐Ÿ”— AI Winter

๐Ÿ”— United States/U.S. Government ๐Ÿ”— United States ๐Ÿ”— Technology ๐Ÿ”— Computing ๐Ÿ”— Systems ๐Ÿ”— Cognitive science ๐Ÿ”— Linguistics ๐Ÿ”— Computing/Computer science ๐Ÿ”— Robotics ๐Ÿ”— Transhumanism ๐Ÿ”— Linguistics/Applied Linguistics ๐Ÿ”— Systems/Cybernetics

In the history of artificial intelligence, an AI winter is a period of reduced funding and interest in artificial intelligence research. The term was coined by analogy to the idea of a nuclear winter. The field has experienced several hype cycles, followed by disappointment and criticism, followed by funding cuts, followed by renewed interest years or decades later.

The term first appeared in 1984 as the topic of a public debate at the annual meeting of AAAI (then called the "American Association of Artificial Intelligence"). It is a chain reaction that begins with pessimism in the AI community, followed by pessimism in the press, followed by a severe cutback in funding, followed by the end of serious research. At the meeting, Roger Schank and Marvin Minskyโ€”two leading AI researchers who had survived the "winter" of the 1970sโ€”warned the business community that enthusiasm for AI had spiraled out of control in the 1980s and that disappointment would certainly follow. Three years later, the billion-dollar AI industry began to collapse.

Hype is common in many emerging technologies, such as the railway mania or the dot-com bubble. The AI winter was a result of such hype, due to over-inflated promises by developers, unnaturally high expectations from end-users, and extensive promotion in the media . Despite the rise and fall of AI's reputation, it has continued to develop new and successful technologies. AI researcher Rodney Brooks would complain in 2002 that "there's this stupid myth out there that AI has failed, but AI is around you every second of the day." In 2005, Ray Kurzweil agreed: "Many observers still think that the AI winter was the end of the story and that nothing since has come of the AI field. Yet today many thousands of AI applications are deeply embedded in the infrastructure of every industry."

Enthusiasm and optimism about AI has increased since its low point in the early 1990s. Beginning about 2012, interest in artificial intelligence (and especially the sub-field of machine learning) from the research and corporate communities led to a dramatic increase in funding and investment.

Discussed on

๐Ÿ”— Post-quantum cryptography: just in case

๐Ÿ”— Computing ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science ๐Ÿ”— Cryptography ๐Ÿ”— Cryptography/Computer science ๐Ÿ”— Computing/Computer Security

Post-quantum cryptography (sometimes referred to as quantum-proof, quantum-safe or quantum-resistant) refers to cryptographic algorithms (usually public-key algorithms) that are thought to be secure against an attack by a quantum computer. As of 2019, this is not true for the most popular public-key algorithms, which can be efficiently broken by a sufficiently strong quantum computer. The problem with currently popular algorithms is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems can be easily solved on a sufficiently powerful quantum computer running Shor's algorithm. Even though current, publicly known, experimental quantum computers lack processing power to break any real cryptographic algorithm, many cryptographers are designing new algorithms to prepare for a time when quantum computing becomes a threat. This work has gained greater attention from academics and industry through the PQCrypto conference series since 2006 and more recently by several workshops on Quantum Safe Cryptography hosted by the European Telecommunications Standards Institute (ETSI) and the Institute for Quantum Computing.

In contrast to the threat quantum computing poses to current public-key algorithms, most current symmetric cryptographic algorithms and hash functions are considered to be relatively secure against attacks by quantum computers. While the quantum Grover's algorithm does speed up attacks against symmetric ciphers, doubling the key size can effectively block these attacks. Thus post-quantum symmetric cryptography does not need to differ significantly from current symmetric cryptography. See section on symmetric-key approach below.

Discussed on

๐Ÿ”— Wirth's Law

๐Ÿ”— Computing ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science

Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster.

The adage is named after Niklaus Wirth, who discussed it in his 1995 article "A Plea for Lean Software".

Discussed on

๐Ÿ”— M4 (computer language)

๐Ÿ”— Computing ๐Ÿ”— Computing/Computer science

m4 is a general-purpose macro processor included in all UNIX-like operating systems, and is a component of the POSIX standard.

The language was designed by Brian Kernighan and Dennis Ritchie for the original versions of UNIX. It is an extension of an earlier macro processor m3, written by Ritchie for an unknown AP-3 minicomputer.

The macro preprocessor operates as a text-replacement tool. It is employed to re-use text templates, typically in computer programming applications, but also in text editing and text-processing applications. Most users require m4 as a dependency of GNU autoconf.

Discussed on

๐Ÿ”— Curryโ€“Howard correspondence

๐Ÿ”— Computing ๐Ÿ”— Computer science ๐Ÿ”— Mathematics ๐Ÿ”— Computing/Software ๐Ÿ”— Computing/Computer science

In programming language theory and proof theory, the Curryโ€“Howard correspondence (also known as the Curryโ€“Howard isomorphism or equivalence, or the proofs-as-programs and propositions- or formulae-as-types interpretation) is the direct relationship between computer programs and mathematical proofs.

It is a generalization of a syntactic analogy between systems of formal logic and computational calculi that was first discovered by the American mathematician Haskell Curry and logician William Alvin Howard. It is the link between logic and computation that is usually attributed to Curry and Howard, although the idea is related to the operational interpretation of intuitionistic logic given in various formulations by L. E. J. Brouwer, Arend Heyting and Andrey Kolmogorov (see Brouwerโ€“Heytingโ€“Kolmogorov interpretation) and Stephen Kleene (see Realizability). The relationship has been extended to include category theory as the three-way Curryโ€“Howardโ€“Lambek correspondence.

Discussed on