Topic: Systems
You are looking at all articles with the topic "Systems". We found 44 matches.
Hint:
To view all topics, click here. Too see the most popular topics, click here instead.
Abelian sandpile model
The Abelian sandpile model, also known as the Bak–Tang–Wiesenfeld model, was the first discovered example of a dynamical system displaying self-organized criticality. It was introduced by Per Bak, Chao Tang and Kurt Wiesenfeld in a 1987 paper.
The model is a cellular automaton. In its original formulation, each site on a finite grid has an associated value that corresponds to the slope of the pile. This slope builds up as "grains of sand" (or "chips") are randomly placed onto the pile, until the slope exceeds a specific threshold value at which time that site collapses transferring sand into the adjacent sites, increasing their slope. Bak, Tang, and Wiesenfeld considered process of successive random placement of sand grains on the grid; each such placement of sand at a particular site may have no effect, or it may cause a cascading reaction that will affect many sites.
The model has since been studied on the infinite lattice, on other (non-square) lattices, and on arbitrary graphs (including directed multigraphs). It is closely related to the dollar game, a variant of the chip-firing game introduced by Biggs.
Discussed on
- "Abelian sandpile model" | 2018-08-31 | 66 Upvotes 5 Comments
AI Winter
In the history of artificial intelligence, an AI winter is a period of reduced funding and interest in artificial intelligence research. The term was coined by analogy to the idea of a nuclear winter. The field has experienced several hype cycles, followed by disappointment and criticism, followed by funding cuts, followed by renewed interest years or decades later.
The term first appeared in 1984 as the topic of a public debate at the annual meeting of AAAI (then called the "American Association of Artificial Intelligence"). It is a chain reaction that begins with pessimism in the AI community, followed by pessimism in the press, followed by a severe cutback in funding, followed by the end of serious research. At the meeting, Roger Schank and Marvin Minsky—two leading AI researchers who had survived the "winter" of the 1970s—warned the business community that enthusiasm for AI had spiraled out of control in the 1980s and that disappointment would certainly follow. Three years later, the billion-dollar AI industry began to collapse.
Hype is common in many emerging technologies, such as the railway mania or the dot-com bubble. The AI winter was a result of such hype, due to over-inflated promises by developers, unnaturally high expectations from end-users, and extensive promotion in the media . Despite the rise and fall of AI's reputation, it has continued to develop new and successful technologies. AI researcher Rodney Brooks would complain in 2002 that "there's this stupid myth out there that AI has failed, but AI is around you every second of the day." In 2005, Ray Kurzweil agreed: "Many observers still think that the AI winter was the end of the story and that nothing since has come of the AI field. Yet today many thousands of AI applications are deeply embedded in the infrastructure of every industry."
Enthusiasm and optimism about AI has increased since its low point in the early 1990s. Beginning about 2012, interest in artificial intelligence (and especially the sub-field of machine learning) from the research and corporate communities led to a dramatic increase in funding and investment.
Discussed on
- "AI Winter" | 2019-11-25 | 55 Upvotes 41 Comments
- "AI winter" | 2007-10-25 | 16 Upvotes 1 Comments
Ant Colony Optimization Algorithms
In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs. Artificial Ants stand for multi-agent methods inspired by the behavior of real ants. The pheromone-based communication of biological ants is often the predominant paradigm used. Combinations of Artificial Ants and local search algorithms have become a method of choice for numerous optimization tasks involving some sort of graph, e.g., vehicle routing and internet routing. The burgeoning activity in this field has led to conferences dedicated solely to Artificial Ants, and to numerous commercial applications by specialized companies such as AntOptima.
As an example, Ant colony optimization is a class of optimization algorithms modeled on the actions of an ant colony. Artificial 'ants' (e.g. simulation agents) locate optimal solutions by moving through a parameter space representing all possible solutions. Real ants lay down pheromones directing each other to resources while exploring their environment. The simulated 'ants' similarly record their positions and the quality of their solutions, so that in later simulation iterations more ants locate better solutions. One variation on this approach is the bees algorithm, which is more analogous to the foraging patterns of the honey bee, another social insect.
This algorithm is a member of the ant colony algorithms family, in swarm intelligence methods, and it constitutes some metaheuristic optimizations. Initially proposed by Marco Dorigo in 1992 in his PhD thesis, the first algorithm was aiming to search for an optimal path in a graph, based on the behavior of ants seeking a path between their colony and a source of food. The original idea has since diversified to solve a wider class of numerical problems, and as a result, several problems have emerged, drawing on various aspects of the behavior of ants. From a broader perspective, ACO performs a model-based search and shares some similarities with estimation of distribution algorithms.
Discussed on
- "Ant Colony Optimization Algorithms" | 2023-07-20 | 13 Upvotes 1 Comments
- "Ant Colony Optimization Algorithms" | 2014-05-03 | 56 Upvotes 14 Comments
Bang Bang Control
In control theory, a bang–bang controller (2 step or on–off controller), also known as a hysteresis controller, is a feedback controller that switches abruptly between two states. These controllers may be realized in terms of any element that provides hysteresis. They are often used to control a plant that accepts a binary input, for example a furnace that is either completely on or completely off. Most common residential thermostats are bang–bang controllers. The Heaviside step function in its discrete form is an example of a bang–bang control signal. Due to the discontinuous control signal, systems that include bang–bang controllers are variable structure systems, and bang–bang controllers are thus variable structure controllers.
Discussed on
- "Bang Bang Control" | 2016-04-07 | 30 Upvotes 24 Comments
Bremermann's limit
Bremermann's limit, named after Hans-Joachim Bremermann, is a limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe. It is derived from Einstein's mass-energy equivalency and the Heisenberg uncertainty principle, and is c2/h ≈ 1.36 × 1050 bits per second per kilogram. This value is important when designing cryptographic algorithms, as it can be used to determine the minimum size of encryption keys or hash values required to create an algorithm that could never be cracked by a brute-force search.
For example, a computer with the mass of the entire Earth operating at the Bremermann's limit could perform approximately 1075 mathematical computations per second. If one assumes that a cryptographic key can be tested with only one operation, then a typical 128-bit key could be cracked in under 10−36 seconds. However, a 256-bit key (which is already in use in some systems) would take about two minutes to crack. Using a 512-bit key would increase the cracking time to approaching 1072 years, without increasing the time for encryption by more than a constant factor (depending on the encryption algorithms used).
The limit has been further analysed in later literature as the maximum rate at which a system with energy spread can evolve into an orthogonal and hence distinguishable state to another, In particular, Margolus and Levitin have shown that a quantum system with average energy E takes at least time to evolve into an orthogonal state. However, it has been shown that access to quantum memory in principle allows computational algorithms that require arbitrarily small amount of energy/time per one elementary computation step.
Discussed on
- "Bremermann's limit" | 2016-04-22 | 57 Upvotes 21 Comments
Burning Ship Fractal
The Burning Ship fractal, first described and created by Michael Michelitsch and Otto E. Rössler in 1992, is generated by iterating the function:
in the complex plane which will either escape or remain bounded. The difference between this calculation and that for the Mandelbrot set is that the real and imaginary components are set to their respective absolute values before squaring at each iteration. The mapping is non-analytic because its real and imaginary parts do not obey the Cauchy–Riemann equations.
Discussed on
- "Burning Ship Fractal" | 2016-09-26 | 252 Upvotes 65 Comments
Capability Immaturity Model
Capability Immaturity Model (CIMM) in software engineering is a parody acronym, a semi-serious effort to provide a contrast to the Capability Maturity Model (CMM). The Capability Maturity Model is a five point scale of capability in an organization, ranging from random processes at level 1 to fully defined, managed and optimized processes at level 5. The ability of an organization to carry out its mission on time and within budget is claimed to improve as the CMM level increases.
The "Capability Im-Maturity Model" asserts that organizations can and do occupy levels below CMM level 1. An original article by Capt. Tom Schorsch USAF as part of a graduate project at the Air Force Institute of Technology provides the definitions for CIMM. He cites Prof. Anthony Finkelstein's ACM paper as an inspiration. The article describes situations that arise in dysfunctional organizations. Such situations are reportedly common in organizations of all kinds undertaking software development, i.e. they are really characterizations of the management of specific projects, since they can occur even in organizations with positive CMM levels.
Kik Piney, citing the original authors, later adapted the model to a somewhat satirical version that attracted a number of followers who felt that it was quite true to their experience.
Discussed on
- "Capability Immaturity Model" | 2020-03-02 | 124 Upvotes 19 Comments
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory". Shannon is noted for having founded information theory with a landmark paper, "A Mathematical Theory of Communication", that he published in 1948.
He is also well known for founding digital circuit design theory in 1937, when—as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT)—he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his fundamental work on codebreaking and secure telecommunications.
Discussed on
- "Claude Shannon" | 2009-12-03 | 25 Upvotes 12 Comments
Clifford A. Pickover
Clifford Alan Pickover (born August 15, 1957) is an American author, editor, and columnist in the fields of science, mathematics, science fiction, innovation, and creativity. For many years, he was employed at the IBM Thomas J. Watson Research Center in Yorktown, New York where he was Editor-in-Chief of the IBM Journal of Research and Development. He has been granted more than 500 U.S. patents, is an elected Fellow for the Committee for Skeptical Inquiry, and is author of more than 50 books, translated into more than a dozen languages.
Discussed on
- "Clifford A. Pickover" | 2019-09-22 | 17 Upvotes 1 Comments
Computer
A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A "complete" computer including the hardware, the operating system (main software), and peripheral equipment required and used for "full" operation can be referred to as a computer system. This term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster.
Computers are used as control systems for a wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, and also general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users.
Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with MOS transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th to early 21st centuries.
Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with some type of computer memory, typically MOS semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.