Topic: Computing (Page 25)

You are looking at all articles with the topic "Computing". We found 496 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— Rule of Three (Computer Programming)

πŸ”— Computing πŸ”— Computing/Software

Rule of three ("Three strikes and you refactor") is a code refactoring rule of thumb to decide when similar pieces of code should be refactored to avoid duplication. It states that two instances of similar code don't require refactoring, but when similar code is used three times, it should be extracted into a new procedure. The rule was popularised by Martin Fowler in Refactoring and attributed to Don Roberts.

Duplication is considered a bad practice in programming because it makes the code harder to maintain. When the rule encoded in a replicated piece of code changes, whoever maintains the code will have to change it in all places correctly.

However, choosing an appropriate design to avoid duplication might benefit from more examples to see patterns in. Attempting premature refactoring risks selecting a wrong abstraction, which can result in worse code as new requirements emerge and will eventually need to be refactored again.

The rule implies that the cost of maintenance certainly outweighs the cost of refactoring and potential bad design when there are three copies, and may or may not if there are only two copies.

Discussed on

πŸ”— Host Protected Area

πŸ”— Computing πŸ”— Computer Security πŸ”— Computer Security/Computing

The host protected area (HPA) is an area of a hard drive or solid-state drive that is not normally visible to an operating system. It was first introduced in the ATA-4 standard CXV (T13) in 2001.

Discussed on

πŸ”— Delphi

πŸ”— Computing πŸ”— Computing/Software

Delphi is an event-driven programming language based on Object Pascal and an associated integrated development environment (IDE) for rapid application development of desktop, mobile, web, and console software, currently developed and maintained by Embarcadero Technologies.

Delphi's compilers generate native code for Microsoft Windows, macOS, iOS, Android and Linux (x64 only). Since 2016, there have been new releases of Delphi every six months, with new platforms being added approximately every second release.

Delphi includes a code editor, a visual designer, an integrated debugger, a source code control component, and support for third-party plugins. The code editor features Code Insight (code completion), Error Insight (real-time error-checking), and refactoring. The visual forms designer has traditionally used Visual Component Library (VCL) for native Windows development, but the FireMonkey (FMX) platform was later added for cross-platform development. Database support in Delphi is very strong. A Delphi project of a million lines of code can compile in a few seconds – one benchmark compiled 170,000 lines per second.

Delphi was originally developed by Borland as a rapid application development tool for Windows as the successor of Turbo Pascal. Delphi added full object-oriented programming to the existing language, and since then the language has grown to support generics and anonymous methods, and native Component Object Model (COM) support. In 2006, Borland’s developer tools section was transferred from Borland to a wholly owned subsidiary known as CodeGear, which was sold to Embarcadero Technologies in 2008. In 2015, Embarcadero was purchased by Idera Software, but the Embarcadero mark was retained for the developer tools division.

Delphi and its C++ counterpart, C++Builder, are interoperable. They share many core components, notably the IDE, VCL, and much of the runtime library. In addition, they can be used jointly in a project. For example, C++Builder 6 and later can consume Delphi source code and C++ in one project, while packages compiled with C++Builder can be used from within Delphi. In 2007, the products were released jointly as RADΒ Studio, a shared host for Delphi and C++Builder, which can be purchased with either or both.

Discussed on

πŸ”— Modelica

πŸ”— Computing

Modelica is an object-oriented, declarative, multi-domain modeling language for component-oriented modeling of complex systems, e.g., systems containing mechanical, electrical, electronic, hydraulic, thermal, control, electric power or process-oriented subcomponents. The free Modelica language is developed by the non-profit Modelica Association. The Modelica Association also develops the free Modelica Standard Library that contains about 1360 generic model components and 1280 functions in various domains, as of version 3.2.1.

Discussed on

πŸ”— Program slicing

πŸ”— Computing

In computer programming, program slicing is the computation of the set of program statements, the program slice, that may affect the values at some point of interest, referred to as a slicing criterion. Program slicing can be used in debugging to locate source of errors more easily. Other applications of slicing include software maintenance, optimization, program analysis, and information flow control.

Slicing techniques have been seeing a rapid development since the original definition by Mark Weiser. At first, slicing was only static, i.e., applied on the source code with no other information than the source code. Bogdan Korel and Janusz Laski introduced dynamic slicing, which works on a specific execution of the program (for a given execution trace). Other forms of slicing exist, for instance path slicing.

Discussed on

πŸ”— BΓ©lΓ‘dy's Anomaly

πŸ”— Computing

In computer storage, BΓ©lΓ‘dy's anomaly is the phenomenon in which increasing the number of page frames results in an increase in the number of page faults for certain memory access patterns. This phenomenon is commonly experienced when using the first-in first-out (FIFO) page replacement algorithm. In FIFO, the page fault may or may not increase as the page frames increase, but in Optimal and stack-based algorithms like LRU, as the page frames increase the page fault decreases. LΓ‘szlΓ³ BΓ©lΓ‘dy demonstrated this in 1969.

In common computer memory management, information is loaded in specific-sized chunks. Each chunk is referred to as a page. Main memory can hold only a limited number of pages at a time. It requires a frame for each page it can load. A page fault occurs when a page is not found, and might need to be loaded from disk into memory.

When a page fault occurs and all frames are in use, one must be cleared to make room for the new page. A simple algorithm is FIFO: whichever page has been in the frames the longest is the one that is cleared. Until BΓ©lΓ‘dy's anomaly was demonstrated, it was believed that an increase in the number of page frames would always result in the same number of or fewer page faults.

Discussed on

πŸ”— Phreaking

πŸ”— Computing πŸ”— Telecommunications πŸ”— Computer Security πŸ”— Computer Security/Computing

Phreaking is a slang term coined to describe the activity of a culture of people who study, experiment with, or explore telecommunication systems, such as equipment and systems connected to public telephone networks. The term phreak is a sensational spelling of the word freak with the ph- from phone, and may also refer to the use of various audio frequencies to manipulate a phone system. Phreak, phreaker, or phone phreak are names used for and by individuals who participate in phreaking.

The term first referred to groups who had reverse engineered the system of tones used to route long-distance calls. By re-creating these tones, phreaks could switch calls from the phone handset, allowing free calls to be made around the world. To ease the creation of these tones, electronic tone generators known as blue boxes became a staple of the phreaker community. This community included future Apple Inc. cofounders Steve Jobs and Steve Wozniak.

The blue box era came to an end with the ever-increasing use of computerized phone systems which allowed telecommunication companies to discontinue the use of in-band signaling for call routing purposes. Instead, dialing information was sent on a separate channel which was inaccessible to the telecom customer. By the 1980s, most of the public switched telephone network (PSTN) in the US and Western Europe had adopted the SS7 system which uses out-of-band signaling for call control (and which is still in use to this day). Phreaking has since become closely linked with computer hacking.

Discussed on

πŸ”— Harvard Mark I

πŸ”— Computing πŸ”— Computing/Early computers

The IBM Automatic Sequence Controlled Calculator (ASCC), called Mark I by Harvard University’s staff, was a general purpose electromechanical computer that was used in the war effort during the last part of World War II.

One of the first programs to run on the Mark I was initiated on 29 March 1944 by John von Neumann. At that time, von Neumann was working on the Manhattan Project, and needed to determine whether implosion was a viable choice to detonate the atomic bomb that would be used a year later. The Mark I also computed and printed mathematical tables, which had been the initial goal of British inventor Charles Babbage for his "analytical engine".

The Mark I was disassembled in 1959, but portions of it are displayed in the Science Center as part of the Harvard Collection of Historical Scientific Instruments. Other sections of the original machine were transferred to IBM and the Smithsonian Institution.

Discussed on

πŸ”— N8VEM – Homebrew Computing Project

πŸ”— Computing πŸ”— Computing/Computer hardware

N8VEM was a homebrew computing project. It featured a variety of free and open hardware and software. N8VEM builders made their own homebrew computer systems for themselves and shared their experiences with other homebrew computer hobbyists. N8VEM homebrew computer components are made in the style of vintage computers of the mid to late 1970s and early 1980s using a mix of classic and modern technologies. They are designed with ease of amateur assembly in mind.

In November 2015 the N8VEM project was ended by its creator Andrew Lynch and the community reconvened under the new name of Retrobrew Computers.

Discussed on

πŸ”— WarGames was released today 40 years ago

πŸ”— United States πŸ”— Video games πŸ”— Computing πŸ”— Film πŸ”— Military history πŸ”— Military history/North American military history πŸ”— Military history/United States military history πŸ”— Film/American cinema πŸ”— United States/Film - American cinema πŸ”— Science Fiction πŸ”— Computer Security πŸ”— Computer Security/Computing πŸ”— Military history/Cold War πŸ”— Cold War πŸ”— United States/Washington - Seattle πŸ”— United States/Washington πŸ”— Film/War films πŸ”— Military history/War films

WarGames is a 1983 American science fiction techno-thriller film written by Lawrence Lasker and Walter F. Parkes and directed by John Badham. The film, which stars Matthew Broderick, Dabney Coleman, John Wood, and Ally Sheedy, follows David Lightman (Broderick), a young hacker who unwittingly accesses a United States military supercomputer programmed to simulate, predict and execute nuclear war against the Soviet Union.

WarGames was a critical and commercial success, grossing $125Β million worldwide against a $12Β million budget. The film was nominated for three Academy Awards.

Discussed on