Topic: computing (Page 30)

You are looking at all articles with the topic "computing". We found 496 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— Halt and Catch Fire

πŸ”— United States πŸ”— Computing πŸ”— Television πŸ”— United States/American television

Halt and Catch Fire is an American period drama television series created by Christopher Cantwell and Christopher C. Rogers. It aired on the cable network AMC in the United States from June 1, 2014, to October 14, 2017, spanning four seasons and 40 episodes. Taking place over a period of more than ten years, the series depicts a fictionalized insider's view of the personal computer revolution of the 1980s and the growth of the World Wide Web in the early 1990s. The show's title refers to computer machine code instruction Halt and Catch Fire (HCF), the execution of which would cause the computer's central processing unit to stop working (catch fire being a humorous exaggeration).

In season one, the company Cardiff Electric makes its first foray into personal computing, with entrepreneur Joe MacMillan (Lee Pace) running a project to build an IBM PC clone with the help of computer engineer Gordon Clark (Scoot McNairy) and prodigy programmer Cameron Howe (Mackenzie Davis). Seasons two and three shift focus to a startup company, the online community Mutiny, that is headed by Cameron and Gordon's wife Donna (Kerry BishΓ©), while Joe ventures out on his own. The fourth and final season focuses on competing web search engines involving all the principal characters.

Halt and Catch Fire marked Cantwell's and Rogers's first jobs in television. They wrote the pilot hoping to use it to secure jobs as writers in the industry but instead landed a series of their own from AMC. The story was inspired by Cantwell's childhood in the Silicon Prairie of Dallas–Fort Worth, where his father worked as a software salesman, and the creators' subsequent research into Texas's role in personal computing innovations of the 1980s. Filmed in the Atlanta, Georgia, area and produced by the network, the series is set in the Silicon Prairie for its first two seasons and Silicon Valley for its latter two.

Halt and Catch Fire debuted to generally favorable reviews, though many reviewers initially found it derivative of other series such as Mad Men. In each subsequent season, the series grew in acclaim, and by the time it concluded, critics considered it among the best shows of the 2010s. Despite its critical reception, the series experienced low viewership ratings throughout its run, with only the first episode surpassing oneΒ million viewers for its initial broadcast.

Discussed on

πŸ”— Mark V. Shaney

πŸ”— Computing πŸ”— Computing/Software πŸ”— Linguistics πŸ”— Computing/Computer science πŸ”— Linguistics/Applied Linguistics

Mark V. Shaney is a synthetic Usenet user whose postings in the net.singles newsgroups were generated by Markov chain techniques, based on text from other postings. The username is a play on the words "Markov chain". Many readers were fooled into thinking that the quirky, sometimes uncannily topical posts were written by a real person.

The system was designed by Rob Pike with coding by Bruce Ellis. Don P. Mitchell wrote the Markov chain code, initially demonstrating it to Pike and Ellis using the Tao Te Ching as a basis. They chose to apply it to the net.singles netnews group.

Discussed on

πŸ”— IBM Parallel Sysplex

πŸ”— Computing πŸ”— Computing/Computer hardware πŸ”— Computing/Software

In computing, a Parallel Sysplex is a cluster of IBM mainframes acting together as a single system image with z/OS. Used for disaster recovery, Parallel Sysplex combines data sharing and parallel computing to allow a cluster of up to 32 systems to share a workload for high performance and high availability.

Discussed on

πŸ”— Wetware computer

πŸ”— Computing

A wetware computer is an organic computer (which can also be known as an artificial organic brain or a neurocomputer) composed of organic material such as living neurons. Wetware computers composed of neurons are different than conventional computers because they are thought to be capable in a way of "thinking for themselves", because of the dynamic nature of neurons. While wetware is still largely conceptual, there has been limited success with construction and prototyping, which has acted as a proof of the concept's realistic application to computing in the future. The most notable prototypes have stemmed from the research completed by biological engineer William Ditto during his time at the Georgia Institute of Technology. His work constructing a simple neurocomputer capable of basic addition from leech neurons in 1999 was a significant discovery for the concept. This research acted as a primary example driving interest in the creation of these artificially constructed, but still organic brains.

Discussed on

πŸ”— Kensington Security Slot

πŸ”— Computing πŸ”— Computing/Computer hardware πŸ”— Computing/Computer Security

A Kensington Security Slot (also called a K-Slot or Kensington lock) is part of an anti-theft system designed in the early 1990s and patented by Kryptonite in 1999–2000, assigned to Schlage in 2002, and since 2005 owned and marketed by Kensington Computer Products Group, a division of ACCO Brands.

Discussed on

πŸ”— Ted Nelson

πŸ”— Biography πŸ”— Internet πŸ”— Computing πŸ”— Philosophy πŸ”— Biography/science and academia πŸ”— Philosophy/Philosophers πŸ”— Sociology πŸ”— University of Oxford

Theodor Holm Nelson (born June 17, 1937) is an American pioneer of information technology, philosopher and sociologist. He coined the terms hypertext and hypermedia in 1963 and published them in 1965. Nelson coined the terms transclusion, virtuality, and intertwingularity (in Literary Machines), and teledildonics. According to a 1997 Forbes profile, Nelson "sees himself as a literary romantic, like a Cyrano de Bergerac, or 'the Orson Welles of software.'"

Discussed on

πŸ”— 86-DOS

πŸ”— Computing πŸ”— Computing/Software

86-DOS is a discontinued operating system developed and marketed by Seattle Computer Products (SCP) for its Intel 8086-based computer kit. Initially known as QDOS (Quick and Dirty Operating System), the name was changed to 86-DOS once SCP started licensing the operating system in 1980.

86-DOS had a command structure and application programming interface that imitated that of Digital Research's CP/M operating system, which made it easy to port programs from the latter. The system was licensed and then purchased by Microsoft and developed further as MS-DOS and PCΒ DOS.

Discussed on

πŸ”— C++0x (upcoming C++ standard, includes lambda functions)

πŸ”— Computing πŸ”— Computing/Software πŸ”— C/C++ πŸ”— C/C++/C++

C++11 is a version of the standard for the programming language C++. It was approved by International Organization for Standardization (ISO) on 12 August 2011, replacing C++03, superseded by C++14 on 18 August 2014 and later, by C++17. The name follows the tradition of naming language versions by the publication year of the specification, though it was formerly named C++0x because it was expected to be published before 2010.

Although one of the design goals was to prefer changes to the libraries over changes to the core language, C++11 does make several additions to the core language. Areas of the core language that were significantly improved include multithreading support, generic programming support, uniform initialization, and performance. Significant changes were also made to the C++ Standard Library, incorporating most of the C++ Technical Report 1 (TR1) libraries, except the library of mathematical special functions.

C++11 was published as ISO/IEC 14882:2011 in September 2011 and is available for a fee. The working draft most similar to the published C++11 standard is N3337, dated 16 January 2012; it has only editorial corrections from the C++11 standard.

Discussed on

πŸ”— MONIAC – Monetary National Income Analogue Computer

πŸ”— Computing πŸ”— Economics πŸ”— Computing/Early computers

The MONIAC (Monetary National Income Analogue Computer) also known as the Phillips Hydraulic Computer and the Financephalograph, was created in 1949 by the New Zealand economist Bill Phillips (William Phillips) to model the national economic processes of the United Kingdom, while Phillips was a student at the London School of Economics (LSE). The MONIAC was an analogue computer which used fluidic logic to model the workings of an economy. The MONIAC name may have been suggested by an association of money and ENIAC, an early electronic digital computer.

Discussed on

πŸ”— Dadda Multiplier

πŸ”— Computing πŸ”— Computing/Computer hardware

The Dadda multiplier is a hardware binary multiplier design invented by computer scientist Luigi Dadda in 1965. It uses a selection of full and half adders to sum the partial products in stages (the Dadda tree or Dadda reduction) until two numbers are left. The design is similar to the Wallace multiplier, but the different reduction tree reduces the required number of gates (for all but the smallest operand sizes) and makes it slightly faster (for all operand sizes).

Dadda and Wallace multipliers have the same three steps for two bit strings w 1 {\displaystyle w_{1}} and w 2 {\displaystyle w_{2}} of lengths β„“ 1 {\displaystyle \ell _{1}} and β„“ 2 {\displaystyle \ell _{2}} respectively:

  1. Multiply (logical AND) each bit of w 1 {\displaystyle w_{1}} , by each bit of w 2 {\displaystyle w_{2}} , yielding β„“ 1 β‹… β„“ 2 {\displaystyle \ell _{1}\cdot \ell _{2}} results, grouped by weight in columns
  2. Reduce the number of partial products by stages of full and half adders until we are left with at most two bits of each weight.
  3. Add the final result with a conventional adder.

As with the Wallace multiplier, the multiplication products of the first step carry different weights reflecting the magnitude of the original bit values in the multiplication. For example, the product of bits a n b m {\displaystyle a_{n}b_{m}} has weight n + m {\displaystyle n+m} .

Unlike Wallace multipliers that reduce as much as possible on each layer, Dadda multipliers attempt to minimize the number of gates used, as well as input/output delay. Because of this, Dadda multipliers have a less expensive reduction phase, but the final numbers may be a few bits longer, thus requiring slightly bigger adders.

Discussed on