Topic: Transhumanism (Page 2)

You are looking at all articles with the topic "Transhumanism". We found 18 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— Drexler–Smalley debate on molecular nanotechnology

πŸ”— History of Science πŸ”— Transhumanism

The Drexler–Smalley debate on molecular nanotechnology was a public dispute between K. Eric Drexler, the originator of the conceptual basis of molecular nanotechnology, and Richard Smalley, a recipient of the 1996 Nobel prize in Chemistry for the discovery of the nanomaterial buckminsterfullerene. The dispute was about the feasibility of constructing molecular assemblers, which are molecular machines which could robotically assemble molecular materials and devices by manipulating individual atoms or molecules. The concept of molecular assemblers was central to Drexler's conception of molecular nanotechnology, but Smalley argued that fundamental physical principles would prevent them from ever being possible. The two also traded accusations that the other's conception of nanotechnology was harmful to public perception of the field and threatened continued public support for nanotechnology research.

The debate was carried out from 2001 to 2003 through a series of published articles and open letters. It began with a 2001 article by Smalley in Scientific American, which was followed by a rebuttal published by Drexler and coworkers later that year, and two open letters by Drexler in early 2003. The debate was concluded in late 2003 in a "Point–Counterpoint" feature in Chemical & Engineering News in which both parties participated.

The debate has been often cited in the history of nanotechnology due to the fame of its participants and its commentary on both the technical and social aspects of nanotechnology. It has also been widely criticized for its adversarial tone, with Drexler accusing Smalley of publicly misrepresenting his work, and Smalley accusing Drexler of failing to understand basic science, causing commentators to go so far as to characterize the tone of the debate as similar to "a pissing match" and "reminiscent of [a] Saturday Night Live sketch".

Discussed on

πŸ”— Possible explanations for the slow progress of AI research

πŸ”— Computing πŸ”— Computer science πŸ”— Science Fiction πŸ”— Cognitive science πŸ”— Robotics πŸ”— Transhumanism πŸ”— Software πŸ”— Software/Computing πŸ”— Futures studies

Artificial general intelligence (AGI) is the hypothetical intelligence of a machine that has the capacity to understand or learn any intellectual task that a human being can. It is a primary goal of some artificial intelligence research and a common topic in science fiction and futures studies. AGI can also be referred to as strong AI, full AI, or general intelligent action. (Some academic sources reserve the term "strong AI" for machines that can experience consciousness.)

Some authorities emphasize a distinction between strong AI and applied AI (also called narrow AI or weak AI): the use of software to study or accomplish specific problem solving or reasoning tasks. Weak AI, in contrast to strong AI, does not attempt to perform the full range of human cognitive abilities.

As of 2017, over forty organizations were doing research on AGI.

Discussed on

πŸ”— Indefinite lifespan

πŸ”— Medicine πŸ”— Biology πŸ”— Skepticism πŸ”— Transhumanism πŸ”— Alternative Views πŸ”— Guild of Copy Editors πŸ”— Alternative medicine πŸ”— Longevity

Life extension is the idea of extending the human lifespan, either modestly – through improvements in medicine – or dramatically by increasing the maximum lifespan beyond its generally settled limit of 125 years. The ability to achieve such dramatic changes, however, does not currently exist.

Some researchers in this area, and "life extensionists", "immortalists" or "longevists" (those who wish to achieve longer lives themselves), believe that future breakthroughs in tissue rejuvenation, stem cells, regenerative medicine, molecular repair, gene therapy, pharmaceuticals, and organ replacement (such as with artificial organs or xenotransplantations) will eventually enable humans to have indefinite lifespans (agerasia) through complete rejuvenation to a healthy youthful condition. The ethical ramifications, if life extension becomes a possibility, are debated by bioethicists.

The sale of purported anti-aging products such as supplements and hormone replacement is a lucrative global industry. For example, the industry that promotes the use of hormones as a treatment for consumers to slow or reverse the aging process in the US market generated about $50Β billion of revenue a year in 2009. The use of such products has not been proven to be effective or safe.

Discussed on

πŸ”— Engines of Creation, by K. Eric Drexler (1986)

πŸ”— Books πŸ”— Transhumanism πŸ”— Alternative Views

Engines of Creation: The Coming Era of Nanotechnology is a 1986 molecular nanotechnology book written by K. Eric Drexler with a foreword by Marvin Minsky. An updated version was released in 2007. The book has been translated into Japanese, French, Spanish, Italian, Russian, and Chinese.

Discussed on

πŸ”— A Fire Upon the Deep

πŸ”— Novels πŸ”— Science Fiction πŸ”— Transhumanism

A Fire Upon the Deep is a 1992 science fiction novel by American writer Vernor Vinge. It is a space opera involving superhuman intelligences, aliens, variable physics, space battles, love, betrayal, genocide, and a communication medium resembling Usenet. A Fire Upon the Deep won the Hugo Award in 1993, sharing it with Doomsday Book by Connie Willis.

Besides the normal print book editions, the novel was also included on a CD-ROM sold by ClariNet Communications along with the other nominees for the 1993 Hugo awards. The CD-ROM edition included numerous annotations by Vinge on his thoughts and intentions about different parts of the book, and was later released as a standalone e-book (no longer available).

Discussed on

πŸ”— Utility Fog

πŸ”— Robotics πŸ”— Transhumanism

Utility fog (coined by Dr. John Storrs Hall in 1989) is a hypothetical collection of tiny robots that can replicate a physical structure. As such, it is a form of self-reconfiguring modular robotics.

Discussed on

πŸ”— Self-Replicating Machine

πŸ”— Science Fiction πŸ”— Robotics πŸ”— Transhumanism

A self-replicating machine is a type of autonomous robot that is capable of reproducing itself autonomously using raw materials found in the environment, thus exhibiting self-replication in a way analogous to that found in nature. The concept of self-replicating machines has been advanced and examined by Homer Jacobson, Edward F. Moore, Freeman Dyson, John von Neumann, Konrad Zuse and in more recent times by K. Eric Drexler in his book on nanotechnology, Engines of Creation (coining the term clanking replicator for such machines) and by Robert Freitas and Ralph Merkle in their review Kinematic Self-Replicating Machines which provided the first comprehensive analysis of the entire replicator design space. The future development of such technology is an integral part of several plans involving the mining of moons and asteroid belts for ore and other materials, the creation of lunar factories, and even the construction of solar power satellites in space. The von Neumann probe is one theoretical example of such a machine. Von Neumann also worked on what he called the universal constructor, a self-replicating machine that would be able to evolve and which he formalized in a cellular automata environment. Notably, Von Neumann's Self-Reproducing Automata scheme posited that open-ended evolution requires inherited information to be copied and passed to offspring separately from the self-replicating machine, an insight that preceded the discovery of the structure of the DNA molecule by Watson and Crick and how it is separately translated and replicated in the cell.

A self-replicating machine is an artificial self-replicating system that relies on conventional large-scale technology and automation. Although suggested earlier than in the late 1940's by Von Neumann, no self-replicating machine has been seen until today. Certain idiosyncratic terms are occasionally found in the literature. For example, the term clanking replicator was once used by Drexler to distinguish macroscale replicating systems from the microscopic nanorobots or "assemblers" that nanotechnology may make possible, but the term is informal and is rarely used by others in popular or technical discussions. Replicators have also been called "von Neumann machines" after John von Neumann, who first rigorously studied the idea. However, the term "von Neumann machine" is less specific and also refers to a completely unrelated computer architecture that von Neumann proposed and so its use is discouraged where accuracy is important. Von Neumann himself used the term universal constructor to describe such self-replicating machines.

Historians of machine tools, even before the numerical control era, sometimes figuratively said that machine tools were a unique class of machines because they have the ability to "reproduce themselves" by copying all of their parts. Implicit in these discussions is that a human would direct the cutting processes (later planning and programming the machines), and would then assemble the parts. The same is true for RepRaps, which are another class of machines sometimes mentioned in reference to such non-autonomous "self-replication". In contrast, machines that are truly autonomously self-replicating (like biological machines) are the main subject discussed here.