下午3点到5点是什么时辰| 生姜什么时候吃最好| 肌肉僵硬是什么原因引起的| 煞气是什么意思| 貔貅五行属什么| 前胸后背出汗多是什么原因| 北方人立秋吃什么| 唐僧最后成了什么佛| 女性盆腔炎什么症状| 吃什么可以提升白细胞| 痛风吃什么菜好| 释然是什么意思| 黄体生成素高是什么原因| hpv81阳性是什么意思| 梦见一群羊是什么意思| 身上痒是什么情况| rma是什么意思| 纠缠什么意思| 中山大学是什么级别| 股骨头疼痛吃什么药| 扎西德勒什么意思| 窦性心律不齐吃什么药| 洗衣机漏水是什么原因| 一什么牛肉| 彩超能检查什么| 什么情况下需要做肠镜| 黄精什么人不能吃| 蔚蓝是什么意思| 收缩压和舒张压是什么| 舌头有齿痕是什么原因| 女人小便带血是什么原因引起的| 仁爱是什么意思| 贲门松弛吃什么药| 乙肝大三阳是什么意思| 钾高吃什么可以降下来| 被香灰烫了预示着什么| 一个万一个足念什么| 舌苔发白是什么问题| 心脏挂什么科| 病毒四项检查都有什么| 试管都有什么方案| 孵化基地是什么意思| 同房后出血是什么原因| 关联词是什么意思| 沼泽地是什么意思| 毛巾为什么会臭| 腰疼是什么原因引起的| 代糖是什么| 活动性肺结核是什么意思| 素有是什么意思| 氨气对人体有什么危害| 迪奥什么意思| 老妈子是什么意思| 梦见晒被子是什么意思| 熊猫长什么样| 黄瓜什么时候种| 瘦人吃什么长胖| ser是什么氨基酸| 集少两撇是什么字| 皮肤属于什么系统| 加持是什么意思| 什么的嘴巴| 右脸颊长痘是什么原因| 1962年五行属什么| 黄龙玉产地在什么地方| AX是什么意思| 阿托伐他汀钙片有什么副作用| 鸡代表什么数字| 喘不上气挂什么科| 恬静是什么意思| 卧蚕是什么意思| 主诉是什么意思| 为什么今年有两个六月| 山东为什么简称鲁| 上眼皮肿是什么原因| 冠状沟有溃疡是什么病| 焦点是什么意思| 后脑勺发胀是什么原因| 云南白药里面的保险子有什么用| 985什么意思| 检查肝功能挂什么科| 四妙丸有什么功效与作用| 丧偶什么意思| 胸长什么样子| 尾戒代表什么| 杆菌是什么意思| 余情未了什么意思| 氨基酸什么牌子好| 龟头炎用什么药| 应无所住而生其心是什么意思| 脚后跟痒是什么原因| 发烧适合吃什么水果| 什么人生病不看医生| 21岁属什么生肖| 什么口什么心| 什么样的大树| h是什么牌子的衣服| 心脏下边是什么器官| 怨妇是什么意思| 角弓反张是什么意思| 纺锤形是什么形状| 流产能吃什么水果| 女性阴道痒是什么原因| 今年是什么年庚| 血红蛋白低说明什么| 出局是什么意思| 孩子白细胞高是什么原因| 卡尔文克莱恩是什么牌子| 眼花缭乱的意思是什么| 肉蔻炖肉起什么作用| 诸侯国是什么意思| 冰冻三尺的下一句是什么| 左肾积水是什么意思| brooks是什么品牌| 日本天皇叫什么名字| 榻榻米床垫什么材质的好| 瑜伽什么意思| 桐字属于五行属什么| 心脏跳的快是什么原因| 为什么家里会有隐翅虫| 幽门螺旋杆菌的症状是什么| 什么手机拍照好看| 什么的什么的词语| 来大姨妈能喝什么饮料| 大脚趾头麻木是什么原因| 唇腺活检主要是看什么| 一什么桃花| 连续打喷嚏是什么原因| 西安吃什么| 失落感是什么意思| 咳嗽有绿痰是什么原因| 高筋面粉是什么意思| 打九价是什么意思| 快速眼动是什么意思| dell是什么牌子的电脑| 撕漫男什么意思| 什么样的伤口需要打破伤风| 女人长期喝西洋参有什么好处| 过生日送什么礼物好| 什么地流着| e代表什么数字| 和珅属什么生肖| 失聪是什么原因造成的| doge是什么意思| 反胃酸是什么原因| 仙人掌有什么功效| 狼毒是什么| 榴莲不能和什么同吃| hpv都有什么症状| 四大神兽是什么动物| 吃什么可以让月经快点来| ad医学上是什么意思| 脑萎缩是什么病| 水嘴是什么| apc药片是什么药| 空调扇的冰晶是什么| 猴跟什么生肖配对最好| 32年婚姻是什么婚| 夜间睡觉口干口苦是什么原因| 尿浑浊是什么病的前兆| 怀孕第一天有什么症状| 中指是什么意思| 劳热是什么意思| 暴发火眼吃什么药| 什么是宦官| 什么血型和什么血型不能生孩子| 沙漠玫瑰什么时候开花| 亿五行属什么| 急性结肠炎什么症状| 脚脱皮是什么原因| 剪不断理还乱什么意思| 祛湿气喝什么茶| 小孩脸上长痣是什么原因引起的| 免疫肝是什么病| 温水煮青蛙是什么意思| 四十年是什么婚| 代言是什么意思| 什么是琥珀| 蕙质兰心什么意思| 六十天打一字是什么字| 上面一个四下面一个正念什么| 每天坚持黄瓜敷脸有什么效果| 阴茎发麻是什么原因| 翠色什么流| 1956年属什么生肖| 身心疲惫是什么意思| 无什么不什么的成语| 鬼针草有什么功效| 疤痕增生挂什么科| 2002年属什么生肖| 夏至什么意思| 排卵期后面是什么期| 尿道感染有什么现象| 奶茶和奶绿有什么区别| 单身领养孩子需要什么条件| co是什么气体| 八面玲珑是指什么生肖| 鼻息肉长什么样| 斗破苍穹什么时候出的| 什么狗不掉毛适合家养| 在干什么| 总是睡不着觉是什么原因| eb病毒阳性是什么意思| 什么叫西米| 夜盲症是什么意思| 直男是什么意思| 生理需求是什么意思| 21金维他什么时候吃效果最好| 每晚都做梦是什么原因| 当医生学什么专业| 什么含维生素d| 儿童肠胃炎吃什么药| 人体含量最多的元素是什么| 什么是意识| 怀孕两个月出血是什么原因| 大脑记忆力下降是什么原因| 为什么回族不吃猪肉| 腿部发痒是什么原因引起的| 为什么会感染幽门螺旋杆菌| 胃湿热吃什么中成药| 玉越戴越亮是什么原因| 胃复安又叫什么名字| 境遇是什么意思| 郑成功是什么朝代的| 三个句号代表什么意思| 糖尿病可以吃什么菜| 更年期的女人有什么症状表现| 庞统和诸葛亮什么关系| 紧凑是什么意思| 咕噜是什么意思| 小猫来家里有什么预兆| 男人阴茎硬不起来是什么原因| 什么叫强迫症| 中二病的意思是什么| 事业单位是指什么| 心烦意乱是什么意思| 粤菜是什么口味| 2026年属什么生肖| 挂妇科门诊都检查什么| 尿碘是检查什么的| 椎间盘轻度膨出是什么意思| 雪莲菌泡牛奶有什么功效| 羞明畏光是什么意思| 鱼腥草泡水喝有什么功效| 子宫是什么样子图片| 脚起水泡是什么原因| 局部皮肤瘙痒什么原因| 流产了有什么症状| 降压药什么时候吃最好| aki医学上是什么意思| 颞下颌关节紊乱吃什么药| 幼稚细胞是什么意思| 水煮肉片放什么配菜| 什么是继发性肺结核| 护理学是干什么的| 长痱子用什么药| 醋泡姜用什么醋好| 肝脏不好吃什么食物才能养肝护肝| 鹦鹉吃什么水果| 大便干燥一粒一粒的吃什么药| 过期化妆品属于什么垃圾| 下作是什么意思| 后下药什么时候下| 93年属于什么生肖| 百度Jump to content

企业高温津贴标准调整 每人每月200元

From Wikipedia, the free encyclopedia
百度 沈晓农对双方在组织开展职业技能竞赛、深入基层加强调研、弘扬劳模精神等方面的合作成效给予充分肯定,并对下一步合作提出意见建议。

The history of computer science began long before the modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science.[1] This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of massive worldwide trade and culture.[2]

Prehistory

[edit]
John Napier (1550–1617), the inventor of logarithms

The earliest known tool for use in computation was the abacus, developed in the period between 2700 and 2300 BCE in Sumer.[3] The Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.[4]:?11? Its original style of usage was by lines drawn in sand with pebbles. Abaci of a more modern design are still used as calculation tools today, such as the Chinese abacus.[5]

In the 5th century BC in ancient India, the grammarian Pā?ini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions.[6]

The Antikythera mechanism is believed to be an early mechanical analog computer.[7] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.[7]

Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world. They were developed by Muslim astronomers, such as the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[8] and the torquetum by Jabir ibn Aflah.[9] According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[10][11] Programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers.[12]

Technological artifacts of similar complexity appeared in 14th century Europe, with mechanical astronomical clocks.[13]

When John Napier discovered logarithms for computational purposes in the early 17th century,[14] there followed a period of considerable progress by inventors and scientists in making calculating tools. In 1623 Wilhelm Schickard designed the calculating machine as a commission for Johannes Kepler which he named the Calculating Clock, but abandoned the project, when the prototype he had started building was destroyed by a fire in 1624.[15] Around 1640, Blaise Pascal, a leading French mathematician, constructed a mechanical adding device based on a design described by Greek mathematician Hero of Alexandria.[16] Then in 1672 Gottfried Wilhelm Leibniz invented the Stepped Reckoner which he completed in 1694.[17]

In 1837 Charles Babbage first described his Analytical Engine which is accepted as the first design for a modern computer. The analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities that enabled it to interpret a programming language with loops and conditional branching. Although never built, the design has been studied extensively and is understood to be Turing equivalent. The analytical engine would have had a memory capacity of less than 1 kilobyte of memory and a clock speed of less than 10 Hertz.[18]

Considerable advancement in mathematics and electronics theory was required before the first modern computers could be designed.

Binary logic

[edit]

Gottfried Wilhelm Leibniz

[edit]
Gottfried Wilhelm Leibniz (1646–1716) developed logic in a binary number system and has been called the "founder of computer science".[19]

In 1702, Gottfried Wilhelm Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system. Leibniz simplified the binary system and articulated logical properties such as conjunction, disjunction, negation, identity, inclusion, and the empty set.[20] He anticipated Lagrangian interpolation and algorithmic information theory. His calculus ratiocinator anticipated aspects of the universal Turing machine. In 1961, Norbert Wiener suggested that Leibniz should be considered the patron saint of cybernetics.[21] Wiener is quoted with "Indeed, the general idea of a computing machine is nothing but a mechanization of Leibniz's Calculus Ratiocinator."[22] But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.[23]

By this time, the first mechanical devices driven by a binary pattern had been invented. The Industrial Revolution had driven forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems and stored binary information.[23]

Emergence of a discipline

[edit]
Charles Babbage (1791–1871), one of the pioneers of computing

Charles Babbage and Ada Lovelace

[edit]

Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to 8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a form of sequential control. This means that one operation would be carried out before another in such a way that the machine would produce an answer and not fail. This machine was to be known as the "Analytical Engine", which was the first true representation of what is the modern computer.[24]

Ada Lovelace (1815–1852) predicted the use of computers in symbolic manipulation

Ada Lovelace (Augusta Ada Byron) is credited as the pioneer of computer programming and is regarded as a mathematical genius. Lovelace began working with Charles Babbage as an assistant while Babbage was working on his "Analytical Engine", the first mechanical computer. [25] During her work with Babbage, Ada Lovelace became the designer of the first computer algorithm, which could compute Bernoulli numbers,[26] although this is arguable as Charles was the first to design the difference engine and consequently its corresponding difference based algorithms, making him the first computer algorithm designer. Moreover, Lovelace's work with Babbage resulted in her prediction of future computers to not only perform mathematical calculations but also manipulate symbols, mathematical or not.[27] While she was never able to see the results of her work, as the "Analytical Engine" was not created in her lifetime, her efforts in later years, beginning in the 1840s, did not go unnoticed.[28]

Early post-Analytical Engine designs

[edit]
Leonardo Torres Quevedo (1852–1936) proposed a consistent manner to store floating-point numbers

Following Babbage, although at first unaware of his earlier work, was Percy Ludgate, a clerk to a corn merchant in Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.[29][30]

Two other inventors, Leonardo Torres Quevedo and Vannevar Bush, also did follow on research based on Babbage's work. In his Essays on Automatics (1914), Torres designed an analytical electromechanical machine that was controlled by a read-only program and introduced the idea of floating-point arithmetic.[31][32][33] In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, he presented in Paris the Electromechanical Arithmometer, which consisted of an arithmetic unit connected to a (possibly remote) typewriter, on which commands could be typed and the results printed automatically.[34] Bush's paper Instrumental Analysis (1936) discussed using existing IBM punch card machines to implement Babbage's design. In the same year he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.[35]

Charles Sanders Peirce and electrical switching circuits

[edit]
Charles Sanders Peirce (1839–1914) described how logical operations could be carried out by electrical switching circuits

In an 1886 letter, Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits.[36] During 1880–81 he showed that NOR gates alone (or alternatively NAND gates alone) can be used to reproduce the functions of all the other logic gates, but this work on it was unpublished until 1933.[37] The first published proof was by Henry M. Sheffer in 1913, so the NAND logical operation is sometimes called Sheffer stroke; the logical NOR is sometimes called Peirce's arrow.[38] Consequently, these gates are sometimes called universal logic gates.[39]

Eventually, vacuum tubes replaced relays for logic operations. Lee De Forest's modification, in 1907, of the Fleming valve can be used as a logic gate. Ludwig Wittgenstein introduced a version of the 16-row truth table as proposition 5.101 of Tractatus Logico-Philosophicus (1921). Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924. Konrad Zuse designed and built electromechanical logic gates for his computer Z1 (from 1935 to 1938).

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with switching circuit theory in the 1930s. From 1934 to 1936, Akira Nakashima, Claude Shannon, and Viktor Shetakov published a series of papers showing that the two-valued Boolean algebra, can describe the operation of switching circuits.[40][41][42][43] This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers. Switching circuit theory provided the mathematical foundations and tools for digital system design in almost all areas of modern technology.[43]

While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. His thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.[44]

Alan Turing and the Turing machine

[edit]
Alan Turing, English computer scientist, mathematician, logician, and cryptanalyst. (circa 1930)

Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Many of these clerks who served as human computers were women.[45][46][47][48] Some performed astronomical calculations for calendars, others ballistic tables for the military.[49]

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase computing machine gradually gave way, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical Turing Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware.

The mathematical foundations of modern computer science began to be laid by Kurt G?del with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by G?del and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.[50]

In 1936 Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing.[51] This became the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis states that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.[51]

In 1936, Alan Turing also published his seminal work on the Turing machines, an abstract digital computing machine which is now simply referred to as the Universal Turing machine. This machine invented the principle of the modern computer and was the birthplace of the stored program concept that almost all modern day computers use.[52] These hypothetical machines were designed to formally determine, mathematically, what can be computed, taking into account limitations on computing ability. If a Turing machine can complete the task, it is considered Turing computable.[53]

The Los Alamos physicist Stanley Frankel, has described John von Neumann's view of the fundamental importance of Turing's 1936 paper, in a letter:[52]

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936… Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing...

John V. Atanasoff (1903–1995) created the first electric digital computer, known as the Atanasoff–Berry computer

Kathleen Booth and the first assembly language

[edit]

Kathleen Booth wrote the first assembly language and designed the assembler and autocode for the Automatic Relay Calculator (ARC) at Birkbeck College, University of London.[54] She helped design three different machines including the ARC, SEC (Simple Electronic Computer), and APE(X)C.

Early computer hardware

[edit]

The world's first electronic digital computer, the Atanasoff–Berry computer, was built on the Iowa State campus from 1939 through 1942 by John V. Atanasoff, a professor of physics and mathematics, and Clifford Berry, an engineering graduate student.

Konrad Zuse, inventor of the modern computer[55][56]

In 1941, Konrad Zuse developed the world's first functional program-controlled computer, the Z3. In 1998, it was shown to be Turing-complete in principle.[57][58] Zuse also developed the S2 computing machine, considered the first process control computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. In 1946, he designed the first high-level programming language, Plankalkül.[59]

In 1948, the Manchester Baby was completed; it was the world's first electronic digital computer that ran programs stored in its memory, like almost all modern computers.[52] The influence on Max Newman of Turing's seminal 1936 paper on the Turing Machines and of his logico-mathematical contributions to the project, were both crucial to the successful development of the Baby.[52]

In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.[52][60] Turing's design for ACE had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer, which was enormous by the standards of his day.[52] Had Turing's ACE been built as planned and in full, it would have been in a different league from the other early computers.[52]

Later in the 1950s, the first operating system, GM-NAA I/O, supporting batch processing to allow jobs to be run with less operator intervention, was developed by General Motors and North American Aviation for the IBM 701.

In 1969, an experiment was conducted by two research teams at UCLA and Stanford to create a network between 2 computers although the system crashed during the initial attempt to connect to the other computer but was a huge step towards the Internet.

Claude Shannon (1916–2001) created the field of information theory

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[61] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).[61]

Shannon and information theory

[edit]

Claude Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.[62]

Norbert Wiener (1894–1964) created the term cybernetics

Wiener and cybernetics

[edit]

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.[63]

John von Neumann (1903–1957) introduced the computer architecture known as Von Neumann architecture

John von Neumann and the von Neumann architecture

[edit]

In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture. Since 1950, the von Neumann model provided uniformity in subsequent computer designs. The von Neumann architecture was considered innovative as it introduced an idea of allowing machine instructions and data to share memory space.[citation needed] The von Neumann model is composed of three major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an instruction is being fetched or to the ALU if data is being fetched.[64]

Von Neumann's machine design uses a RISC (Reduced instruction set computing) architecture,[dubiousdiscuss] which means the instruction set uses a total of 21 instructions to perform all tasks. (This is in contrast to CISC, complex instruction set computing, instruction sets which have more instructions from which to choose.) With von Neumann architecture, main memory along with the accumulator (the register that holds the result of logical operations)[65] are the two memories that are addressed. Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as if statements or while loops. The branches serve as go to statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate and interpret fetched data and instructions. These registers include the "IR" (instruction register), "IBR" (instruction buffer register), "MQ" (multiplier quotient register), "MAR" (memory address register), and "MDR" (memory data register)."[64] The architecture also uses a program counter ("PC") to keep track of where in the program the machine is.[64]

John McCarthy (1927–2011) is considered one of the founding fathers of artificial intelligence

John McCarthy, Marvin Minsky and artificial intelligence

[edit]

The term artificial intelligence was credited by John McCarthy to explain the research that they were doing for a proposal for the Dartmouth Summer Research. The naming of artificial intelligence also led to the birth of a new field in computer science.[66] On August 31, 1955, a research project was proposed consisting of John McCarthy, Marvin L. Minsky, Nathaniel Rochester, and Claude E. Shannon. The official project began in 1956 that consisted of several significant parts they felt would help them better understand artificial intelligence's makeup.

McCarthy and his colleagues' ideas behind automatic computers was while a machine is capable of completing a task, then the same should be confirmed with a computer by compiling a program to perform the desired results. They also discovered that the human brain was too complex to replicate, not by the machine itself but by the program. The knowledge to produce a program that sophisticated was not there yet.

The concept behind this was looking at how humans understand our own language and structure of how we form sentences, giving different meaning and rule sets and comparing them to a machine process. The way computers can understand is at a hardware level. This language is written in binary (1s and 0's). This has to be written in a specific format that gives the computer the ruleset to run a particular hardware piece.[67]

Minsky's process determined how these artificial neural networks could be arranged to have similar qualities to the human brain. However, he could only produce partial results and needed to further the research into this idea.

McCarthy and Shannon's idea behind this theory was to develop a way to use complex problems to determine and measure the machine's efficiency through mathematical theory and computations.[68] However, they were only to receive partial test results.

The idea behind self-improvement is how a machine would use self-modifying code to make itself smarter. This would allow for a machine to grow in intelligence and increase calculation speeds.[69] The group believed they could study this if a machine could improve upon the process of completing a task in the abstractions part of their research.

The group thought that research in this category could be broken down into smaller groups. This would consist of sensory and other forms of information about artificial intelligence. Abstractions in computer science can refer to mathematics and programming language.[70]

Their idea of computational creativity is how the program or a machine can be seen in having similar ways of human thinking.[71] They wanted to see if a machine could take a piece of incomplete information and improve upon it to fill in the missing details as the human mind can do. If this machine could do this; they needed to think of how did the machine determine the outcome.

See also

[edit]

References

[edit]
  1. ^ Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Chapman Hall.
  2. ^ "History of Computer Science". uwaterloo.ca.
  3. ^ Boyer, Carl B.; Merzbach, Uta C. (1991). A History of Mathematics (2nd ed.). John Wiley & Sons, Inc. pp. 252–253. ISBN 978-0-471-54397-8.
  4. ^ Ifrah, Georges (2001). The Universal History of Computing: From the Abacus to the Quantum Computer. John Wiley & Sons. ISBN 978-0-471-39671-0.
  5. ^ Bellos, Alex (2025-08-07). "Abacus adds up to number joy in Japan". The Guardian. London. Retrieved 2025-08-07.
  6. ^ Sinha, A. C. (1978). "On the status of recursive rules in transformational grammar". Lingua. 44 (2–3): 169–218. doi:10.1016/0024-3841(78)90076-1.
  7. ^ a b "Project Overview". The Antikythera Mechanism Research Project. Archived from the original on 2025-08-07. Retrieved 2025-08-07.
  8. ^ "Islam, Knowledge, and Science". Islamic Web. Retrieved 2025-08-07.
  9. ^ Lorch, R. P. (1976), "The Astronomical Instruments of Jabir ibn Aflah and the Torquetum", Centaurus, 20 (1): 11–34, Bibcode:1976Cent...20...11L, doi:10.1111/j.1600-0498.1976.tb00214.x
  10. ^ Simon Singh. The Code Book. pp. 14–20.
  11. ^ "Al-Kindi, Cryptography, Codebreaking and Ciphers". 9 June 2003. Retrieved 2025-08-07.
  12. ^ Koetsier, Teun (2001), "On the prehistory of programmable machines: musical automata, looms, calculators", Mechanism and Machine Theory, 36 (5): 589–603, doi:10.1016/S0094-114X(01)00005-2.
  13. ^ Marchant, Jo (November 2006). "In search of lost time". Nature. 444 (7119): 534–538. Bibcode:2006Natur.444..534M. doi:10.1038/444534a. PMID 17136067.
  14. ^ "John Napier and the Invention of Logarithms, 1614 . E. W. Hobson". Isis. 3 (2): 285–286. October 1920. doi:10.1086/357925.
  15. ^ "1.6 Shickard's Calculating Clock | Bit by Bit". Retrieved 2025-08-07.
  16. ^ "History of Computing Science: The First Mechanical Calculator". eingang.org.
  17. ^ Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development. MIT Press., p.38-42, translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim.
  18. ^ "CS History". everythingcomputerscience.com. Retrieved 2025-08-07.
  19. ^ "2021: 375th birthday of Leibniz, father of computer science". people.idsia.ch.
  20. ^ Lande, Daniel R. (December 2014). "Development of the Binary Number System and the Foundations of Computer Science". The Mathematics Enthusiast. 11 (3): 513–540. doi:10.54870/1551-3440.1315. ProQuest 1646358626.
  21. ^ Wiener, Norbert (1961). Cybernetics Or Control and Communication in the Animal and the Machine. MIT Press. p. 12. ISBN 978-0-262-73009-9. {{cite book}}: ISBN / Date incompatibility (help)
  22. ^ Wiener, Norbert (1948). "Time, Communication, and the Nervous System". Annals of the New York Academy of Sciences. 50 (4 Teleological): 197–220. Bibcode:1948NYASA..50..197W. doi:10.1111/j.1749-6632.1948.tb39853.x. PMID 18886381.
  23. ^ a b Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press.
  24. ^ "Charles Babbage". Encyclop?dia Britannica Online Academic Edition. Encyclop?dia Britannica In. 3 July 2023. Retrieved 2025-08-07.
  25. ^ Evans 2018, p. 16.
  26. ^ Evans 2018, p. 21.
  27. ^ Evans 2018, p. 20.
  28. ^ Isaacson, Betsy (2025-08-07). "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle". The Huffington Post. Retrieved 2025-08-07.
  29. ^ "The John Gabriel Byrne Computer Science Collection" (PDF). Archived from the original on 2025-08-07. Retrieved 2025-08-07.
  30. ^ "1907: was the first portable computer design Irish?". Ingenious Ireland. 17 October 2012.
  31. ^ L. Torres Quevedo (1914). "Ensayos sobre Automática – Su definicion. Extension teórica de sus aplicaciones". Revista de la Academia de Ciencias Exacta, Revista 12: 391–418.
  32. ^ Torres Quevedo, Leonardo (19 November 1914). "Automática: Complemento de la Teoría de las Máquinas" (PDF). Revista de Obras Públicas. LXII (2043): 575–583.
  33. ^ Kneusel, Ronald T (2025). Numbers and Computers. Texts in Computer Science. pp. 84–85. doi:10.1007/978-3-031-67482-2. ISBN 978-3-031-67481-5.
  34. ^ Torres y Quevedo, Leonardo (1982). "Electromechanical Calculating Machine". The Origins of Digital Computers. pp. 109–120. doi:10.1007/978-3-642-61812-3_7. ISBN 978-3-642-61814-7.
  35. ^ Randell, Brian. "From Analytical Engine to Electronic Digital Computer: The Contributions of Ludgate, Torres, and Bush" (PDF). Archived from the original (PDF) on 21 September 2013. Retrieved 9 September 2013.
  36. ^ Peirce, C. S., "Letter, Peirce to A. Marquand", dated 1886, Writings of Charles S. Peirce, v. 5, 1993, pp. 421–23. See Burks, Arthur W., "Review: Charles S. Peirce, The new elements of mathematics", Bulletin of the American Mathematical Society v. 84, n. 5 (1978), pp. 913–18, see 917. PDF.
  37. ^ Peirce, C. S. (manuscript winter of 1880–81), "A Boolian Algebra with One Constant", published 1933 in Collected Papers v. 4, paragraphs 12–20. Reprinted 1989 in Writings of Charles S. Peirce v. 4, pp. 218–21, Google [1]. See Roberts, Don D. (2009), The Existential Graphs of Charles S. Peirce, p. 131.
  38. ^ Hans Kleine Büning; Theodor Lettmann (1999). Propositional logic: deduction and algorithms. Cambridge University Press. p. 2. ISBN 978-0-521-63017-7.
  39. ^ John Bird (2007). Engineering mathematics. Newnes. p. 532. ISBN 978-0-7506-8555-9.
  40. ^ Yamada, Akihiko (2004). "History of Research on Switching Theory in Japan". IEEJ Transactions on Fundamentals and Materials. 124 (8): 720–726. Bibcode:2004IJTFM.124..720Y. doi:10.1541/ieejfms.124.720.
  41. ^ "Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics". IPSJ Computer Museum. Information Processing Society of Japan.
  42. ^ Stankovi?, Radomir S; Astola, Jaakko T; Karpovsky, Mark G. Some historical remarks about the development and applications of switching (PDF) (Report).
  43. ^ a b Stankovi?, Radomir S. [in German]; Astola, Jaakko Tapio [in Finnish], eds. (2008). Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory (PDF). Tampere International Center for Signal Processing (TICSP) Series. Vol. 40. Tampere University of Technology, Tampere, Finland. ISBN 978-952-15-1980-2. Archived from the original (PDF) on 2025-08-07.{{cite book}}: CS1 maint: location missing publisher (link) (3+207+1 pages) 10:00 min
  44. ^ Shannon, Claude (2021). "A Symbolic Analysis of Relay and Switching Circuits (1938)". Ideas That Created the Future. pp. 71–78. doi:10.7551/mitpress/12274.003.0010. ISBN 978-0-262-36317-4.
  45. ^ Light, Jennifer S. (1999). "When Computers Were Women". Technology and Culture. 40 (3): 455–483. doi:10.1353/tech.1999.0128. Project MUSE 33396.
  46. ^ Kiesler, Sara; Sproull, Lee; Eccles, Jacquelynne S. (December 1985). "Pool Halls, Chips, and War Games: Women in the Culture of Computing". Psychology of Women Quarterly. 9 (4): 451–462. doi:10.1111/j.1471-6402.1985.tb00895.x.
  47. ^ Fritz, W.B. (Fall 1996). "The women of ENIAC". IEEE Annals of the History of Computing. 18 (3): 13–28. doi:10.1109/85.511940.
  48. ^ Gürer, Denise (June 2002). "Pioneering women in computer science". ACM SIGCSE Bulletin. 34 (2): 175–180. doi:10.1145/543812.543853.
  49. ^ Grier 2013, p. 138.
  50. ^ "G?del and the limits of logic". plus.maths.org. 2025-08-07. Retrieved 2025-08-07.
  51. ^ a b Copeland, B. Jack (2019). "The Church-Turing Thesis". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy (Spring 2019 ed.). Metaphysics Research Lab, Stanford University. Retrieved 2025-08-07.
  52. ^ a b c d e f g "Turing's Automatic Computing Engine". The Modern History of Computing. Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. 2017.
  53. ^ Barker-Plummer, David (2025-08-07). "Turing Machines". Stanford Encyclopedia of Philosophy. Retrieved 2025-08-07.
  54. ^ Booth, Kathleen HV, "Machine language for Automatic Relay Computer", Birkbeck College Computation Laboratory, University of London
  55. ^ Bellis, Mary (15 May 2019) [First published 2006 at inventors.about.com/library/weekly/aa050298.htm]. "Biography of Konrad Zuse, Inventor and Programmer of Early Computers". thoughtco.com. Dotdash Meredith. Archived from the original on 13 December 2020. Retrieved 3 February 2021. Konrad Zuse earned the semiofficial title of 'inventor of the modern computer'[who?]
  56. ^ "Who is the Father of the Computer?". ComputerHope.
  57. ^ Rojas, R. (1998). "How to make Zuse's Z3 a universal computer". IEEE Annals of the History of Computing. 20 (3): 51–54. doi:10.1109/85.707574.
  58. ^ Rojas, Raúl. "How to Make Zuse's Z3 a Universal Computer". Archived from the original on 2025-08-07.
  59. ^ Talk given by Horst Zuse to the Computer Conservation Society at the Science Museum (London) on 18 November 2010
  60. ^ "BBC News – How Alan Turing's Pilot ACE changed computing". BBC News. May 15, 2010.
  61. ^ a b "The First "Computer Bug"". CHIPS. 30 (1). United States Navy: 18. January–March 2012. Retrieved 2025-08-07.
  62. ^ Shannon, Claude Elwood (1964). The mathematical theory of communication. Warren Weaver. Urbana: University of Illinois Press. ISBN 0-252-72548-4. OCLC 2654027. {{cite book}}: ISBN / Date incompatibility (help)
  63. ^ Xiong, Aiping; Proctor, Robert W. (2018). "Information Processing: The Language and Analytical Tools for Cognitive Psychology in the Information Age". Frontiers in Psychology. 9: 1270. doi:10.3389/fpsyg.2018.01270. PMC 6092626. PMID 30135664.
  64. ^ a b c Cragon, Harvey G. (2000). Computer Architecture and Implementation. Cambridge: Cambridge University Press. pp. 1–13. ISBN 978-0-521-65168-4.
  65. ^ "Accumlator" Def. 3. Oxford Dictionaries. Archived from the original on May 18, 2013.
  66. ^ Moor, James (15 December 2006). "The Dartmouth College Artificial Intelligence Conference: The Next Fifty Years". AI Magazine. 27 (4): 87–91. ProQuest 208119658.
  67. ^ Prudhomme, Gerard (December 2018). Introduction to Assembly Language Programming. Arcler Education Incorporated. ISBN 978-1-77361-470-0. OCLC 1089398724.
  68. ^ McCarthy, John; Lifschitz, Vladimir (1991). Artificial intelligence and mathematical theory of computation : papers in honor of John McCarthy. Academic Press. ISBN 0-12-450010-2. OCLC 911282256.
  69. ^ Haenlein, Michael; Kaplan, Andreas (August 2019). "A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial Intelligence". California Management Review. 61 (4): 5–14. doi:10.1177/0008125619864925.
  70. ^ Baeten, Jos C. M.; Ball, Tom; Boer, Frank S., eds. (2012). Theoretical Computer Science. Lecture Notes in Computer Science. Vol. 7604. doi:10.1007/978-3-642-33475-7. ISBN 978-3-642-33474-0.[page needed]
  71. ^ "The Creativity Post | What is Computational Creativity?". The Creativity Post. Retrieved 2025-08-07.

Sources

[edit]

Further reading

[edit]
[edit]
腹膜炎吃什么药 日本浪人是什么意思 过期橄榄油有什么用途 cns是什么意思 9月20号是什么星座
马粟是什么 咳嗽吐白痰是什么病 胡萝卜富含什么维生素 什么是童话 风云际会的意思是什么
身上汗味重是什么原因 火红的太阳像什么 丁火是什么火 阴性什么意思 风湿什么药好
林丹用的什么球拍 七月三号什么星座 云南有什么 下肢肿胀是什么原因 儿童低烧吃什么药
egfr是什么意思hcv8jop0ns1r.cn 鸡拉稀吃什么药hcv8jop6ns3r.cn 肾积水挂什么科室tiangongnft.com 尿检ph值是什么意思mmeoe.com gr什么意思hcv7jop6ns6r.cn
血府逐瘀丸治什么病hcv8jop7ns1r.cn 转移酶偏高是什么原因hcv8jop3ns7r.cn 润六月是什么意思hcv8jop5ns0r.cn 什么叫辟谷减肥法hcv9jop3ns4r.cn 朗姆是什么hcv7jop7ns4r.cn
1978年属马五行缺什么wuhaiwuya.com 痰核是什么意思hcv8jop3ns2r.cn 十月一日是什么节hcv9jop2ns8r.cn 武夷肉桂茶属于什么茶hcv8jop7ns1r.cn 鬼画符是什么意思hcv8jop9ns3r.cn
镜花缘是什么意思hcv9jop2ns3r.cn 双肺条索是什么意思hcv9jop2ns1r.cn 粉碎性骨折是什么意思hcv8jop3ns2r.cn 唐筛是检查什么hcv8jop7ns5r.cn 口里发酸是什么原因hcv9jop6ns7r.cn
百度