Text Practice Mode
ComputerScience P1
created Apr 15th, 09:15 by MuathNasr
0
1513 words
0 completed
0
Rating visible after 3 or more votes
1.1 Historical Overview of Computer Science and the Internet
Although computer science is a relatively new scientific field, it stems from more than two centuries of fundamental work. First and foremost, mathematicians paved the way for the rise of computing technology. The very first computer scientist may have been German mathematician and philosopher Gottfried Leibniz.
Leibniz was one of the very rare scientists who rose to fame both as a philosopher and a mathematician. Around 1672 Leibniz invented the first mechanical digital calculator, which he coined the Stepped Reckoner (Keates, 2012). The term referred to the calculator’s operating mechanism, called Staffelwalze. The stepped reckoner was the very first calcula- tor that could calculate all four arithmetic operations (Beeson, 2004). Leibniz’ invention influenced the development of calculators for at least two centuries.
The British mathematician, philosopher, and inventor Charles Babbage invented the first mechanical computer back in the 1820s, with an upgraded version following in 1834. Bab- bage’s invention was coined the Difference Engine (with the upgraded variant called the Analytical Engine; Collier, 1970). The original machine’s name referred to the method of divided differences. This method (originated by Newton) enabled the interpolation of functions through the use of polynomial coefficients. While Babbage’s Difference Engine depended on a human being acting as the algorithm, the Analytical Engine was a general- purpose program-controlled digital computer that worked without human intervention (Hyman, 1982). It was designed to store 1,000 different numbers of 50 digits each – which is more than any computer before the 1960s could store (Collier, 1970). Thus, the Analyti- cal Engine is widely considered as the world’s first programmable computer. It’s worth noting that Charles Babbage was a true pioneer of computer science, since there was no contemporary who envisioned something remotely comparable to Babbage’s vision of a programmable machine.
da Lovelace (1815–1852)
However, the inception of formal informatics, in particular programming, has been credi- ted to Augusta Ada King, Countess of Lovelace and daughter of poet Lord Byron and math- ematician Lady Byron. Better known by the name Ada Lovelace, the young mathematician and writer would later be regarded as the world’s first programmer (Collier, 1970). A fre- quent collaborator of Charles Babbage, Lovelace’s view of his Difference Engine was a more general one, as she considered the machine to be much more than a mere automa- ted calculator: Lovelace also envisioned the processing of music scores, letters, and pic- tures (Fuegi & Francis, 2003). In other words, Lovelace discovered the computer’s potential as a universal tool.
Konrad Zuse (1910–1995)
The German engineer and inventor Konrad Zuse has been credited with creating the first universally programmable computer system “Z3” back in 1941 (Rojas, 1997). Zuse was one of the founding fathers of computer science and left behind an impressive and versatile body of work. In one of his books, Zuse even speculated on the possibility of the universe as a giant computer program, a controversial theory that gave rise to the controversial movement of digital physics. Remarkably, Zuse’s Z3 was confirmed to be Turing-complete in 1998, three years after his death (Rojas, 1998).
Alan Turing (1912–1954)
The British mathematician and logician Alan Turing has influenced and shaped the rise of computer science itself by formalizing the emerging science of programming. He is the creator of the Turing machine and of the concept of Turing completeness. By creating the Turing machine, the inventor paved the way for universal computing. Turing is also con- sidered as one of the early founding fathers of artificial intelligence (AI) and invented the Turing test (the inventor himself preferred “imitation game”) (Turing, 1950). The Turing test is still regarded as a crucial threshold an AI has to overcome to demonstrate a human level of intelligence (Russell & Norvig, 2016). During the Turing test, the testers must decide whether their conversation partner is man or machine. Despite being regarded a national hero – Turing was the leading scientist of the team that decoded the famous ENIGMA code machine of Nazi Germany – he was later sentenced to be chemically castra- ted due to his homosexuality. In 1954, Turing committed suicide.
John von Neumann (1903–1957)
Another founding father of modern-day computer science and artificial intelligence was the Hungarian-American mathematician John von Neumann. Von Neumann is most nota- bly credited as the inventor of the von Neumann architecture, a system in which both data and program are binary coded (Goldstine, 1980). He is probably most famous for his con- ception of an AI-driven technological revolution, which he discussed along with the Pol- ish-American scientist Stanislaw Ulam (1909–1984); von Neumann and Ulam are also the founding fathers of the hypothetical concept of the technological singularity, which is a major theory of transhumanism (Goldstine, 1980).
Irving John Good (1916–2009)
The British mathematician Irving John Good, a fellow cryptologist who worked with Alan Turing, envisioned a possible intelligence explosion that could lead to an automated but also uncontrollable trajectory of progress. Good’s controversial ideas are still debated today, particularly the idea about an ultra-intelligent machine that creates even better machines that exceed the intelligence of humans (Good, 1965).
The evolving field of computer science simultaneously describes a historical hierarchy that unfolded during the progress of the developing science. Hence, it was already some sort of quantum leap to move from Babbage’s Difference Engine to his Analytical Engine. The former was a mere highly developed calculation apparatus, while the latter more resembled a modern computer due to its automated functions and independence of human intervention. Arguably, the invention of the Difference Engine back in 1834 marks the very start of computer science, since this was the event where the subject of the sci- ence was created.
From there, the art of programming began to evolve to find sound solutions through opti- mization processes. Most progress was incremental and took years, if not decades, but in the Turing era the development appeared to tangibly accelerate. The phase of rapid pro- gress that began in the 1960s motivated all kinds of hyperbolic claims of technological potential, even leading to the belief that human-like artificial intelligence might be a mat- ter of years away. However, two so-called “AI winters” seemed to deny AI’s announced potential and sowed doubts as to whether the promised powers were feasible at all.
On the other hand, industrial applications revolutionized the business world from the late 1980s, expanding and solidifying the general trust in electronic data processing and increasing its influence as a major driving force of digitization. In particular, ERP systems gained major relevance. Enterprise Resource Planning (abbreviated ERP) is a method that utilizes software to manage major business processes. Typically, ERP is regarded as a type of business management software tool, combined with an organizational unit that collects and interprets data from various business affairs.
The Advent of the Internet
The internet (a composite constituting of interconnected networks) is a global digital plat- form that interconnects a vast number of computers and provides access to World Wide Web resources through the use of a web browser. In principle, the internet was installed back in the 1960s as a means of discreet information exchange, mostly for government institutions. After being a) separated from its old context of intelligence affairs and b) sub- stantially expanded, the internet embarked on its unparalleled rise to relevance and influ- ence that we regard as entirely self-evident nowadays.
Originally created on the platform of a packet-switched network, its first iteration was the ARPANET (Advanced Research Projects Agency Network). Back in 1969, ARPANET intercon- nected the first computers, and one year later the Network Controlling Program got estab- lished (Bidgoli, 2004). The usage of these first network systems was almost exclusively military, with a focus on intelligence and espionage applications. Installed as a mutual. venture of ARPA (the Advanced Research Projects Agency) and the Defense Communica- tion Agency, while funded by the former and operated by the latter, the internet’s origins were the opposite of civilian (Bigdoli, 2004).
From 1985, the advent of the National Science Foundation Network (NkonuxSFNET) serves as another precursor of the modern-day internet. Commercial websites, networks, and enterprises were integrated into the web structure from the early 1990s, laying the foundations of today’s well-known World Wide Web (www). During the course of this transformative process, a great many innovative and often also disruptive technologies have been implemented. From email correspondence to internet-based telephony (as pio- neered by Skype), from websites that enable music (as pioneered by Spotify) and video streaming (YouTube) to blogs, vlogs, and news forums, the internet has accumulated and bundled all information-related enterprises and provides democratic access to the world’s vast pool of information.
In a world of daily growing mountains of data, the means to find the relevant piece of information one is looking for becomes more and more crucial. The advent of the search engine Yahoo in the year 1994 demonstrates in impressive ways the dynamics and unfore- seeable reactions of the free market. While Yahoo entered the market as the frontrunner (fueled by the so-called first mover advantage), it quickly showed that being the first isn’t enough.
00:00
1.1 Historical Overview of Computer Science and the Internet
Although computer science is a relatively new scientific field, it stems from more than two centuries of fundamental work. First and foremost, mathematicians paved the way for the rise of computing technology. The very first computer scientist may have been German mathematician and philosopher Gottfried Leibniz.
Leibniz was one of the very rare scientists who rose to fame both as a philosopher and a mathematician. Around 1672 Leibniz invented the first mechanical digital calculator, which he coined the Stepped Reckoner (Keates, 2012). The term referred to the calculator’s operating mechanism, called Staffelwalze. The stepped reckoner was the very first calcula- tor that could calculate all four arithmetic operations (Beeson, 2004). Leibniz’ invention influenced the development of calculators for at least two centuries.
The British mathematician, philosopher, and inventor Charles Babbage invented the first mechanical computer back in the 1820s, with an upgraded version following in 1834. Bab- bage’s invention was coined the Difference Engine (with the upgraded variant called the Analytical Engine; Collier, 1970). The original machine’s name referred to the method of divided differences. This method (originated by Newton) enabled the interpolation of functions through the use of polynomial coefficients. While Babbage’s Difference Engine depended on a human being acting as the algorithm, the Analytical Engine was a general- purpose program-controlled digital computer that worked without human intervention (Hyman, 1982). It was designed to store 1,000 different numbers of 50 digits each – which is more than any computer before the 1960s could store (Collier, 1970). Thus, the Analyti- cal Engine is widely considered as the world’s first programmable computer. It’s worth noting that Charles Babbage was a true pioneer of computer science, since there was no contemporary who envisioned something remotely comparable to Babbage’s vision of a programmable machine.
da Lovelace (1815–1852)
However, the inception of formal informatics, in particular programming, has been credi- ted to Augusta Ada King, Countess of Lovelace and daughter of poet Lord Byron and math- ematician Lady Byron. Better known by the name Ada Lovelace, the young mathematician and writer would later be regarded as the world’s first programmer (Collier, 1970). A fre- quent collaborator of Charles Babbage, Lovelace’s view of his Difference Engine was a more general one, as she considered the machine to be much more than a mere automa- ted calculator: Lovelace also envisioned the processing of music scores, letters, and pic- tures (Fuegi & Francis, 2003). In other words, Lovelace discovered the computer’s potential as a universal tool.
Konrad Zuse (1910–1995)
The German engineer and inventor Konrad Zuse has been credited with creating the first universally programmable computer system “Z3” back in 1941 (Rojas, 1997). Zuse was one of the founding fathers of computer science and left behind an impressive and versatile body of work. In one of his books, Zuse even speculated on the possibility of the universe as a giant computer program, a controversial theory that gave rise to the controversial movement of digital physics. Remarkably, Zuse’s Z3 was confirmed to be Turing-complete in 1998, three years after his death (Rojas, 1998).
Alan Turing (1912–1954)
The British mathematician and logician Alan Turing has influenced and shaped the rise of computer science itself by formalizing the emerging science of programming. He is the creator of the Turing machine and of the concept of Turing completeness. By creating the Turing machine, the inventor paved the way for universal computing. Turing is also con- sidered as one of the early founding fathers of artificial intelligence (AI) and invented the Turing test (the inventor himself preferred “imitation game”) (Turing, 1950). The Turing test is still regarded as a crucial threshold an AI has to overcome to demonstrate a human level of intelligence (Russell & Norvig, 2016). During the Turing test, the testers must decide whether their conversation partner is man or machine. Despite being regarded a national hero – Turing was the leading scientist of the team that decoded the famous ENIGMA code machine of Nazi Germany – he was later sentenced to be chemically castra- ted due to his homosexuality. In 1954, Turing committed suicide.
John von Neumann (1903–1957)
Another founding father of modern-day computer science and artificial intelligence was the Hungarian-American mathematician John von Neumann. Von Neumann is most nota- bly credited as the inventor of the von Neumann architecture, a system in which both data and program are binary coded (Goldstine, 1980). He is probably most famous for his con- ception of an AI-driven technological revolution, which he discussed along with the Pol- ish-American scientist Stanislaw Ulam (1909–1984); von Neumann and Ulam are also the founding fathers of the hypothetical concept of the technological singularity, which is a major theory of transhumanism (Goldstine, 1980).
Irving John Good (1916–2009)
The British mathematician Irving John Good, a fellow cryptologist who worked with Alan Turing, envisioned a possible intelligence explosion that could lead to an automated but also uncontrollable trajectory of progress. Good’s controversial ideas are still debated today, particularly the idea about an ultra-intelligent machine that creates even better machines that exceed the intelligence of humans (Good, 1965).
The evolving field of computer science simultaneously describes a historical hierarchy that unfolded during the progress of the developing science. Hence, it was already some sort of quantum leap to move from Babbage’s Difference Engine to his Analytical Engine. The former was a mere highly developed calculation apparatus, while the latter more resembled a modern computer due to its automated functions and independence of human intervention. Arguably, the invention of the Difference Engine back in 1834 marks the very start of computer science, since this was the event where the subject of the sci- ence was created.
From there, the art of programming began to evolve to find sound solutions through opti- mization processes. Most progress was incremental and took years, if not decades, but in the Turing era the development appeared to tangibly accelerate. The phase of rapid pro- gress that began in the 1960s motivated all kinds of hyperbolic claims of technological potential, even leading to the belief that human-like artificial intelligence might be a mat- ter of years away. However, two so-called “AI winters” seemed to deny AI’s announced potential and sowed doubts as to whether the promised powers were feasible at all.
On the other hand, industrial applications revolutionized the business world from the late 1980s, expanding and solidifying the general trust in electronic data processing and increasing its influence as a major driving force of digitization. In particular, ERP systems gained major relevance. Enterprise Resource Planning (abbreviated ERP) is a method that utilizes software to manage major business processes. Typically, ERP is regarded as a type of business management software tool, combined with an organizational unit that collects and interprets data from various business affairs.
The Advent of the Internet
The internet (a composite constituting of interconnected networks) is a global digital plat- form that interconnects a vast number of computers and provides access to World Wide Web resources through the use of a web browser. In principle, the internet was installed back in the 1960s as a means of discreet information exchange, mostly for government institutions. After being a) separated from its old context of intelligence affairs and b) sub- stantially expanded, the internet embarked on its unparalleled rise to relevance and influ- ence that we regard as entirely self-evident nowadays.
Originally created on the platform of a packet-switched network, its first iteration was the ARPANET (Advanced Research Projects Agency Network). Back in 1969, ARPANET intercon- nected the first computers, and one year later the Network Controlling Program got estab- lished (Bidgoli, 2004). The usage of these first network systems was almost exclusively military, with a focus on intelligence and espionage applications. Installed as a mutual. venture of ARPA (the Advanced Research Projects Agency) and the Defense Communica- tion Agency, while funded by the former and operated by the latter, the internet’s origins were the opposite of civilian (Bigdoli, 2004).
From 1985, the advent of the National Science Foundation Network (NkonuxSFNET) serves as another precursor of the modern-day internet. Commercial websites, networks, and enterprises were integrated into the web structure from the early 1990s, laying the foundations of today’s well-known World Wide Web (www). During the course of this transformative process, a great many innovative and often also disruptive technologies have been implemented. From email correspondence to internet-based telephony (as pio- neered by Skype), from websites that enable music (as pioneered by Spotify) and video streaming (YouTube) to blogs, vlogs, and news forums, the internet has accumulated and bundled all information-related enterprises and provides democratic access to the world’s vast pool of information.
In a world of daily growing mountains of data, the means to find the relevant piece of information one is looking for becomes more and more crucial. The advent of the search engine Yahoo in the year 1994 demonstrates in impressive ways the dynamics and unfore- seeable reactions of the free market. While Yahoo entered the market as the frontrunner (fueled by the so-called first mover advantage), it quickly showed that being the first isn’t enough.
saving score / loading statistics ...