Moore's Law and More

Chapter 6 · Hardware, Computing Trends, and Managerial Implications

Vocabulary

eWaste

eWaste is discarded electronic equipment such as laptops, phones, tablets, batteries, monitors, and other digital devices. It matters in this chapter because faster hardware improvement makes devices feel outdated sooner, so firms and consumers replace them more often. That creates both an environmental problem and a managerial responsibility around disposal, recycling, and supply-chain sustainability.

Sources: MIS 301 Ch. 6 slides; U.S. EPA - Electronics Donation and Recycling

Internet of Things (IoT)

The Internet of Things is a network of physical objects that contain sensors, processors, and connectivity so they can collect, send, and sometimes act on data. In business terms, IoT turns ordinary products or spaces into data-generating systems. Moore's Law helped make IoT practical because chips became small, cheap, and energy-efficient enough to embed in everyday products.

Sources: MIS 301 Ch. 6 slides; IBM - What is the Internet of Things?

Konana's Model of the Software Ecosystem

Konana's model describes technology as a stack of dependent layers: hardware at the bottom, then the operating system, then database and middleware layers, and then enterprise and consumer applications on top. The key managerial idea is lock-in. Because each layer depends on the layers beneath it, switching one layer can make the other layers costly or inconvenient to change.

Source: MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Memory

Memory is the computer's fast working area where instructions and data are held while the machine is actively using them. In this chapter, memory is best understood as temporary workspace for the CPU rather than as long-term file storage. When you compare machines, more memory usually helps with speed, multitasking, and handling larger programs at once.

Sources: Professor study guide; MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Microprocessor

A microprocessor is a processor built onto a single integrated circuit chip. It performs the calculations and decision-making steps that let a computer run software. This term matters because Moore's Law tracks the increasing transistor density on chips like microprocessors, which is what made computing dramatically faster and cheaper over time.

Sources: MIS 301 Ch. 6 slides; Britannica - Microprocessor

Moore's Law

Moore's Law is the long-running observation that the number of transistors that can be placed on an integrated circuit at a given cost tends to double roughly every two years. It is not a law of nature; it became an industry target that pushed companies to keep improving chip capability while driving down the cost of computing. Managers care because this steady improvement keeps enabling new business models, new software, and new customer expectations.

Sources: MIS 301 Ch. 6 slides; Intel - Moore's Law Press Kit

Non-volatile

Non-volatile means data remains stored even after power is turned off. This property is what makes storage devices useful for keeping files, applications, and operating systems over time. A hard drive or solid-state drive is valuable precisely because it preserves information after shutdown.

Sources: Professor study guide; Textbook: 6.1 Moore’s Law and How Managers Interpret It

Price Elasticity

Price elasticity describes how strongly demand changes when price changes. In the context of computing, demand is highly elastic because when computing power becomes cheaper, people and firms quickly find more uses for it. That is why falling chip costs did not shrink the market - they expanded it by creating more demand for devices, cloud services, wearables, and AI tools.

Sources: MIS 301 Ch. 6 slides; Textbook: 6.1 Get Out Your Crystal Ball

Quantum Computing

Quantum computing is a computing approach that uses qubits rather than ordinary bits. Because qubits can take advantage of superposition and other quantum effects, a quantum computer can represent and process some kinds of problems very differently from a classical computer. The promise is enormous, but the technology is still hard to commercialize because the machines are fragile, expensive, and difficult to operate reliably.

Sources: MIS 301 Ch. 6 slides; IBM - What is Quantum Computing?

Storage

Storage is the long-term place where data, files, applications, and the operating system are kept when they are not actively being processed. Unlike memory, storage is designed for persistence, not moment-to-moment speed. When buying a computer, storage tells you how much data you can keep, while memory tells you how much work the machine can comfortably handle at one time.

Sources: Professor study guide; MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Volatile

Volatile means the contents disappear when electrical power is removed. RAM is the standard example: it is fast enough for active work, but it cannot be trusted for permanent retention. That is why a power loss can wipe out unsaved work that was sitting only in memory.

Sources: Professor study guide; Textbook: 6.1 The End of Moore’s Law, but Not Really an End to Fast/Cheap Computing

Semiconductor from slides

A semiconductor is a material whose electrical behavior can be controlled so that it sometimes conducts electricity and sometimes blocks it. That controllability is what makes semiconductors useful for building modern chips. They form the physical foundation of CPUs, GPUs, memory chips, and many other electronic components.

Source: MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Transistor from slides

A transistor is a tiny electronic switch that can represent binary states such as 0 and 1. Billions of transistors are packed onto a chip so the machine can perform logic operations and calculations. The reason Moore's Law matters so much is that it tracks how many of these switches can be put on a chip at low cost.

Sources: MIS 301 Ch. 6 slides; Intel - Moore's Law Press Kit

CPU (Central Processing Unit) from slides

The CPU is the main processor that carries out instructions and coordinates the computer's work. In the "data kitchen" analogy from class, it is the chef using software recipes, working with memory as workspace and storage as the pantry. Even though many devices now contain specialized chips, the CPU remains the general-purpose center of computation.

Source: Textbook: 6.2 Buying Time

Multicore CPU from slides

A multicore CPU places two or more processing cores on the same chip so multiple tasks can be handled at the same time. This matters because simply making one core faster became harder as heat and power limits increased. Multicore design improved performance by adding parallelism rather than relying only on higher clock speed.

Source: MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Parallel Computing from slides

Parallel computing means splitting work across multiple processors, cores, or machines so pieces of a problem can be solved simultaneously. It is used in supercomputing, cloud computing, graphics processing, and AI workloads. The advantage is speed, but the challenge is coordination because the pieces of work still need to fit together correctly.

Source: Textbook: 6.3 The Power of Parallel: Supercomputing, Grids, Clusters, and Putting Smarts in the Cloud

Fab from slides

A fab is a semiconductor fabrication plant where chips are manufactured. Fabs are strategically important because they are extraordinarily expensive to build and require huge amounts of water, electricity, purification systems, and highly controlled cleanroom conditions. That makes chip production not just a technical issue but also a supply-chain and geopolitical issue.

Source: Textbook: 6.2 The End of Moore’s Law? Not the End of Fast/Cheap Computing!

Bandwidth from slides

Bandwidth is the amount of data that can be moved through a connection in a given amount of time. It is like the number of lanes on a highway: more lanes allow more traffic to move at once. High-bandwidth connections matter for activities such as cloud backups, video streaming, and transferring large files.

Source: MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Latency from slides

Latency is the delay before data begins moving from one point to another or before a system responds. If bandwidth is highway width, latency is closer to the travel delay for the first car. Low latency matters most in tasks like gaming, live polling, and interactive apps where responsiveness matters more than moving huge amounts of data.

Source: MIS 301 Ch. 6 slides: 10 - Moore's Law and Hardware

Key Questions

Practice Quiz

Instructions: Select the best answer for each question, then click "Check Answer" to see if you're correct. These questions are designed to test application of concepts, not just memorization.

1. A retailer moves its inventory planning from weekly spreadsheets to a real-time forecasting system that constantly recalculates demand. Which chapter concept best explains why this kind of shift becomes more practical over time?

2. A student says, "My laptop has 512 GB of memory, so I can keep a huge number of programs open at once." What is the best correction?

3. Which example best illustrates high bandwidth but potentially poor latency?

4. Why did moving from single-core chips to multicore chips become an important industry strategy?

5. Which statement best connects Moore's Law, price elasticity, and the growth of IoT?