A Byte-Sized History of Technology at Stanford GSB
Nine decades of upgrades, from punch cards to PCs and vacuum tubes to GPUs
For its first few decades, Stanford Graduate School of Business was anything but high tech. That changed with the dawn of the Digital Age, as the faculty learned how to integrate computers into their teaching and research and increasingly tech-savvy students sought to stay ahead of ever bigger technological leaps. Read on for a look back at how Stanford GSB has adopted, adapted, and upgraded over the years.
1933
The Stanford GSB library installs a projector to show slides and “opaque objects such as charts, forms, and book pages.”

1952 | GSB Archives
1952
The Burroughs Adding Machine Company demos 26 of its models in the Stanford GSB lounge.
1957
IBM gives the school a three-year, $25,000 grant to hire a professor of data processing.
1958
Stanford GSB gets its first computer, an IBM Type 610. The size of a desk, it has enough memory to store 84 numbers. Students taking Electronic Data Processing use it to analyze the results of an alumni survey.

1958 | GSB Archives
1960

1960 | GSB Archives
All Stanford GSB students, the Alumni Bulletin states, “should learn that computers exist.” Case problems involving computers are introduced in classes.
1962
The Western Data Processing Center donates an IBM 1401 to the school.

1962 | GSB Archives
1967
The school installs an IBM 360-67, which can do 1 million computations per second and can be linked to “remote typewriters” in labs, offices, and “even homes.”
1968
Management and the Computer becomes a requirement for first-year MBAs. “We do not expect students to become super-programmers,” says Professor Norman R. Nielsen. “Rather, we expect them to understand the computer as a tool, a lightning-fast calculator with an instant memory which can help them arrive at business decisions.”
1969
The Stanford GSB library rolls out “automated computer printouts” of its periodical holdings. A two-week pre-enrollment class is introduced to prepare students for computer programming.

Before magnetic tape or floppy disks, there were punch cards. One could store up to 80 characters. 1971 | Claudio Divizia
1971
Stanford GSB obtains an HP Model 2000C for student use. The “compact” machine measures 6 by 11 by 2.5 feet and is linked to 32 typewriter terminals. “The new system has changed the beginning computer experience from what was for some an emotionally negative experience to one that is at least neutral or positive,” says Professor J. R. Miller. The Continuing Education program advertises a course on management models with the headline: “Don’t be replaced by your computer.”
1972
The new core curriculum includes a four-week computer course for incoming MBAs.
1978
Stanford GSB buys a $750,000 DEC System 20 computer as well as five Apple IIs to run VisiCalc, the first spreadsheet program.

1978 | GSB Archives

1981
The computer facility is expanded to include a new DEC-20 and more terminals for students — bringing the total number of terminals on campus to 150.
1982
Stanford GSB has 12 personal computers. Seventy percent of incoming students report having some experience with computers.
1983
Users log 1.5 million hours on the school’s two mainframes, nicknamed How and Why. The library installs its first computers for student use. A faculty task force recommends building a network of “microcomputers” from multiple vendors.

1984 | GSB Archives
1984
Former Stanford GSB director of computing Sandy Lerner co-founds Cisco Systems. The school begins to shift from mainframes to PCs. “Our students, many of whom are using micros in high school and college, are going to demand that they be used as teaching tools here,” says Professor Myron Scholes.
1987
The library offers Compact Disclosure, a CD-ROM database of financial information from SEC filings.
1989
The school has about 440 computers, including 130 Apple Macs and 140 IBM PCs. Its annual computing budget is $2 million.

1989 | GSB Archives

1991 | GSB Archives
1991
A refurbished student computer facility is dedicated in the basement. The school now operates more than 500 “microcomputers.”
1992
The Rosenberg Corporate Research Center opens in the library.
1994
Stanford GSB launches its website and an email list for alums.
1996
Course materials are put online and videoconferencing equipment is added to two classrooms.
1999
The GSB library is connected to the internet. Professors Garth Saloner and Haim Mendelson found the Center for Electronic Business and Commerce. Students are advised to buy computers with 4 GB of hard drive storage.

1999 | GSB Archives

2013 | GSB Archives
2007
Stanford GSB manages over 1,000 computers and has over 14 TB of network storage.
2012
The IT team launches Cloud Forest, a cloud-based supplement to its three onsite Yen servers.
2013
The school’s first massive open online course (MOOC) is taught by Professor Joshua Rauh.
2014
Stanford GSB introduces its online LEAD certificate program.
2018
The school introduces new courses on artificial intelligence and machine learning.
2019
The Data, Analytics, and Research Computing (DARC) team is established to assist faculty with data analysis and machine learning as well as run dedicated research servers. A new AI-focused Executive Education course is introduced. “AI is quickly becoming a critical tool in every industry,” says program director and economics professor Paul Oyer.

2019 | Elena Zhukova

2020 | Elena Zhukova
2020
As the COVID-19 pandemic spreads and campus shuts down, Stanford GSB switches to virtual instruction in 60 hours. The GSB Research Hub launches Redivis, a platform that will process 13,000 TB of data by 2025. Dean Jonathan Levin says that data analysis is “the new big thing” and that “we have tried to build an engine that will make Stanford the best place to do this kind of data-oriented research.”
2022
DARC has already fielded 110,000 research jobs. Its computing cluster has 282 TB of storage and its cloud platform stores 119 billion records.
2024
GSB-GPT, a dedicated generative AI bot designed for secure faculty and staff use, launches. The Research Hub runs 12 Nvidia A40 graphical processing units (GPUs) on its server cluster, enabling it to run 70 billion-parameter large language models.
2025
Stanford GSB researchers get access to Marlowe, Stanford’s new “superpod” of 248 Nvidia H100 GPUs. A survey of faculty finds more than 80% are using AI in some form. Twenty-two courses mention AI or machine learning in their descriptions.

2025 | Stanford Data Science (Marlowe)
A Little Help From LLMs
How one instructor has folded AI into his teaching
Over the past two years, artificial intelligence has come crashing into the classroom at Stanford GSB.
“My life was made a lot more interesting by having to deal with this,” jokes Dan Iancu, an associate professor of operations, information, and technology, describing the pedagogical challenges created by AI tools, specifically large language models (LLMs).
In highly technical courses like Advanced Optimization and Simulation Modeling, Iancu aims to equip MBA students with the know-how to determine when to turn to an LLM for an immediate answer as well as the savvy to apply its power thoughtfully and effectively.
“What’s important is to develop a process where students still learn the fundamental concepts, which is a way to think about problems broadly,” Iancu explains. “Whatever the problem is, how do you quantify the important things and estimate these from data to answer the question or solve the problem?”
Iancu thinks of LLMs as a workflow, which he embeds in course elements such as coding and modeling. The programming language Python, for example, “was a bottleneck for some of our students, and it was something one had to understand in order to be able to move to more interesting concepts in class — how to capture fairness, for example. But LLMs are really good at Python, and one can now outsource some of this work.”
While Iancu still devotes class time to teaching Python, his students can now work with an LLM to write code, a shift that requires them to develop prompting skills and the ability to assess the results AI delivers. “To be able to verify that what they’re getting from the tool is correct, they have to understand the coding and the model,” he says. “I encourage them to not immediately go to the tool but to think first on their own, because I want to make sure that they have the correct model and are solving the right problem in the right way. I want to make sure students actually engage critically with the LLM.”
For students who are struggling to keep up, LLMs are no replacement for working through problems with an instructor. They’re also no substitute for conceptual mastery, and Iancu says evaluating comprehension remains a moving target. “Ideally, I’d assess students without access to the tools, because you just want to test the concepts. But that feels a little duplicitous because it doesn’t emulate the real world, and whenever I teach, I want to be as close as possible to actual use cases. I don’t think I have a full recipe just yet.”
Iancu understands why some of his colleagues prohibit the use of LLMs in their classrooms, a policy he believes will remain feasible in the short term. “It’s not trivial how to strike the right balance,” he adds. “But students have access to these tools and they are becoming more powerful, so if we don’t allow the tools, we’re adding guardrails and making them less productive than they could be.”
It’s an equilibrium that may prove increasingly difficult to maintain as the technology continues its rapid progress. “If you’d asked me five years ago where we’d be today, I would’ve said this was thirty years out.” — Michael McDowell
Explore More
Hard Lesson: Trust Your Gut and Take the Shot

Maker: La Boulangerie/Shaw Bakers

When Speakers Talk, Stanford GSB Listens
