April 23, 2026
| by Michael McDowellIn the second week of class, Yuyan Wang had about 35 MBA students experiment with the finer points of tuning a neural network, looking under the hood of a model similar to those that make recommendations on platforms like YouTube and Amazon.
Back to Class
In this ongoing series, we bring you inside the classroom to experience a memorable Stanford GSB course.
“It’s an art, not a science,” she said as students fiddled with algorithmic configurations in a simulated classification problem (distinguishing a cat from a dog, for example). For most, it was likely their first experience adjusting the settings that control how a machine learning model “learns,” or optimizes itself based on data to improve its predictions.
“You do not know what neural network configuration is going to be the best one for a data set,” Wang explained. “This is why it’s called hyperparameter tuning. You have to basically kind of trial and error, try out a few values, and then look at the model’s performance in the predictive scenario and see which type of parameter configuration gives you the best performance.”
Wang, an assistant professor of marketing, began teaching Understanding AI Technology for Business Problems — one of the school’s first technical courses on AI — in January 2025. Before joining Stanford Graduate School of Business in 2023, she spent nearly seven years in industry, working at Uber and Google’s DeepMind as a machine learning scientist and engineer.
That experience informs her approach to the material. “I found that the biggest challenge of launching a project or making some real-world product impact is actually not technical, but rather in the communication: convincing leadership, operations, or product teams that this is the right thing to do,” she says. “We speak different languages when it comes to AI.”
Wang’s aim is not to immerse students in the intricacies of algorithmic design but to nurture future leaders who are conversant in the vocabulary of AI and are attuned to the potential of agentic AI. She utilizes a mix of formats, from lectures and mini-case studies to practitioner visits. Anthropic’s Jascha Sohl-Dickstein, the inventor of the models that power image generation, visited her classroom during the winter quarter, as did experts from OpenAI and startups like Neo4j and LangChain. Hands-on lessons give MBAs a better sense of the nuts and bolts.
“I had a question around model construction hyperparameters,” one student asked, referring to the number of layers of artificial neurons that compose a model. “How do you decide between having just, for example, one neuron limiting layer, or like, 100?”
“That’s a great question,” Wang replied. “The rule of thumb is, usually the larger the data set, the more dimensions of the input, then the larger the neural network should be.” A neural network’s ideal size depends on how it will be used, she emphasized. “You can’t throw data and compute at your problem and hope it will magically spit out a model that will maximize your objective.”
The day’s homework would involve additional algorithmic training involving a much larger data set.
“The most difficult part of applying AI to solve a business problem is not building a fancy model but translating a business objective,” Wang says. “That translation part is not something technical people are capable of doing. But if you only have the business insights, then it’s often not that helpful. I want my students to have the ability to translate a business problem into something that AI can solve.”
For media inquiries, visit the Newsroom.