By Folu Opaleye
My journey through IFS740 has recently shifted from just memorizing definitions of “strategy” to actually experiencing how it operates in a state of permanent volatility. Strategy isn’t a static plan you print out; it’s a living, breathing response to a world that won’t stop changing. This post is my deep dive connecting our group fieldwork with informal food vendors in Belhar and Bellville to the slightly terrifying ethical frontier of AI, and the leadership mindsets we actually need to survive it.
Week 5: Learning to Fail “Rough and Simple”
During our Week 5 “Maker Week,” I had to have a serious talk with my perfectionist tendencies. We were diving into Strategic Experimentation, and the lecture material was essentially shouting at me that prototypes should be “rough, simple, and quick.”
In our group, we really struggled with “deferring judgment.” When we started brainstorming ideas for our informal food vendors, our solutions were so rigid! We were obsessing over how “realistic” or “technically perfect” an idea was before we even understood if it had value. It was a total rookie mistake. I learned that in a volatile environment, uncertainty isn’t a wall you hit, it’s the room where strategy is born.
We had to learn to fail fast. We realized that for a vendor in a township, a “frugal innovation” approach is infinitely more strategic than some high-end, over-engineered system. By applying the Sachs and Kundu mindset shift of moving from Control to Empowering, we stopped trying to “manage” the vendor and started trying to “equip” them. Prototyping taught me that failure is just a strategic capability; every time an idea didn’t work, it pointed us closer to the ones that would: our voice-ledger and QR scanner.

Week 6: The “Cape Town Outlier” and the Empathy Gap
Week 6 was when the theory finally got a face. Heading out into the Belhar and Bellville communities completely shattered my “techno-optimist” goggles. I went in assuming that these vendors just needed “better tech” to succeed. I was wrong.
Our empathy maps revealed a heartbreaking “dual reality.” One vendor told us she feels completely “invisible” the second the university students leave for the holidays. We dubbed this the “December Vacuum.” Another vendor was literally trying to serve 100 hungry students a day using a single domestic stove in her home kitchen. Her “physical infrastructure” was a ceiling she couldn’t break through, no matter how much software we gave her.
These real-world constraints changed my entire understanding of strategy. I realized that Digital Adoption isn’t a choice of ‘will’; it’s a choice of ‘affordability’ and ‘trust.’ If a vendor has to choose between a data bundle and a loaf of bread, the bread wins every time. We had to pivot our thinking to the Society 5.0 framework, which is all about an “Imagination Society” where technology actually solves social disparity rather than just creating fancy gadgets for people who already have everything.

Week 7: Weapons of Math Destruction and the AI Frontier
By Week 7, we were deep into our specific solutions: a voice-ledger and a QR code scanner. But my “innovation high” was replaced by what I call a “responsibility hangover” after reading Cathy O’Neil’s Weapons of Math Destruction. Her argument hit me like a ton of bricks: algorithms are just “opinions embedded in code.”
This is a massive ethical dilemma for our project. If we design a voice-ledger to help a vendor in Belhar track their manual records, but we train the AI behind that voice recognition on “formal” business English or Western dialects, we will systematically exclude every single entrepreneur we met during our fieldwork. That is Algorithmic Bias in its purest, most damaging form.
I’ve learned that I have to be the one fighting for transparency. We can’t let the ‘mysterious’ nature of AI become a new way to sideline the people. For me, ethics aren’t just something to list on a CV to look good; they are the only way to make sure our tech does what it’s supposed to do: help people, not hurt them.

Week 8: Digital Leadership in “Permanent Beta”
Finally, Week 8 synthesized everything. I’ve realized that Digital Leadership in 2026 isn’t about being a tech guru who knows the most code. It’s about being a Synthesist. It’s about being able to take the messy, lived realities of an entrepreneur who is working alone to cut staff costs and synthesizing those needs into a tool that actually works.
Reflecting on those Sachs and Kundu mindset shifts, I’ve realized that to lead in the South African context, I have to move from Privacy to Transparency. My job isn’t just to build a platform; it’s to lead with an Inclusive Vision.
To lead responsibly in a complex environment, you need the People, Process, Technology triangle to be balanced by Purpose. Our project isn’t just about a QR code; it’s about empowering a human being to survive until February when the students come back. That kind of leadership requires empathy, ethical reasoning, and a willingness to always learn, adjust and improve.

My Evolving Stance
So, what’s actually changed in my head?
At the start of this semester, I viewed Strategic IS Management as a corporate toolkit that’s used to get a competitive edge in a fancy boardroom. Now I see it as a moral compass.
By mixing design thinking with real empathy for the resource-constrained and keeping a suspecting eye on how AI handles data, my professional identity has been flipped. I’m no longer just an analyst; I’m an advocate for an AI-Ready workforce that actually cares about diversity and ethical reasoning over simple automation. I’m ready to build tech that actually fits the world we live in, not just the one we see from a laptop screen.
Leave a comment