Podcast: Breaking Barriers in Tech with Rana el Kaliouby

WF1324853 Hero

Photo: Rana el Kaliouby with her book, Girl Decoded.

For this episode of Shaping the FutureTM, we are joined by Rana el Kaliouby, founder of Affectiva and author of the book Girl Decoded, now out in paperback. Rana is an AI thought leader, a World Economic Forum Young Global Leader and a Young Presidents' Organization member, and co-hosted a PBS NOVA series on artificial intelligence. You can keep up with Rana's latest innovations on her company's website, check out her TED talk, or follow her on Twitter, Instagram and Facebook.

A full transcript of the episode appears below; it has been edited for clarity.

You can follow Shaping the Future wherever you listen to podcasts. Subscribe on Apple Podcasts, Spotify, Google Podcasts, Stitcher, or iHeartRadio.

The views expressed in this podcast are those of the guest and do not necessarily represent those of HMH.

Matthew Mugo Fields: Today, we're speaking with Rana el Kaliouby, a pioneer in artificial intelligence, about the future of AI, particularly how emotion and increased diversity in technology can advance this field. Rana is co-founder of Affectiva and the author of the book Girl Decoded.

Rana's path led her from a life in Egypt to her doctorate and post-doctorate studies at Cambridge University and MIT. A passionate advocate for humanizing technology, ethics in AI, and equity, Rana has been recognized on Fortune's 40 Under 40 list and on Forbes' Top 50 Women in Tech.

At Affectiva, Rana and her team attempt to, as they say, "humanize technology before dehumanizes us." The results of their work are programs that can detect and respond to emotions as users interact with a wide array of technology products. We sat down and talked about her path as a student, the importance of mentors and her role as one of the few women founders of a VC-backed technology startup.

Now here's my conversation with Rana.

Welcome to the Shaping the Future podcast. It's so, so great to have you.

Rana El Kaliouby: Thank you for having me, Matthew.

Matthew: Now, before we jump in, I have two confessions to make.

Rana: Uh oh.

Matthew: Exactly. Ever since reading your book, Girl Decoded, I have been super excited, geeking actually, to have this conversation with you for two big reasons. Number one, I have worked in the area of AI in the education field for about the last 10 years. And in doing that work, I had the pleasure of working with some of the early pioneers of what we call traditional AI. You are a pioneer of a whole new branch of AI, Emotion AI. And it is not that frequent that folks get to talk to people like you, who are pioneering a whole new thing. And I'm so excited to explore that with you.

But the other reason I'm excited to have you on the podcast is as both an educator and a #GirlDad. I'm super excited to explore your journey. You have such a potent and powerful journey that has so many lessons for all of us, but particularly for young women. And I'm really excited to have you with us. We always like to start with a discussion of your sort of early years, your early academic life. And so I would ask you to start by telling us a little bit about young Rana, who you describe in the book as a nice Egyptian girl.

Rana: Totally. So I am originally from Egypt. I was born in Cairo and actually to two parents who are technologists. So my dad taught COBOL programming, which is an extinct programming language at this point, but he taught it in the seventies. And my mom signed up to take this computer programming class. She must have been one of the very early females in the Middle East to be in tech. And they met and hit it off and got married. And so, I grew up very much around technology, and both my parents were big supporters of our education, which I'm super grateful for. I actually think they invested...I have two younger sisters, and they invested in our education. And I really think that's the biggest investment they've made for us. And I'm so grateful. And I tried to pay it forward with my kids. But yeah, grew up surrounded by technology, and had this realization. As a nice Egyptian girl from pretty conservative family, I was not allowed to date until after college. And had a very strict curfew growing up.

But there was this commitment to education and being a go-getter. It shaped who I am. And it also, being surrounded by technology, it made me realize the power of technology and helping people connect with one another. And that's been a common thread in my work.

WF1324853 Inline 3

Matthew: Your mother must have been a pioneer in her own right. Back in, I guess, in those days, taking your dad's class, was she a pioneer and did that inspire your journey?

Rana: She was definitely a pioneer. I will say she worked her entire career, but it was always clear that her career came second. So she was never allowed to talk about work at home. I also now realize she was never allowed to travel for work, which is interesting. So, growing up, I knew she was a career woman, but I didn't really understand the sacrifices she had to make. She's definitely a role model for me and my sisters, because she was one of the very first kind of techies in the Middle East.

Matthew: Oftentimes, even when people have the kind of family support or family inspiration that you have, they receive support from, encouragement from, mentorship from educators in their lives that help add some rocket fuel, if you will, to them. Were there any educators that kind of stand out in your journey? And if so, who were they? What'd they do for you?

Rana: Absolutely. So, the first one is my math teacher in high school, and she was smart, and she was also super hip and cool and cute. And I was like, "Oh my God, you can be a mathematician, and you can also be super cool." And she just really was a big cheerleader for me. And I credit her for having an interest in science and STEM basically because she made me believe in myself, and she was also a great role model. So she was definitely instrumental in my choosing computer science as an undergraduate. But then fast forward, many years, this MIT professor Rosalind Picard, who, I ended up reading her book in Egypt in 1998, called the Affective Computing, where she posits that computers need to have emotions and understand emotions. And I was so inspired by her work, and I followed her work for years, and eventually got to meet her in person in 2004, while I was doing my doctorate degree at Cambridge.

And we hit it off, and she became my mentor and my sponsor. She brought me over to the United States as a post-doc at MIT. And then we later co-founded our company together. She's definitely somebody I admire, and I look up to, and she took a risk on me. She took a risk on this weirdo Egyptian girl, right?

Matthew: I want to complete the exploration of your academic journey, and then we'll get into your professional journey. So, tell us a little bit about what took this, again as you say in the book, nice Egyptian girl to Cambridge University, miles and miles away in, in the UK? And what drew you there? What was that all about?

Rana: My initial dream, if you like, was to become faculty at the American University in Cairo, where I did my undergrad, and I knew to do that, I had to go abroad and get a PhD. So, at the time, I was a new bride, I just literally got married, and I knew I couldn't come to the United States because it's too far. My husband at the time had to stay based in Cairo because he had a company there. So I was like, okay, the UK is close. So I applied and got a scholarship to go do my PhD at Cambridge University. And I basically left my family behind, moved to Cambridge, and I focused on building the very first emotionally intelligent machine. I was so fascinated by how technology changes the way people connect and communicate with one another. Yet all of our machines are so oblivious to our emotions. They're emotion blind. And so that set me on this journey to ask the question, "What if computers could understand human emotions? What would that look like? And what would that unlock in terms of applications?"

And yeah, that was, that was the beginning of kind of my PhD work. And then, of course, I got so into the research and building the actual technology that I kind of moved on from my initial dream of becoming faculty. And I wanted to do more.

WF1324853 Inline 4

Matthew: So based on what you just said, I want to explore this idea of connecting something like emotion and AI for you, which you were deeply curious about and interested in, with an academic course of study. Now you were talking about it in the context of PhD and advance study. But that, I think that there's something about that that applies to even K–12 education as well. So, what was it for you that got you super curious about marrying together emotion and AI and making that a focus of your studies?

Rana: I think it is really important to find something that you're passionate about. And as you said, that's almost like age-independent because when you're intrinsically motivated, that's when you're driven. That's when that's what gets you out of bed. That's what makes you go the extra mile. I actually use that philosophy with my team at Affectiva, too. It's so important that people feel connected to the work they're doing. Otherwise, it's just like a job, and it's onerous, right? You want people to be excited about what they're doing. So for me, I grew up really interested in people's nonverbal communication. I attribute it back to being in middle school and high school and not being able to be dating but watching other people. Watching all the eye-gaze exchange and all the non-verbal [cues]. So I was always fascinated by that parallel language that humans employ.

And actually, it's the majority of how we communicate; 90% of our communication is non-verbal. It's facial expressions, vocal intonations, hand gestures, body movement. It's not the words you're saying. The words you're saying are only about 10%. So, I was really interested in that, and I was just also intrigued by the role technology has continued to play in our lives, but it was missing this understanding, and I wanted to marry these two universes. So yeah, that was what got me started on this over 20 years ago now, which is crazy.

Matthew: Wow.

Rana: And it was way before we had smart phones. It was way before cameras were ubiquitous, but I kind of saw that this was coming, and I wanted to play a role in building that future, shaping the future.

Matthew: Yeah, exactly. Shaping the future; that's amazing. Guys and girls making googly eyes at each other led to a whole new branch of AI. Yes, you heard that here.

Rana: Exactly.

Matthew: That's great. That's great. So, you take on the sort of courageous journey, newly married, off to Cambridge University. That had to be sort of a culture shock to your system in some way. What was that like?

Rana: It was definitely a cultural shock. It was also challenging because it was right after September 11th. It was literally like 10 days after September 11th happened because that was when the semester starts. And at the time, I used to wear a hijab. So, I was very visibly Muslim, and my parents and my in-laws were against this. First, I was married. You know, I wasn't really supposed to leave my husband behind and just go off and do my thing. Second, they were concerned for my safety. So I decided to take off my hijab and put a hat on. So, I would literally show up to class in these like British hats, which was really awkward. But I also, that was the time when I kind of realized my smile is my secret weapon.

I realized that when I smile, it's like me saying, "Hey people, I come in peace." And I built really amazing relationships at Cambridge and friendships and grew a lot. Became very independent as a human, which of course, ultimately the trade-off was, I think it hurt my marriage because we had this long-distance relationship, and we were apart. But yeah, it was a really transformative time in my life. And again, it's the power of education, right? Like it opened my brain and my eyes to a whole new world that I wasn't aware of.

Matthew: That resonates deeply. You were able to, in some ways it sounds like you're describing a process of self-discovery and an awakening of sorts for you, even though you were in this culturally foreign environment.

Rana: Did you have a similar experience?

Matthew: Yeah. Yeah, I did. Mine was a lot younger. I came to the United States when I was 10 and a half. And so it began then but really stuck for me in college. And that awakening had a lot to do with figuring out my place in the world and how I was going to do for others what had been done for me. So that was my version of emotion and AI for you. The thing I was obsessing over is how I can do for other kids, what the village, if you will, of people that kind of helped me get over, what they did for me. And so that's been the sort of theme of my career. But we're talking about you.

Rana: But all these universal themes, right. They're universal. Yeah.

Matthew: Yes, absolutely. Absolutely. So, it's while you're at Cambridge that you then discover that there's this kind of burgeoning field of research around this thing that you have been thinking about and how does that happen and how do you get involved?

Rana: It was interesting because it was very new. So, MIT was on it already, and you know, this was their building, all of this stuff, but Cambridge didn't really have anybody working on Emotion AI. I was the first PhD student to explore the space, and my PhD advisor, Professor Peter Robinson. He's your typical older British professor, right? He did not at all get why emotions had any place in technology. So he was very skeptical. And I remember walking into his office, and he'd have this like very kind of stiff British face, no expressions at all. And here I am, like talking about facial expressions and the power of the face. And nothing, right? Poker face in return. And I'd walk out of his office in tears. And I'd be like, "What have I done to my life? I'm doing a PhD about emotions with somebody who doesn't care about emotions."

But over the course of four years, we just built an incredible relationship. And he now runs a whole team of researchers who specialize in Emotion AI. So I'm kind of proud that I got him over to the other side where he believes that emotions matter in our lives.

WF1324853 Inline 6

Matthew: There's a lesson in there about your own conviction and perseverance, and that helping to transform the perspective of someone who was a mentor or someone who professionally was ahead of you. And I think that's powerful. And that's a powerful lesson for many, many people.

Rana: Yeah. You run into a lot of naysayers and a lot of skeptics, and you have to either decide that you are going to forge ahead anyway, with or without their buy-in; or you want to get their buy-in. If you want to get their buy-in, it's a process, and they have to build a lot of trust. You have to work on it. And I think that has definitely been the case in my careers, especially with the company.

Matthew: Hey, Shaping the FutureTM listeners, do you know where you can get the best digital curriculum all in one place? If you're a school or district leader, team up with HMH today to help all students and teachers reach their full potential with our new HMH Anywhere solution. HMH Anywhere offers educators seamless access to core curriculum for language arts, math, science, and social studies, paired with AI-driven, personalized practice, and online professional development for teachers. And all supported by a unified assessment system that precisely tracks students’ growth. HMH Anywhere also integrates with tools like Zoom, Google Meet, and Microsoft Teams for easy remote teaching and learning. Check it out at Now, back to the episode.

So I mentioned at the outset, you're a pioneer new branch of AI. We have mentioned casually this thing, Emotion AI, but for our listeners, could you just do the sort of basic explanation? What is Emotion AI, and how did it come to be, and how is it used?

Rana: Sure. So I'll start with human intelligence. In human intelligence, you have your IQ, your cognitive intelligence, but you also have your EQ, your emotional intelligence. And we know from over 60 years of research that people who have higher EQs and certainly all educators would resonate with that. They tend to be more productive and personable and persuasive in their work and in their lives. So, I believe this is true for technology as well. Especially technology that is so deeply ingrained in our everyday lives like AI is. So my hypothesis is that all these devices that we interact with on a daily basis, it's not enough for these devices to have IQ. Like Siri, has a little bit of IQ because you can talk to Siri and talk to Alexa. But they have no EQ. They have no emotional intelligence.

Emotion AI, or artificial emotional intelligence, is this idea that we build technology that can quantify and capture these enriched nonverbal signals like facial expressions. Or sometimes, we do vocal intonations of how fast are you speaking. How much energy is in your voice. All of these signals, we're now able to capture them using technology. And it unlocks a lot of use cases.

Matthew: And the face is important in the expression of emotion. Why?

Rana: Ooh, the face is one of the most powerful canvases for expressing one's...not just emotions, but your cognitive states or social state. There's about 45 facial muscles that drive what expressions we make, and in the seventies, this guy called Paul Ekman—there's a show that was modeled after his work. It's called Lie to Me. And basically, he mapped every single facial muscle to a code. So, when you smile, it's code 12. When you do a brow furrow—angry—it's like action unit four. And he built a system so that people could become face readers, like certified face readers. But it's still laborious. It takes a hundred hours of training, and it's just so time-consuming.

So instead, we use machine learning and computer vision to train machines to do that automatically. So when it sees your face, Matthew, it can very quickly say, "Ooh, I can see the 12 plus four plus three plus 17." And just map out your expressions. And then if there's a different level of coding that takes it to an emotional state or a cognitive state. Do you look tired? Do you look confused? Do you look excited, interested, happy, sad, et cetera.

Matthew: And so, you really took this idea and your research here. You went with Rosalyn to MIT Media Lab where you started there as a researcher, but then you guys got together and co-founded this company Affectiva that you're now the CEO of. And spun it out of the university. So, you're not only a pioneer in this whole new branch of AI, but the fact that you are a founder CEO of a venture capital finance tech startup is also a pioneering part of your journey. And I'd love for you to talk a little bit more about that aspect of it. Taking this thing from being a research project to becoming a company.

Rana: Yeah. So when I got to MIT Media Lab in 2006, the Media Lab is very unique as an academic institution in that it's very interdisciplinary. It's where the misfits are, but also, we were very tied to industry. So twice a year, we would sponsor all of these Fortune 500 companies and invite them to the lab. And it was actually called a "Demo or Die." You couldn't just show up with a PowerPoint, or you just talk. You have to show a prototype of what you were building. So, leading up to these weeks, twice a year, these weeks we would just work overnight. We would spend the nights in the lab building these prototypes. So, for a few years in a row, all these companies wanted to buy the technology. So, Procter & Gamble wanted to test their new products using this. Bank of America wanted to track customer experience.

We had Pepsi that wanted to test their ads with this, right? We just had this list. Toyota wanted to track driver drowsiness. And I literally kept a log of all these different use cases. And I had no way of giving them the technology because it's an academic colors, no mechanism to do that. And when the list got to about 20 different companies, Roz and I went to the Media Lab director at the time, Frank Moss, and we said, "Hey, Frank, we need budget. We need more researchers on this because we're ignoring our sponsors." And he said, "This is not research anymore. This is a commercialization opportunity." My knee-jerk reaction, my initial reaction was like, "Wait, Frank. I'm about to apply to faculty at MIT. Don't mess with my plan." But then I thought about it. And it was a really unique opportunity, as you said, to take something I'm deeply passionate about and now bring it to the world at scale. Which often, academia isn't really set up to do well.

Yeah. So we started at Affectiva in 2009. We are venture-backed. We've raised over $15 million of venture in strategic funding. And it has been a roller coaster, an emotional roller coaster, for sure.

WF1324853 Inline

Matthew: Yeah. I can imagine having gone through a similar process of commercializing research. In my case, it was Stanford, and it was AI in education. That's not an easy process by any stretch, but in your case, I think, I don't know what the latest numbers are, but it's something like less than 5% of venture capital funding goes to women founders. And you're a part of that very small number. Tell us a little bit about that part of the journey and about that process for you.

Rana: Yeah, it definitely was not easy. So, when Roz and I decided to spin out, because of our connections to MIT, we were able to get a lot of meetings, but we also got a lot of "no's." So, I remember we did the whole Sand Hill Roadshow, where you've got all these back-to-back VC offices, and you have meetings with a lot of them, but I want everybody to picture this. So two women scientists, I was visibly Muslim because I used to wear the hijab at the time. And we were pitching an emotion company, but we never met with a woman funder. It was all like older white dudes. So it was hard. Like we did not look like anything that they saw or invested in. So that was another example where we had deep conviction, and we just persevered.

Rana: And I have to give Roz a lot of credit for that because she really taught me that if you believe in this, you don't take no as an answer; you just keep going. And we kept going and found our initial check and then initial funder and went on from there. We've raised money from top-tier investors like Kleiner and others.

Matthew: For those of you don't know, Sand Hill Road is this famous stretch of road in the heart of Silicon Valley, where many of the top venture capital firms in the world now all have offices. And so it is not uncommon for entrepreneurs who are trying to start companies and get financing to do a tour and go from office to office. I've been on that same tour myself. And it is all about having perseverance. To your point, once the first domino dropped, you were able to get others on board. And again, another lesson in your own perseverance. I want to ask you, as you think about and reflect on your journey, both as a scientist and as a CEO, what advice do you give? And do you have, particularly for young women out there who are encountering, who may be encountering, some of those harsh no's?

Rana: I think my biggest advice would be to not let yourself be your own biggest obstacle. I'll share an example around that. So when we started the company, we decided also we were nudged by some of our investors to hire a seasoned executive to run the company. So my role was chief technology officer, and we hired the CEO. And he was there for a few years. He was great. And then he decided — he was commuting back and forth from the Bay Area to Boston—it was really tough on his family. So he decided to move on. And the question became, "All right, who's going to be the next CEO?" And a few members of our board said, "Rana should be. It's her baby. It's her technology" Roz, my co-founder, stayed at MIT. She's a tenured professor. So, she never left MIT.

And I went back home, and I thought about it. I was like, "I've never been CEO before. I don't want to fail." And so, I went back the next day, and I said, "I'm sorry, I can't do it." At the same time, our head of sales, who had also never been CEO before and only been with the company for a year and a half. He was like, "I'll take it. I'll take the job." So, he became CEO, and he was CEO between 2013 and 2015. When, literally on my way back after my TED talk, I was googling, "What are the roles and responsibilities of a CEO?" And it hit me that I was doing it. I was raising money for the company. I was the face of the company and the tech category, very involved in our product and product market strategy. I was like, "Wow, I'm doing the job."

It took me convincing myself that I can do it before I was able to convince the world. Because I went to him and we negotiated it. We took it to the board. It was a unanimous vote for me to step into the CEO role. But in a way, I was my own biggest obstacle. And I tell young people, especially young women, don't be that. Right? Don't wait until you check 150% of all the boxes where men typically just check 30% of the requirements, they're in. I honestly still struggle with that voice of inner doubt. I still have to navigate my way through that.

WF1324853 Inline 5

Matthew: That's very honest. There's nothing wrong with being aspirational. And if you have the conviction and you are willing to work hard, we all learn on the job. Right?

Rana: Right.

Matthew: Yeah. We talked a bit about this feature of your journey as, in your own way, confronting gender bias, which is notorious in areas of technology, right. Or the under-representation. I've asked you a couple of times to share lessons and thoughts for young women. What advice do you have for men? So, we can ensure that we are less biased as we move around in the world?

Rana: Oh, I love that. I will say honestly, I've had so many amazing male mentors and male allies, and I'm so grateful to them. So be a male ally, be a male mentor, right. That's number one. Number two, especially in AIand you've just alluded to thatif you're not intentional about fighting bias, bias is going to creep into your algorithm. And then you're going to deploy this technology at scale around the world. And before we know it, we've accentuated the biases that exist in society, and now we've just deployed it at scale.

Matthew: That's right.

Rana: I think that's so important that as leaders, and if you're a thought leader or in a position to be a hiring manager, make sure your team is diverse. I don't mean just gender diversity. I mean ethnic diversity, age diversity, diversity of backgrounds, experiences. Because we each have our blind spots, and we come to the table with our prior experiences, and we bake it into how we design solutions. So, the more diverse voices you have around the table, the less bias these technologies are going to be. I'm very passionate about that. I see this as the biggest risk we have with AI right now. The only solution is to bring diverse voices around the table, in my opinion.

Matthew: I'm glad you went there. And by the way, thank you for the advice. Part of the challenge with the biases is this idea that many of us don't even consciously realize how much pattern recognition plays a role in our decision-making and even our value system. We're looking for matches. I'm using that language very deliberately, and you're shaking your head because you know why, because that's the essence of AI. It's about recognizing patterns and then matching it when new data presents itself.

Rana: Totally.

Matthew: And that is inherently biased by whatever came before. So, if you're a venture capitalist and all the startups that you funded that were successful were by a certain kind of person, whether you're conscious about it or not...Looking for that, whether you're hiring, whether or not you're thinking about who is truly the next AI disruptor, right? And how you program the algorithms that are in your products can carry these biases, unintentionally. It doesn't have to be nefarious. And the point you made, which I think is super powerful, is we've got to be mindful and conscientious and proactive about addressing this stuff, both in terms of who we have at the table and who we build solutions to support.

Rana: 100%. Absolutely. It's often not done intentionally, but that's not an excuse.

Matthew: Yeah, yeah, check those default settings, people.

Rana: I love it, yeah.

Matthew: Change those default settings. Yeah. Great.

Rana: That's a great way to actually capture the bias problem. I love it. I might, what's the word...Borrow it.

Matthew: Borrow it, borrow it. You're free to borrow it. So what advice would you give to the educators out there about how to create and help create more Ranas?

Rana: I will say like over the course of my career, I've been so fortunate to have educators that have played a critical role in my life, either as supporters or sponsors or mentors, or challenging my thinking and expanding my brain and my horizon. So I'm so grateful. I see that with my kids too. There's a number of faculty members at their school who have played a really key role in shaping who they are. So don't forget that as an educator, believe and know that you are changing at least one person's life, and there's real power to that.

WF1324853 Inline 2

Matthew: Even when it can get frustrating and exhausting. Certainly this last year of COVID-19 teaching and learning has been exhausting for many educators out there. You're saying, "Hey, don't ever forget that you've got a young Rana in the midst there."

Rana: Absolutely that's listening and watching and taking it all in, and you never know. Right? You never know that little thing you do or say can inspire a young person and put them on a path that changes the trajectory of their life. Don't underestimate that.

Matthew: That's great advice. Hey Rana, so you mentioned that you were on your way to being a faculty member, and that was your sort of path. And then you got sucked into what became an entrepreneurial journey. When you were on that faculty path, did you have a guiding philosophy of teaching? Guiding philosophy of what it means to be a teacher? And if so, what was it?

Rana: Ooh, that's a great question. I did actually end up teaching at the American University in Cairo and then at MIT for a couple of semesters. And I'm back at the Harvard Business School as an executive fellow. I think my philosophy is, again, to tap into this intrinsic motivation, like put people on things that they really care about. Causes that they care about. Questions that they're intrigued about or curious about. So, I try to leverage this intellectual curiosity and the drive and passion, and we then take the core or the basic principles and customize it depending on what that passion area is. Because I think it's really important to leverage that intrinsic motivation if possible.

Matthew: That makes a ton of sense. And it certainly resonates with me. What does "shaping the future" mean to you? When you hear those words? What does that conjure for you? What do you think about?

Rana: Oh wow. So, in the AI space, and in particularly kind of Emotion AI, I really see a future where the way we interact with machines is exactly the way we interact with humans through conversation, through perception, through empathy. And there are many applications of this technology, including an education and how educators engage with their students and vice versa. Definitely, throughout the last year, we've seen some of how digital communication plays a key role in learning. So, I'm excited about weaving in Emotion AI and making it ubiquitous. I think it's going to shape not only how we interact with our machines but ultimately how we connect and communicate with one another. And that's the golden opportunity. That's what I find really exciting.

Matthew: An example I've heard you mentioned in the past is about your students' learning application could detect, based on their facial expression, if they were frustrated or disengaged somehow, or that they were ready to move faster. Those are the kinds of things you're talking about.

Rana: Exactly. Or imagine if you're an instructor and you're addressing, I don't know, 20 students, and you can't really see all their faces in real-time. So, what if this technology could capture the gist of the energy in the room or the energy of an audience and give you real-time feedback? That's another one that I would have loved to have over the last year.

Matthew: Yep. Yep. Give you some feedback on, "Hey, you need to pep up the Zoom." So, it has been great to have you, we have a mostly educator audience, and I wanted to just give you a last, the last word to say anything you wanted to the educators out there.

Rana: I mean, as I look back on my career, it's all of these key educators that have shaped my life, either challenged the way I think, supported me on my path. And I see that with my kids. A number of faculty members have played an instrumental role in my kids' lives as we moved from Egypt to the United States. And I'm so ever so grateful. So just know that your job will really make a difference and can really make a difference to your students. And that's amazing. Thank you for having me.

Matthew: Thank you. This was awesome.

Thanks for listening and learning with us. You can be the first to hear new episodes of Shaping the Future by subscribing on Apple Podcasts, Spotify, or wherever you listen to podcasts. If you enjoyed today's show, please rate, review, and share with your network. Give us a shout out on your favorite social media platforms.

We value our listeners' support and feedback. Email us at Shaping the Future is produced by HMH. Thanks for listening.


SHAPING THE FUTURE is a trademark of Houghton Mifflin Harcourt Publishing Company.

Related Reading

WF1329372 Hero

Ali Habashi
Shaped Editor

Screen Shot 2023 12 18 at 12 05 02 PM

Jennifer Corujo
Shaped Editor

Hero Banner

Photo: Reading intervention teacher Jennifer "Jennie" Mau