Ever wondered how artificial intelligence is reshaping our world and what it means for your future career? I'm certainly following this topic along with my university colleagues.
Join me for an enlightening conversation with TIM HAYDEN, CEO of Brain+Trust Partners and Texas State University alumnus, as we dissect the rapid evolution of AI. Discover how intuitive models like Perplexity are transforming the way we access information and why hands-on experimentation with AI tools is crucial to understanding their full potential. We also explore how students and faculty can harness AI to enhance their academic endeavors while maintaining the integrity of their educational experiences.
In this episode, Tim draws a fascinating parallel between the historical impact of the shipping container on global trade and the revolution AI promises to bring to various sectors. We discuss the importance of adopting a growth mindset and empathy to successfully navigate technological advancements.
And, as I do with all my guests, I asked Tim about his strengths and natural talents. I hope you enjoy our conversation.
Resources:
Contact Tim Hayden on LinkedIn
The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger by Marc Levinson
The Signals are Talking: Why Today's Fringe is Tomorrow's Mainstream by Amy Webb
Quantitative Futurist Amy Webb
Thanks for listening! Please send me your feedback in a text message -
00:00 - Exploring Artificial Intelligence and Impact
17:08 - Implications of Artificial Intelligence
23:45 - Empathy and Growth Mindset in AI
WEBVTT
00:00:00.960 --> 00:00:06.713
Do you want to know about artificial intelligence and the impact it might have on your life?
00:00:06.713 --> 00:00:14.814
Well, I certainly do, and that's why I reached out to Texas State University grad Tim Hayden.
00:00:14.814 --> 00:00:17.987
Welcome to Stories of Change and Creativity.
00:00:17.987 --> 00:00:19.131
I'm Judy Oskam.
00:00:19.131 --> 00:00:26.289
I'm a university professor and administrator at Texas State, and on this episode, I talked to Tim Hayden.
00:00:26.289 --> 00:00:28.783
He is the CEO of Brain+Trust.
00:00:28.783 --> 00:00:33.012
He advises clients about media, data and technology.
00:00:33.012 --> 00:00:49.844
He's been in this space for more than two decades and during our conversation, we talk about AI search, what students and faculty should know, and what we should all know about how to embrace AI and technology.
00:00:49.844 --> 00:00:52.149
I hope you enjoy our conversation.
00:00:52.149 --> 00:00:58.323
Let's start with give me a quick introduction and then I'll get into some questions.
00:00:58.484 --> 00:01:15.412
Okay, Well, I'm a native Texan, I'm a committed husband, father, son, friend to many and, I think, for us talking today, I'm a devoted Bobcat.
00:01:15.412 --> 00:01:17.566
I'm a forever Bobcat,
00:01:17.828 --> 00:01:18.088
Love it.
00:01:20.206 --> 00:01:24.180
And for the last 20 years, I've owned marketing agencies.
00:01:24.180 --> 00:01:25.620
I've worked for marketing agencies.
00:01:25.620 --> 00:01:46.659
I've started software companies and I've worked for them, and since 2016, I've been in the big data business, really helping companies come around to everything from compliance with data privacy, but also cybersecurity and personalization of customer experiences.
00:01:46.659 --> 00:01:51.150
And here we are using that data for artificial intelligence.
00:01:51.471 --> 00:01:52.472
That's right, I love it.
00:01:52.472 --> 00:01:56.828
Well, and we hear so much about AI and I mean where do we start?
00:01:56.828 --> 00:02:01.951
Where does the average listener or viewer really where do they start with AI?
00:02:01.951 --> 00:02:05.510
What should they know and how should they go about that?
00:02:06.859 --> 00:02:16.294
Well, you know, I think you know I've worked back at the beginning of 2010 and 2011.
00:02:16.294 --> 00:02:33.733
I worked for Edelman, the world's largest PR firm, and I had a colleague there we're still friends Steve Rubell, who used to say about social media is that there's news that finds you and news that you find.
00:02:33.733 --> 00:02:40.247
And AI is very similar in that respect, in that there's a whole lot.
00:02:40.247 --> 00:02:52.092
As we start to carry our smartphones everywhere we go with us, right, we are basically training machine learning and artificial intelligence to be more empathetic with us.
00:02:52.092 --> 00:02:59.614
Right, in terms of the apps we use, in terms of some of the functionality on the phones themselves.
00:02:59.614 --> 00:03:04.907
You know this is it watches what we do and it learns from it.
00:03:04.907 --> 00:03:07.433
Right, and that's a generalization.
00:03:07.433 --> 00:03:51.832
But where I'm going with that is just that there's a whole lot that has already happened over the last decade with machine learning and artificial intelligence to make our lives better or at least more fluid, for better or worse, because there's, I promise you I can probably do a pros and cons of where the things are headed and where they've already taken us, but at the end of the day, I think for folks right now that are really curious about artificial intelligence, you know, one of the quickest ways to really play around with it is to start to investigate companies like OpenAI or Anthropic and Claude, which is their large language model.
00:03:52.933 --> 00:04:21.223
You know, using perplexity for search as an example, you know there are things that are happening right now that are the precursors to challenging and replacing things that we've always done, and when I when I mentioned perplexity, I'll try to end on that, but this answer, at least, is that perplexity, you know it, is built on the premise that when you search for things, you want answers.
00:04:21.223 --> 00:04:23.249
You don't want lists, right?
00:04:23.249 --> 00:04:36.293
So the obvious the $27 billion business that Google has owning the majority of the search market is currently being disrupted.
00:04:36.293 --> 00:04:51.690
It is currently being challenged by companies like Perplexity they're not the only one, but companies like Perplexity that are saying, hey, we need to challenge what search is and what it does, how it functions, right.
00:04:51.690 --> 00:04:57.211
So it's important, I think, for folks to get out and play around with these things right now.
00:04:57.211 --> 00:05:02.473
You're only going to find what the value is if you actually do it.
00:05:02.473 --> 00:05:04.841
You're not going to find it through somebody giving you best practices.
00:05:04.841 --> 00:05:07.067
I think that's the short answer.
00:05:07.728 --> 00:05:18.163
Well, and I think that's a great answer, and, as you know, our faculty at Texas State are really engaged with AI and we're connecting it with the content.
00:05:18.163 --> 00:05:20.427
And you mentioned your start.
00:05:20.427 --> 00:05:22.892
One of your early starts was with Edelman.
00:05:22.892 --> 00:05:33.112
How do you think students can benefit from this as they're getting out into the workforce, and how should they really navigate this?
00:05:34.639 --> 00:05:56.413
Well, you know, I think there's a lot of discourse out there right now about AI in the classroom, and let's just consider, when the student leaves the teaching theater and goes back to their apartment or a coffee shop or anywhere, they have the ability now to ask ChatGPT for an answer.
00:05:56.413 --> 00:06:01.105
They have the ability to upload some questions and say answer these questions.
00:06:01.105 --> 00:06:14.290
For me, I think the important thing here is that not to just replace what is your responsibility or, better yet, your opportunity as a student to learn.
00:06:14.290 --> 00:06:24.305
It is to understand that AI like this will be available in the workplace ultimately, and if anything, it's here to help you get a head start.
00:06:24.305 --> 00:06:25.742
You know it's here to write a draft.
00:06:25.742 --> 00:06:26.302
It's here to.
00:06:26.302 --> 00:06:30.581
It's here to give you a complimenting point of view that maybe you don't have.
00:06:31.324 --> 00:06:47.882
I think if you treat it that way with some respect, I think that students will find out that it shouldn't replace I'm being very careful not to use the word cheat, but you know it's not to use the word cheat, but you know it's.
00:06:47.882 --> 00:06:47.983
It's.
00:06:47.983 --> 00:06:54.963
It is just to say let's not replace what it means to study, what it means to read, what it means to recall your with your notes that you took in class.
00:06:54.963 --> 00:07:05.894
You know there are, there are technologies that are allowing us to do that by transcribing what you know, what is said in the classroom or what is said on a Zoom call, right, right.
00:07:05.894 --> 00:07:10.911
But this can help us as humans, just to have better recall.
00:07:10.911 --> 00:07:15.992
It can help us better with, you know, thorough note taking.
00:07:15.992 --> 00:07:22.673
You know we can only move our hands so much when we're writing or typing, as the case may be.
00:07:27.459 --> 00:07:35.593
Typing or typing as the case may be, and I think there's just incredible opportunities for AI to improve the engagement that students are having with their coursework, right?
00:07:35.593 --> 00:07:42.326
I just I think that's that's what's fascinating right now to me is even in my daily grind.
00:07:42.326 --> 00:08:07.281
You know it's not university, but it's not different in terms how I talk to clients and I talk to strategic partners all day and I learn new things from them all the time, and I'm starting to use artificial intelligence tools just to help me understand and to help me translate what it is I'm engaged with in the discourse I have with those folks.
00:08:07.281 --> 00:08:20.713
So I think, fundamentally, it's the same opportunity that students have now with their coursework, with their group projects, with those one-on-one conversations they may have with faculty.
00:08:20.713 --> 00:08:35.695
I just think there's going to be a number of ways that AI just comes in and really complements or even, as far as I could see, optimizes what needs to happen in those situations.
00:08:36.801 --> 00:08:41.191
I love that term optimize because that, to me, is part of what it's doing.
00:08:41.191 --> 00:08:43.201
It's bringing it to the next level.
00:08:43.201 --> 00:08:46.543
Saving time, but bringing things to the next level.
00:08:47.865 --> 00:08:48.264
That's right.
00:08:48.264 --> 00:08:55.908
I mean, you think about you, just think about the experience of of going away to a four year college.
00:08:55.908 --> 00:09:07.274
Right is, I've always, I always brag on San Marcos because it is.
00:09:07.274 --> 00:09:23.302
It is a very unique and special campus experience and as you start to dial in your study, there is a unique assortment of both the coursework and the professors and your fellow students that you have.
00:09:23.302 --> 00:09:26.484
That's just extremely unique to what happens there.
00:09:26.484 --> 00:09:35.427
To say it's better, it's just that it's different than anywhere else.
00:09:35.447 --> 00:09:45.436
And I think you know this is where we have to be careful, because when you think about the optimization is, how do we double down on that which is unique and special and valuable?
00:09:45.436 --> 00:09:48.024
Why are we investing our time?
00:09:48.024 --> 00:09:54.147
Why are we paying tuition to be in the moment, in the location, in the classroom, where we are right now?
00:09:54.147 --> 00:10:08.355
Right, ai can help us optimize or help us, you know, I would say, amplify even what that investment is, and to do so in a non-commoditized way.
00:10:08.355 --> 00:10:22.452
I mean, I think that's the part we have to be careful is that when you think about people that are out there today on LinkedIn or YouTube and they're saying, you know, be an AI expert, you know, be a.
00:10:22.452 --> 00:10:22.933
You know.
00:10:22.933 --> 00:10:42.386
Let me show you how to prompt engineer, let me show you how to do these things, and I and I'm always quick to say is you know, this is what's fascinating about this technology is that what works for some people will not work for everybody, and that's how malleable AI is.
00:10:42.386 --> 00:10:55.506
It's extremely malleable for you to do things that are and build functionality that I think is very unique to you as an individual and whatever your cause or your opportunity is.
00:10:56.308 --> 00:11:00.947
Well, and that goes to the strategy, right, Without the strategy, you just you have the tool.
00:11:00.947 --> 00:11:02.789
You've got to have strategy and the tool.
00:11:04.113 --> 00:11:04.614
I think so.
00:11:04.614 --> 00:11:24.980
I mean, we talk a lot at BrainTrust about the notion of the human in the loop is to say that, you know, if we have, if we're looking at a specific business process that we'd like to automate, let's quantify that.
00:11:24.980 --> 00:11:39.357
Let's quantify that it takes five clicks, five steps to do that today, and maybe we're going to get it down to two and allow AI to do three of those steps for us, but let's spread the two to be the first and the last.
00:11:39.357 --> 00:11:55.168
So the human is setting things up for the direction or the success in terms of what they're trying to achieve and, lastly, the human is the one who is going to confirm that AI completed the tasks the way that was needed in that situation, right?
00:11:55.168 --> 00:12:12.236
So I think, when you say that you know, just borrowing what you just said with strategy, it has to be where we're being extremely thoughtful and intentional about the.
00:12:12.236 --> 00:12:22.277
You know the key performance indicator, you know the KPI, or the goal, the objective, what is it are we trying to achieve?
00:12:22.277 --> 00:12:25.008
And that's a strategic conversation, right?
00:12:25.328 --> 00:12:38.688
And this is where good, old-fashioned whiteboards and people in a room being able to, you know, leverage each other's body language and communicate these are things that computers are not going to be able to do right.
00:12:38.688 --> 00:13:35.587
This is not what AI is going to be able to do To capitalize on that sensory side of how we communicate and we think about things critically, strategically, and then back into what is it that we can build or we can subscribe to, that helps us do this on a replicable or, you know, from a continuity standpoint over and over again, if that's what we're trying to do in terms of automating something or just grabbing an efficiency with artificial intelligence or other automation technology, Well, and that's where the, like you said, the human element, and, in my case, the teacher, the educator, the leader, the facilitator, wherever, whatever your environment is, that's where that human being comes in, right to make it all work human being comes in, right, to make it all work, right?
00:13:35.607 --> 00:13:41.019
Well, I mean, you think about it in the context of instructors and professors is you have the textbooks?
00:13:41.019 --> 00:13:41.782
Right?
00:13:41.782 --> 00:13:43.426
Let's just we'll go there real quick.
00:13:43.426 --> 00:13:50.450
You have the textbooks and the syllabus, the general syllabus, which are your constants, right?
00:13:50.450 --> 00:13:51.897
These are the things.
00:13:51.897 --> 00:13:54.687
These are the things that are not super malleable, right?
00:13:54.807 --> 00:14:29.846
But the best professors since the earth cooled were the ones who, who, who, who brought in personal experiences, brought in guest speakers, or you know, or, or had a side gig doing some consulting work you know, out in the real world and then brought that back into the classroom and being able to talk about things that are happening here and now, which sometimes will challenge what's in the textbook and sometimes will, you know, certainly be different and be complimentary, if you're intentional about it, to what else is there with the syllabus and the textbook.
00:14:29.846 --> 00:14:49.356
So I think you know, if you can think about things in those terms, then I think you can easily start to understand what are the ways that you could be able to leverage automation or artificial intelligence to help you do things a little bit different.
00:14:49.356 --> 00:14:52.510
You know and again it goes back to what we were talking about just before.
00:14:52.510 --> 00:14:56.025
This was strategy and being able to understand what you're trying to achieve.
00:14:56.547 --> 00:14:58.311
Yeah, well, and has there?
00:14:58.311 --> 00:15:00.616
Ai has been around for for a while.
00:15:00.616 --> 00:15:05.847
Has there been a shift like this before that you could?
00:15:05.847 --> 00:15:07.931
Is there a parallel here?
00:15:09.192 --> 00:15:10.673
I think there's a few of them, right.
00:15:10.673 --> 00:15:39.961
I mean my favorite one when people ask that question, right, and it usually starts off with a question like you know what jobs are going to be the first to go, you know, and I say, well, let's back into this, right, we're humans, humans, and we doesn't matter what happens, if it's disease or it's war or it's you know some other type of political thing that's happening.
00:15:39.961 --> 00:15:45.258
We, we persevere, we, you know, this is this is the way we're wired.
00:15:45.258 --> 00:15:47.027
We'll figure out a way.
00:15:47.027 --> 00:16:05.373
But the one example that I love to really put out there is in the late 1950s the standardization of the shipping container was a big thing and there's a book called the Box.
00:16:05.373 --> 00:16:18.378
The author's name is Mark Levinson and I think it's M-A-R-C-L-E-V-I-N-S-O-N, and the story he tells about the shipping container.
00:16:18.378 --> 00:16:50.818
I had a chance to see him speak and then I read his book and it used to be that when cargo was shipped, it was shipped in bulk and it came into, obviously, coastal cities, because that's where the ports were, and it would take as many as 10,000 humans to be able to unload the cargo from a boat and it may take them as long as a month to do it Once the shipping container came along and finally got standardized in the mid-60s, it only took four people to do this.
00:16:50.818 --> 00:17:06.567
It only took four people to do this because you only needed a person on the boat operating the crane or on the crane, and then you had a person on the boat putting the hooks on the containers and two people down below to direct the train, the truck, and make sure that the the container came down and got on it.
00:17:06.567 --> 00:17:11.894
That right there did a few things.
00:17:11.894 --> 00:17:16.121
It displaced tens of millions of Americans from their jobs.
00:17:16.121 --> 00:17:23.893
It basically we saw the birth of suburbanization Right.
00:17:23.893 --> 00:17:42.008
We saw the really front edge of what everybody talks together about right now is sprawl in terms of economic development and real estate development, and the shipping container was the one thing that enabled that to happen.
00:17:42.008 --> 00:17:49.768
And then, right alongside it too, we had the maturation of the interstate highways right.
00:17:49.768 --> 00:17:54.949
So as that happened, you can think about what we have today with big box retail.
00:17:54.949 --> 00:17:59.178
You can think about right now what we have with processed foods.
00:17:59.178 --> 00:18:10.251
I mean, you can look at all of this and it really came down to the standardization of the supply chain, which was enabled by the shipping container.
00:18:11.694 --> 00:18:14.038
The AI is going to do the same thing.
00:18:14.038 --> 00:18:19.834
It is going to reassign humans to do other jobs.
00:18:19.834 --> 00:18:29.597
It is going to require us to be skilled and re-skilled to do something different in the jobs we already have.
00:18:29.597 --> 00:18:35.988
Do something different in the jobs we already have.
00:18:35.988 --> 00:19:16.867
It is just going to fundamentally change business operations again, and the wonderful thing about this is that, you know, probably five to eight years is when we'll start seeing the bleeding edge of this, of where humans are actually able to invest and put the time into solving incredibly important human challenges that are out there and that can be disease, that could be politics, it could be global economic or sociopolitical issues that are happening global economic or sociopolitical issues that are happening.
00:19:16.887 --> 00:19:23.579
Ai is going to help us be able to use information and leverage information from disparate resources and disparate sources as it does that.
00:19:23.579 --> 00:19:28.905
You know, it's not going to be far-fetched to think that we're going to be able to cure some diseases.
00:19:28.905 --> 00:19:41.236
It's not going to be far-fetched that we're going to be able to do some things that otherwise we just couldn't do because we didn't have the connectivity and we certainly didn't have the added intelligence at our side that we will have.
00:19:41.236 --> 00:19:47.068
So that's my really long way is to say you don't know.
00:19:47.068 --> 00:19:52.438
We have no idea what's going to happen, but some great things are certainly plausible.
00:19:53.118 --> 00:20:05.144
Yeah Well, Tim, what is it in our nature, and what is it in your nature, if you will, or your approach, that allows you to really have such a growth mindset?
00:20:05.144 --> 00:20:13.365
Let's talk about that, because I think that's part of it is that you hear people say AI is bad, AI is terrible and I don't want anything to do with it.
00:20:13.365 --> 00:20:15.430
What do we need to?
00:20:15.430 --> 00:20:17.193
What approach do we need to have?
00:20:17.193 --> 00:20:22.376
And I'm a real believer, we at the university really love students that come in with a growth mindset.
00:20:22.376 --> 00:20:23.701
It just works.
00:20:23.701 --> 00:20:27.317
But talk about your personal journey in that area, if you would.
00:20:28.599 --> 00:20:33.624
Well, you know, I am going back to when I worked at Edelman.
00:20:33.624 --> 00:20:45.172
You know I've that was in the middle of my career as a mobile strategist, which I talked about shipping containers, but of how it's changed the way we live.
00:20:45.172 --> 00:21:03.035
Um, you know, if you, if you start to um, another, another great book um to read is, uh, Amy Webb.
00:21:04.442 --> 00:21:04.762
Love her.
00:21:04.762 --> 00:21:06.268
Yes, I know, Amy.
00:21:06.808 --> 00:21:19.994
And Talking to Signals, right Um, it's another one of those books just everybody should read because, um, it talks about how, about how you know the things that she observed on the streets of Tokyo 20 years ago.
00:21:19.994 --> 00:21:30.933
She could start to really piece them together after she observed that and started to understand how it was connected to some other development or something else that was happening.
00:21:30.933 --> 00:21:34.118
Connected to some other development or something else that was happening.
00:21:34.118 --> 00:21:55.731
And when I was in, when I was doing things as a, as a mobile strategist, I got really interested in things like the development of M-Pesa in sub-Saharan Africa as a way for farmers to be able to get to market and get their, their produce to market before it perished or before pirates could get it, and get the best dollar per pound for whatever their, their produce to market before it perished or before pirates could get it.
00:21:55.731 --> 00:22:08.588
Um, and get the best dollar per pound for whatever their, whatever their bushels were, um, and have all of the commerce take place via text message, text messaging, right Um, um, and this was something that was happening 12 years ago, right, um.
00:22:08.588 --> 00:22:18.413
So if you, if you just start to, if you start to look at the smallest little things that happen around you and it you don't have to go to Japan, you don't have to go to Africa.
00:22:18.413 --> 00:22:38.471
But for me it's always been how do I try to be as empathetic as I can, which sometimes is as simple as, as you know, having a smile on your face and and saying please and thank you, and asking people how their day is going?
00:22:38.471 --> 00:22:45.784
Um, just to be able to get what comes after that right, which is somebody asking a question about what do you do and how do you do it?
00:22:45.784 --> 00:22:51.849
Or hey, did you see that over there, people see people, it's things that other people observe and then pointing it out to you.
00:22:51.849 --> 00:22:54.499
Right, see people, it's things that other people observe, and then pointing it out to you.
00:22:54.499 --> 00:23:43.977
Right, this is this may sound like nonsense, this may sound like, you know, just the capture of information that you might not ever need, but if we think about it and this is something I, when I guest lecture at Texas State and at other universities, is I tell students all the time I say go for a long walk and take your, take your AirPods out, right, take your, take your, take your earphones out and and just go for a walk and and listen to the birds, listen to the traffic, look around you right and think about you know what, what you're seeing, a little bit, but just know that the sensory capture of the experience is going to be unique to you and it's going to enrich how you see something else.
00:23:45.501 --> 00:23:52.994
And backing into this a growth mindset is also having confidence that you'll always find a way right.
00:23:52.994 --> 00:24:20.644
A growth mindset is also having confidence that you'll always find a way right.
00:24:20.644 --> 00:24:21.586
It's that, hey, I will always on the peripheral.
00:24:21.586 --> 00:24:22.367
I need to understand the peripheral.
00:24:22.367 --> 00:24:23.150
I need to keep my eye on that.
00:24:23.150 --> 00:24:25.114
Matter what happens, I need to find a way around it or a way straight to it.
00:24:25.114 --> 00:24:27.540
So to me, that's just about being observant.
00:24:27.540 --> 00:24:36.335
It's just about building empathy with the world around you, which actually just improves everything.
00:24:36.335 --> 00:24:38.367
It just improves your critical thinking.
00:24:38.367 --> 00:24:44.772
It improves your ability to grasp concepts that otherwise were just foreign to you.
00:24:44.772 --> 00:24:48.301
If they're coming, maybe they're coming from somebody you don't agree with, right you know.
00:24:48.301 --> 00:24:51.010
It just helps in all those capacities.
00:24:54.182 --> 00:24:56.730
I think that's a great point to bring up.
00:24:56.730 --> 00:25:08.432
You might have differing views from colleagues, but being open to listen to that and then to me, I would see that AI could be very empowering.
00:25:08.432 --> 00:25:14.800
It could give you even more superpowers if you have that empathetic growth mindset.
00:25:15.682 --> 00:25:16.345
Well, absolutely.
00:25:16.345 --> 00:25:30.028
I mean, there's a lot of talk in Silicon Valley right now that within our lifetimes there will be unicorn companies, you know, companies that are valued at a billion dollars or more, where they only have one employee right.
00:25:30.028 --> 00:25:34.795
There are people who believe we'll see that very soon.
00:25:34.795 --> 00:25:39.489
You know that certainly falls in the in the phylum of superpowers.
00:25:39.489 --> 00:25:40.791
It absolutely does.
00:25:40.791 --> 00:25:47.811
Sure, you know, but I, but I think that's just it is, if you, you know, I, I can.
00:25:47.991 --> 00:26:28.476
I can tell you when I have a zoom call or a Microsoft Teams call with a client and there's no sensitive information that we've talked about, it's just a state of the account or it's a strategic ideation session, maybe it's just a new business call, a system like Otter or Readai, taking those transcripts and then going to Claude, which is a large language model, and giving Claude the transcripts and saying can you?
00:26:28.476 --> 00:26:29.319
You know this was a call with XYZ company.
00:26:29.319 --> 00:26:33.423
We are discussing a probable opportunity to help them with their data management.
00:26:33.423 --> 00:26:46.887
Can you summarize this in 200 words for me and give me five next steps and do a draft email for me back to Greg, and we'll just say and, and 10 nanoseconds later I have a draft of the email.
00:26:48.144 --> 00:26:54.102
The transcript was read and you know if that does, if that's not superpowers, right, that's not.
00:26:54.102 --> 00:27:04.593
If that's not, if that's not taking something that would have taken me, you know, at least an hour to read the transcripts and to cut and paste and then to wordsmith.
00:27:04.593 --> 00:27:07.403
You know, manually, I mean, of course I.
00:27:07.403 --> 00:27:13.512
I took the draft from Claude and then I put my own voice into it and I changed quite a bit.
00:27:13.512 --> 00:27:36.272
But just that immediacy of being able to start something, you know, to get that initial direction going, I think we'll see more and more of, you know, an acceleration of the speed of business, an acceleration of decision making, as we start to see more applications of technology that are similar to that.
00:27:37.000 --> 00:27:37.421
I love that.
00:27:37.421 --> 00:27:39.365
Well, and look ahead five years.
00:27:39.365 --> 00:27:42.131
Give us your, give us your prediction.
00:27:42.131 --> 00:27:45.702
Five years, that's what.
00:27:45.702 --> 00:27:50.272
What is it going to be like for students graduating five years from now?
00:27:50.272 --> 00:27:53.609
Let's take it back to the, to the university, to give you some sort of a framework.
00:27:54.420 --> 00:28:00.951
Sure, sure, well, I mean, I think, things for students to think about right.
00:28:00.951 --> 00:28:11.847
Five years from now, we won't hear anyone say just Google, that right, search is going to fundamentally change.
00:28:11.847 --> 00:28:14.365
Search as a human need or a human behavior will always be there.
00:28:14.365 --> 00:28:15.117
Right, search is going to fundamentally change.
00:28:15.117 --> 00:28:16.871
Search as a human need or a human behavior will always be there.
00:28:16.871 --> 00:28:17.311
Right?
00:28:17.311 --> 00:28:21.876
I mean the internet is vast and it has answers.
00:28:21.876 --> 00:28:33.980
Now we'll have the means to get those answers and not just have sponsored lists of companies that we have to click on and then click through to the page we were looking for.
00:28:35.284 --> 00:28:42.246
So I think you'll, you'll think of things if you, if you just grasp that to say what is what does this mean?
00:28:42.246 --> 00:28:55.315
Right in terms of how we go about marketing, how we go about communications, how we go about research, whatever the case may be, I think there's opportunities to consider that.
00:28:55.315 --> 00:29:05.567
I think there's also this opportunity to understand how AI is not just generative AI much in the way we've been discussing it.
00:29:05.567 --> 00:29:21.316
It is how I think we'll start to see what's called invisible experiences, where companies will have so much data on us and we've opted in and we've consented to them managing our data that they'll start to do things for us.
00:29:21.316 --> 00:29:25.171
They'll start to send us things without us asking for or ordering them.
00:29:25.380 --> 00:29:33.980
Right and, as I'm forecasting, that I would say I would tell students is how would you show up different?
00:29:34.299 --> 00:29:49.817
You know, how would you uniquely approach the opportunity to be able to have a better finger on the pulse of what's going on in the world, what's going on with customer behavior, and be able to show up different?
00:29:51.060 --> 00:30:14.795
And that could be a supply chain type conversation, it could be a marketing or communications type conversation and it could be something else, but I think the big thing that's going to happen in five years is just that start thinking about what it is you can do uniquely as a human that you're not going to be able to do with AI.
00:30:14.795 --> 00:30:30.268
I think that's a list that's always going to be challenged, because there's two people in a coffee shop out there who are trying to automate things and trying to put together the slides to walk into a VC's office and raise a couple million dollars to build another bot.
00:30:30.268 --> 00:30:49.722
But you know, there will be robots, there will be self-driving cars, there will be doctors on call that have robot nurses, you know, and there are all these things that are going to happen, and a lot of it will happen, probably as soon as five years.
00:30:49.722 --> 00:30:59.200
So I want to I want to stop short of being really prescriptive, because it's hard for anybody to say what the world's going to look like at that point?
00:30:59.580 --> 00:31:02.932
Sure, yeah, well, how fun, how fun is that, tim?
00:31:02.932 --> 00:31:07.846
Well, and I always, I always like to ask my guests about their, their personal strengths.
00:31:07.846 --> 00:31:13.924
I'm a Gallup strengths coach and you might not have done the Gallup strengths, clifton strengths finder.
00:31:13.924 --> 00:31:16.640
But what, what are your personal strengths?
00:31:16.640 --> 00:31:17.883
Do you think what?
00:31:17.883 --> 00:31:22.012
What do you, what do you think of or what do other people say about?
00:31:22.012 --> 00:31:26.471
Oh, tim is really because he's strong in this area.
00:31:28.580 --> 00:31:29.402
You know, I?
00:31:29.402 --> 00:31:35.933
I think it really comes down to some of the things we already discussed.
00:31:35.933 --> 00:31:40.670
You know I'm deep on relationships.
00:31:40.670 --> 00:31:56.035
Right, I'm short on the transactional and deeper on the relationship side of business, of leading a company, of supporting my alma mater.
00:31:56.035 --> 00:32:11.165
You know of spending time with my family, right, I think people would say that you know that Tim is more present than many folks.
00:32:11.165 --> 00:32:15.217
I'm not as present as I want to be, um I?
00:32:15.217 --> 00:32:17.461
I tend to stare at my phone too much, right?
00:32:17.903 --> 00:32:57.066
Um, don't we all do you know, but um, I just think, um, I think people would talk about empathy or and um, and it would probably even say that, um, there's some creative value that I bring into which I like to I liken to be more strategic than just straight creative but to to truly think beyond the peripheral, right to what people used to say about thinking outside the box, just thinking more holistically about business opportunities or challenges, about business opportunities or challenges.
00:32:57.066 --> 00:33:18.455
I think that's it's one of the ways that I do excel professionally and I think it's one of the reasons that clients retain what we do, for our ability to think holistically and maybe look a little further down the road, you know, in terms of what it is we're doing today and how can it be leveraged, amortized or applicable, not just something that is temperamental or immediate.
00:33:19.300 --> 00:33:22.049
I love that, and I love the focus on empathy as well.
00:33:22.049 --> 00:33:26.309
Tim, thank you so much for joining me today, so fun.
00:33:26.329 --> 00:33:26.770
Thank you, Judy.
00:33:27.019 --> 00:33:28.065
We'll have you back, for sure.
00:33:29.080 --> 00:33:29.423
Awesome.
00:33:29.423 --> 00:33:30.205
Thank you so much.
00:33:31.500 --> 00:33:34.790
Well, and thank you for listening to Stories of Change and Creativity.
00:33:34.790 --> 00:33:46.451
We'll put some information about Tim Hayden in the show notes with some of the references he made during our talk and, if you've got a story to share or know someone who does, reach out to me at judyoskam.
00:33:46.451 --> 00:33:46.451
com.
00:33:46.451 --> 00:33:48.221
Thanks for listening.