Oct. 6, 2025

Kevin Johnston: The Age of AI and Human Collaboration | EP 40

Kevin Johnston: The Age of AI and Human Collaboration | EP 40

Kevin Johnson, CEO of CloudFactory, discusses the groundbreaking work of human-in-the-loop AI technology. Kevin shares insights into CloudFactory's role in enhancing AI accuracy and scalability, details his journey from a farm in Iowa to leading a technology company, and emphasizes the importance of intentionality and adaptability in achieving success. Discover how CloudFactory is transforming industries with AI solutions, the challenges faced in pivoting to a tech-focused platform, and Kevin's perspective on leadership, culture, and personal growth.

1
00:00:08,560 --> 00:00:10,600
Thank you all for tuning into
the Carolina Business Leaders

2
00:00:10,600 --> 00:00:13,360
podcast, where we highlight
local business leaders and

3
00:00:13,360 --> 00:00:16,239
impact makers who are deeply
rooted in the Carolinas.

4
00:00:16,480 --> 00:00:20,720
Today's guest is Kevin Johnson,
CEO of Cloud Factory, a global

5
00:00:20,720 --> 00:00:23,600
leader in human in the loop
technology solutions that

6
00:00:23,600 --> 00:00:27,120
combine talented cloud workers
with AI to help companies scale

7
00:00:27,120 --> 00:00:29,600
with precision and purpose.
So, Kevin, thanks so much for

8
00:00:29,600 --> 00:00:31,160
joining us.
Thanks for being here, very.

9
00:00:31,160 --> 00:00:32,240
Good.
It's great to be here.

10
00:00:32,360 --> 00:00:34,880
Yeah.
Took a little while, but I we.

11
00:00:35,000 --> 00:00:37,720
Finally got you in, so a lot to
look forward to here.

12
00:00:37,840 --> 00:00:40,720
It just took his daughter moving
to Greenville to get him here.

13
00:00:41,360 --> 00:00:44,000
My best friend motivation.
That's right.

14
00:00:44,040 --> 00:00:46,360
Well, just for the formalities,
can you give the audience a

15
00:00:46,360 --> 00:00:49,280
little introduction as to what
Cloud Factory is and some of the

16
00:00:49,280 --> 00:00:50,560
problems y'all are helping
solve?

17
00:00:50,720 --> 00:00:53,080
Yeah.
So Cloud Factory, I think of

18
00:00:53,080 --> 00:00:55,360
Cloud Factory as an AI
enablement company.

19
00:00:55,400 --> 00:01:00,600
So in order to make AI work, you
need to feed AI data.

20
00:01:01,080 --> 00:01:04,599
The data needs to be quality and
usable and and.

21
00:01:04,599 --> 00:01:06,720
That's the hard part.
That's right, That's right.

22
00:01:06,720 --> 00:01:09,360
Curated for a specific use or
purpose.

23
00:01:09,800 --> 00:01:13,400
And then also AI is a
prediction.

24
00:01:13,720 --> 00:01:16,880
So AI makes an inference.
An inference is basically

25
00:01:16,880 --> 00:01:19,960
prediction.
And so predictions aren't always

26
00:01:19,960 --> 00:01:22,080
right.
And so we both help in the

27
00:01:22,080 --> 00:01:26,680
accuracy of creating accurate
models by collecting, curating,

28
00:01:26,920 --> 00:01:31,520
annotating data, but we also
provide oversight to inference

29
00:01:31,520 --> 00:01:35,680
or to the predictions.
So if the model maybe is not

30
00:01:35,680 --> 00:01:40,760
exactly sure it's correct, then
we can use a combination of AI

31
00:01:40,760 --> 00:01:46,880
technology to evaluate the AI
plus humans to validate or take

32
00:01:46,880 --> 00:01:48,440
a next action, whatever that
might be.

33
00:01:48,440 --> 00:01:50,920
And so when I say AI enablement,
that's what I mean.

34
00:01:51,320 --> 00:01:54,320
And so we do that.
We also help enterprises

35
00:01:56,760 --> 00:02:01,440
imagine, you know what problems
could AI solve and then build

36
00:02:02,040 --> 00:02:05,160
solutions.
You know what data with which

37
00:02:05,160 --> 00:02:08,199
models or agents could produce
some outcome.

38
00:02:08,199 --> 00:02:11,680
That creates a value, a
disruptive value.

39
00:02:11,680 --> 00:02:14,720
So that's your AI.
Are you in like a particular

40
00:02:14,720 --> 00:02:16,560
industry or?
Yeah.

41
00:02:16,560 --> 00:02:19,600
So we do.
So the industries that we focus

42
00:02:19,600 --> 00:02:23,240
on are largely industries where
there is large amounts of

43
00:02:23,320 --> 00:02:25,800
unstructured data, data that
needs.

44
00:02:25,800 --> 00:02:30,160
Work to make it usable.
And then also industries where

45
00:02:30,160 --> 00:02:35,520
there are use cases where the
inference error cost is high.

46
00:02:35,840 --> 00:02:43,840
So for example if if you were to
do use AI to do a medical pre

47
00:02:43,840 --> 00:02:47,800
authorization, so should as the
insurance company should you

48
00:02:47,800 --> 00:02:51,760
approve the authorization for
the procedure or she?

49
00:02:51,760 --> 00:02:58,000
Just talking about this and.
So, so if AI is wrong, the cost

50
00:02:58,000 --> 00:02:59,880
is significant, could be very
high.

51
00:03:00,240 --> 00:03:03,320
And so when we think of air cost
from AI, we think it in the

52
00:03:03,320 --> 00:03:07,680
terms of human cost, financial
cost or legal or regulatory

53
00:03:07,680 --> 00:03:12,160
cost, compliance related cost.
And so we're we're focused

54
00:03:12,160 --> 00:03:15,680
mostly on industries where that
is the case where those use

55
00:03:15,680 --> 00:03:18,120
cases exist.
So that can be financial

56
00:03:18,120 --> 00:03:24,600
services related use cases or
businesses or healthcare and the

57
00:03:24,600 --> 00:03:28,520
provider, so hospitals as well
as insurance companies or

58
00:03:28,520 --> 00:03:31,840
payers.
And then and then there, there

59
00:03:31,840 --> 00:03:37,120
are other, I'll call them
physical AI use cases.

60
00:03:37,280 --> 00:03:41,160
So autonomous vehicles or drones
or robots.

61
00:03:41,400 --> 00:03:44,200
So these are all types of
companies and businesses that we

62
00:03:44,200 --> 00:03:48,640
do work for and and help them
make their AI work and scale

63
00:03:49,360 --> 00:03:53,200
confidently.
The AI has become huge recently.

64
00:03:53,640 --> 00:03:57,160
Over the last few years.
I was not even aware of it.

65
00:03:57,640 --> 00:04:01,240
Call it two years ago that Heath
being a cyber engineer made me

66
00:04:01,240 --> 00:04:04,560
aware of ChatGPT and I was still
late to the game.

67
00:04:04,560 --> 00:04:08,040
But with that being said, what
drew you to Cloud Factory?

68
00:04:08,040 --> 00:04:11,360
Well, before AI was even a thing
that we were using, yeah.

69
00:04:11,680 --> 00:04:15,400
That's a great question.
So I, I, so I wasn't necessarily

70
00:04:15,400 --> 00:04:20,839
looking for this specific role.
So I, I had been some years

71
00:04:20,839 --> 00:04:26,400
prior with as a COO and a
software engineering business

72
00:04:26,400 --> 00:04:32,560
and, and then almost 20 years
with electronic Data systems,

73
00:04:32,560 --> 00:04:34,840
Hewlett Packard and doing a
whole bunch of different roles.

74
00:04:34,840 --> 00:04:38,880
And so I've been in large
technology, IT or software.

75
00:04:38,880 --> 00:04:40,680
Engineering.
Did you start as an engineer?

76
00:04:40,840 --> 00:04:44,640
No.
In fact, I I started as a

77
00:04:44,640 --> 00:04:48,080
salesperson selling food
ingredients.

78
00:04:49,320 --> 00:04:50,800
So we're going to dive into
that, yeah.

79
00:04:51,320 --> 00:04:53,280
So this is a journey we will
dive into.

80
00:04:53,520 --> 00:04:56,120
We're in AI now.
Meandering path.

81
00:04:56,360 --> 00:04:59,360
I love it, I love it.
Which is there's which is has

82
00:04:59,360 --> 00:05:01,040
there's a lesson in that also.
Yeah.

83
00:05:01,920 --> 00:05:08,440
But so I anyway, I got called by
a recruiting firm and, and I was

84
00:05:08,440 --> 00:05:12,200
a bit drawn to cloud factory,
both in the topic of AI.

85
00:05:12,200 --> 00:05:18,800
And so I, I joined in October of
23 and you know, so ChatGPT and

86
00:05:18,800 --> 00:05:22,840
these transformer models were
all the rage and it seemed like,

87
00:05:22,840 --> 00:05:25,160
you know, everything is
possible, which I would say.

88
00:05:26,120 --> 00:05:30,400
Maybe.
And but also Cloud Factory was

89
00:05:30,560 --> 00:05:35,240
had a really interesting
founding story that also

90
00:05:35,240 --> 00:05:39,160
attracted me.
And so Cloud Factory's founder

91
00:05:39,160 --> 00:05:43,400
and his wife had started the
business on the kind of centered

92
00:05:43,400 --> 00:05:47,280
around the purpose of connecting
1,000,000 talented people to

93
00:05:47,280 --> 00:05:51,320
meaningful work.
We're Together can earn, learn

94
00:05:51,320 --> 00:05:54,720
and serve their way to becoming
leaders or following.

95
00:05:54,720 --> 00:05:57,560
A million people is quite a So
that was, that was, that was

96
00:05:57,560 --> 00:05:59,800
quite a vision.
So that was the aspiration I.

97
00:05:59,920 --> 00:06:01,680
Love it it was.
A vision for the company.

98
00:06:02,560 --> 00:06:04,560
So we don't have a million
people today.

99
00:06:05,120 --> 00:06:09,280
You're getting.
There the the aspiration and and

100
00:06:09,280 --> 00:06:11,160
I would say things that have
evolved a lot.

101
00:06:11,840 --> 00:06:16,160
The company was founded 15 years
ago, but and so in in the

102
00:06:16,160 --> 00:06:20,200
business model is evolved a
little bit from where it began

103
00:06:20,200 --> 00:06:24,680
to where it is today.
But the but the founding purpose

104
00:06:24,680 --> 00:06:31,040
of the company that is about
using technology, but connect

105
00:06:31,040 --> 00:06:39,080
connecting technology to humans
and in a way that, you know, is

106
00:06:39,080 --> 00:06:41,120
about serving something greater
than ourselves.

107
00:06:41,600 --> 00:06:44,880
And so that was a that was a
compelling mission.

108
00:06:44,880 --> 00:06:51,720
And Mark and Laurel Sears, Mark,
the founder of the company and

109
00:06:52,080 --> 00:06:55,320
just recently celebrated 15
years of Cloud Factory.

110
00:06:55,320 --> 00:06:59,000
And that's great.
And so that so being connected

111
00:06:59,000 --> 00:07:04,560
to something like that where
really cool technology just

112
00:07:04,960 --> 00:07:09,200
almost limitless the possibility
of what AI we could imagine it,

113
00:07:09,600 --> 00:07:11,240
right?
There's no telling.

114
00:07:11,440 --> 00:07:15,640
And connecting, you know, humans
into that story in some

115
00:07:15,640 --> 00:07:17,560
constructive, positive way.
Yeah.

116
00:07:17,600 --> 00:07:22,680
Was a, it was a, it was a
motivating kind of magnet

117
00:07:22,840 --> 00:07:24,960
magnetic way.
Absolutely, yeah.

118
00:07:25,080 --> 00:07:28,840
So it seems like nowadays like
every startup is focused on AI.

119
00:07:28,840 --> 00:07:31,520
Like you said, AI is like the
Next up and coming thing.

120
00:07:31,520 --> 00:07:33,160
Everyone wants to get their
hands on it.

121
00:07:33,600 --> 00:07:37,320
I have this is we'll come back.
I want to dive into your journey

122
00:07:37,320 --> 00:07:40,160
with Cloud Factory and your
personal journey as well.

123
00:07:40,440 --> 00:07:42,800
But I've had some start up ideas
revolving AI.

124
00:07:43,120 --> 00:07:47,080
But you talk specifically about
when there's significant damage

125
00:07:47,080 --> 00:07:49,200
from a wrong inference, like
those are some of the problems

126
00:07:49,200 --> 00:07:52,320
you're trying to solve.
So one thing I would love is

127
00:07:52,320 --> 00:07:57,280
that there is a like an AI, what
would you call an AI assistant

128
00:07:57,280 --> 00:08:00,320
that would take care of calling
AT&T and waiting on hold

129
00:08:00,320 --> 00:08:02,880
forever, calling your health
insurance, calling your daughter

130
00:08:02,880 --> 00:08:05,520
for the pre authorization.
But if you're wrong when it

131
00:08:05,520 --> 00:08:09,240
comes to health insurance pre
authorizations, it's such a huge

132
00:08:09,320 --> 00:08:11,800
like you can't sell that
essentially.

133
00:08:11,800 --> 00:08:14,560
There's so many different
regulations and whatnot.

134
00:08:14,760 --> 00:08:19,560
So I guess how are you going
after or how are you solving the

135
00:08:19,560 --> 00:08:22,560
problem of when you have a wrong
inference, when you have a

136
00:08:22,560 --> 00:08:24,840
significant consequence with a
wrong inference?

137
00:08:24,840 --> 00:08:27,200
Yeah.
Well, let me let me maybe set a

138
00:08:27,200 --> 00:08:30,720
context just on how we think
about kind of the possible, the

139
00:08:30,720 --> 00:08:33,000
possibilities of AI.
And there's really kind of let

140
00:08:33,760 --> 00:08:37,440
me just put in two big
categories of use for AI.

141
00:08:37,679 --> 00:08:43,039
So one is, hey, you as a person
or me as a person, I want to use

142
00:08:43,120 --> 00:08:46,760
AI to make me more productive.
And so just as you're describing

143
00:08:46,760 --> 00:08:50,040
maybe the idea of a personal
assistant and how could I use

144
00:08:50,040 --> 00:08:52,000
AI?
To make me more productive.

145
00:08:52,280 --> 00:08:59,080
Which for sure, even today and
I'm certain in the future, there

146
00:08:59,080 --> 00:09:03,520
will be more and more ways which
AI can help, you know, increase

147
00:09:03,520 --> 00:09:05,960
the productivity of us
individually.

148
00:09:05,960 --> 00:09:09,480
Oh, 100%.
Maybe possibly experiential

149
00:09:09,480 --> 00:09:12,640
experientially as well.
Just because you were more

150
00:09:12,640 --> 00:09:16,000
productive doesn't mean we're
happier, but but maybe we could

151
00:09:16,000 --> 00:09:17,920
be also.
I don't know, we'll see.

152
00:09:19,160 --> 00:09:25,680
The other way to think about how
AI can create value is instead

153
00:09:25,680 --> 00:09:31,240
of thinking about how AI can
make humans better, think about

154
00:09:31,240 --> 00:09:36,800
how AI as a machine, how humans
can make AI better.

155
00:09:37,360 --> 00:09:41,120
And so if you think about most
processes, many processes are

156
00:09:41,200 --> 00:09:43,200
like in the financial services.
Business.

157
00:09:43,480 --> 00:09:44,880
Lots of paper moving.
Around.

158
00:09:45,680 --> 00:09:49,960
And lots of things that can be,
you know, a lot of work that

159
00:09:49,960 --> 00:09:54,880
people have to do to lose paper
from this pile of that pile.

160
00:09:56,480 --> 00:10:00,680
And so those processes generally
are designed with the human at

161
00:10:00,680 --> 00:10:03,560
the center of the process.
So thinking about, you know,

162
00:10:03,560 --> 00:10:06,480
human centered design or, or
human centered processes.

163
00:10:06,960 --> 00:10:09,320
And so another way.
And I think that that the

164
00:10:09,320 --> 00:10:14,720
possibility of AI, particularly
the use of agents, so AIS that

165
00:10:14,720 --> 00:10:19,160
can reason and take action,
create the possibility to design

166
00:10:19,240 --> 00:10:21,440
processes that are machine
based.

167
00:10:21,440 --> 00:10:25,960
And where the machine falls
short of making that right

168
00:10:25,960 --> 00:10:31,000
decision or being certain enough
that the that the inference or

169
00:10:31,000 --> 00:10:34,960
the prediction or the output is
correct, accurate could be

170
00:10:34,960 --> 00:10:37,440
accurate from a security
standpoint or for a compliance

171
00:10:37,440 --> 00:10:40,480
standpoint or from just the is
it right or is it wrong?

172
00:10:41,040 --> 00:10:44,880
And so using human in the
instance where the model's not

173
00:10:44,880 --> 00:10:45,600
quite.
Sure.

174
00:10:45,600 --> 00:10:48,280
Maybe the model gets you 90% of
the way there and you just have

175
00:10:48,280 --> 00:10:51,320
that human right there, just
when previously the human was

176
00:10:51,320 --> 00:10:54,400
doing 100% of the work.
Going from 10% is a huge, big

177
00:10:54,400 --> 00:10:56,080
deal.
Very.

178
00:10:56,120 --> 00:10:59,840
Big deal, very big deal.
And so we think that, you know,

179
00:10:59,920 --> 00:11:04,480
the possibility that that
processes where, hey, maybe I

180
00:11:04,480 --> 00:11:08,440
wouldn't trust the machine all
by itself, but I think I could

181
00:11:08,440 --> 00:11:11,840
trust the machine for most of
the case and I see most of the

182
00:11:12,160 --> 00:11:15,320
most of the occasions.
And.

183
00:11:15,320 --> 00:11:20,120
But if I can be smart enough
about identifying when a when

184
00:11:20,120 --> 00:11:23,760
the output is not sufficiently
certain it's correct.

185
00:11:24,200 --> 00:11:27,680
Now I can insert a human into
that step of the process and.

186
00:11:27,840 --> 00:11:29,680
That's fantastic.
Evaluate.

187
00:11:29,840 --> 00:11:34,120
Hey, is it possibly not correct?
If it is possibly not correct or

188
00:11:34,120 --> 00:11:37,480
maybe it's not confidently
correct, could be either.

189
00:11:37,640 --> 00:11:41,640
Either way, then I can use a
human to validate and then next

190
00:11:41,640 --> 00:11:45,200
action you know, could be
automated or or depending on on

191
00:11:45,200 --> 00:11:49,600
the use case.
And so that technical evaluation

192
00:11:49,800 --> 00:11:53,960
plus human validation is kind of
the we think is a key for.

193
00:11:54,360 --> 00:11:56,800
I love the.
Making the confidence at the

194
00:11:56,800 --> 00:12:00,960
level required.
I heard this week that Meta is

195
00:12:00,960 --> 00:12:05,040
putting together these crazy
contracts to pay people to

196
00:12:05,080 --> 00:12:09,600
figure out this IAI game, which
is way above my head.

197
00:12:09,600 --> 00:12:13,720
Even some of the terminology
y'all using is unique to just

198
00:12:13,720 --> 00:12:15,480
being in that industry.
So it's cool to see how it

199
00:12:15,480 --> 00:12:17,720
evolves.
But as we mentioned earlier, we

200
00:12:17,720 --> 00:12:20,280
would love to hear about your
journey, your professional

201
00:12:20,320 --> 00:12:23,320
career journey.
So take us back to the start.

202
00:12:23,320 --> 00:12:27,200
You mentioned being in sales, so
let's hear where you got started

203
00:12:27,200 --> 00:12:30,320
and how you feel that that has
helped you get to leadership

204
00:12:30,320 --> 00:12:31,800
position right now.
Yeah.

205
00:12:31,880 --> 00:12:35,720
Well, first I'll say no one is
trying to recruit me for 200

206
00:12:35,720 --> 00:12:39,640
million.
I heard there's one contract

207
00:12:39,640 --> 00:12:42,800
going for $1 billion for one
person.

208
00:12:42,880 --> 00:12:46,120
So it's out of control.
Yes, yes, he was on track.

209
00:12:46,120 --> 00:12:48,480
There's a few out there that are
in the hundreds of 1,000,000,

210
00:12:48,480 --> 00:12:52,560
but I heard today 1 billion.
So, well, anyways, we'll put in

211
00:12:52,560 --> 00:12:58,600
a good word that if you, if you
get, if you get a call, deserve

212
00:12:58,600 --> 00:13:01,040
a.
Lot I hope everyone makes a lot

213
00:13:01,040 --> 00:13:06,920
of money, but but I the the you
know, as I was saying before my

214
00:13:06,920 --> 00:13:10,760
path was a little bit, you know,
me, I'll say meander.

215
00:13:11,080 --> 00:13:18,120
So I grew up on a farm in Iowa.
I went to school at Iowa State

216
00:13:18,120 --> 00:13:20,840
University.
I studied agriculture business.

217
00:13:22,240 --> 00:13:24,760
I, I minored in rhetorical
communication.

218
00:13:25,760 --> 00:13:33,040
So, so in the short, you know,
not much right?

219
00:13:33,080 --> 00:13:39,040
And so for me, my career path
has been more about trying to be

220
00:13:39,040 --> 00:13:42,880
self aware and, and kind of
moving in action.

221
00:13:43,440 --> 00:13:48,280
And so as you know, I mean, I'm
sure God has had the hand in

222
00:13:48,280 --> 00:13:50,000
many of the steps that I've
taken along the way.

223
00:13:50,000 --> 00:13:54,360
I know that, but, but I think
also, you know, we're created

224
00:13:54,360 --> 00:13:57,880
and given gifts and talents and
then it's up to us to use those,

225
00:13:57,880 --> 00:14:00,280
you know, as we see fit and you
know, for good.

226
00:14:00,720 --> 00:14:05,920
And so I would just say that I'm
a curious person and I, you

227
00:14:05,920 --> 00:14:11,880
know, like to do new things.
And, and so, you know, I, I

228
00:14:11,880 --> 00:14:20,880
think the, the curiosity with
maybe enough humility, it gives

229
00:14:20,880 --> 00:14:23,680
me the possibility that, you
know, be motivated and pursue

230
00:14:23,680 --> 00:14:26,480
and do things that otherwise I
wouldn't have thought that would

231
00:14:26,480 --> 00:14:29,240
have been, you know, that you
just wouldn't have scripted.

232
00:14:29,640 --> 00:14:31,240
Would you say you're addicted to
learning?

233
00:14:31,680 --> 00:14:33,440
I do like to learn.
So I am.

234
00:14:33,560 --> 00:14:39,520
I mean, I do like to learn.
I'm not a big novel reader or

235
00:14:39,880 --> 00:14:43,720
that, but I, I do, you know,
every Saturday morning at my

236
00:14:44,160 --> 00:14:46,520
people I work with, they'll joke
because they know I'll be up

237
00:14:46,520 --> 00:14:48,640
reading some white paper about
some topic.

238
00:14:50,000 --> 00:14:53,720
And I do enjoy that.
And, but, but I, so that part of

239
00:14:53,720 --> 00:14:56,760
discovery is really important.
But I think that but the but

240
00:14:56,760 --> 00:15:00,840
the, the the main thing is just
that I would say for me has been

241
00:15:00,840 --> 00:15:03,240
willingness to jump into things
that are not certain.

242
00:15:05,840 --> 00:15:08,480
But that's ex courage.
Like AI?

243
00:15:09,600 --> 00:15:16,160
So, so the, So when I was a
senior in university, in

244
00:15:16,160 --> 00:15:21,600
college, I had a, my roommate
and high school friend.

245
00:15:22,960 --> 00:15:27,000
We roomed together in college.
And anyway, so he was, he had

246
00:15:27,360 --> 00:15:33,320
been the he was the president of
the FFA, the Farmers of America.

247
00:15:33,760 --> 00:15:36,600
You're really from suburb?
That's an Iowa State thing right

248
00:15:36,720 --> 00:15:39,440
there.
And anyway, he was really good

249
00:15:39,480 --> 00:15:42,320
friends with the Dean of the Egg
college and I was kind of

250
00:15:42,360 --> 00:15:44,160
acquainted because he was
acquainted.

251
00:15:44,640 --> 00:15:50,680
Anyway, this was in 1989, and
the university decided to put

252
00:15:50,680 --> 00:15:53,800
together an exchange program in
the Soviet Union.

253
00:15:54,760 --> 00:15:58,280
And like, you know, hey, that
sounds.

254
00:15:58,280 --> 00:16:00,960
Interesting, this is like 10
years after the Cold War.

255
00:16:01,360 --> 00:16:04,680
Well, not not exactly.
Right.

256
00:16:04,680 --> 00:16:09,920
So the the the Berlin Wall fell
in October of 89.

257
00:16:10,240 --> 00:16:14,040
Oh wow.
And so this was in the summer.

258
00:16:14,280 --> 00:16:18,320
Of 89 that year that.
We were that we were in that we

259
00:16:18,320 --> 00:16:19,120
did this trick.
Wow.

260
00:16:19,440 --> 00:16:23,920
So just being willing to like do
things.

261
00:16:23,920 --> 00:16:29,000
And I would say for me it was
more of ignorance is bliss.

262
00:16:29,680 --> 00:16:32,440
So I mean, I just didn't know
what I didn't mean.

263
00:16:33,880 --> 00:16:35,000
So did you go to the same
evening?

264
00:16:35,000 --> 00:16:36,320
Yeah.
How long were you there?

265
00:16:36,320 --> 00:16:40,360
Yeah, it was. 2 1/2 months.
Was it like, what was the most

266
00:16:40,360 --> 00:16:41,720
shocking?
Was it culture?

267
00:16:41,720 --> 00:16:43,160
Shock or Well, yeah, I would
say.

268
00:16:43,160 --> 00:16:46,000
What you expected?
Actually, I mean definitely was

269
00:16:46,000 --> 00:16:49,160
culture, all kinds of things.
You're like, oh, wow, this is

270
00:16:49,400 --> 00:16:51,760
different on a whole bunch of
different levels.

271
00:16:51,760 --> 00:16:56,720
But I'd say that one of the most
interesting things was is that,

272
00:16:57,360 --> 00:17:00,920
you know, because the Soviet
Union was enemy number one, you

273
00:17:00,920 --> 00:17:06,079
know, all of that.
And and it didn't take too long

274
00:17:06,079 --> 00:17:09,760
to realize that the government
and the people are not the same

275
00:17:09,760 --> 00:17:13,319
thing.
And so that was a, that was a

276
00:17:13,960 --> 00:17:18,160
transforming experience for me
just in terms of how I think

277
00:17:18,160 --> 00:17:22,319
about people in general and how
I think about, you know, the

278
00:17:22,319 --> 00:17:26,560
world in general in a place that
we all operate and, and live in.

279
00:17:26,880 --> 00:17:29,960
And so that was that was super
interesting experience.

280
00:17:30,680 --> 00:17:34,480
But I would say it was a little
bit of a stage setting for this,

281
00:17:34,840 --> 00:17:38,520
you know, poor farm kid from
Iowa who had never been anywhere

282
00:17:39,200 --> 00:17:42,680
to all right, you know what,
like the whole world is out

283
00:17:42,680 --> 00:17:46,760
there and there can there's a
thing to see in the opportunity

284
00:17:46,760 --> 00:17:48,520
to explore and.
So that just sparked your

285
00:17:48,520 --> 00:17:51,240
curiosity.
Yeah, so you know, one thing

286
00:17:51,240 --> 00:17:56,000
leads to another and and anyway,
my first job I worked for a a

287
00:17:57,040 --> 00:18:02,560
small family owned ingredient
manufacturing company and we

288
00:18:02,560 --> 00:18:10,800
were so it was married in 92 by
95 we moved to Ireland, Northern

289
00:18:10,800 --> 00:18:14,400
Ireland and Spain and.
Marissa was born in Spain.

290
00:18:15,880 --> 00:18:17,320
Ireland and Spain.
It's for work.

291
00:18:17,320 --> 00:18:21,400
So we we have sales in Europe,
but we had no physical presence

292
00:18:21,400 --> 00:18:24,520
in in Europe.
I'd say Ireland and Europe kind.

293
00:18:24,520 --> 00:18:30,320
Of launching the European.
Division 20 something an

294
00:18:30,320 --> 00:18:33,360
accountant and manufacturing
leader.

295
00:18:33,360 --> 00:18:37,320
We we all moved to Ireland
Highland and began, you know

296
00:18:37,320 --> 00:18:40,440
this journey of setting up
operations, manufacturing,

297
00:18:40,440 --> 00:18:44,680
distribution, sales, etcetera.
What an experience, yeah.

298
00:18:45,080 --> 00:18:50,520
At I guess that was 28 right
when when we moved.

299
00:18:50,600 --> 00:18:52,280
So what do you do when you get
over there?

300
00:18:52,280 --> 00:18:56,080
You're 28 and you need help with
something who you call like work

301
00:18:56,080 --> 00:18:58,520
wise when you're kind of like
the the guy in charge.

302
00:18:58,520 --> 00:19:00,960
There, Yeah.
So you know, you just figure out

303
00:19:00,960 --> 00:19:03,360
how to be self-sufficient.
So you guys question about, Oh,

304
00:19:03,360 --> 00:19:07,000
well, maybe, you know, be fun to
be an entrepreneur or do

305
00:19:07,000 --> 00:19:09,960
something, you know, how can I
imagine what a future to be?

306
00:19:09,960 --> 00:19:13,240
Well, part part of being an
entrepreneur is just kind of.

307
00:19:13,240 --> 00:19:17,000
Figuring it out, right, Right.
There's no manual.

308
00:19:18,400 --> 00:19:23,360
There's not always a manual.
So one of the things, so in

309
00:19:23,360 --> 00:19:29,200
those days we had, you know, an
assistant that would do typing

310
00:19:29,360 --> 00:19:33,880
for us or you'd make a dictation
and then they would type it up

311
00:19:33,880 --> 00:19:35,720
and then you would put it in a
fax machine.

312
00:19:36,440 --> 00:19:39,160
And, and so I didn't know how to
type.

313
00:19:40,280 --> 00:19:42,040
We changed schools when I was
young.

314
00:19:42,080 --> 00:19:44,640
The year that you were supposed
to learn at one school, I was

315
00:19:44,960 --> 00:19:48,080
not there.
So you know what?

316
00:19:48,080 --> 00:19:51,000
So, for example, I'd learn how
to type because I wasn't going

317
00:19:51,000 --> 00:19:53,800
to have some new dude, you know,
just silly things.

318
00:19:53,920 --> 00:19:55,720
Stuff I could just.
Figure it out, Yeah.

319
00:19:56,680 --> 00:19:59,880
I guess how is welcome back, How
is that experience of moving

320
00:19:59,880 --> 00:20:02,680
around so much help shape your
perspective when you're now

321
00:20:02,680 --> 00:20:07,200
leading a global company?
Yeah, well, I think I think the

322
00:20:07,320 --> 00:20:10,800
you know, when you kind of can
see all of the corners of the

323
00:20:10,840 --> 00:20:14,760
planet and all of God's
creation, all of the people and

324
00:20:14,960 --> 00:20:20,160
cultures and personalities you
do, it is evident that in in

325
00:20:20,160 --> 00:20:22,800
some ways the world is is so
different.

326
00:20:23,160 --> 00:20:26,840
But at the same time, you know,
people just want to live in

327
00:20:26,840 --> 00:20:29,320
peace.
They want to eat, they want to

328
00:20:29,320 --> 00:20:32,480
have shelter for themselves.
They want they want good for

329
00:20:32,480 --> 00:20:35,640
their family.
And so there's so much about

330
00:20:35,640 --> 00:20:39,680
humanity that has a core common
thread through it.

331
00:20:40,040 --> 00:20:43,840
And so when you, you know, when
you're leading teens or you're

332
00:20:43,840 --> 00:20:47,880
interacting with people in all
corners of the world, you know,

333
00:20:48,200 --> 00:20:50,640
it just know that in the end,
they're human.

334
00:20:50,640 --> 00:20:53,680
They have a mom and dad, they
may have children, they may have

335
00:20:53,680 --> 00:20:58,240
a spouse, kind of be empathetic
and care genuinely.

336
00:20:58,760 --> 00:21:04,960
And and you know, it's actually
the, you say, well, maybe that's

337
00:21:04,960 --> 00:21:07,240
what you have to do in order to
gain, in order to be a good

338
00:21:07,240 --> 00:21:09,720
leader or strong.
But the reality is that as you

339
00:21:09,720 --> 00:21:13,800
develop relationships and
connections with people, there's

340
00:21:13,800 --> 00:21:16,720
benefit, there's reward in that
for that self personally.

341
00:21:16,720 --> 00:21:23,320
And so in in many ways, like the
hardest thing about it is time

342
00:21:23,320 --> 00:21:29,480
zone and you know, a long
stretched out day, but you know,

343
00:21:29,800 --> 00:21:34,840
the the humanity has more more
of a common thread to you know

344
00:21:34,840 --> 00:21:37,880
what what matters then you know,
people would normally think

345
00:21:37,880 --> 00:21:41,200
about.
So we said there's no manual as

346
00:21:41,200 --> 00:21:43,320
to how you get to where you are
today.

347
00:21:43,320 --> 00:21:47,280
So you're ACEO, but you've been
in other leadership roles such

348
00:21:47,280 --> 00:21:51,480
as COO and things of that nature
across different industries and

349
00:21:51,480 --> 00:21:55,200
sectors.
So which jobs in your journey do

350
00:21:55,200 --> 00:21:57,440
you feel like I've helped you
the most with getting to where

351
00:21:57,440 --> 00:21:59,400
you are today?
Or maybe learning about

352
00:21:59,400 --> 00:22:01,680
leadership and how to run a
company.

353
00:22:02,000 --> 00:22:07,000
I think, I think the, one of
the, one of the lessons that

354
00:22:07,000 --> 00:22:12,800
that I would say is, is
important and I've kind of have

355
00:22:12,800 --> 00:22:16,480
deep in my or internalized more
and more over the years, is that

356
00:22:18,520 --> 00:22:23,040
the certainly there's a
different in age and the amount

357
00:22:23,040 --> 00:22:25,680
of experience you have at one
point in your career than to a

358
00:22:25,680 --> 00:22:27,920
latter point in your career.
But the reality is to get to

359
00:22:27,920 --> 00:22:31,320
more experience in your career,
you have to be willing to step

360
00:22:31,320 --> 00:22:34,480
into responsibilities that go
beyond your experience.

361
00:22:36,000 --> 00:22:40,040
A lot of people will say, well,
exactly.

362
00:22:40,320 --> 00:22:42,240
Well, part of it can be a
comfort zone.

363
00:22:42,240 --> 00:22:45,520
Part of it can just be and say,
you know what, when I look at

364
00:22:45,520 --> 00:22:50,080
the average degree of separation
between IQ, you know, on the

365
00:22:50,080 --> 00:22:55,120
left and IQ on the right, it's
actually not that far apart the

366
00:22:55,200 --> 00:22:57,680
the degree of separation and
intelligence.

367
00:22:57,680 --> 00:23:01,160
On one hand we say, wow, they're
super smart people that need

368
00:23:01,160 --> 00:23:04,800
that somebody's willing to pay a
billion dollars for come work

369
00:23:04,800 --> 00:23:09,640
for them and other people that
are, you know, cleaning the

370
00:23:10,080 --> 00:23:11,560
schools.
Right, right.

371
00:23:12,160 --> 00:23:15,720
And so, but at the same time in
the workforce, in the

372
00:23:15,720 --> 00:23:19,840
professional workforce, you
know, the degree of intelligence

373
00:23:19,840 --> 00:23:21,920
is not that great.
There is that learning.

374
00:23:21,920 --> 00:23:25,840
Curve there is a learning curve,
but most of it is about is, is

375
00:23:25,840 --> 00:23:30,160
courage being willing to step
into something that you're that

376
00:23:30,240 --> 00:23:36,280
that is beyond your experience.
I remember I had, so I, I worked

377
00:23:36,280 --> 00:23:39,040
for this food ingredient come, I
was a salesperson and I, you

378
00:23:39,040 --> 00:23:42,160
know, I to figure things out,
but I was a salesperson.

379
00:23:42,680 --> 00:23:44,880
I, we, we moved back to the
states.

380
00:23:45,520 --> 00:23:50,560
I ended up going to work for Cap
Gemini technology consulting

381
00:23:50,560 --> 00:23:55,080
company and, and then ended up
at EDI was there for three

382
00:23:55,080 --> 00:23:58,240
years, ended up at EDS.
Remember one of the, the, the,

383
00:23:58,400 --> 00:24:02,160
the kind of the biggest step
change roles that I had.

384
00:24:02,480 --> 00:24:05,960
So I joined EDS as a
salesperson.

385
00:24:07,040 --> 00:24:10,760
After one year, I became a
delivery executive for one of

386
00:24:10,760 --> 00:24:14,200
our clients, State Farm
Insurance Company.

387
00:24:15,000 --> 00:24:19,840
And so we moved to Bloomington,
IL And I, you know, remember I

388
00:24:19,840 --> 00:24:26,920
had a business major, I got AC
and I did not, I was, I'm not,

389
00:24:27,080 --> 00:24:29,560
I'm not technically trained
professionally, but I am a

390
00:24:29,560 --> 00:24:32,520
curious person.
And so they're like, hey, don't

391
00:24:32,560 --> 00:24:35,640
worry about not knowing much
about delivering technology.

392
00:24:35,840 --> 00:24:38,600
We just need somebody who's good
with the client, has a growth

393
00:24:38,600 --> 00:24:40,920
attitude.
And so I stepped into a job that

394
00:24:40,920 --> 00:24:44,800
I really probably had no
business doing, but, you know,

395
00:24:45,240 --> 00:24:48,880
figured it out.
And that was a $10 million

396
00:24:48,880 --> 00:24:50,120
contract.
Wow.

397
00:24:50,440 --> 00:24:53,320
Well, I don't know, it sounds.
Yeah, but you're the guy

398
00:24:53,320 --> 00:24:54,840
delivering and making the
making.

399
00:24:54,840 --> 00:24:59,520
That I had a good team, but then
the next one was 150 million.

400
00:24:59,840 --> 00:25:02,440
Yeah, that's a jump from 10:00
to 1:50.

401
00:25:03,080 --> 00:25:06,800
So, you know, yeah, I could talk
pretty well, right?

402
00:25:07,280 --> 00:25:12,040
I could convince people and I
could generally figure out what

403
00:25:12,040 --> 00:25:14,120
were the important words and
what order they went in.

404
00:25:14,520 --> 00:25:18,000
So people would be maybe
reasonably confident that I

405
00:25:18,000 --> 00:25:20,680
could do something more than
what I was doing.

406
00:25:21,200 --> 00:25:26,840
And so, you know, one thing led
to another and now I'm managing

407
00:25:26,840 --> 00:25:31,280
this large client, you know, 10
times the size of the one that I

408
00:25:31,280 --> 00:25:32,960
had.
So that was an important jump.

409
00:25:33,520 --> 00:25:34,680
Leadership.
Significant.

410
00:25:34,680 --> 00:25:36,080
Yeah.
At a young age still, yeah.

411
00:25:36,120 --> 00:25:40,960
That's right.
So, well, I was maybe I turned

412
00:25:40,960 --> 00:25:45,920
40 somewhere, like 30s in there.
And then, yeah, I went from

413
00:25:46,000 --> 00:25:50,320
having a team that was
responsible for like 100 people,

414
00:25:50,440 --> 00:25:54,280
100 a hundred people, something
to like 1100 people.

415
00:25:55,000 --> 00:25:59,200
Another big jump.
So but there I didn't know

416
00:25:59,200 --> 00:26:03,480
something more, but I took a
minute tried to understand the

417
00:26:03,480 --> 00:26:05,400
problem.
What about some of the tech

418
00:26:05,400 --> 00:26:09,000
challenges that were that you
faced, some of the, hey, this is

419
00:26:09,000 --> 00:26:11,240
a technical issue that I'm not
familiar with.

420
00:26:11,240 --> 00:26:13,320
How would you go about solving
those kinds of problems?

421
00:26:13,440 --> 00:26:16,680
So we we had as a technologist
yourself.

422
00:26:17,680 --> 00:26:19,840
This.
Is a really good question

423
00:26:20,320 --> 00:26:23,560
because ultimately I, I, you
know, so I, I can't make it up

424
00:26:23,560 --> 00:26:25,600
all the time.
Well, this is before like was

425
00:26:25,840 --> 00:26:28,440
what year was this?
Was this before Google was a

426
00:26:28,440 --> 00:26:31,040
big?
This was in the like, early

427
00:26:31,040 --> 00:26:32,640
2000s.
OK, gotcha.

428
00:26:32,640 --> 00:26:36,000
Yeah, 2002, 2000.
Three, yeah, plenty of resources

429
00:26:36,000 --> 00:26:38,200
available during that.
That's why I had a lot, you

430
00:26:38,200 --> 00:26:40,440
know, a really strong technical
team around me.

431
00:26:40,640 --> 00:26:46,120
But what I'd do is that once
when, so you're doing

432
00:26:46,120 --> 00:26:51,240
development, yes, we mostly
maintain systems in production.

433
00:26:51,240 --> 00:26:55,320
OK, So when something breaks,
you know you got to figure out

434
00:26:55,320 --> 00:26:56,160
it's a.
Lot of debugging.

435
00:26:56,840 --> 00:27:02,680
Bro yeah, so I would get on the
outage calls where system's

436
00:27:02,680 --> 00:27:04,480
down.
Dealing with an angry customer.

437
00:27:05,160 --> 00:27:10,240
You listen to the angry customer
and essentially, you know, you

438
00:27:10,480 --> 00:27:13,440
make a big rope around the
problem and then you just keep

439
00:27:13,440 --> 00:27:16,400
eliminating variables.
Well, after a few of those

440
00:27:16,400 --> 00:27:19,320
times, you start actually
understanding how this whole

441
00:27:19,320 --> 00:27:21,240
thing is wired up together and
works.

442
00:27:21,240 --> 00:27:24,280
And so you develop, you know you
don't have to code anything, but

443
00:27:24,280 --> 00:27:26,720
you can create a logical.
Matter of the process.

444
00:27:26,760 --> 00:27:28,760
So you can start to figure stuff
out so.

445
00:27:28,920 --> 00:27:30,600
Learn as you go.
That's the moral.

446
00:27:30,600 --> 00:27:34,640
It sounds like we'll get well
back to Cloud Factory.

447
00:27:34,640 --> 00:27:39,240
So what was the first impression
of the company as you joined and

448
00:27:39,240 --> 00:27:42,800
how has it evolved since day one
to now, especially with all

449
00:27:42,800 --> 00:27:47,680
that's going on with AI?
Yeah, the So when I joined the a

450
00:27:47,680 --> 00:27:52,560
lot of the revenue that made-up
the profile of the company, the

451
00:27:52,560 --> 00:27:54,600
base of the company was labor
based.

452
00:27:55,040 --> 00:28:00,240
So humans in the loop playing
some stat labeling and

453
00:28:00,240 --> 00:28:07,080
annotating, you know, 2D images
or doc PDF documents or

454
00:28:07,080 --> 00:28:11,440
something and with some
technology involved in the

455
00:28:11,440 --> 00:28:16,480
process, but not a lot of it
what we've done kind of since

456
00:28:16,480 --> 00:28:21,840
and again I've joined in October
of 23, so that were December.

457
00:28:21,840 --> 00:28:28,320
Almost two years in August 20. 5
So we've made a, a, a pretty

458
00:28:28,320 --> 00:28:31,000
hard pivot to a technology
platform.

459
00:28:31,000 --> 00:28:34,400
So taking parts and pieces of
what we had integrated together

460
00:28:34,400 --> 00:28:39,880
into a platform that allows us
to kind of orchestrate objects

461
00:28:39,880 --> 00:28:44,280
of service or service objects
into an orchestrated flow.

462
00:28:44,600 --> 00:28:48,920
Where we can optimize the steps
between what, you know,

463
00:28:48,960 --> 00:28:51,520
technology might be doing and
what a human is doing.

464
00:28:52,160 --> 00:28:56,960
And, and, and being able to
shift the business from a

465
00:28:56,960 --> 00:29:00,840
largely a labor based revenue
business to a platform based

466
00:29:00,840 --> 00:29:03,680
revenue business has been a big
part of the change.

467
00:29:04,240 --> 00:29:08,200
The the, the benefit of that is
we can, it allows us to create

468
00:29:08,440 --> 00:29:12,680
more value for customers in the
out relates kind of tied to the

469
00:29:12,680 --> 00:29:14,680
outcome or the value they're
trying to receive.

470
00:29:15,320 --> 00:29:20,400
And then effectively it makes
our role in creating that value

471
00:29:21,560 --> 00:29:27,840
more tangible or, you know, it
would be, it's more difficult

472
00:29:27,840 --> 00:29:30,640
for the customer to leave us and
go to someone else.

473
00:29:30,960 --> 00:29:34,280
Now ultimately clients can, you
know, pick and choose who they

474
00:29:34,360 --> 00:29:37,440
want to have their providers.
And so it's not like we've made

475
00:29:37,440 --> 00:29:40,840
a immovable obstacle for the
customer.

476
00:29:40,960 --> 00:29:43,000
You get some point where it's
easier for them to stay than to

477
00:29:43,000 --> 00:29:43,840
leave.
It's easier.

478
00:29:43,960 --> 00:29:48,040
Exactly easier but, but, but
because of the value, the added

479
00:29:48,160 --> 00:29:51,560
incremental value that we're
able to create and and in many

480
00:29:51,560 --> 00:29:55,480
cases it's creating value that
they would otherwise not be able

481
00:29:55,480 --> 00:29:59,640
to recreate because of what we
were able to bring through our

482
00:29:59,920 --> 00:30:02,040
platform.
That is the combination of

483
00:30:02,040 --> 00:30:04,800
technology plus.
So you've initiated this change

484
00:30:04,800 --> 00:30:07,600
in under 2 years being?
Yeah, yeah.

485
00:30:07,680 --> 00:30:11,120
So we're about a really, I'd say
a year and a half into it.

486
00:30:11,360 --> 00:30:13,120
Yeah.
First was all right.

487
00:30:13,120 --> 00:30:15,000
What's you know, Where's the
toilet?

488
00:30:15,160 --> 00:30:16,920
Yeah.
Thank you.

489
00:30:17,760 --> 00:30:21,000
So we're all the pieces tied in
and there's like, OK, now that

490
00:30:21,000 --> 00:30:22,960
you understand that, now you
know.

491
00:30:23,080 --> 00:30:25,320
How do we make this better?
How can we make it better?

492
00:30:25,320 --> 00:30:28,800
What was your biggest challenge
when trying to improve the

493
00:30:28,800 --> 00:30:31,880
system as a new CEO?
Yeah, I'd say, I mean

494
00:30:31,960 --> 00:30:35,880
culturally, organizationally.
So we had kind of 22 kind of

495
00:30:35,880 --> 00:30:41,320
opposite things.
So one hand, Mark, the founder

496
00:30:41,320 --> 00:30:46,080
had done a fantastic job
building a culture, really a

497
00:30:46,080 --> 00:30:48,600
group of people.
If you're a cow factory, you're

498
00:30:48,600 --> 00:30:50,960
you're a person that wants to be
there because you want to make a

499
00:30:50,960 --> 00:30:52,800
difference in the world in a
positive way.

500
00:30:53,240 --> 00:30:57,280
And so we have a fantastic
culture and A and a really great

501
00:30:57,280 --> 00:31:00,400
group of people that are about
serving something greater than

502
00:31:00,400 --> 00:31:02,400
themselves.
And these people are in Kenya,

503
00:31:02,400 --> 00:31:04,480
they're all over.
The all all over the.

504
00:31:04,480 --> 00:31:07,960
World.
And and so, yeah, the companies

505
00:31:07,960 --> 00:31:09,400
founded in Nepal.
And.

506
00:31:10,520 --> 00:31:17,640
And then also in, in Kenya, UK,
Germany, US, Philippines,

507
00:31:17,960 --> 00:31:23,240
Colombia.
And so, but, but I, I think the,

508
00:31:23,360 --> 00:31:26,560
the culture that says that, that
really positive culture, which,

509
00:31:26,640 --> 00:31:28,880
you know, thing, thing I
promised to Mark.

510
00:31:28,880 --> 00:31:32,640
And if Mark listens to this
movies like, hey, whatever I do,

511
00:31:32,640 --> 00:31:35,000
I hope I don't, I can't, I don't
want to break that.

512
00:31:35,000 --> 00:31:37,920
So protect that.
Then at the same time say, hey

513
00:31:37,920 --> 00:31:42,440
team, we're going to shift what,
what we're, you know, our focus

514
00:31:42,440 --> 00:31:45,520
for what we're doing and, and
why we matter and the value we

515
00:31:45,520 --> 00:31:48,200
create for our customers and
those are big changes.

516
00:31:48,320 --> 00:31:51,480
Is it hard to share that vision
and kind of get everyone on

517
00:31:51,520 --> 00:31:52,800
board?
You know, I think the first, the

518
00:31:52,800 --> 00:31:55,040
first difficult thing is to come
up with a vision.

519
00:31:55,400 --> 00:31:59,520
So not being the founder of the
company, but still needing that

520
00:31:59,520 --> 00:32:01,200
says, all right, here's what
we're going to do.

521
00:32:01,200 --> 00:32:02,960
Yeah.
And to be able to come up with

522
00:32:02,960 --> 00:32:07,840
that and, and then, you know,
that's credible that people

523
00:32:07,840 --> 00:32:10,920
that's under that people can
understand and, and ultimately

524
00:32:10,920 --> 00:32:13,560
is inspirational and people are
willing to follow and get

525
00:32:13,560 --> 00:32:15,520
behind.
And so that took a little bit of

526
00:32:15,520 --> 00:32:19,200
time to get clear on that vision
and then be able to articulate

527
00:32:19,200 --> 00:32:23,320
it well enough that the, you
know, the team was on board, our

528
00:32:23,320 --> 00:32:27,000
board was on board and our
customers, you know, are

529
00:32:27,000 --> 00:32:31,000
signaling that, hey, we like we
like these ideas, you know, give

530
00:32:31,000 --> 00:32:32,720
us more.
So when you come in with this

531
00:32:32,720 --> 00:32:36,720
vision, you have to convince the
founders, the board members and

532
00:32:36,720 --> 00:32:39,680
your team that this is the right
way to go and their.

533
00:32:39,800 --> 00:32:43,400
Customers.
The biggest 1 customers they.

534
00:32:44,800 --> 00:32:47,200
Say everything is the privilege
of the revenue, Yeah.

535
00:32:47,560 --> 00:32:48,680
Exactly.
Exactly.

536
00:32:49,000 --> 00:32:50,280
Yeah, but that's quite a
challenge.

537
00:32:50,280 --> 00:32:53,760
I mean, yeah, I guess that's
good in that it helps having all

538
00:32:53,760 --> 00:32:56,280
that positive feedback to know,
hey, this might not be the

539
00:32:56,280 --> 00:32:58,120
perfect vision, how can we make
it better?

540
00:32:58,120 --> 00:33:00,760
How can we get it to be that
perfect direction?

541
00:33:00,840 --> 00:33:02,560
So it's good to have that
feedback, I'm sure.

542
00:33:02,560 --> 00:33:04,840
Very intimate.
But it's but it's hard, right?

543
00:33:04,840 --> 00:33:07,520
So if not everybody was super
excited for that.

544
00:33:08,320 --> 00:33:11,760
We're super excited.
So, you know, some people chose

545
00:33:11,760 --> 00:33:14,480
not to stay.
Some, I mean some people stay.

546
00:33:15,320 --> 00:33:17,240
Others always changes some
talent.

547
00:33:17,240 --> 00:33:19,080
Gaps you have to recruit some
in.

548
00:33:19,520 --> 00:33:24,200
And so doing all of that change
and still kind of holding the

549
00:33:24,200 --> 00:33:29,160
hearts and the minds of, of, of
the, of the team more broadly, I

550
00:33:29,160 --> 00:33:31,880
would say it's probably the most
challenging thing, the most

551
00:33:31,880 --> 00:33:37,720
challenging 1 original ideas,
yeah, vision and two, keeping

552
00:33:37,720 --> 00:33:39,160
everybody to come along with
you.

553
00:33:39,640 --> 00:33:42,600
That's that's not easy.
Yeah, which I know that that was

554
00:33:42,600 --> 00:33:44,400
one of the questions that we
had.

555
00:33:44,400 --> 00:33:48,720
You know, how do you maintain a
culture that's so great and try

556
00:33:48,720 --> 00:33:50,280
to grow and implement new
things?

557
00:33:50,280 --> 00:33:53,920
So that's that's a big way to.
Pull Well, the, the, the thing

558
00:33:53,920 --> 00:33:58,360
that we have tried to do and I
would, I would say it feels like

559
00:33:58,360 --> 00:34:02,960
it's it, it's worked well is to
be honest and transparent and

560
00:34:02,960 --> 00:34:05,280
empathetic with the, you know,
with the team.

561
00:34:05,840 --> 00:34:10,080
And and so, you know, we tried
to do that there in in the

562
00:34:10,080 --> 00:34:14,520
initial period and we changed
some things and had really nice

563
00:34:15,159 --> 00:34:16,480
sequential.
Quarters to take.

564
00:34:16,480 --> 00:34:19,159
Take me into your mindset when
you implement this vision.

565
00:34:19,239 --> 00:34:21,880
Were you stressing out then?
What was your I don't know

566
00:34:22,159 --> 00:34:23,960
mentality then?
Well.

567
00:34:24,320 --> 00:34:25,440
Were you already thinking I
mean?

568
00:34:27,679 --> 00:34:29,000
You know, I mean, it's hard
work.

569
00:34:29,360 --> 00:34:34,120
So I think that the thing though
is that I don't know, somebody

570
00:34:34,120 --> 00:34:37,520
said strong convictions, loosely
held.

571
00:34:37,920 --> 00:34:41,760
So, you know, we decided on what
the vision was.

572
00:34:41,760 --> 00:34:45,840
We, you know, spent a lot of
time developing some consensus

573
00:34:45,840 --> 00:34:51,320
around it and developing a plan
to execute against it.

574
00:34:51,840 --> 00:34:55,920
And then a way to kind of a
management system to set up the

575
00:34:55,920 --> 00:34:59,320
measures and to be able to
understand, hey, how is it, is

576
00:34:59,320 --> 00:35:02,160
it working?
Because we need to know as we go

577
00:35:02,600 --> 00:35:05,840
both kind of actual and leading
indicators, how well is it

578
00:35:05,840 --> 00:35:08,240
working.
And so, you know, we put the

579
00:35:08,400 --> 00:35:12,760
structure in place to do that.
And, you know, we were, we, we

580
00:35:12,920 --> 00:35:17,280
definitely made some adjustments
along the way as we went, you

581
00:35:17,280 --> 00:35:20,160
know, in hindsight, some things
we wish we'd have done sooner.

582
00:35:23,120 --> 00:35:27,520
But I think that the
transparency and empathy is, you

583
00:35:27,520 --> 00:35:31,280
know, we just said, Hey, there's
some hard parts to this and

584
00:35:31,280 --> 00:35:36,600
there's some good parts to this.
And, and they're all, everybody

585
00:35:36,600 --> 00:35:38,320
works at Cloud Factory as an
adult.

586
00:35:39,560 --> 00:35:42,040
And so we we just treat them
like.

587
00:35:42,040 --> 00:35:45,560
Adults.
And just say, Hey, here's, you

588
00:35:45,560 --> 00:35:49,760
know, because it, if we have a
plan, a strategy, a vision and a

589
00:35:49,760 --> 00:35:54,480
strategy and a plan that they
don't understand and they're

590
00:35:54,480 --> 00:35:58,240
educated adults, the problem may
not be the the team.

591
00:35:58,240 --> 00:36:03,560
And so you know, my view is
it's.

592
00:36:03,640 --> 00:36:05,760
Like free feedback?
Is this is?

593
00:36:05,760 --> 00:36:10,600
If I can't, you know, explain it
in a clear way that is

594
00:36:10,600 --> 00:36:15,120
understood, then maybe it's not
such a great idea or maybe I

595
00:36:15,120 --> 00:36:20,600
can't explain things well.
Possible, sure, both is true.

596
00:36:20,600 --> 00:36:23,560
But so I would say, you know,
that that's been really

597
00:36:23,560 --> 00:36:27,400
important, the transparency, the
empathy, and then the

598
00:36:27,400 --> 00:36:30,480
willingness to listen, you know,
which is probably hard for me.

599
00:36:32,320 --> 00:36:35,280
Well, that's hard for all of us.
And those are great to hear some

600
00:36:35,280 --> 00:36:38,920
of the key things that help you
push forward some of these ideas

601
00:36:38,920 --> 00:36:40,720
with your team.
And we always say feedback is

602
00:36:40,720 --> 00:36:43,400
one of the best things.
And I know I always appreciate

603
00:36:43,400 --> 00:36:46,280
feedback in the workplace and
outside because it helps you

604
00:36:46,280 --> 00:36:48,800
grow and learn how you can do
things differently or better.

605
00:36:49,400 --> 00:36:52,440
But with all that being said,
the role that you're in now is a

606
00:36:52,640 --> 00:36:54,840
big role and super time
consuming.

607
00:36:54,840 --> 00:36:57,520
And as you know, we've had
leaders from all different

608
00:36:57,520 --> 00:37:00,440
industries on this podcast say
one of the things we love to ask

609
00:37:00,440 --> 00:37:04,040
y'all is what are some of the
habits or maybe routines that

610
00:37:04,040 --> 00:37:06,600
you have in place to help you
stay focused and lead your

611
00:37:06,600 --> 00:37:10,040
company well?
Yeah, so I'm definitely a

612
00:37:10,040 --> 00:37:15,320
morning person.
So I, I'm about 7 hours of

613
00:37:15,320 --> 00:37:20,440
sleep.
OK, so I like to be, you know,

614
00:37:20,600 --> 00:37:24,800
well and take care of myself and
you know, I sleep is important,

615
00:37:25,160 --> 00:37:28,640
but I, but I, you know, I like
to start my day early.

616
00:37:28,920 --> 00:37:33,200
How early are we talking?
Well, I mean, I start to wake up

617
00:37:33,400 --> 00:37:39,320
around 5:00, 5530, depending on
what time zone I'm Yeah.

618
00:37:40,160 --> 00:37:44,640
Right, let's try and.
And then I try not to fill my

619
00:37:44,640 --> 00:37:51,080
day with stack at full.
So I think for me in my role,

620
00:37:51,080 --> 00:37:54,440
the most important thing I can
do is be really clear and

621
00:37:54,440 --> 00:37:58,040
intentional about what we're
doing and how to articulate it

622
00:37:58,040 --> 00:37:59,640
and keep everybody brought
along.

623
00:38:00,080 --> 00:38:05,080
And so, you know, I spent a lot
of time thinking and my morning

624
00:38:05,400 --> 00:38:08,080
window is my best thinking
window.

625
00:38:08,160 --> 00:38:11,880
And so when no one's awake or
the dog's not bothering me too

626
00:38:11,880 --> 00:38:14,080
much, you know, I'll, I'll do
that.

627
00:38:14,360 --> 00:38:15,440
A.
Nice cup of coffee.

628
00:38:15,840 --> 00:38:19,520
I'm definitely a coffee.
Christina doesn't drink coffee.

629
00:38:19,560 --> 00:38:21,680
Yeah, I don't.
I don't. 3 or 4 cups a day,

630
00:38:21,840 --> 00:38:25,280
yeah.
I'm the only one in our family.

631
00:38:25,320 --> 00:38:27,040
It doesn't drink.
It drinks coffee.

632
00:38:27,400 --> 00:38:28,480
Yeah.
So it's been lonely.

633
00:38:29,680 --> 00:38:31,120
Well, you have a coffee friend
here.

634
00:38:31,120 --> 00:38:34,200
But it's funny, your routine
sounds so similar to my father,

635
00:38:34,200 --> 00:38:36,080
who I work with now in a family
business.

636
00:38:36,080 --> 00:38:41,600
And just, it's sweet to see how
important that morning time is

637
00:38:41,600 --> 00:38:45,440
and to read the white papers on
the weekend and just absorb as

638
00:38:45,440 --> 00:38:47,760
much knowledge and information
as you can.

639
00:38:47,760 --> 00:38:51,040
And how that helps you all
become these great leaders and

640
00:38:51,040 --> 00:38:53,800
something that helps us and
everyone listening that you

641
00:38:53,800 --> 00:38:57,280
know, that quiet time and you
know, use that time how you

642
00:38:57,280 --> 00:38:59,800
wish.
If if you're in a faith

643
00:38:59,800 --> 00:39:01,800
perspective, I guess sometime
with the Lord.

644
00:39:01,800 --> 00:39:05,480
And I know that applies to
y'all, but then too, learning

645
00:39:05,480 --> 00:39:08,320
and growing and absorbing
knowledge and reading goes a

646
00:39:08,320 --> 00:39:10,920
long way.
And and so that's something we

647
00:39:10,920 --> 00:39:14,120
hear a lot on the podcast.
Yeah, I've been, I get up around

648
00:39:14,120 --> 00:39:17,320
the same time as it depends also
on like I try to get my sleep, I

649
00:39:17,320 --> 00:39:19,680
prioritize sleep.
So if I haven't gotten my sleep

650
00:39:19,680 --> 00:39:22,600
in a couple of days, I do not
get up, prioritize that sleep.

651
00:39:22,880 --> 00:39:25,520
But I like to think too, after
doing that for so many years,

652
00:39:25,520 --> 00:39:29,360
how much extra time we've spent
learning or processing and just

653
00:39:29,360 --> 00:39:31,120
getting ahead like that time
adds up.

654
00:39:31,680 --> 00:39:33,840
And I think it's a common trait
of a lot of successful people,

655
00:39:34,760 --> 00:39:38,320
but.
I think also in that is just

656
00:39:38,440 --> 00:39:42,920
intentionality.
So just don't, don't go through

657
00:39:42,920 --> 00:39:46,960
life just letting it happen.
Be like decide what you want to

658
00:39:46,960 --> 00:39:51,400
do and then do it, and then be
willing to adjust, adapt to.

659
00:39:51,680 --> 00:39:53,880
That's the hard part of
adjusting and adapting.

660
00:39:53,960 --> 00:39:56,960
Yeah, because you know.
There's no manual back to that.

661
00:39:57,040 --> 00:40:01,560
You just adopt and.
We never take decisions with a

662
00:40:01,560 --> 00:40:04,760
full set of information.
We don't know everything that's.

663
00:40:04,800 --> 00:40:07,560
Great.
That's a humble perspective.

664
00:40:07,800 --> 00:40:09,920
Right there.
Yeah, but how can you know more

665
00:40:09,920 --> 00:40:11,720
if you know everything?
Yeah, that's right.

666
00:40:12,360 --> 00:40:14,120
That's good.
That's not my job.

667
00:40:14,240 --> 00:40:16,280
But we still but we still have a
few more questions.

668
00:40:18,200 --> 00:40:19,200
I.
Don't know at all.

669
00:40:19,200 --> 00:40:20,680
I can't.
Make sure you're aware of that.

670
00:40:20,680 --> 00:40:22,960
That's right.
But all I say, looking, we've

671
00:40:22,960 --> 00:40:25,160
talked a little bit about your
career doing into that.

672
00:40:25,440 --> 00:40:27,160
We're obviously young
professionals.

673
00:40:27,160 --> 00:40:29,400
You have a lot of young
professionals that you're still

674
00:40:29,400 --> 00:40:32,120
parenting.
What advice would you give to

675
00:40:32,120 --> 00:40:35,320
those that are aspiring to maybe
follow your footsteps and become

676
00:40:35,320 --> 00:40:37,240
a business leader one day?
Yeah.

677
00:40:37,840 --> 00:40:42,880
So I mean, I, I think the, I
mean, one thing we say, and, and

678
00:40:43,400 --> 00:40:46,400
I'm sure your dad would say this
too, is that we, we can't

679
00:40:46,400 --> 00:40:50,560
predict all of the things that
are going to happen to us in

680
00:40:50,560 --> 00:40:53,520
life.
Some of the things, you know,

681
00:40:53,560 --> 00:40:56,360
we're, we're responsible for
some of the things we're just

682
00:40:56,360 --> 00:40:59,560
not responsible.
They happen to us not because of

683
00:40:59,560 --> 00:41:00,720
something.
It's just life.

684
00:41:01,200 --> 00:41:05,680
And so I think 1 is not to be
too married to some idea about

685
00:41:05,680 --> 00:41:08,480
what outcome you think you're
going to achieve, but

686
00:41:08,480 --> 00:41:12,480
continually be, you know,
adapting and moving towards

687
00:41:12,480 --> 00:41:18,640
something intentional.
And, and I would say that, you

688
00:41:18,640 --> 00:41:24,120
know, if you're, you know,
you're earlier in your career

689
00:41:24,120 --> 00:41:26,800
and you're like, the, the most
important thing you're on your

690
00:41:26,800 --> 00:41:30,360
mind is some new, you know,
electric bicycle you want to

691
00:41:30,360 --> 00:41:33,000
buy.
Because he bikes are cool.

692
00:41:33,520 --> 00:41:36,760
They're cool.
I mean no, I mean you can take.

693
00:41:37,000 --> 00:41:40,600
The whole swamp private.
But I mean.

694
00:41:40,920 --> 00:41:44,560
You know, thinking about in one
year and five year and 10 years

695
00:41:44,560 --> 00:41:47,720
and you know, what do you want
to, what do you want to aspire

696
00:41:47,720 --> 00:41:49,920
to?
What do you want to make?

697
00:41:49,920 --> 00:41:51,920
How do you want to make a
difference in life?

698
00:41:51,920 --> 00:41:55,120
How do you want, who is going to
miss you if you're not here?

699
00:41:55,920 --> 00:42:01,080
And so some of that, you know,
we have responsibilities for

700
00:42:01,360 --> 00:42:04,680
loved ones around us.
You know, what do you, what do

701
00:42:04,680 --> 00:42:06,360
you, what is it that you're
aspiring to?

702
00:42:06,360 --> 00:42:08,720
And then just be super clear,
intentional about it.

703
00:42:10,040 --> 00:42:13,680
And, and if you're not sure,
take a minute and think about

704
00:42:13,680 --> 00:42:16,160
it.
And because you know, you, we

705
00:42:16,160 --> 00:42:20,560
all have, you know, 24 hours a
day and seven days a week.

706
00:42:20,560 --> 00:42:24,280
And however many years we have,
you know, you know, we're all

707
00:42:24,280 --> 00:42:26,920
falling out of an airplane.
Yeah, exactly.

708
00:42:27,760 --> 00:42:31,360
But yeah, you might as well.
Might as well make the most of

709
00:42:31,360 --> 00:42:33,680
the gifts that you haven't and
then apply them in some way that

710
00:42:33,680 --> 00:42:38,560
not just is feeding you, but.
Bringing others that's right,

711
00:42:38,680 --> 00:42:40,800
and being intentional about it I
think is key.

712
00:42:40,840 --> 00:42:43,840
Being intentional, yeah.
One thing I'd love to touch on

713
00:42:43,880 --> 00:42:47,280
before we do wrap it up is
you're leading a global company

714
00:42:47,280 --> 00:42:51,200
as ACEO, yet you still find time
to serve the church on Tuesday

715
00:42:51,200 --> 00:42:52,160
nights.
I do.

716
00:42:52,320 --> 00:42:54,480
How do you find time for that in
such a busy role?

717
00:42:54,480 --> 00:42:56,400
Most people would say hey, I, I
don't.

718
00:42:56,480 --> 00:42:59,920
I just can't make time for that.
Or I travel all the time, yeah.

719
00:43:00,000 --> 00:43:04,200
Here's what I say so I I pre
COVID.

720
00:43:04,640 --> 00:43:10,200
So I spent I didn't make time
not not I mean I did stuff, but

721
00:43:10,200 --> 00:43:14,520
not not sufficiently.
So when COVID happened and, and

722
00:43:14,520 --> 00:43:20,560
I, my, I traveled way too much.
And so when COVID happened and I

723
00:43:20,560 --> 00:43:23,840
didn't travel anymore, it was
just a moment for I could, you

724
00:43:23,840 --> 00:43:26,760
know, step back and reflect on,
all right, what am I, what am I

725
00:43:26,800 --> 00:43:32,400
pouring myself into and for what
return am I, am I doing it?

726
00:43:32,800 --> 00:43:36,960
And so, so that was an important
time just for me in my own

727
00:43:36,960 --> 00:43:39,400
personal development.
I think I'm developing all the

728
00:43:39,400 --> 00:43:41,880
way till I die.
By the way, I don't think that

729
00:43:41,880 --> 00:43:45,320
ends for any of us.
But so that was a time we're

730
00:43:45,320 --> 00:43:48,240
saying, OK, well, you know, I'm,
I'm person of faith.

731
00:43:48,520 --> 00:43:54,120
So I would say if if God has
plans for me, good plans for me,

732
00:43:55,320 --> 00:43:59,720
do I think that he wouldn't give
me enough time to do his good

733
00:43:59,720 --> 00:44:03,920
plans for me?
I heard Rick Warren do this

734
00:44:03,920 --> 00:44:09,600
quote or, or make this comment
is like, well, I guess either he

735
00:44:09,600 --> 00:44:11,400
didn't give me enough time or I
don't trust him.

736
00:44:12,320 --> 00:44:15,840
And so I just said, all right,
there we go.

737
00:44:15,840 --> 00:44:20,080
My life is all, you know,
discombobulated because of COVID

738
00:44:20,080 --> 00:44:22,400
and my normal rituals are messed
up.

739
00:44:22,400 --> 00:44:24,200
And So what can I do
differently?

740
00:44:24,720 --> 00:44:27,920
And so I just said, hey, you
know, and when I am up early in

741
00:44:27,920 --> 00:44:33,680
the morning, I spend probably an
hour just, you know, reading,

742
00:44:33,880 --> 00:44:40,920
praying, reflecting, and, and
that is my best hour in the day

743
00:44:40,920 --> 00:44:44,160
for my brain to salt to, to
process problems.

744
00:44:45,360 --> 00:44:48,160
So as I said, for me, it was
like, all right, I'm going to

745
00:44:48,160 --> 00:44:52,320
give up my best hour.
That's in the morning time.

746
00:44:52,360 --> 00:44:55,480
The evening time is less.
I mean it might sound like a lot

747
00:44:56,880 --> 00:45:03,640
but I super enjoy doing the
Tuesday evening volunteer and I

748
00:45:03,640 --> 00:45:06,640
do a couple other things too but
so somehow.

749
00:45:08,240 --> 00:45:09,640
Priorities.
Well.

750
00:45:10,000 --> 00:45:11,880
I just trust that everything
else will work out.

751
00:45:11,920 --> 00:45:13,680
I.
Think you're a great example.

752
00:45:13,680 --> 00:45:15,560
I love that, it's really cool.
Appreciate it.

753
00:45:16,160 --> 00:45:18,080
We're going to wrap it up.
We've got a couple more

754
00:45:18,080 --> 00:45:21,720
questions and I know Connie's
got dinner waiting for you, so

755
00:45:21,720 --> 00:45:26,800
we'll wrap it up here soon.
So one of the things you touched

756
00:45:26,800 --> 00:45:30,040
on is what you're working on at
Cloud Factory, but could you

757
00:45:30,040 --> 00:45:33,160
give us some insight on what's
next for Cloud Factory?

758
00:45:33,160 --> 00:45:36,160
Any exciting development or
initiatives that you are able to

759
00:45:36,160 --> 00:45:37,840
share about?
Yeah.

760
00:45:37,840 --> 00:45:43,160
So we're so, so our year, our,
our, our fiscal year ended in

761
00:45:43,160 --> 00:45:45,120
June.
So we're kind of pivoting into

762
00:45:45,120 --> 00:45:47,160
our new year.
We just hired a new Chief

763
00:45:47,160 --> 00:45:53,000
Product and Technology officer
and who has a really interesting

764
00:45:53,000 --> 00:45:56,640
background in AI.
And so we're very excited about

765
00:45:56,640 --> 00:46:03,600
that.
I think that 20, our fiscal 26

766
00:46:03,600 --> 00:46:07,400
will be a year where we make big
inroads into the enterprise and

767
00:46:07,400 --> 00:46:12,000
really unlocking these tough use
cases where like, hey, I'd love

768
00:46:12,000 --> 00:46:14,200
to use AI, but I'm not sure I
can trust it.

769
00:46:14,560 --> 00:46:17,400
And so I think this is going to
be a a big year for Cloud

770
00:46:17,400 --> 00:46:20,280
Factory and really making
progress in the enterprise and

771
00:46:20,720 --> 00:46:25,520
unlocking disruptive potential
for the enterprise that you know

772
00:46:25,640 --> 00:46:30,280
up up until now most CE OS have
been kind of cautious or afraid

773
00:46:30,280 --> 00:46:32,120
to be too hard into.
It's exciting.

774
00:46:32,240 --> 00:46:34,720
Yeah, we are excited.
That's awesome.

775
00:46:35,080 --> 00:46:38,600
Well, we will close it out, but
for the final, final bit, when,

776
00:46:38,600 --> 00:46:41,000
when your time with Cloud
Factory is all said and done,

777
00:46:41,000 --> 00:46:43,760
it's going to end at some point,
who knows when, but what's the

778
00:46:43,760 --> 00:46:45,960
legacy you're hoping to leave
behind with the people you're

779
00:46:45,960 --> 00:46:49,160
working with?
Well, I mean, you know, first I

780
00:46:49,160 --> 00:46:52,360
would say hopefully I can make
some positive impact on some

781
00:46:52,360 --> 00:46:57,240
people's life along the way.
You know, I hope I don't break

782
00:46:57,280 --> 00:47:01,920
the good thing that Mark created
and the, the vision for Cloud

783
00:47:01,920 --> 00:47:04,120
Factory.
I I hope I leave that better,

784
00:47:04,320 --> 00:47:06,760
you know, in some way better
than I found it.

785
00:47:06,760 --> 00:47:10,440
Not that it was, it was bad when
I found it, but just to continue

786
00:47:10,440 --> 00:47:13,040
to build on the legacy that that
he started from.

787
00:47:13,600 --> 00:47:20,120
And, you know, I'd say just,
yeah, continue to find new ways

788
00:47:20,120 --> 00:47:21,680
to serve.
That's awesome.

789
00:47:22,200 --> 00:47:24,880
Well, Kevin, thank you so much.
Thanks for doing this, we

790
00:47:24,880 --> 00:47:25,640
appreciate it.