Leaders Shaping the Digital Landscape
June 20, 2024

A Culture of Quality

Join host Wade Erickson as he converses with Philip Daye, QA Team Lead at EMARKETER, to talk about the reasons why building a community through collaboration leads to a shared vision of quality.

In this episode, host Wade Erickson chats with Philip Daye, QA Team Lead at EMARKETER, about the power of building a community through collaboration. Discover how fostering a collaborative culture drives a shared vision of quality.

Key Takeaways:

  • Building a collaborative community enhances communication and trust.
  • Shared goals and vision improve overall quality and team alignment.
  • Community-driven initiatives lead to continuous improvement.
Transcript

Wade Erickson (00:13):

Welcome all to another episode of Tech Leaders Unplugged. I'm Wade Erickson, the host, and today we're getting unplugged with Philip Daye QA team Lead at eMarketer. Our subject or topic today is a culture of quality, and we're gonna go into a little bit about how quality and software development teams and, and, and other areas of product development bring quality in. And notice it's a capital quality. This is historically. You know, we talk about Big Q and little Q in the quality world but this is, you know, the formal program about how we make sure that the products and services that we del deliver to our customer base, both internal and external meet the expectations and exceed expectations. So, thanks so much, Philip, for joining us today and bringing your expertise into this community that and go ahead and introduce yourself a little bit. Introduce eMarketer, and then we'll jump into the topic.

Philip Daye (01:16):

Sure thing. Thanks, Wade. Happy to be here with you today. As said, my name is Philip Daye. I am a QA team lead at eMarketer. My career, though, has seen me hold many different roles in the quality space. But I, you know, I'm happy now to be leading a group of testers and trying to set a good solid direction for the quality of the products at eMarketer. eMarketer is an, is an online site for analysts. We, we prepare and deliver information on the markets and market trends, especially around advertising and marketing. There are daily emails that go out. There is websites and charts and we create the tools that help them prepare these, this information as well as display it to our, our clients and to the public in general.

Wade Erickson (02:15):

Great. Great. So a culture of quality the topic today, what, what brought that question that topic to you today and your thoughts and we could jump right into that.

Philip Daye (02:30):

Sure. You know, if, if you'd asked me about it early in my career, I probably wouldn't have even thought about there being a culture to quality. You know, I started out kind of as a lone wolf initially, and it was just something that kind of emerged over time. As I worked in different organizations and interacted with the other testers began to see that the way we interacted with each other could be different than how things were done throughout the company. And it really kind of crystallized for me when I was at Ultimate Software. We had grown from an organization where the testers could meet in one room, we could have a, a small conference or whatever, and everyone could come together. And so there was always an interchange of ideas and thoughts, problems, but the organization continued to grow, continued to get to multiple sites. It was no longer possible to gather everyone together. And so, under the direction of Tariq King, who was the director of quality at that time, I formed a Quality Guild. We used the model that you sometimes will see was used at Spotify where you, you divide domain up into different groups, and then the groups would be divided into different teams, and then you would also have people who had like, roles working together. But what we did with the Guild is the concept was to bring anyone who had an interest in quality together. A a culture is built through sharing ideas and sharing common ways of looking at things. And so, while we weren't trying to shape a new culture, what we were trying to do was preserve what had originally existed. As the company grew there were some missteps along the way. We had to learn how to make one of these guilds function, but over time, we found that we got, we built a really healthy interchange, a really healthy exchange of ideas. A tester could come in with a problem they were having, and they could raise a question and they could get immediate response from others that were helping them. This then allowed us to also step out a little further and begin to build working groups around different ideas like test management tooling or metrics around defining what constitutes test coverage. And as we grew, we began to really crystallize the concepts that test of how testing and quality were gonna be pursued at Ultimate Software. Now, having left there and come now to e-marketer, I faced a, a slightly different problem. Here wasn't a matter of preserving an existing culture, but building a culture. The culture of quality was, was nascent. The director of engineering really had solid concepts about what he wanted to see in testing and quality, but it became my mandate then to begin to grow that. And so that's the process we're in the midst of right now, is trying to, trying to build that shared understanding of what constitutes quality.

Wade Erickson (05:50):

Great. So the, the Guild concept share so you said it came from Spotify and obviously guilds are a group of people, community that have a certain, I mean, it's a very historical term. Tell me a little bit about how you identified that. Was it a a, a book or was it something that you came across, you know in your work life? Or tell me a little bit about this guild concept and how that shaped how you structured your communication.

Philip Daye (06:23):

Sure. So as you said, first off, the concept of a guild is ages old, you know, in European history, it goes back to at least the Middle Ages. And as someone who is a bit of a history buff, I've studied on that, and I understand that that concept, the idea of being able to bring someone into a guild shared a, a shared concept, a shared workspace, and being able to take someone from, say, an intern up through a master craftsman. So that always has that appeal. But as we were talking through and trying to build up what we wanted to do with this, this culture I did come across some articles online about how Spotify organized theirs, and it made sense. And while we didn't directly model what they were doing, it was close enough in the idea of that we had the different spaces with the different teams. And the, and the thing about the guild that I gravitated towards was, it was not just about the role of the individual. So it wasn't just, only testers can come, right? It was, do you have an interest in quality? Then let's get together and let's talk about it. Because we find that in a di diversity of opinions, we can build some better ideas that way. Some of the best ideas for testability have come out of developers when you've posed the problem. How are we going to test this? What can we do? And maybe they'll know something in the implementation that would allow them to expose the testability hook. So having anyone interested in quality participating was, was key to this.

Wade Erickson (08:11):

And so the, tell me a little bit about the formation. Is this something that is kinda like a center of excellence that formalizes with the stakeholders in the organization and you have, you know, kind of you know, mission statements? Or is it more of an informal gathering of like-minded folks that wanna accomplish something in a common strategy?

Philip Daye (08:37):

Right. It sits in the middle of those concepts, so it's not a true center of excellence. As I began to research this, what I found is a lot of times center of excellence tends to become its own little culture by itself. And the things that come out of the center of excellence, especially in a, in a larger organization, are not uniformly adopted. And what you'll end up happening is you'll have these subcultures, pockets of cultures that develop, and they may develop their own solution separate from what the center of excellence is proposing. So what we needed was something that was more cross-cutting, that was bringing in ideas from throughout the organization. Now, it wasn't entirely informal. I did have a mandate to create this guild so there was some direction behind it. It did have its own mission statement. We did have a, a structure of sorts. Initially, we tried it out by just inviting, when I joined Ultimate Software, the, the organization had a flat structure, so everyone in testing was a tester. There was no other differentiation. There were a few test architects, but everyone else was the same level. And we had just begun to stratify the layers. And so initially we only invited those that were considered test leads, so a much smaller group of people. And then over time we found that that was a little too insular, so we began to open it up to more and more people. So it was informal in that there was not a formal membership, but it was formal in that it had, it had backing from leadership, and it had certain things that it was trying to accomplish over time. The other thing we found was that initially I tried to get presentations and I tried to solicit presentations and people just weren't interested in preparing presentations. And so it, it started to falter a little bit. So what we did is we then adopted the concept of lean coffee, the idea that we crowdsourced our agenda for every meeting, and then not only did we crowdsource it, but then the crowd voted on it. And so we could pick things off of a list and start at the top and just start discussing, discuss the topic until we felt we had completed it, and then move on to the next one.

Wade Erickson (11:08):

That's an interesting way to kinda build an agenda because oftentimes, like you said, they're, they're dictated from the top and people get a little bit bogged down with the concept, oh, I gotta create a pres presentation, I'm gonna be evaluated for it, and assessed on it. And there's anxiety around that versus this coffee concept. It's really about, Hey, what do we wanna talk about? Right? Right. And then those are often the best way to exchange information instead of one, you know, the teacher in front of the student kind of model. So that's a great, great insight there. So oftentimes QA is, you know, heavily driven by the tools that are selected. Tell me a little bit about, in these guilds since they are more of a collaborative gathering of people versus a, a top town dictate tell me a little bit about how tool selection and, you know, those kind of things come into the, and maybe even processes and methodologies that come into what you guys evaluate and then, you know, eventually adopt.

Philip Daye (12:19):

Sure. Let me start by, by taking a step back and telling a story from earlier in my career. I was hired into a company because of a particular tool I knew, and there weren't a lot of people that necessarily had a lot of expertise in it. As I got into the company, I found out that they had just sued another vendor to get out of a contract with a tool because the tool never worked for them. And as I got into this and we started trying to work with this tool only to find out that this tool would not work with their application. Now, this was not a web application, it was a, it was an ERP desktop application client server model. So now they were gonna have to get out of another contract and purchase another tool. Well, the decision was made by the manager who had a friend who was a salesman at a vendor. And so you can guess which tool got selected. Again, the tool was good when used the way it was intended, but you look again at the application we were testing, it didn't work. And so here I am tasked with leading a small team of automation testers trying to automate with a tool that kept falling over, kept dying on us because it just couldn't handle the application we were testing. So fast forward a little bit, that manager has departed. My manager is now actually an architect, has less knowledge about testing, relied on us for information about testing and all of that. I, I, I asked him to allow us to do an experiment, and we built a small framework on top of this test tool that kind of took away some of the constraints that the tool itself had when interacting with the application. And we were able to show a, a fairly significant bump in performance on the tests when they ran. And he said, that's great. Now go find a tool that does that. And so he freed us up to then go out and look for a tool. We went through a process of selection. We started by identifying the things that were important to us, things like which language would the tool work with was it extensible? Could we add onto it? Because we had some developers who were very creative with the custom components that they created for the application. So we created this matrix of things that were important to us, and then we found tools that matched up with that. And then by doing that, we were able to select a tool that we felt we could stand behind. In fact, I went into a meeting with one of my other automators with the CEO, the CTO, the CFO. And you know, the question was, you know, we've been through all these different tools, now you're telling us we need to go to one more tool. If this doesn't fail, whose head's on the line? And, you know, I'm getting ready to say, I'm, I'm the leader. I'm, I'm gonna take the responsibility for it. And before I could even say anything, my second was saying, I'll take the responsibility for it. I eventually left that company my second stayed and was there for quite a while. That tool. Now mind you, this is in the span of less than five years. They had gone through three tools already. That tool was still being used 10 years later. The story to say is, for me, the concept is you let the people who are gonna use the tool, have a say in the tool. You have a much better chance of success. They know what they need. You know, obviously you're gonna have to make it make financial sense. I understand that part of it. But the, the ones who are using it are the ones who probably have a better idea of what you need. And so we brought that concept into the guild as well. As I mentioned, one of the things we worked on was we needed to select a, a test management tool. So what did we do? We formed a small working group. They they helped identify by talking to their colleagues what we needed in the tool, what kind of things were important to us. One of them was Jira integration, right? An obvious thing. And so then they took the requirements and they began to investigate different tools. And as they narrowed down the choices, then would set up a POC inviting in a broader group to try it out and see if it worked. That led to the selection of a tool that as far as I know, is still in use at Ultimate, well, at UKG, the, the company had been acquired and, and merged with another company. So it has changed a little bit, but to the best of my knowledge, that tool is still in use as well, and was a great success in that the process of selecting the tool took less time and got to the right tool. And then we were able to find other ways to extend it and make it more valuable to the organization.

Wade Erickson (17:19):

Yeah, I think, I think just like when you talked about that's key in that the proof of concept. You, you wanna actually have a, in the case test automation, we do this all the time at Logigear cause we have a product that's been in the market over 15 years. And but we're largely a testing services company. And so when a company's interested in using our tool, many, many vendors out there don't have a large services practice. And so to build that, you know little small regression suite to try out the tool, you have to take that on yourself, you know, and, and learn the tool, get a trial version, hopefully they got trial versions and then get it to work. And so, you know, that kind of delays things, learning the tool. How we do that is if you're interested, we build out five or 10 against your application and we have a ver a version that is free and stays free. It's just not an enterprise version. And so you don't get all the reporting and the, all that stuff, but those, those scripts are running on, you know, the the team's version we call it. And, and we found that that and we have, we have companies that have couple hundred thousand scripts running on this. You know, I mean huge, huge enterprises all the way down to small one, two person teams, you know, and in that case, you know, the teams version is a great way to get teams working with and they absolutely know it'll work 'cause it's being hitting that application. And, you know, ours is desktop web, mobile API, we're one of the last few to still support desktop. I think there's only one, one other competitor that has that dialed in. Well, 'cause we, that's where we started out. But but yeah, that's key to I think testing any tool evaluation is do your matrix, you know, get the criteria together, make or buy decision kind of analysis, right. That you would do. In this case it's all buy. But and then definitely put it to test so that you don't have some gotchas down there, especially the controls. You know, there's so many different controls out there between web and such that that you wanna make sure that the, the tool can, can see the controls that you're using, you know, in your setup. So so training and growing testers. Tell me a little bit about how you, you, like you said, there's different within the guild, there's different layers and levels of, of experience. How do you take some of the freshers that come in and bring them through? Which that guild, you said is kind of that journeyman to master craftsman kind of a model. Tell me a little bit about how you formulate those paths of learning.

Philip Daye (20:15):

Sure. So for myself, you know, I was, I was self-trained. I really didn't have anyone to show me how to do testing. So I had to learn on my own. And I really, I kind of dug into that and did my best to educate myself. And then as I found others that I could learn from, I did that because of that. And, and I think just naturally I'd like to share what I know. So I have always been one who, when working within a team, even if I'm, I'm simply a peer and not the lead if I know something and someone else doesn't, I'm gonna show them how to do it. What was a real revelation to me was when I joined Ultimate Software, they had a program there of a, every new tester that joined the organization, had to go through a one week black box testing course. It had been written by the test architects and it didn't matter how much experience you had, I had, I was already 15 years into my career at that point. I had to go through it just like someone who was an intern coming in first time ever hearing of testing. It was a way to level set the way the organization approached quality. We get back to kind of talking about culture in this, right? This kind of helped inoculate everyone with that culture that we were trying to establish. And, you know, you fast forward a little bit at the, about the same time I'm forming the Quality Guild, I've also now been tapped to be one of the people teaching this black box testing class. And from that, then we began to develop others. They developed a white box class. I developed one on session-based test management. Part of it can be, depending on the structure of the organization, mandating it, you, you tell everyone that they're gonna take these courses. Part of it can be just making sure it's offered. The other things that we did, and that I've seen used throughout is conferences, make conferences available to, to the testers. Find a way to finance their being able to go and attend. There's always so much, there's a diversity of of knowledge at these conferences. That's part of what's led me to speaking now at some of the conferences as well and trying to share there. There's also books Can you make, can you make books available to your people? Last year I decided I wanted to take my current team through I-S-T-Q-B, I spoke to my manager, we bought everyone the books that are needed to study for the exam. You know, highlight things like podcasts, like the tech leaders Unplugged, make sure that I'm sharing this information and as many different resources as possible. But by the same token, now as a lead, essentially a manager to these people, I'm constantly having a conversation with them about the direction of their career, where they want to go, what they wanna learn, what kind of resources can I find for them and share with them what can I teach them myself. So I think it's very important. I think it's important for any leader in, in any role, a manager. If you are responsible for the growth of your people, I think it's, it's incumbent upon you to be proactive about how you do that. Have the conversations, make the resources available for testers in particular, we need to consider the things that are important to drive quality within our organization. And so I, is it about black box testing? Is it about learning how to do testing with a tool like Postman to do API testing, identify those weaknesses and then work towards addressing them.

Wade Erickson (24:15):

Alright. So you were kind of wrapping near the end of the show. This is where I like to pivot a little bit towards you as a person and your career from your profile. I picked up, you have a degree in mathematics but early on you, you jumped into software QA and you've stayed there quite some time within your career. Tell me a little bit about how that story, about how you came to know software QA and, you know, build that passion to stick around and stay with it for so many years.

Philip Daye (24:48):

Sure. It was kind of a circuitous route that got me there. As you noted, I have a degree in mathematics. When I came out of college, I was looking to go into engineering preferably into some sort of aerodynamic engineering, and there just weren't jobs. There was nothing for me, not an internship, nothing. So I ended up working completely different area. I worked in the legal side of real estate. So I worked for an attorney and I worked for a title company and I, I got tired of that. I wanted to get back towards technology and things like that. So I took a, I took a stab at getting back into technology by taking a a tech support role at a company. They just happened to be an aggregator of real estate data, so something that I had in my background. And so they said, you know, we need someone to look at this data and make sure that what we're getting back out of our system is correct. That was my first introduction to testing. Had no idea what I was doing. They didn't really know how someone should test. But it was a natural fit. I just seemed to, to kind of gravitate towards it. Well, and so what did I do? As I said, I'm self-taught. I went to Barnes and Nobles. I bought the first book on the bookshelf that I could find on software testing. Happened to be chem canner's testing computer software, right? Still to me, the the gold standard for books at, at this point, because I learned so much from it, was so valuable. As I got into it, I found that this background in mathematics, the, the, the idea of being analytical really felt fit, really fit well with this concept of being a tester. And shortly after starting to do this manual testing for the organization, we were then ready to, to bring in automation. And so now that technical side and, and the bit of programming background that I had and all of that, everything just fit together so beautifully. And I found that I probably could have gone and been a developer, but I found I really, really gravitated towards the testing, towards taking things from the beginning, from, from literally writing requirements that the salesman would bring me, you know, to delivering 'em to the, to the developer, to then testing it and making sure that it met those requirements. And then, you know, it was a small company, so I wore all the hats. Then I was the guy who did the builds and I was the one who wrote the user documentation and I was the one who went out and did the user training. And as I stepped out from there and I started to see the broader world of testing, it just, it was the right place for me to be. And I really enjoyed the journey working with different companies and different people and learning along the way. And I'm, I'm still learning to this day.

Wade Erickson (27:48):

Yeah, it is, definitely. And, and, and I like that point that you brought up is that a lot of folks come into qa with a developer background, and that's the job that's available when you're a fresher, right? Yeah. You, you come outta college you're coding and then you jump into a company and you're testing everybody else's code. And it's a matter of, you know, kind of, I don't know it's rite of passage to be able to become a developer to go through qa. I think that's an actually a great model because you learn so much evaluating other people's code, and then if you can actually get into some unit testing and see how others are writing code, that's exposure to the senior developers. And, and I think it's a great path. But not a lot of people stay with it, you know? And, and it's like, you know, like you did, you found these are the pieces of the puzzle that you like to see and ensure that they're going to be released in a the best way the company can present. And so hats off to you for, you know, the career path that you've had and leading others and, and continuing to support them over the years. Alright, well, let me wrap up by just introducing, we've got two shows next week actually, Monday June 24th. We have David Hood, CEO, and Founder of 42 Robots AI. We're gonna get quite deep into AI, and I'm not talking just generally ai, this is all many, many areas of AI. David Hood, I've spoke with him before and he, he's got a, a great understanding of where things are going and where they've been. The second one is Tuesday, June 25th, both at 9:30 Pacific Time Letta Araman DevOps Automation Senior product manager at IBM. So we're gonna be talking DevOps and other automation kinds of concepts from a great talent over there at IBM. So that'll be a, a great week next week. So join us for both shows and as always they'll be available live but immediately after available to, to watch the recording. So thanks so much, Philip, for your time today, sharing your QA experiences and your background. Of course with Logic Ear, that is our history. And our founder wrote two books for Wiley Press on QA testing. That's really the origin of this company was an education company that moved into consulting. And then now we have hundreds and hundreds of testers offshore and provide testing services as well as a product. So this, this program, this normally we don't have a a folks that fall dropped right into our service line, but it's a, it was a great, great joy to talk to you today about the thing you both have in common. So again, appreciate your time and everybody else next week we'll see you here on Tech Leaders Unplugged.

Philip Daye (30:50):

Thanks, Wade.

Wade Erickson (30:52):

Sure thing.

 

Philip Daye Profile Photo

Philip Daye

QA Team Lead

A seasoned software quality professional with over 25 years of experience in the field, Philip is currently the QA Team Lead at EMARKETER. He has a diverse background as a tester, manager, architect, and leader, and has worked with companies of all sizes to ensure the delivery of high-quality software. Philip is deeply committed to staying current with advances in the field, and actively shares his knowledge and experience with others through speaking engagements at conferences and meetups, as well as by founding internal communities of practice.