Putting Impact Back in 360 Feedback Surveys

If you have been in HR long enough, you've probably run your share of 360 feedback surveys/360 assessments/360 reviews in some shape or form. While designing the process and output takes much of our bandwidth, 7 out of 10 HR leaders say that they cannot tangibly tie the impact of this massive exercise on talent development outcomes. It doesn’t have to be this way.
In this episode:
  • How 360s become a mundane exercise
  • How to adjust or revamp the process in today’s work climate
  • What to do after 360s to deliver a tangible impact
  • Q&A with host and guest speakers

About the speakers

No items found.

Watch now



A big hello to all of our beloved people and culture practitioners delighted as always to have you join us today. I'm GC, your host, and people science leader at mesh dot ai, and someone who considers themselves extremely blessed to have worked with and learned from hundreds of people and culture leaders all over the world over the last fifteen years.

When it comes to solving for business performance through their people, modern companies find themselves almost rebuilding the plane as they're flying it. With that said, the future is already here. It's just unequally distributed.

And that's where the performance puzzle comes in. A show where people leaders from around the world volunteer their experiences and playbooks to help you navigate the intersection of people's strategy and business success.

Today, we're gonna learn about how to bring the module back into that age old program of three sixties.

To discuss the topic of putting impact back into three sixty feedback surveys, we have an absolutely mouth watering duo of experts. I think if I can say so myself, we've literally pulled off a sizable by having these two gentlemen agree to talk to us about three sixties today. And I think that was quite evident in our warm up session in the run up to this one. In no particular order, allow me to first introduce to you. Alan Church, who's an organizational psychologist, and recognized thought leader in talent management with over thirty years of experience, including global corporate executive positions, as well as external OD consulting.

Not only has Alan formally been the SVP for global talent management at PepsiCo.

By his own admission, he's someone who's transformed from being a only development guy to being a holistic talent management practitioner. And interestingly enough, he's successfully relaunched the three sixty program five times for five different CEOs and as you can imagine, five different business strategies.

Incidentally, side note. He's also authored two best selling books around the subject. Thank you for joining us today, Alan. Thanks for having me.

Next up, we have Jack Zenger. Jack's a best selling author, speaker, and a national columnist for Forbes and Harvard Business Review.

With more than five decades of hands on experience, Jack is rightfully considered a world expert in the field of leadership development and organizational behavior.

He's the CEO of Zango Fokman, a research and consulting outfit that works with some of the fastest growing companies in the world. Also has its own proprietary methodology and approach for three sixties. And as you can imagine, Jack himself is well known for his compelling research and inspiring stories. Welcome to the show, Jack. Such a pleasure to have you here with us today.

My my great pleasure, GSE.

Perfect. Folks in the audience, we're here for you as always. Now is a good time for you to check out three cool features on the map bar on the bottom of your screen. First up, the chat section where I highly encourage you to share your thoughts, opinions, and experiences as you listen into our speakers. The second one, the poll section, where you'll be able to participate and consume results from our live poll. And last but not the least, the questions section, the live questions section, where I encourage you to post questions as and when they are coaching you to our speakers. As a moderator, I'll try my best to kind of weave these into the conversation.

There's always no pressure on me.

Now given that both Jack and Alan, Simply three sixties personified, I'm amply aware that I'll need to do a really, really good job as moderator to help unpack most of their super deep perspectives in the next fifty odd minutes or so. But what should help us cover good ground is the good old agenda.

And through to the format of the show, We're gonna spend around ten minutes trying to discover why companies are solving for bringing impact back into three sixties today. The meat of the show will be spent in basically learning from Alan and Jack. What the healthy choices around the core design

Related read: 7 Best 360 Feedback Tools for Employee Development


What's the very purpose of three sixties? In today's day and age in one line?

principles or levers of a three sixty program would look like today. And last but not the least, we're gonna lean on their experience to see how to get that lovely design out of the gates that it's implemented the way it's desired.


I'm gonna warm up our experts with a nice thirty thousand feet warm up question.

Let's start with you, Jack.

According to you, what's the very purpose of three sixties in today's day and age in one line. Let me make that a little challenging.

I think it's to identify and and develop better leaders.

It's bottom line that's that's what it's all about.

It's, it's the most practical workhorse well proven tool that we have available.

Identify potential leaders and develop them into being leaders. Alan over to you.

Yeah. I love that. I was I was looking at the definition that we created in our, what, the twenty twenty handbook of strategic three sixty. I looked at it, JC, and I was like, well, I can't read this thing because it's there's four editors and we wrote it together and it's long.

Right? So long it would kill everybody. So Let me try a shorter version. And this is what I came up with on my own.

It's not that far off, but three sixty is a strategic data driven process. For driving self awareness and forming leadership development and enhancing talent and organization decision making.

So I'll get to data, I need I think it needs to be strategic, and linked to business, right, drives self awareness, drives individual development, and drives basically organization change and talent management. So I think it when it's done well, it's doing all of the above. You can't always do all the above easily, but when done well, it's hitting all those cylinders at the same time.

Other than if I oversimplify that to get the right data, share the right data with the right people, and then let them improve what they learn from the data.

And and do you see, make sure that the organization is using the data in the right ways to inform decisions or development decisions. Right?

Yeah. I agree with the first three. I think the the fourth one, and this is where Jack and I will have some fun today, I think. Not that we disagree on this.

I don't think at all, but just from the point of view of we were starting before we got on here, my orientation is from a build perspective is an o d an org development perspective. Right? So I think you diagnose, you build the content, you make it adaptive to the strategy, based on good science, but you figure out how that's gonna work. And then when you use it, you figure out how to make the most use of it in the organization.

Sometimes it's only development.

My experience more recently. It's mostly talent management as well. And that's where Jack and I think we'll disagree.

To be very honest, Alan, the additional thing you added was almost the misinterpretation of reasonably high quality data that leads to kind of certain misinterpreted objectives or poor outcomes per se. And that makes me think. Right? Because three sixties have been around a while.

And let me let me throw this to Jack like you were tempted to do other like the very fact that Alan felt the need to clarify that fourth pillar means that there are certain common misconceptions that we're just unwilling to unlearn.

Have you seen certain common misconceptions in all of the clients that you advised that's making us have this conversation about why three sixties have possibly lost their impact in the way they're applied today.


What are some common misconceptions organizations have when approaching 360s?

I think what's happened is that, in an effort to make the whole process more palatable.

Years ago, when it first began to be started, I think that there was a a way of trying to allay people's fears and anxieties, and so telling them that this was for their for for their development purposes it would be confidential and privileged.

And from my experience, that the m a great majority of companies that are doing three sixty today are are using it that way. They're they're keeping the data very much centered around the individual, about whom it was collected.

And they have not not wanting to have his or her feelings hurt, they they have kinda protected that data.

I I totally share much I think of what Alan believes and and has been practicing, and that is it's got far more potential from than that. And the the the it's it's lack of impact in terms of how much it impacts people individually and the the frequency with which people, you know, what what percentage of our participants actually make changes in their leadership behavior Those two things could both be greatly increased if we changed our our underlying philosophy about it.

Listening to you there, Jack, you know, and and trying to tie the threads between what Alan just said and what you mentioned is the per If I marry what Alan mentioned about self awareness being one of the objectives of the process, and and your entire philosophy around helping those identified folks grow into the leadership potential that they have.

I guess an image and obvious design choice from that objective would be to not kind of hold that data centrally but to democratize and circulate that data where it can make that self awareness impact per se. And that and that tempts me to ask you, Alan, about, you know, just like just like Jack mentioned, the overall objective needs to be accurate identification of potential and then development into growing into that particular potential. What would you say you know, are some of the more contemporary and strategic objectives that one can aspire to deliver through three sixty programs, especially in today's context?

Yeah. That it's a great question.


What are some of the more contemporary and strategic objectives that one can aspire to deliver through 360 programs?

So it it's interesting that the and, Jack, I don't know the benchmark. There's a lot of benchmarks out there, but one of the ones that we've done is with big companies that are that are have, like, large talent management functions, right, which is not every company by any means.

In those companies, it's, like, it's something around sixty six percent are using it for assessment. For high potential and sixty percent for executives for succession.

So I think that the more sophisticated organizations with larger groups of people that do the work well. Right? I mean, I think to your point, are using it for more than just development. But I think the apps that you can do it for, g c, the the best way to think about it is, you know, set up the strategy first.

Right? So it's gonna have to be development no matter what. Like, I don't think you should ever do three sixties and never give it to the person. I doubt many do that, but you never know.

Right? I mean, and there are some maybe external assessment houses that might do that, actually. It's possible. I think it's a very small percentage.

But so always development for the individual, but simultaneously value for the organization. I think that's where people fall down these days. Right? Three sixties become, you know, commoditized.

And I and I worry about the democratization of three sixty two because the more you give it to people to do on their own, the less likely they're gonna follow or know or even care about best practices and things we know from forty years of doing three sixty. So that makes me nervous. But I do think development is key. I think having information that comes back that's valid and actionable for the organization, is what gives the organization a vested interest in doing it.

So figuring out whether that's, you know, best in class in terms of assessing potential, which requires some extra Right? You can't just do it out of the gate, with any three sixty. But potentials one, another one is succession planning and figuring out, you know, who really has the the right characteristics to get there, or what are the gaps needed to close from a critical experience point of view and development point of view to get them there. Right?

Using it to inform talent reviews and not so much about knocking people off in an inbox. Like, I I'm not a fan of knocking people off or having, you know, one thing happen to them. But I am a fan of saying, listen, you have this sort of idea of internal potential and managers have a point of view and data is a different point of view. So let's talk about where they line up and where they don't.

So I think if you can use it to good effect for the organization for leaders, they'll buy into it. And by them buying into it, individuals buy into it too, because it matters to them. Right? Can't tell you the number of times I've built three sixties.

Sometimes in Pet, but also consulting, and most recently, I've got a couple of engagements going. Where the existing three sixty, nobody cares.

I mean, the individual likes to get the feedback sure, but they don't really do anything with it because there's no accountability. There's often not a lot of resources for development, and managers don't really pay attention. So why would the individual really care unless they were super self motivated And as my mentor, Warner Burke used to say, the only people that actually take value out of something like three sixty on their own are the people who are already good at it. Already good at leadership and already good at self development. Right? The ones who really need it need help.

The more I I hear both of you kind of set the tone on the objectives for three sixties, the more I realize the importance of that development data that you're generating through the three sixty program per se And I'll build on, you know, the right framework, the way to collect it, the way to disseminate it just like Jack mentioned, you know, who's it for. Are they getting it? Are they understanding it well? So, I'm now gonna segue into the the meaty section of this conversation which is designing the playbook. And I'm gonna start off with the underlying framework under which companies usually collect this data in three sixty instruments themselves. Right? The very rubric and and we've seen various shapes and forms of those.

Could be company values, could be leadership principles, could be done in house, could be kind of outsourced to external proprietary frameworks.

Is there a method to the madness, Jack? How should I arrive at the right rubric for me to base and build my three sixty instrument on?

Well, I'm not sure I'm able


How should I arrive at the right rubric for me to base and build my three sixty instrument on?

to answer what what's the right one.

I can tell you that what what we've done is that we've said Let's let's organize around the concept of of leadership capabilities or competencies.

Let's empirically turn select items that we have we see differentiate high performing from average report performing leaders.

So if one of your objectives is to identify potential and and and that you really believe that there's a comp, you know, correlation between leadership behavior and and business results then you wanna identify people that have those positive capabilities and behaviors. We follow the competency model, but that that that term is obviously up for grabs.

We think that by doing this empirically, by by looking at a thousands of items and saying, okay, which ones of these really do differentiate high and low performers, you can efficiently create a an instrument that, captures the the the the most important capabilities.

And so, you know, years ago, we had sixteen differentiating competencies that emerged.

We've, in the last for three years, added three more, because the the world evolves and and cultures and and organizations change.

But we we think if it's empirically done, that you have a better outcome, then the practice that's, I think, often happens in organizations where a group of senior executives sit around a table and sort of choose their favorite questions, and, and that becomes the three sixty degree feedback instrument. That probably is the least effective way of getting No. Absolutely. I actually agree.

Can I jump in GC? Absolutely, Alan. You don't need Yeah. I would agree, Jack. And I think your approach is is excellent when it's coming from a external perspective.

Right? When when someone's going to a firm and looking at an external organization to do the work. I think having a a a deep research base that you've looked at and validated items at work and competencies is fantastic. My experience, of course, as you know, even though I'm consulting now, even consulting all I've ever done ten years before Pepsi twenty one years of Pepsi and now two years out is build custom model.

Right? So that's my that's my bank. But, but, you know, kinda to your point, and I would tell anybody this. I've talked about this elsewhere, leadership is leadership.

Right? So I would go after leadership competencies or leadership framework or whatever, you know, leadership capabilities, whatever you wanna call them. But basically, eighty five percent of all models are roughly same. And I'm sure, Jack, yours is based.

That the eighty five percent is what's your core. Right? I mean, you know, you have it all. But, I mean, it it they're kinda the same.

And so from my perspective, especially when integrating either for culture change, which is what I did the first fifteen years in my career or talent management, the second sort of fifteen.

It's getting the language that fits the organization. So that's where I kinda lean a little more. Facing on science, like, you have Jack exactly, but then tweaking it for for the leader so that they buy into it. Right?

So the same thing. You interview all the top leaders. You get their perspective. You say, look at this.

This really fits some models have. I'll I'll plug some things in, make it work. I have some good items that I know work, right, to your point, and then you kinda label it. And a great example is a client I'm working with now that a three sixty that didn't have much of an impact.

They got a new CEO, and in interviewing folks, this this concept of sense of urgency came up. Now I've never seen sense of urgency as a competency. Like, it's it's interesting. I get it, but it's usually not a competency you call out at, like, the level of a seven, you know, like, seven big differences.

Right? But that's what they need. And underneath it, it's kind of decision making in its judgment. I think, you know, good things are gonna be there.

But they need that label to drive their change and their leadership thinking around that. Another good example is inclusion. I'm sure Jack, one of your one of your competencies now has to do with inclusion. But when I was joining Pepsi code years ago, and and back in the Warner birthdays, that didn't exist, but not that same way.

You know what I mean? Not as term. And so one of the models in PepsiCo, we we did a couple of the different builds, inclusion went from an idea, maybe an item, to one of seventeen sub competencies to one of the big seven all the way up to being huge. And partly cultural, partly cultural relevance, but also the organization needed it.

So I'm a little more of a spoke than you, but I think we're both on the same page that must come from research must come from good science and then make it work. Right?

Well, and just to piggyback on your comments, despite the fact that we we do have a a proprietary kind of model, well, more than half of our implementations were with organizations involves some degree of tailoring using their language, using, you know, some of their terminology that resonates in internally. So I I absolutely agree that you need to kind of be responsive to it to the culture of the organization.

Yeah. And I would say GC that the sort of the thing that I I bet Jack agrees with this completely, but I don't I don't wanna speak for him. But I think the, you know, the idea of going to get an individual three sixty off the shelf by it's like just contained is fine for like coaching individuals and some leader programs maybe. I think, you know, some firms out there do a good job at CCL, you know, I mean, it's fine.

But that's not the same thing as they can get internally into a process Jack's talking about or I'd be talking about where you're it's really strategic to the company. Right? That's individual off the shelf is great for individual development and coaching. It's not you should necessarily be doing in my opinion to drive a big agenda, right, a big strategic three sixty.

In fact, to that extent, you know, the immediate question that kind of pops up in my head is, you might have the most well researched rubric framework which you then, you know, taking Jack's starting point and then kinda marrying it with your internal organization's context tweaking it. In fact, I've seen the extreme of that spectrum where certain CEOs come in and say, hey, this year, this particular competency or leadership principle is all we will assess potential against or identify people as high potential against from that cultural change perspective. But very often, you know, we we kind of under index on how well understood this framework is or how it shows up in everyday behaviors by the very participants of the three sixties themselves, whether it is the assessors or the assess or the peers who are coming in.

So, Alan, let me begin with you. What have you seen as, you know, some of the more effective ways in which once you've got the right rubric in place you can really socialize this so that it becomes part of, let's say, cooler talk balance that is well understood by the organization.


Once you've got the right rubric in place, what are some of the more effective ways in which you can really socialize this?

Yeah. And it that's, you know, it's funny because, I mean, I you know, I can't say I'm a marketer. Right? But I come from a company that has a big marketing orientation.

And in the early years of building, so I I built the leadership model five times at PepsiO, right, through different CEOs and CHOs and cultural agendas. And, the first four times I didn't get the name right. Have to say. I mean, it was, like, leadership effectiveness in individual, you know, model and, you know, leadership competencies, and it wasn't until the last one.

That we really nailed at GC. And so the the little story goes because it's all in branding, in in my opinion. Right? It was certainly a Pepsi co.

And and communications and getting people to buy into it. Right? So in the last model we did several years ago now when Ramon took over as CEO and Ronald Shelekins came in as CHUR, came back to Pepsiko, really.

We built a new model. Right? We said, alright. You know, Andrew knew he's gone. She had her sophisticated model. It was great, but we're now moving to a new one.

Content underneath pretty similar research very similar. Still did all the good stuff you're supposed to do, did internal, you know, validation work and prediction.

Came out with this model, and I was told, Alan, this is great, deep science. We love it. Thank you. Six months later, I got a call. Alan, I can't remember your model.

And if I can't remember it, I can't use it.

So we went back and said, alright, a day of brainstorming GC, and we came up with the great five. And there's some articles out there if you want, everybody wants to see it. But the great five is it's real simple. Hey.

You can remember great. Right? Good to great. It's like a rip off of nineteen eighties books.

Right? I mean, you know, way back when, right? But it's, you know, growth relationships, agility, you know, effectiveness and thinking.

Right? And so, it's it's really underneath those, there's there's actually science incompetencies that are more like what we would normally expect, but everybody can remember the word great five helps, because people can think of seven plus or minus two in their brain. So they can hold five, they can hold the word great, and they can pretty much deal with those five words. And I remember going to our CEO and and explaining it to him. And he's like, well, there's a lot of science, a lot of stuff here, Alan, and I'm not sure we can use this. And I said, well, just answer the question.

Do you like people who are growth mindset oriented? Sure. Do they need to be good at relationships? Absolutely.

Right? Do they need to kinda execute efficiently? Of course. Right? Are they agile? Well, in this day and age, of course.

And do they think strategically h one h two h three well, of course. Oh, I see. It works. Right?

So and that that sort of branding for the first time, that model took off GC of Pepsi. Once we launched it, people love it. They're walking around with it. They love it.

They've got all kinds of stuff, and it never happened before with the old bottles because they were too IO psychology specific. They got a two normal competency ish. Nobody could remember him. This thing has legs.

And it's going great guns, you know, two years after I'm gone. So I I think being able to package it in a way that people understand, understand what it's used for, understand what's in it, and then being able to actually leverage it in conversations. You can have a great five discussion about somebody without knowing anything other than the five words.

Jack, I think when you when you listen to that, you know, there's a part of you that's wondering, hey, all of this marketing job are these are these instruments even like you know, valid has their face validity been checked, but once all of that is done, I'm pretty sure, you know, in in terms of your own clients where you run your own proprietary approach to three sixties. You've encountered the quality of feedback coming into that tool, sometimes not being up to scratch.

Are there, you know, over and above over communicating or simplifying the underlying rubric just the way Alan mentioned Are there other interventions that you kind of suggest to organizations or have seen successful in in at least improving the quality of what's coming through the instrument itself?

Well, certainly having a a simple and a memorable mnemonic device, like the one that Alan just repeated, is very useful. Nothing nothing beats simplicity And so I I think the most most effective, instruments do do have that quality.

You know, in in our case, we've we've described the the overall leadership capabilities as a metaphor of attempt.

With with five, five pulls holding up the tent, and the middle pole being character, and then one pole being the ability to produce results and one one poll being the personal characteristics and capabilities that this individual had and one being leading change, and and the final one being interpersonal skills.

That that simple model has been very helpful.

And having some is a is a memorable way to kinda remember what can be a very fuzzy, you know, very broad concept that really does help.

Certainly, when you to implement this all more effectively in the organization, training the the managers to understand that and to be able to coach to it and to give them the the the confidence that that they can really dive into it with their subordinates.

Is is really helpful. And, I I think the the awkward the big opportunities for making three sixty degree feedback process is more impactful.

Really is in how you train both the recipient and the person who's kind of providing the coaching, and give them a greater skill set because both of them have to work together in order for it to succeed.

And, you know, one of the things that we've become very, a change in lightly is how how coachable, how how amenable is this individual to being given feedback how much is he or she willing to kinda pay attention to it, respond to it, you know, really mull it over, and then and finally to act on it. And we have focused almost all of our attention kind of on the on the giver of feedback. What we haven't done is to kinda really focus on How about the person who is receiving this? And how what can you do to prepare the individual to kinda, you know, be be more, receptive to to useful feedback. Because fundamentally, the the three sixty process is a feedback process. Designed to gonna be be helpful.

And the the fact that, you know, not all people, take advantage of it says we haven't done a, you know, an effective job of helping them see, hey, this is really important. And you you need to be, you know, hold hold yourself accountable, and and hopefully your your manager can help you be more accountable in in making it happen.

Yeah. I I love that jacket. I I think, you know, I think helping individuals interpret it and make sure they understand insights out of it and not is where I get into a little bit of AI concern g g c. So I know we'll get there later, but helping individuals make You'll free to dive right in, Alan.

Well, But first, it I agree totally with Jack. Like, managers just don't naturally know how to use these tools and don't know how to give feedback anyway regardless of three sixty. Right? So they need capability period.

Three sixty is just another layer, but the other group that I think sometimes and I'm sure you do this, Jack, but the other group that also needs help is HR. I hate to say it. But HR, just because you're an HR journalist and they're, I mean, obviously, you can be very strong at that. You may not know these tools as well as you think.

Right? And you may not understand how they're supposed to be, how they're working built. So that's another, sort of group of people that I've worked on over the years, both at Pepsi cone and Externally, building capability on how to interpret and give feedback with that group. And then the fourth group, honestly, is the leaders in a room who are talking about people based on the data Right?

So they're probably not giving feedback, like senior leaders, right?

But they're they're listening to feedback being discussed in, you know, some form or fashion in a talent management context, And they make snap judgements based on what they see as well. They need a little bit of training too, you know, just because it's data doesn't mean it's perfect. Right? You need to think about not just one score, like, not boil it down to one score, but said some perspective on the different points of view in the three sixty.

Right? Are they equally consistent? Are there differences? And do you have somebody who's really good with managers and terrible with peers?

You know, So there's some training you have to do. I think with everybody who touches the data despite the fact that most people think it's intuitive. I I don't think it it is intuitive to get it right.

I love where this going. So I'm gonna take a quick step back because, you know, we've we've kind of moved on in our conversation from the framework under which to get the best quality data. And now we're almost solving for the garbage in garbage out problem if I can kind of oversimplify that. And what I'm hearing both a few gentlemen mention is there's a few personas who are active participants of this process.

They have a certain set of skills that they need to become better and better at if the outcomes of the three sixty have to be impactfully delivered and then comes actually understanding these two pillars and then solving for it. So if I quickly summarize I think the biggest point that Jack mentioned on that was it all starts with the individual as the individual open enough to receive this authentic feedback per se. And that relates back to the objective that you're setting. How how clearly you kind of put your money where your mouth is This is development focused.

I won't use it, you know, behind closed doors. We haven't spoken about this behind closed doors for some sort of assessment or some sort of a performance impact per se. But that's that's another conversation for another day. But there's there's the everything starts with the individual for whom, you know, the development journey needs to begin.

What I heard, Jack then built on top of that and, you know, Alan also kind of chime in was the next biggest party is the person who's on the other side of the report, trying to co build with that development journey, which is the manager, or let me kind of go to the extent of calling them a coach. And then the leaders or the talent councils folks who weigh in in the talent reviews for, say, if I can call it that, you know, who need to understand how certain behaviors or traits show up, for them to be able to mature, you kind of understand this. And then other new added the changing role of talent management folks who are driving this.

I think a few things that right throughout the conversation we've heard you mention is number one you know, think about the design, not as a set of questions, but as a rubric that is well understood and scaled.

And and now we're talking about running this almost as an upskilling program for the participants of, you know, the three sixty. So I'm, as always, I'm tempted to kind of double click and I'll start with you Alan, you know, from your experience on the other side of the table of being that practitioner implementing this, let's say, at Pepsi and before.

From one three sixty to another, you know, how did you solve for improving these feedback, giving feedback receiving or coaching skills per se?

Before I move on. Yeah. I think the, the biggest thing, I mean, for us in changing it. Right? So this gets a little bit back to how do you make it stick too, right, is helping the organization understand why it's changing. Because to Jack's point, simplicity is good, and I would say entrenched things are better sometimes than new things unless everybody hates the entrenched thing. So even if they don't use it very well, they like their existing one and they are resistant to learn something new, typically, not always, but often.

So, you know, a, having a CEO behind the model, right, sponsor the model. And I'm not just talking, hey, three sixty is good. I'm talking they wanna see data on people because they believe in what you're asking. That gets you a little to to language GC as I was talking about.

Right? So make sure the CEO buys into and or has some of his or her language in the in the content. But that aside, I think it's communicating why it's changed being super transparent on what you're doing with it? Like, what's the purpose of this data?

Where is this process going? What's it based on you know, is it based on science? People, you know, appreciate science often, not always, but it's based on science, links to our culture, has predictive power, whatever it is.

How are we gonna use it? Who's gonna see it? And it drives me nuts when people talk about they get up and they talk about three sixty and they say it's anonymous. It hasn't been anonymous since nineteen eighty five with paper.

It can't be anonymous unless it's paper. Even anything online has gotta be confidential at best. Just because there's an online trail. Right?

I mean, like, let's let's face it. But that aside, who gets to see it? Who's delivering the feedback? And what goes with it?

You know, another example, another client I'm working with, now we used to get this at PAP two sometimes, but we built a new tool, right, for the new CEO and, we're delivering feedback one on one, and we're the facilitator. So we're helping make sure that they really understand the data. This one's new to them. And their question to us is, okay, great.

What's next?

What development resources are are available to us? And that's the issue. I think a lot of people put in place or have it in place, have three sixty in place, and haven't really put the same energy into establishing the development resources and tracking an accountability both for managers and the employee to change.

And that lets it fall flat. So I think you have to have all of those things GC together. So it's like a package. Hey, here's the new three sixty.

Here's why we're doing it. Here's who's behind it. Here's what it's based on. Here's what it's used for.

Here's what you can expect. Here's who's get to go through. And here's why and how we expect to support you in this and what we expect out of you when you go through it.

Communication before the three sixty, efficiency during the three sixty and then closing the loop after the three sixty is my top three takeaways.

Jack, just coming back to, you know, the skill issue in terms of the participants of the three sixty. And I'd like to lean on your significant experience of working with business leaders and leadership teams. In fact, you know, you are a performance leadership coach yourself.

In those talent round tables, in those councils, you know, what's


What's the typical facilitation guide to help leaders leave bad habits and poor examples off the table and bring in that consistent yardstick?

What's the typical kind of facilitation guide or or how do you kind of help leaders leave bad habits, poor examples of the table and bring in that consistent yardstick while they're kind of going through an individual at a time.

Well, but that's a that's a wonderful question.

I I would like to just add on to what Alan has said that that as you collect all this data, one of the one of the useful elements of a three sixty degree process is that it does allow you to aggregate data and to stand back and say, okay. Collectively now, what are we seeing in terms of what are the common opportunities for improvement that seem to be really important.

And and that then informs and guides the organization in in making available various developmental activities and and and opportunities.

And so I I I think one of the things that organizations have done a very good job at is aggregating looking at aggregate data by by different departments, by different levels, by different age groups, but by the by gender, and saying, you know, are there some things which are uniquely important to that group? And therefore, how can we prepare our managers to be more effective at helping them to develop?

I think the second thing that you can do in, you know, in terms of imp making this, you know, implementing it is I don't wanna sell short the the combination of the three sixty degree feedback data with other things that the leadership team can do.

For that person's development.

And and I believe as an outsider looking at, at PepsiCo, for instance, One of the things that they've done to a much greater degree than most organizations is to say, okay.

Development self awareness is important. Yeah. Having good data, having good feedback is important. But it's also the opportunities that the organization gives you in terms of assignments, projects, spot teams, the the nature of the work that you give a person, it self is a great developmental opportunity, and tool, and, PepsiCo's willingness to kind of move people and to transfer people and to you know, trade trade functions, it it's that kind of attention to the the real development So in the talent discussions that I think are the are the most effective ones, it is that conversation that says, Okay. What are this person's strengths?

What maybe stands in their way?

And what are the experiences, the assignments, and the projects that we can make available to them, that would truly make a difference in their career.

And and I think is that it's that attention to the combination of no. It's it's it's yeah. It's the the feedback's one thing But as Alan said, the developmental resources, it may be a formal program of some kind or a course or a seminar that they can attend But very often, it's not that. It it very often it's it's other things that you can do that really help this person it already made progress on their leadership journey.

So it's it's building in those discussions in the in the talent management reviews that I think is the real payoff.

Yeah. And and if an organization does that successfully, Jack, I I guess you'd agree that it starts to solve for how open an individual is. To go through a three sixty and receive that feedback because they see all of these as standard practices in their organization.

The second thing that jumps out at me, Jack, when you mentioned that is, I always learned andragogy tells us seventy percent of learning is by doing.

So if you really want an effective development journey after a three sixty and true to the objective, you set out at the top of the conversation, which is develop more people into potential leaders.

Seventy percent of that development journey has to be these stretch goals rotational opportunities, horizontal vertical enrichment, you have it. So, and and to Alan's point on that, if I'm a talent management practitioner designing my program, I'm making sure my leaders are walking into the conversation prepared with certain opportunities that they need to fill in ideally from internal talent.

As tactical or as strategic as you may kind of find that. Stay with the garbage in garbage out probably. We we've spoken about skills. We've spoken about personas.

Two other schools of thought that come in in terms of, you know, the the accuracy or the quality of the data that's coming. One is the network from which this data is crowdsourced. And the other is the frequency with which this data is crowdsourced. Let's pick up the simpler one or the quicker one to, you know, to kind of click.

So annual performance or any surveys that, you know, have periodic form filling sixty percent of the data, marked by cognitive devices,


Do we need to be relooking at the frequency at which we implement our three sixty survey tool?

Do we need to be relooking at the frequency at which we implement our three sixty survey tool? Is once a year, always gonna be needed to be moderated for those cognitive biases because humans don't have the memory of an elephant.

It's an interesting question. I I think, I I will have to say I think it depends a little bit on what the three sixties for. In part because, and if it's the only tool or if it's in a suite of tools, and and Jack and I were talking earlier, you know, some organizations and and Pepsi co included have multiple, if you will, multi rater tools. Right?

So there's a full three sixty based on the Grade five. There's a development check-in mini three sixty that it's twelve to eighteen months after you've gotten that data, if you've been assessed to help on just a couple things, right, not full blown. And then there's an annual upward feedback tool only. So just direct reports that focuses today on the PepsiCo Way.

It used to be on manager quality, but it's basically the the cultural items.

So it depends on the purpose, you know, I I don't think three sixty by itself should be used for performance management. I think upward feedback is great for performance management on manager behaviors, but not the three sixty. Leadership is a different construct to me. So to me, you know, no, it can't it shouldn't be more than it shouldn't be more frequent than once a year. I think think people don't change enough, and they don't get enough time to internalize to to do it that frequently.

Other types of feedback are probably better than that. But wouldn't go more than two years without giving somebody a formal three sixty assessment. Whether it's part of, you know, a culture process as we talked about or part of a talent assessment suite, part of a program or or just even individual coaching, I do think two years is enough time that if you haven't seen some change, the person is either not working on it or not going to work on it.

Jack, what do you think? I would totally I would totally agree with what with what has just been said.

The two years is is it's it it's that's the outside limit to it for a three sixty process. I will tell you that we we recently did, in our small organization, kind of decided to do a three sixty again, and So because we, you know, say, hey, we the the more the more raters you have, you know, the the better gonna be. And so on average, you're doing about thirteen or fourteen people each rating every individual.

For for a small organization or for a large organization, This this eats up an enormous amount of time. And especially if the feedback is with with your peers. So It's one thing if it's an upward feedback only, that means that the the the subordinate only has one person. He's gonna he or she is gonna be ready.

But if you're doing all the peers and and all the direct reports, boy, it becomes a it becomes a real, administrative nightmare at at at at moments in time. So my view is very much Black Allen. I two two years on the outside for for the three sixty, but if there are other tools, a mini survey that that focuses on, what were the two items that you were gonna work on? And let us give you some a pulse check on how they're doing.

That's that's very useful and and a very efficient way, and it doesn't then it doesn't fatigue the organization in doing it. Yeah. One one thing that's I I love that giant. One thing that when I first got to PepsiCo, I mean, they're so process oriented.

Right? And it's execution machine. Right? I mean, think about all the chips and soda every day.

Right? But when I got there, there was a process of five thousand executives going through three sixty every other year. Five thousand from the top down in one shot. So that's what I inherited when I walked in.

And, and it was so complicated because to Jack's point, like, lower down the organization is not a problem. But for the top two hundred people, they're all rating each other. And there were some CEOs in that in the company at the time, like some sector, you know, business CEOs, like the head of international who had forty seven people to rate. Because everybody wanted his perspective, you know.

And it's like it it was crazy. And we got to the point where we had to create thing with our vendor at the time called radar overload, and we would get a flag as internal people to say, over twenty, watch out, start dropping people that they fast for, which is crazy. When you think about it. Right?

So the the the thing we evolved to relatively rapidly in a couple of years, and it's it's still practiced there today is much more of wades of three sixty happening tied to either assessments or cohorts of people or programs or simply Here's the quarterly three sixty. You know, here's the recommendations for types of people that should go through when they had it last, new job, how long in a new job, long in the company, you know, no, you know, five minutes after a new job. Right? Six months in at least, that kind of stuff.

And it rolls. And although it means three sixty all the time in a way, it also means less any single moment to Jack's point. And this and the senior execs don't go crazy.

No. In fact, when I when I'm writing the two things you mentioned, if we're looking at this program as a Big Bang baseline, which Alan you mentioned at least once in two years.

And beyond that, you're looking at the various parts of the development journey and almost pulse checking the right people at the right time for whatever the short term kind of development milestones are. And you mentioned, right, just the way Jack mentioned most of that development journey might be stretch assignments, which we are kinda working with different leaders, or different peer sets on. Or let's say twenty percent might be cohort based peer mentoring, which is various folks who are better at you than that. So, that that kind of nicely segues to my next question, which I had kind of alluded to earlier, which is and and Alan, you mentioned, you know, the pedals of getting it wrong, which is, hey, I'd I'd love some feedback from my CEO.

Not sure not sure the last time I directly worked with him or her, but, you know, there's there's various ways in which organize have been trying to answer the question of who is the right set of people in this individual's network to weigh in on this particular three sixty.

You know, we've we've gone from days of HR practitioners kind of top down centrally deciding this based on a cookie cutter set of rules. To, let's say, the other end of the spectrum where organizations are equipped with org network analysis because they are a dynamic kind of web of team of teams per se. But


What are some of the healthy choices to make so that you get the right set of people to weigh in you know, during a 360?

what are some of the healthy choices to make so that you get the right set of people to weigh in you know, during a three sixty. You know, let let let's start with you, Jack, in terms of what you typically prescribe.

Well, let me see. In practice, what we see is that it ends up that most most participants have about thirteen to fourteen respondents.

And those respondents are usually their immediate manager, their boss.

It's usually them selecting a handful of their peers.

Usually there's four or five peers.

And then all of their direct reports, people who kind of report to them. And then on occasion, we've said you you can go to an outside supplier or an outside vendor or someone that's, that that you work with that really knows you well and that you think would provide you with with valuable feedback.

We had one client. I won't name the organization. It was a very prestigious university where the dean of the medical school wanted to get three sixty degree feedback. So he invited all four hundred and fifty members of the staff to respond to him.

Well, you can imagine. You can imagine the outcome of it. Yeah. But anyway, so Maybe I hope I've answered your question. That that we our experience has been that that, obviously, the manager, obviously, the peers if the individual has some voice in choosing them, I think it tends to lend the the credibility of and and and the believability and how they value the data.

And so I think, we've made the practice of of encouraging the participant to to share in that selection process.

If you think that they're kind of avoiding a right group of people that should be giving them feedback, Yeah. Sometimes you have to intervene a little bit, but generally speaking, people will choose choose wisely. Because to people, enable managers to weigh in on the selection.

Those are the two sets of parties, Jack, that you believe have the the the clearest visibility that who should weigh?

Alan, through your experience, the do's and don'ts of the assessors to be selected. Yeah. I mean, I agree. I think, you know, again, When it was when it was development only, it honestly wasn't that it wasn't I hate to say this, but it wasn't that important that you had a well rounded set of people because it it's not going anywhere.

Right? It's just for the individual. If they choose to pick their friends and they get bad feedback, that's on them. Yeah.

Right? If if it's for, though, talent management and it's being used to calibrate talent for decisions, there needs to be some rigor to the process of selection.

And it may not be mandated control, but We we we experimented with every version of it. You can imagine. Over the twenty one years of late in three sixty, we tested everything.

Even like at one point, manager approval, a approval, return approval, too much. But but I will say all the systems that we build after about two six have have this nomination process involved where the HRES feeds the direct reports.

Right? So it's part of the hierarchy. So it's get they can managers participant can take some off or add some based on what makes sense. Right?

And so as director reports for sure, the manager is fed.

You know, there's peers. And what I would say is there's sometimes two types of peers, and a lot of the clients I work with, especially senior people, You sometimes need two peer categories or appear in some sort of other category because there's often functional peers and there are peers who sit on the same leadership So think of a CFO in Frrito lay North America. He or she has other people on the peer team under Frrito lay on that l t. That are different functions, but he or she also has four or five other finance leaders who are other CFOs.

Those are not the same type of people seeing the same behavior. So from a delivery of the feedback point of view, it's it's really helpful to be able to separate them if they think that would be helpful. Right? So to me, there's that nuance We also have a lot of experience with doing dot in managers as well.

And today's, again, senior leaders. Right? So they have more complex orgs. But dot in managers can be very powerful if a manager and a dotted manager disagree on your strategic thinking, just imagine that.

Especially if your dotted manager is your functional leader, I, your career leader, and your line manager is your business performance leader.

So those are great development discussion. So I'd say if you have it, manager and dotted separate, if you have two sets of tours, great. And then the one last thing you see is other. So I know others are wacko category, and some people use it.

Jack, I don't know if you guys have other, but We have others. Yeah. We have others initially at Pep and people just complaint because nobody knows who went in there. Nobody knows.

I mean, you can't control who they are. You don't know who they are. So when the feedback comes back, you're like, well, what does this mean? So One thing that we did that's turned out really useful and I've used it externally since, we use others as the development only category.

So it doesn't count. In other words, it doesn't roll up to algorithms. It doesn't roll up to, you know, predictive indices on hypos. But it's in there as a rating.

You can see it. And you can see the range. You can see, you know, a number of people over three plus. Right?

And comments are in there separated by others. And that enables people to put in their, you know, skip level managers, skip level clients, executive committee members who typically rate much harsher, or second level, you know, direct reports, right, double down dots. And so, you know, they can get feedback as long as they know what that group is, and you have to help them understand when they're picking rangers. But that can be very meaningful.

So here's how I show up as a leader for my directs directs. And I'm not gonna get dinged on it, so it's okay to ask them. And I'm not gonna get dinged if I wanna ask the executive committee if they even know who I am. And it's telling if they can't even rate you on that.

But it's a it doesn't count. And people feel really comfortable with the idea that that's development only. Right? And the rest of it does, you know, play to a to an algorithm.

So so those are the categories I would say. I mean, customers are great if you have them, you know, internal clients are great. But to me, the key in the end, whatever you have, is making sure the individual can understand their behaviors in the concept, in the context of that group they're getting feedback on. The worst thing is to get fifteen direct reports and you say, I don't know.

I differ with different people. I have two different types of directs. Why don't you tell us that in the beginning? Right?

So it's really can I understand my peers and what they're saying about me? Can I act on it? Right? And it's so very targeted about who's in there and who's not in, you know?


I think, keeping an eye out on the clock, I'd be remiss if I didn't ask you this particular question, even if it's the last one for this conversation. And I'll, I'd like to kind of start with you, Jack. Now we've, we've kind of spoken about, here's the framework collect high quality data here. Here are five or six design parameters.

Here are the healthy choices on how you can get high quality data. But I'm starting with you, Jack, on this part, which is the application of the data. And the first tangible application of the data is usually the three sixty report. That gets handed over to the coach or the manager as well as the individual.

Now, obviously, I've seen reports that are forty pages long. I've seen reports that are almost like baseball cards but my simple question to you, Jack, is


What are the four or five must haves that really simplify and increase the quality of that last mile coaching conversation?

what are the four zero five must haves that really simplifies and increases quality of that last mile coaching conversation.

You know, the four zero five must have that you ought to ought to have to enable that high quality conversation on a three sixty feedback report. Yeah. So I I would say that the the important elements of a of a three sixty report from my perspective, would, first of all, be that it it be intelligible to a a mere mortal. It doesn't require PhD psychologists to to interpret the data.

Secondly, the it it helps if it's visually compelling so that they can see the the big picture. They can see their their fundamental strengths and where they're really good. And it it it if it can focus primarily on the things that they're that they're strong at, but also be very clear with them about any area that really is detracting from their performance. But I think it it helps to have the big picture it helps to have the individual items so that they can see the the dispersion of of of comments So I think the more transparent that you can be about the the data so that they can get a real visceral kind of feel about Okay. How does this, how does this get compiled and what does it really mean?

That's important.

I think it's also valuable if if part of the report can be, helping them see not only what am I good at but in the minds of other people, what's really important? What what are the most important behaviors for me and my current role?

And so it's the integration of my my capability in that arena, how important it is, And then how how much passion do I have about wanting to improve that? And I think if you can help people kind of put those together about yes, no, you know, those those three, it's a Venn diagram of importance capability and and my my personal passion. I think the the instrument also has the capability of providing them almost a little mini survey, a mini employee survey so that you can see how does your current behavior as a manager impact your your your your subordinates right now. And so you can ask the subordinates, a few questions which are unique to the subordinate.

And and it's it's how he or she currently feels about you know, is this organization gonna meet its strategic objectives? Is this a place where I would recommend to my friends? You know, is this a good place to work?

So I I think you can you can build into this instrument not only feedback about their current behavior, but also help them see what's what it's currently doing to the people that report to them. And then finally, I think the the most valuable part of the instrument can be the the written comments. And I I think if you can, ask some open ended questions that elicit helpful, valuable, practical, concrete specific recommendations.

Asking for a list of all the places where they have any weakness isn't a very helpful question as far as I'm concerned. I mean, it it's but but asking if there's anything that's really getting in their way or detracting from their performance, that that can be can be useful.

So those for me are the the major components, of an effective three sixty instrument.

I'm gonna quickly summarize that before I throw it over to you, Alan.

Simple in parlance, it's meant to be read by a salesperson, a finance person, an engineer, not an io psychologist.

As visual as possible because that's easier to consume. I think those first two bullet points you shared, Jack, you almost sounded like a marketer yourself.

The third one is essentially, you know, give them, you know, their their hidden strengths, their blind spots and all of that, but not just, cleanly and simply try and personalize or contextualize that to what's important to their current role and their journey in the organization And last but not the least get them to buy into why they need to work on this by exhibiting the correlations with some impact, which could be growth in your team's performance could be growth in, let's say, you know, the NPS of your team so on and so forth as you've kind of actioned some of that very, very useful to kind of see in a report from tying that out to the rest of the journey.

Alan, I I agree with all the Jack said. I love the idea of tying in the engagement aspect, if you will, right, the a bit of that survey side. Yep. The and I think you you mentioned this g c two in a way, and I checked.

I'm sure you do this too. One of the things that I found very impactful is to include two simple items that are not part of the leadership model, if you will. One is leadership effectiveness and simply rating how effective is this leader from all the perspectives. So it's an outcome measure.

Right? And the second one is is, how would you recommend this manager? And to Jack's point, that you might only ask that of a of direct reports, but you get a manager quality item It's an NPS score of sorts, and you get a leadership effectiveness outcome measure that the individual gets to see. So it's so they get to see all their data they also get to see what people think of them bottom line.

It's kinda the one it's closest to one item you can give them. Right? There are people like them or not. Right?

Would recommend them and they're Generally, they're seen as effective or not by their manager, their dotted, right, their peers direct, etcetera. So I think that's really helpful. In terms of format, I I after years of building always building custom formats. I now have the Allen format.

Believe it or not. And the vendor that I worked with over the years, has it now and they use it for me, and that's kinda fun. But, but really, it's it's a one always have a one page snapshot of the competencies kinda how people did overall high level, you know, not necessarily by perspective yet, and then kinda top three, bottom three or top three five, top five, bottom five on the same page, just one kind of snapshot period, right, with some norms. That one page can be pulled off and taken in people planning or talent reviews or discussions.

Right? So that could stand alone. And to Jack's point, you don't need an IO degree to read that.

But the the then I would have the righted comments because people love those. And if you don't put them next, they're gonna go to the back of the report and read nothing else. You gotta put them next. Then you have all the item detailed to Jack's point.

Right? And and it's gonna have I would always use range. I wouldn't use Like, I would obviously don't identify confidentiality. You don't use range under three, but if there's a range, that's really important for people to see.

A three point five with some ones and some fives is different than three point five with all threes and fours. So I would definitely do an indicator of range. And then at the back, GCU and I've talked about this, but I believe in probably a little more lens and Jack does maybe because it I like to deliver the feedback and I have the IO perspective. Right?

So, Jack, probably a little more complicated reports than you like, but I like to have a blind spot, you know, page. I like to have a top five by top five items by perspective in each column page so you can really see the differences, and then a bottom five. So you can really see the differences which five the manager rated lowest, which five the highest, which directs, etcetera. You can use those two pages from a t m point of view and from a development target point of view in a great way.

I think there's also the role for AI here a little bit g c. This is probably the only place I it to give you some suggested areas based on a decent algorithm and probably one that uses key drivers or outcome, you know, drivers to Jack point exactly earlier. Which of these are the most important for you from a potential point of view or leadership outcome point of view versus just they're all good to have because we believe in them. Right?

So maybe something like that too. So I I like different lenses because honestly having delivered feedback as Jack has for many, many years. Not every page works with everybody. Right?

Some matters just don't work. You need another lens that does get them to buy in. So I like a little bit of different lenses, to be honest. But I think we're in the same page.

It's gotta be an easy part visually stimulating to people to see it interpretable for some people at the summary level and then depth for the real quality discussion for the person.

That's super useful, Alan.

We've come up to the top of this, discussion, but we started the conversation with me forcing both of you to share the very purpose of three sixties in one line. We're gonna conclude this conversation with me holding a proverbial gun to both of your heads again.

We've learned from both of you about how to design


What is that one quantifiable metric that I should use to measure the impact and therefore the success of my new 360?

and get an impactful three sixty out of the gates. One year down the line, all of the talent management folks who've learned from you today have done and made all those healthy choices. What is that one quantifiable metric that I should use to measure the impact and therefore the success of my new three sixty.

One metric.

Jack, do you have a good one? It's the metric that measures the effectiveness of this whole this whole process?


The impact of this process, Jake. I think I think the metric that I would probably lean on would be look look at look at the individuals prior data and look at their and data and see what the delta is. And if and you if you color code that or whatever you do, we we do a color coding browser so that people can see where am I the same? Where have I gotten better? Where have I where have I improved proof?

I think at the end of the day, this whole process is about helping develop better leaders. And and helping the organization support that process.

And so the the ultimate measure is have we really moved the needle on how the effectiveness of the how the leader is behaving.

That would be mine. Yeah. That's great. I the reason I was torn GC is because Yeah.

It it it's the purpose, right, again, which I, you know, I hate to come back to the same thing. But to Jack's point, change over time is what you want from a from a leadership development point of view and improvement. Right? Yeah.

I was thinking from a TM point of view though, and this is the org side. Now, not the person's right. So the OD side is like you want the person to get better. The TM side is you want that data to have predicted ultimately who gets promoted and who is successful.

Right? If they've worked on it. So think from the t and that's a lot harder because it takes a lot longer sometimes to see that happen, and there's a lot many more variables to come in the way. But I think, you know, individually you want them to get better.

And organizationally, you want them to become bigger leaders and drive the company forward. So I think those are the two sides of the coin, Jack. I don't know if that's fair. Yeah.

Yeah. The last one. I guess start small and start tracking the two small individuals over a period of time. And as your program matures, push yourself to see two things.

Number one. Possibly number of leadership positions you're filling internally, and number two, were you able to predict the success of these individuals in those roles in the first place? I think that kind of ties up the entire story really, really well. Thank you so much both of you.

I know we could have gone on for the rest of the day, to be very honest, but I thoroughly enjoyed myself. The time flew by, and I'm pretty sure everyone who's joined us live and always gonna consume this recording asynchronously is gonna learn a hell of a lot. About bringing impact into three sixties. Thank you so much, Jacqueline Al, for joining us today, and I'm pretty sure we're gonna come knocking on your door again to build on some similar exciting top of the future.

Sounds good. Thanks for having us. No. It's been my pleasure. Thank you. Nice nice to be with you.


Virtual Show

Building Psychological Safety at Work: A Primer

Virtual Show

The 2023 Performance Review Playbook

Virtual Show

Mastering Data-Driven Talent Reviews: Turning Insights Into Action


About The Performance Puzzle by GC

The Performance Puzzle" – your monthly ticket to a rollercoaster ride through the thrilling, perplexing, and downright fascinating world of performance management. Managing performance is like juggling flaming swords on a tightrope over a pit of hungry tigers (metaphorically speaking, of course).
It's a delicate dance of setting expectations, giving feedback, motivating the troops, and tracking progress, all while making sure everyone's happy, engaged, and doing their best work. Phew!
Each month, our show invites superstar guest speakers from the HR galaxy to share their wisdom, battle scars, and game-changing strategies for conquering the performance management maze. Whether it's tackling the challenges of remote work, mastering the art of goal setting, or decoding the mysteries of employee feedback, we've got it all covered.
Think of GC as your HR spirit guide, helping you navigate the treacherous waters of performance management with expert tips, inspiring stories, and a healthy dose of humor.
So, if you're an HR leader on a quest for answers, solutions, and maybe a few laughs along the way, "The Performance Puzzle" is your golden ticket. Tune in, get inspired, and let's crack the code to performance management together!
Stay tuned, folks, because the next episode is right around the corner. Don't miss it!

Like what you see?

Consider joining our email community of 7,000+ HR professionals!
Be the first to hear about our educational content sprinkled with a little bit of entertainment. We NEVER pitch slap, and we DON’T spam.