The Microsoft Review Time Capsule from 1997

Since there are so many idiots out there bloviating about how Microsoft’s review system is somehow responsible for the companies decline, I thought I’d share some review materials from when I was a manager in 1997 and the company was at the top of its game and had destroyed almost all of its competitors just prior to the DOJ trial.   These were our manager review directives from Brad Silverberg, also responsible for shipping the Windows 95 Operating System.  As you can see, the stack rank and curve existed long before Steve Ballmer became CEO.  Obviously as a manager I was reviewing my own folks and struggling to deal with how to fit a team of extremely talented and competitive people into the required stack and curve their scores appropriately.  I wish I could show some illustrative reviews but I’m afraid that I can only present one of my own as an example of how a Microsoft review was presented in that era.  They key point is that they did not change substantially under Ballmer, stock grants replaced option grants due to changes in accounting practice and the size of the awards became  much sparser during the recession.  The bitching about it sounds about the same to me today as well.  I’m sure it will come as a huge shock to learn that people were always pissed about their reviews, demoralized, engaging in sabotage and sulky-quitting even in an era where the options grants added up to millions.

Microsoft Review 1-10-96
Microsoft Review 1-10-96
Microsoft review 1-10-96.pdf
3.0 MiB
920 Downloads
Details...

This is a link to an earlier blog I wrote on the subject of how Microsoft valued people which included a copy of their actual employee evaluation toolkit.

http://www.alexstjohn.com/WP/2013/05/03/ten-attributes-of-a-good-employee-by-bill-gates/

_____________________________________________
From: Brad Silverberg
Sent: Saturday, May 24, 1997 9:43 AM
To: Tad Coburn; Terry Crowley; Dean Hoshizaki; Rob Howe; Michael Mathieu; Rob Mauceri; Bob Pomeroy; Andy Schulert; Andy Seres; Robert Silver; Heinz Strobl; Andy Abbar; Jay Abbott (Exchange); Salim AbiEzzi; Ahmad Abu-Dayah; Robert Ackerman; Ravindra Agrawal; Imran Akhtar; Ayman Aldahleh; Daniel Allen; Jason Allen; Peter Allenspach; Nona Allison; Ebbe Altberg; Peter Amstein; Alicia Anderson; Marlee Anderson; Bob Anderson; Eric Andrews (Exchange); Peggy Angevine; Todd Apley; Jay Arnold; Katherine Arrington; Russ Arun; B. Ashok; Scott Austin; Jigish Avalani; Linda Averett; Girish Bablani; Mahesh Babu; Jeff Bailey (Exchange); John Bain; Brian Bakke; Bridget Bakken; Matt Bamberger; David Bangs; Susanne Banks; Dave Barnett; Rick Baxter; Arthur Becker; Jennifer Beers; Joe Belfiore; LaShaun Bellamy; Matthew Bellew; Robert Bennett; Laura Bentley; Sean Bentley; Brett Bentsen; David Berge (Exchange); Eric Berman (Exchange); John Beswetherick; Morris Beton; John Betz; Anil Bhansali; Anita Bhasin; Daniel Bien; Roger Bischoff; Brian Bishop; Dawn Bishop (Exchange); Rebecca Black; Ambrosio Blanco; Mike Blaszczak; Mike Blaylock; Bill Bliss (Exchange); Sue Bohn; Michael Bond; Nick Borelli; Randall Boseman; Adam Bosworth; Kathy Boullin; John Bowler; Adam Braden; Don Bradford; David Bradlee; Tony Brady; Aaron Bregel; Dave Brennan (Exchange); Laura Brenner; Terri Bronson; Kevin Brown; Mark Brown; Wayne Bryan; Chris Bryant; Dave Buchthal; Scott Bullock; Gary Burd; Laura Burke (Exchange); Bruce Burns; Chris Burroughs; Brendan Busch; Laura Butler; Brandon Bynum; Michael Byrd; Johnny Calcagno; Kelly Calvert; Colin Campbell; Tom Cane; Lani Ota Carpenter; Alan Carter; Liz Chalmers; Robert Chambers; Craig Chapman; Amit Chatterjee; Digvijay Chauhan; Rod Chavez; Chee Chew; George Chinn; Erik Christensen (VB); George Christensen; Martin Cibulka; Jennifer Cioffi; Charles Clark; Mike Colee; Dennis Comfort; Michael Connolly; Rob Copeland; Lyle Corbin; John Cordell; Michael Coulson; Richard Craddock; Robert Crissman; Joan Crites; Walt Criteser; Alex Crovetto; Ed D’Alessandro; Sadik Dakar; Kevin Dallas; John Dauphiny; Ruthanne Day; Mark Daymond; Arthur De Haan; Douglas Dedo; Amit Dekate; Peter Delaney; Kurt Delbene (Exchange); John Delo; Bill Demas; Adam Denning; Brendan Dixon; Scot Dormier; David Dow; Jerry Drain; Donald Drake; Tonya Dressel; Craig Ducharme; Dan Duffek; Carolyn Duffy; Rajeev Dujari; Jerry Dunietz; Cynthia Duppong; Cindy Durkin; Ken Dye; Rick Eames; David Ebbo; Kurt Eckhardt; Dean Edwards; Donalee Edwards; Ian Ellison-Taylor; Barbara Ellsworth; Annette Elsbree; John Elsbree; Steve Elston; Raymond Endres; Michael Eng; Larry Engel; Gary English; Peter Engrav; Eric Engstrom; Mark Enomoto; David Erb; Ben Errez; Tuna Ertemalp; Jude Eshleman; Brent Ethington; Jim Evans; Debra Falkin; Jan Falkin; Scott Fallon; Jonathan Fay; Lauren Feaux; Lonnie Ferguson (Exchange); Scott Ferguson; Justin Ferrari (Exchange); Tracy Ferrier; Dave Fester; John Fine; Richard Firth; Lon Fisher; Charles Fitzgerald; Ingrid Fitzgerald; Craig Fleischman; Drew Fletcher; Randy Flynn; Fabrice Fonck; Christian Fortini; Gregory Frederick; Steven Freedman; Patricia Friel; Makarand Gadre; Arne Gaenz; Don Gagne (Exchange); Ravi Gandham; Ananda Ganguly; Richard Gardner; Kurt Geisel; Scot Gellock; Anita George; Charles Gifford (Exchange); Gay Gilmore (Exchange); Robert Gilmore (Systems); Dale Gipson; Arye Gittelman; Danny Glasser; Kirk Glerum; Steve Goan; Peter Golde; Jay Goldstein; Debra Gonzalez; David Goodhand (Exchange); Cindy Goral; Richard Gorvetzian; Alexander Gounares; James Gower; Jennifer Grambihler; David Greenspoon (Exchange); Michael Grier; Jim Griesmer; Shane Groff; Ronald Grumbach (Exchange); Lisa Halston; Jeremy Gulley; Bob Gunderson; Vic Gundotra; Eric Gunnerson; Chris Guzak; David Habib; Dean Hachamovitch; Barbara Haerer; Mary Haggard; Heryun Hahn; Michael Halcoussis; Michael Hale; Don Hall; Mark Hall (Team C++); Patrick Halstead; Kathryn Hamilton-Cook; Cindy Hansen; Seetharaman Harikrishnan; Greg Harrelson; Ralf Harteneck; Roger Harui; Ted Hase; Brad Hastings; Kristine Haugseth; Barbara Haerer; Mary Haggard; Heryun Hahn; Michael Halcoussis; Michael Hale; Don Hall; Mark Hall (Team C++); Patrick Halstead; Kathryn Hamilton-Cook; Cindy Hansen; Seetharaman Harikrishnan; Greg Harrelson; Ralf Harteneck; Roger Harui; Ted Hase; Brad Hastings; Kristine Haugseth; Eric Hawley; Sylvia Hayashi; Todd Heckel; Bob Heddle; Andy Held; Alfred Hellstern; Bartley Hendrix; Tonya Henry; Jeff Henshaw; David Herlihy; Mike Hewitt (Exchange); Matthew Hickman; David Hicks; Rodney Hill; Roz Ho; Randy Holbrook; Lisa Holcomb; David Holmes (Dev); Steve Holt; Destry Hood; Alex Hopmann; Mark Hopwood; Valerie Horvath; Joan Hoshino; PJ Hough; Derek Houseworth; Terrence Huang; Ross Hunter; Mark Igra; Blake Irving; Michael Jaber; Bruce Jackman; John Jacob; Nancy Jacobs (PUB); Mark Jaremko; Renan Jeffereis; Peter Jerkewitz; Julian Jiggins; Ajay Jindal; Jeff J. Johnson; Russell Johnson; Tom Johnston; Chris Jones; Dianne Juhl; Gary Kacmarcik (Exchange); Richard Kaethler; Heikki Kanerva; Jorge Kara; Aram Kargodorian; Marc Keller; Bruce Kelley (Exchange); Lisa Kelley; Mike Kelly (Office); Will Kennedy; Anders Kierulf; Jen Kilmer; Daniel Ko (Exchange); Reed Koch; Eliyezer Kohen; Rashmi Kohli; Srini Koppolu; Craig Kosak; Mike Koss (Exchange); Bill Koszewski; Rick Krause; Joseph Krawczak; Ivan Kreslin; Gwenn Krossa; Tony Krueger; Steve Kruy; Byron Krystad; Richard Kuhn; Peter Kukol; Frida Kumar (Ebbeson); Rajiv Kumar; David Kurlander; Andrew Kwatinetz; Kevin La Chapelle; Diane LaCaze; Lori Lamkin; Judith Laplante; Julie Larson; Kirstin Larson (Office); Denise La Rue (Exchange); Marc Lauinger; David Lazar; Gregory Leake; Tim Lebel; Antoine Leblond; Andy Lee; Bruce Lee; Gerry Lenocker; Eric LeVine; Millani Lew; Sin Lew; Tim Lewis; John Licata; Greg Lindhorst; Troy Link; Jack Litewka; Rob Little; Ted Liuson; Alex Loeb; Peter Loforte; Michael Longacre; Paul Lorimer (Exchange); Martyn Lovell; Brad Lovering; Stephen Lovett; Gwen Lowery; Steve Lucco; Michael Luce; Ivan Lumala; Donny Luu; Drew Lytle (Exchange); James Maccalman; Brian MacDonald (Exchange); Eric Maffei; Dhananjay Mahajan; Neelamadhaba Mahapatro; Neeraj Maithel; Jolie Maki-Chappina; Joe Maloney; Frank Mantek; Holly Marklyn; Sandra Martinez (Exchange); Peter Mau; David Maymudes; Pam Ho McBain (Exchange); Colin McCartney; Tim McKee; David McKinnis; Scott McMaster; Donald McNamara; Gwen McNicholas; Yusuf Mehdi; Anil Mehra; Diane Melde; David Meltzer (MKTG); George Meng; Joe Merritt; Colin Merry; Eric Michelman; Andrew Miller; Eric Miller; Rusty Miller; Patrick Minahan; John Misko; Rajeev Misra; Chuck Mitchell; Sean Mitchell; Robin Moeur; Alex Mogilevsky; Charles Moore; Dinarte Morais; Alex (Iskandar) Morcos; Ann Morris; Alessandro Motta; Brian Mueller (Exchange); Chad Mumford; Kent Murdoch; Jim Murphy; Jeffrey Murray; Alessandro Muti; Simon Muzio; Erin Myrin; Toshihiro Nagai; Vikram Nagaraj; Tony Nahra; Satoshi Nakajima; Yutaka Nakajima; Raman Narayanan; Rick Nasci; Lorraine Nay; Philip Nelson; Crystal Nemeth; Ilse Nethercutt; Anna Neumann; Duong Nguyen; Dyane Nguyen; Tina Nguyen; Tuan Nguyen; Marc Niaufre; Philippe Nicolle; Allison Nielsen; Anders Nilsson; Warren Nolder; Glenn Noyama; Mark O’Hara; Jim O’Neill; John O’Rourke; Bruce Oberg; Sean Oldridge; David Oliver; Diane L. Olsen; Kip Olson; Marc Olson; Mark Olson; Rick Olson; Ken Ong; Peter Oosterhof; Todd Ortega; Joe Oswald; Adam Overton; Scott Oveson; Heather Pacheco; Navneet Paliwal; Gurpreet Pall; Julia Pan; Jean Paoli; Susan Pappalardo; Ramesh Parameswaran; Won Joo Park (Exchange); Dave Parker; Dave Parker (IPU); Robert Parker; Kim Parris; Hadi Partovi; Yousef Tony Parvizi; Bakul Patel; Liam Patel; Peggy Peale; Chris Peltz; Charlie Peterson; Eric J. Peterson; Joe Peterson; Mark Peterson (Office); Susanne Peterson; Gloria Pfeif; Cristiano Pierry; Ron Pihlgren; Alfredo Pizzirani; Barton Place; Rich Pletcher; Steve Polsky; Dave Pond; Johann Posch; Rick Powell; Enrique Prado; Chris Pratley; Matthew Price; Rob Price (Exchange); Casey Quayle; Dennis Quimby (Exchange); Anand Ramakrishna; Suryanarayanan Raman (Exchange); Alec Ramsay; Mohammad Rashid; Adam Rauch; Steve Rayson; Thomas Reardon; Dale Rector; Vamshidhar Reddy; Christopher (Chip) Reeves; Mike Reilly; Cory Reina; Sung Rhee (Exchange); Doug Ricard; Tom Richardson (Exchange); Jacqueline Riddell; Silvana Rimoli (Exchange); Ed Ringness; Michael Risse; Juan Fernando Rivera; Jeff Robbins; William Rollison; Jon Roskill; Tara Roth; Eric Rothenberg; Hannes Ruescher; Diana Ryesky; Ivo Salmre; Wes Sanford; George Santino; Jim Sather; Richard Sauer; Daryl Savage (Exchange); Michael Savage; Cathy Saxton; Jean Saylor; John Scarrow; Mike Schackwitz; Gideon Schaller; Greg Schechter; John Schilling; Kirk Schemlein; Mike Schmidt; Gerhard Schobbe; Jim Schoeggl; Brian Scott; Richard Sechrest; Ajai Sehgal; Scott Seiber; Bertrand Sekour; Handan Selamoglu; Gopalakrishnan Seshadrinathan (Exchange); Charles Seybold; Chris Shaffer; Razi Sharir; Terri Sharkey; Tracy Sharpe; Gennie Shaw; Lin Shaw; Jeanne Sheldon; John Shewchuk; Geoff Shilling; Lora Shiner; Andrew Short; Michael Shulman; Andrew Shuman (Exchange); Will Sibbald; Jon Sigler; Steve Silverberg (Exchange); Morris Sim; Bowen Simmons; Bo Simmons; Karl Simonsen; Ryan Simpson; Joel Singer; Alan Skow; Mike Sliger; Curt Smith; Dean Smith; Harriet Smith; Jeff Smith; Neil Smith (SYS); Ross Smith; Steve Smith (Team C++); George Snelling; Ira Snyder; Beverly Sobelman (Exchange); Jason Soubier (Exchange); Ron Souza; Dan Spalding; Gary Spangler; Kory Srock; Alex St. John; Maria Staaf; Martin Staley (Exchange); Brian Staples; George Stathakopoulos; Curt Steeb; Adam Stein; Marty Steinberg; Joanne Steinhart; Victor Stone; Todd Stout; Valerie Stowell; Peter Sugarman; Michel Suignard; Kent Sullivan; Scott Swanson; Craig Symonds; Kathleen Tamanaha; Lawrence Tanimoto; Matthew Tebbs; Chia-Chi Teng; Dennis Tevlin; David Thacher; Jon Thomason; Allan Thorpe; Mike Tiano; Carl Tostevin; Michael Toutonghi; Naseem Tuffaha; Shusuke Uehara; Ron Ullmann; Craig Unger; Phani Kumar Vaddadi; John Vail; Ted Van Zwol; Christopher Vaughan; Jesse Vaught; David Veintimilla; Channing Verbeck; Laurent Vernhes; Rodney Vieira; Luann Vodder; Gary Voth; Rohit Wad; Mark Wagner; Robert Wahbe; Connie Waite; Janet Walker; Mark Walker (Word); Donna Wallace; Michael Wallent; Jim Walsh; Marc Wandschneider; Alice Wang; Andrew Warden; Paul Warrin; Jens Wazel; Brad Weed; Chris Weight; Ted Weinberg; Robert Welland; Steve Wells (IMPU) (Exchange); Michael Wenberg; Chris Wendt; Michael Werner; Sharon Wetherby; Nicole S. Whitten; Tanowo Wiggins; Paul Williams; Lydja Williams; Sara Williams (DRG); Cornelius Willis; Barbara Wilson; Peter Wilson (VS); Selena Wilson; Randy Winjum; Richard Wolf; Ross Wolf; Eric Wolz; Daryl Wray; Andrew Wright; Robbie Ray Wright; Susan Wright; Enwei Xie; Eunice Yan; Joe Yap; Tom Yaryan; Laura Yedwab; Kenny Young; Mitchell Young; John Zagula; John Zanni; Jason Zhu; Chris Zimmerman; Larry Zinkan; Gerard Zytnicki
Subject: August review

As part of the AICG management team, you play a critical role in the review process: from making sure people get open and honest feedback, to helping set the right objectives and priorities, to rewarding outstanding performance, to making tough decisions, and helping develop our employee talent.  This is a long mail full of info to help you manage the review process.  Please read it carefully and contact the “AICGHR” team if you have any questions.   If there are leads or managers on your team who should have access to this information but did not receive it – please forward.

Performance Philosophy:

The review process was established to give employees direct feedback on their performance, to set or reset individual and team goals, and to assist in compensating employees based on their contribution.  In addition to doing the regular things we do as part of review, we have the opportunity to step back and think about how individuals have responded to the changes and turbulence of the past year.  Who’s kept their momentum?  Who’s stalled?  This should be part of the feedback process as well.

My expectation is that you will set aside time to do a thorough job writing reviews and preparing for the review meeting with your employees.  The written reviews should be open, honest, complete and constructive.  Every manager has an obligation in the written review and in the review meeting to say concretely, with examples and remedies, what the person has done well and not so well; areas of strength as well as areas needing improvement.

I expect you as a manager to have goals on the process of management.  Managers who provide employees with clear personal and work goals, good planning and follow-up/feedback have more effective, productive teams.  This is also the time to have discussions with employees about development opportunities, either within their current roles or as a career step.  Feedback from employees in surveys, exit interviews and conversations over the last 18 months have shown us that development and career concerns rank highly as an employee issue.

I want to especially emphasize the need for a good “spread” in rewards.  This is designed to ensure that we really give exceptional rewards to our exceptional performers.

Review Ratings and Distribution Curve:

A key part of performance management at Microsoft is the rating and ranking of employees.

The review definitions are not absolute; ratings are relative to others at that level in the organization.  Ratings are done on a curve to ensure that we are differentiating performance and that divisions apply performance ratings consistently. 

The performance ratings for each division should be:

Rating                    Distribution     

>3.5                        35%     

3.5                          40%     

<3.5                        25%

I realize that in very small groups it may not be possible to hit the distributions, but as the models get consolidated, the distributions should be hit, and certainly all the models for my directs should hit the curve.

Compensation Philosophy:

We pay for performance.  Our compensation philosophy balances base pay, bonus and stock to be extremely competitive in the industry.  This year, the merit budget is 6% which is slightly above what other high tech companies are paying in salary increases.  The merit budget will vary from year to year based on external business conditions, company performance and competitive market rates.

Once again this review, we want to be more explicit about what is part of merit pay and what adjustments we are making for key individuals.  We want to place well performing product group employees in our core ladders comfortably in the range and leverage variable rewards such as bonus and stock toward our best performers.

Merit increases and Adjustments:

Merit increases reward sustained performance and should recognize the overall value Microsoft places on contribution.  The merit increase should move the employee through the salary range based on ongoing performance and contribution level.

With the goal of moving our core product employees close to or above the midpoint of the ranges, we will allow adjustments to be given on a case by case basis.  Adjustments should be targeted to technical employees who are our top performers in levels 11 and 12 who have a tenure of 4 years or less.  All adjustments will be approved by the Group VPs when the review models are submitted.  No adjustment should be communicated to employees until the final review models have been approved.

Promotions:

Promotions for the division should be kept to 22% for the year.  This includes all promotions for both August ’97 and February ’98 Review.  Promotions to levels 11 or below should be granted only to employees who have achieved a 3.5 or higher over the last several review periods.  For level 12 promotions, the employee should have at least a 4.0 this period and a 3.5 the previous (and I definitely prefer consecutive 4.0’s).   You should complete the promo justification form for any proposed promotion.

Bonus:

The bonus award is targeted at performance during the review period.  It covers results and overall contribution.  There has sometimes been a notion that the bonus is “50% results, 50% hard work”.  That’s not how we’ll be doing bonus. Bonus is for results and overall contribution.  Of course, some aspect of the bonus is for the intangible contributions a person makes that may not translate directly to the person’s own results but did make the group perform better.

As with merit and adjustments, the bonus should be leveraged to reward significant amounts to the top performing employees over this last review period.   I expect to see a good spread. 

The bonus budget will remain at 10% of eligible salaries with the actual bonus range from 0-30%. I really do want to see the best performers get the best rewards.

What this means is that a “normal” 3.5 should get a 7-8% bonus, with some receiving up to 10%.  Many 3.0 performers should not get a bonus at all.  This leaves “extra” bonus money that can be used to reward our top performers.

The bonus distribution should approximate the matrix below:

Rating  Distribution     

4.5 –  5.0           15 – 30%

4.0                    11 – 14%

3.5                    7 – 10%

3.0                    0 – 6% (many will not get a bonus)

2.5                    0

Stock:

Stock is intended to provide a sense of ownership in the company to key contributors who are identified as providing long-term potential contribution to Microsoft.  Current  performance and skill set is not the sole factor to be considered since it is assessed separately in the merit and bonus portion of the rewards system.  Consider overall performance, flexibility and impact on the company if the employee were to leave.

In the past, stock has been distributed in a way that dilutes the value to the top ranked people.  Once again the theme of “good spread” comes up: we will see more people getting no stock as well as more people in the top category, getting more stock.  No stock should be given to those in the D category.

                        Category                                              Distribution     

A – Absolute must keep                                         15% of people

B – Hard to replace high level performer            25% of people

C – Potential for growth – solid                            35% of people

D – Uncertain for long term potential                  25% of people

Stack Ranking

Stack ranking will be done this August. DAD will continue to use the 1-9 approach.  ICCD and Tools will use the 1-5 approach. Rank by function (e.g. development, test, marketing, ue, management).  Using the 1 through N approach makes it hard to roll up at a division of 3000+ people.

Training

There are several training opportunities available to you this review.  I encourage you to take advantage of these programs if you are new to the review process, or consider them a good refresher.  The courses are:

  • Setting Great Goals and Objectives
  • Giving Feedback and Recognition that Counts
  • Developing Careers with The MS Competency Tool Kit

I’ve attached a mail below with additional information about the training opportunities, and link to the enrollment tool.

Good reviews, good models

There are a number of ways we measure performance: stock, stack rank, merit salary increase, rating, and bonus.  I’ve listed them in order from “longest term” to “shortest term”.  That is, the stock assignment should reflect the person’s long term contributions and value to the company in the future, independent of what the person has contributed in the past.  It is a forward looking, long-term based measurement.  Stack rank is mostly long-term based (75% “lifeboat drill”) has a short term component and will fluctuate somewhat from period to period.  Merit increase reflects the short term performance for the review period, but since salary is a (mostly) monotonic increasing function (i.e., very few people ever get pay cuts) it gets weighted with the person’s future potential.  So a person who has “peaked” in level will get smaller merit increases than an up and comer.  Rating and bonus are strictly short term measurements and backward looking — they cover only the period just completed.

Obviously, these measurements are interrelated.  Sustained strong short term performance translates to someone climbing the stack rank and thus getting promoted to higher levels and receiving more stock.

When I look at a model I look for correlations.  Stock and stack are very closely tied.  I don’t expect to see D’s at the top of the stack rank, nor A’s at the bottom.  The typical pattern is that as you go down the stack rank, you go from A’s to B’s to C’s to D’s.  It’s not a complete 100% correlation, so you’ll see some B’s sprinkled at the bottom of the A’s, etc., since stack has a shorter term component than stock.

Similarly, bonus and rating are very highly correlated.  A 3.0 should not get a 10% bonus, no matter how hard the person worked.  Bonus is for results, remember?  A 3.0 says, “did the job, no more, no less”.  Someone who worked super hard but got 3.0 results is in need of career counseling as the person is likely in the wrong job to work that hard but get mediocre results.  A 4.0 should not get an 8% bonus.  We want to really reward exceptional performance.   Shade towards the low end of the range for “soft” scores and at the high end for “almost but not quite the next higher rating”.

This attachment will be helpful to you as you work through the review process:

Manager Review Training
Manager Review Training
manager review training.doc
22.5 KiB
750 Downloads
Details...

AICG Timetable

May 15:                        Review forms available

May 15 – June 13:          Write and submit reviews to managers

June 2:                         Review models distributed to managers

June 23 – June 30:         HR Managers working with Bradsi directs to scrub/finalize models

July 10 -11:                   Models reviewed by Paul Maritz

July 14 -Aug 1:              Conduct 1:1 review meetings

Aug 1:                          Pay changes effective

Aug 15:                         Payday

Aug 30:                         Hard copy review forms due to HR

 

 

Microsoft Review 1-10-96
Microsoft Review 1-10-96
Microsoft review 1-10-96.pdf
3.0 MiB
920 Downloads
Details...

Comments

comments

3 Comments

  1. You did set us up in a review cage match that particular year. I’m still not sure who won.

    • He.. he.. hey Jason! Still at Microsoft? Good for you. I’ve got the last review I wrote for you right here. Just say the word if you want to share it with the world. 🙂

      I know it may not be appropriate for you but I’m sure people would be interested in hearing a Microsoft insiders view on how the review system changed from Gates to Ballmer’s era.

  2. I started in 1994 and left in 2013. I think the review system is better today: rewards are more predictable as they are more consistently (and transparently) matched to the review score. But everybody wants the brass ring and people groused about reviews then and they grouse about them now. I groused too. Except for one, however, I think all my reviews were fair. When a company is perceived as doing well, it’s a case study in great management. When it is perceived as doing poorly, those same management practices, well…

Pingbacks

Leave a Reply

Follow

Get every new post delivered to your Inbox

Join other followers:

%d bloggers like this: