Posts Tagged status reporting

Comparing observed behavior to project management actuals

Coming back from a vacation is never easy. I have been trying to finish this post for the last week and time was just in short supply. Anyhow, I have finally caught up and gotten back to normal. In the last post I decided to embark on an experiment to examine what if any differences there would be between what one observes people doing at work versus what they report using our project management software, weekly timesheets, and status reports.

I sampled people in various departments at Tenrox twice a week as discreetly as I could in order not to disturb them or attract any attention and change behavior. I employed the following rules during my observations:

  • Observe 2 different people in each department and record what they are doing at that moment; I called each of these observations a sample
  • Exclude all managers and executives from the experiment; except for R&D managers because I wanted to measure how much they actually spend on various tasks, specially on meetings
  • Take 2 samples per day; twice per week; for 8 weeks
  • The samples must be taken randomly at different times of the day and of different people

After the sampling was complete I compared the approved timesheets and status reports to the observed activity. Here is a summary of my observations:

Table 1 – Percentage of Time Working (includes meetings)

  Based on Observation Based on Timesheet
Account executives 88% 100%
Inside sales 81% 100%
Marketing 88% 100%
R&D team members 94% 100%
R&D management/project managers 100% 100%
Support 94% 100%
Professional Services 100% 100%


Table 2 – Percentage of Time in Meetings

  Based on Observation Based on Timesheet
Account executives 13% 0%
Inside sales 6% 0%
Marketing 6% 2%
R&D team members 13% 8%
R&D management/project managers 63% 45%
Support 6% 0%
Professional Services 6% 5%

In my observations I considered that the person was “not working” when they were talking about non-work related topics, or just taking a break (such as a smoking break, coffee break etc.). Also, to be fair, for those specific individuals who took longer “breaks” during the day, I took a note and checked to see if they left later that day to make up for it. If this was not the case then “percentage worked” reflects the deficit.

Well! The experiment was much harder to run and took a lot more time to perform than I had expected. Here is what I learned and the action items:

1) Broken Breakdown Structure

In our timesheet system we have the concept of a Task and a WorkType: Task = Project + WorkType

WorkType is used to define the type of work one does; and it can apply to any project. For example “Design” is a work type, “ABC Design” is a task you report time against when you are doing design work for project ABC.

One of my first surprises was that we had more than 6 different “Meeting” work types in the system such as “General Meeting”, “Meeting”, “PS Meeting”, “R&D Meeting”, “Marketing Meetings”. With each of these work types associated to one or more team-specific and customer-specific projects.

I had to generate time reports that included all of these work types to be able to measure how much time is being reported as time spent in meetings.

We definitely do not need 6 meeting work types. First action item is to clean this up so we have clear and consistent work types used by all teams to report meeting time in their teams and for any projects.

2) What’s a Meeting?

We have to make sure everyone has the same definition of a meeting. For example, R&D managers and sales team have a lot of 1 to 1 or small group meetings in their offices. This time was not reported as meeting time which is fine as long as we do so consistently. Looking at the samples versus what is reported in timesheets, and after speaking to a few people, I can see that this is not the case right now. Also some people marked some training sessions as meeting time whereas others recorded this time against a training work type.

3) Now let’s look at Table 1 results and my analysis of the differences:

  • Our account executives do take time during the day to socialize with each other and other team members. However they work hard and I know they are responsive to customers even when they are called upon outside business hours. So the observations while valid are not a cause for concern.
  • Our inside sales team also does some socializing and participates in meetings but their timesheets report no such details. However, they have a pretty tough job. Making hundreds of calls per week following up on Web leads and other marketing activities. I hear them sometimes, they show remarkable patience and they actually care about their work so the breaks are very much needed to let some steam out.
  • Marketing which includes our Web team definitely has room for improvement. The observations confirmed what I already suspected. This team can and should do better than it does now. Web team members have to feel more intensity and a sense of urgency in their day-to-day deliverables. The experiment has actually provided me with new insight on this team.
  • R&D team members require a lot of freedom so they can be creative, intense, and excited about what they do. Therefore, given the great success of the last two releases, the occasional walking, chatting and longer breaks are not a cause for concern.
  • I am willing to cut the Support team some slack too. As part of this experiment, I decided not to just let one number lead me to any specific conclusion. Our 90 days plus accounts receivable (A/R) as a percentage of total receivables is near an all time low, our blocking support issues and SLA (service level agreement) compliance were well under control, so the team is more than justified in taking a break here and there. Of course, this would have been a red flag if the other metrics came in on the critical side. Definitely I will sample this group again if and when I see our A/R creeping up or an increase in SLA-compliance related issues.

4) Let’s look at Table 2 results and my analysis of the differences:

  • The account executives report time at a very high level; simply reporting hours worked against a task called “Sales Activities” so you cannot tell how much time was spent on calls, in meetings or anything else. I do not think we need more detailed time reporting for this team especially since most of the team is comprised of veterans.
  • Our inside sales team also reports time at a high level. I think this should change. We need to cross reference their timesheets with phone logs, qualified lead counts and lead quality. Although I understand they need to take some breaks from time to time, I think more detailed time reporting will help us ensure this team stays on top of its game and reassess what they spend time on if and when our lead count or quality goes through a rough period.
  • Marketing which includes our Web team was inconsistent in their timesheet reports. Some people are reporting time spent in meetings and some are not, some are providing more detailed time reporting against specific tasks and some are not. I will meet with the head of that team to discuss these inconsistencies and agree on what the reports should include. I think the marketing team, specially the Web team, should provide a more accurate picture of what tasks and projects they are working on (for example: search engine optimization, trade show preparation, Web site design, PR, etc.). Right now it is a mixed bag.
  • In comparing what was observed versus what was reported, R&D team members were remarkably accurate in reporting how much time they spent in meetings. The percentage spent on meetings was higher than I expected but this is probably because the R&D team is gearing up to work on the next major release so specification meetings and reviews are likely to be more frequent and longer.
  • R&D management/project managers are spending a lot of time in meetings. The discrepancies in the percentage spent in meetings come from higher level executives who are not providing detailed reporting of what they are spending time on. If I take the higher level managers out then what was observed is in the range of what is being reported. Still, I think perhaps too much time is being spent in larger group meetings by high level managers. I prefer more of smaller, shorter, targeted group meetings to fewer large meetings, even for a major release than what I observed.
  • Support team was a mixed bag as well. Those on the on-demand side are not reporting at the project level or any details of what they do. On the other hand, core application support team who is spending time with customers is providing very detailed time reporting. This should be addressed as we need to track costs of our on-demand initiatives and any time our on-demand team spends with customers.

Summary and Closing Remarks

This was quite a valuable experiment. Comparing observations to what is being reported to actual results has provided me with new actionable information that will help us improve how we run the various teams and our tracking systems.

The conclusion I draw from this exercise is no single data point can help management run their business well. You gain insight by looking at all of them combined: project management reports, observing people, walking and talking to people, customer surveys, and looking at the company results. That is the only way you can spot potential problems and prevent them or identify best practices and promote them to other parts of the company.

I would highly recommend this exercise for any high level executive or line of business manager. These types of activities help us get closer to the action, get the real facts; not just see things through rose colored glasses, and better understand the inner workings of our teams.

, , , , ,