Stay tuned for more on performance and capability, as well as other relevant topics related to work and human performance.
And for additional content, check out the Library on our website.
e are on Twitter (though, we are still not sure why)
Chirp with us: @Prhconsulting
| We've got articles, presentations, and project profiles on the website. Visit our online library for all kinds of materials on human performance, learning, and business.|
|Discuss Amongst Yourselves...|
|Don't Forget the Blog!|
We hope the blog format has not been overshadowed by the 140 character Twitter or similarly brief Facebook and LinkedIn posts.
There are a few new posts on our blog www.prhconsulting.com/blog.
We use the blog for short notes and commentary on business and human performance issues. Recently, we posted
For additional topics, visit our blog.
Pass it On
It's easy to forward this newsletter to interested colleagues -- just click the "forward" link at the bottom!
|Check Out the October 2010 PI Journal|
I hope you've had a great first quarter and, if you made it to the ISPI or ASTD conference, I hope you found it a worthwhile investment!
When I was thinking about how to introduce myself and my company for my presentation at ISPI, it occurred to me that we do a pretty wide range of interventions. In fact, the main thing that is consistent is our process: once we have a defined need, we analyze (that is, take it apart so we can understand it) the work and the components of capability (that is, knowledge, skills, attitudes, information, tools, etc.) needed to perform it. Usually the next step is "chunking" and sequencing, either into a path (if it is a large performance domain) or into a deliverable. Of course, deliverables can be performance or knowledge tests, training, reference documents, job aids, or, most often, a combination of the above.
The advantage of our approach is that the solutions we design and develop are tailored to the business situation and need. Maybe it makes it harder to sell -- we aren't famous for building specific things -- but I like to think we listen well and apply a wide range of tools to get the job done...whatever it is.
In this issue, we are going to talk about how technology has spread into almost every area of business. (I would say every area but there has to be an exception...just because I can't think of it doesn't mean I'll assume there isn't one.) And, if you define "training" as transferring capability to perform, we believe training needs to stay on the radar and in the plans of any company that wants to be competitive over time.
I hope you enjoy this issue of the newsletter!
Peter R. Hybert, CPT
A quick shout-out to a client team we worked with to design a new work process for selling. (They know who they are...) The process changes emphasized the critical factors leading to more/bigger sales. The supporting tools and leadership enforcement ensured adoption. To make it happen, they did an outstanding job with the roll-out and introduction.
In fact, the project and the results exceeded their expectations and projections -- so they submitted it to their company's corporate-wide recognition program. After a number of months of reviews and short-listing, they managed to win one of only two global awards (and that is within a very large company with a number of divisions...and lot's of Six Sigma Black Belts too). Way to go -- we were happy to have played a role in the project!
|Tech Know-How Needed Everywhere |
Is There Any Such Thing as "Non-Technical" Training?
When reflecting on the type of clients and projects in which we typically get involved, it seems that we often gravitate toward things that have some degree of technical content. We aren't the vendor chosen to build generic communication skills, for example. But we have done sales training and leadership development projects...however, they always have some significant technical component. For example, selling complex high-tech products and systems. Or, leadership in a regulated evironment with complex work processes and equipment. Or facilitating product development teams made up of people from multiple disciplines or even companies.
That's OK with us. We think we have a pretty good handle on how to analyze complex work and design effective solutions to build capability. We take a fair amount of pride in how we distill and streamline complexity without over-simplifying it. We have gotten good feedback at how we package the information needed for performance into usable job aids and references.
We're anticipating that this is good news for our business. Lately, addressing technical content and complex capabilities seems less like a niche and more like the way all training and performance interventions need to lean. In fact, the New York Times recently published an article proposing that the next new wave of innovation will come from mining the vast amounts of data collected through internet transactions and surfing.
Basically, the digital movement has affected all areas of work, which means we will have to build our employees' capability to perform that work and leverage the related technology. Going forward, people will enter the job with an increasing level of technical skill, especially related to computers, as schools continuously improve their educational programs. Its a moving target.
To put it in context, back in the day, one of our clients released a PC-based product running on a Windows (NT) platform that did the same work (and then some) as one of their previous products which resided on a "mini" (which actually is a much bigger and less accessible computer). Old hat today. But in 1985 nobody knew anything about it. (For example, one challenge was coming up with terminology to describe clicking through a series of windows...how about "tunneling"? For that matter, what's a window? What's a mouse?)
Networking was pretty obscure and web/internet/http was unheard of. Now everybody knows about that stuff and it seems that mobile and cloud computing or social networks (along with the related heightened focus on privacy and security) are the newer leading edge. Things won't be slowing down in the near future. But computing isn't the only discipline that is advancing. See also combinatorial chemistry. Gene mapping and therapy. Increased use of statistics in maintaining mechanical equipement (i.e., predictive maintenance) or process control.
What does this have to do with performance? After all, just because there is content doesn't necessarily mean people need to be trained on it. People doing a job need to know how to apply new technology though. We need to explain relevant technological concepts to at least a basic awareness, or conversational, level to the affected employees, which is often most of them. Most employees will need to understand basically, what the technical concept is, how it works (at a general level), why it is important, and how it applies to their products, processes, and tasks. Others need much more depth. And new tools and processes are needed to capture this content in a way that makes it available and useful to others.
It doesn't always mean a class though. Job aids are still powerful. Today's digital job aids can be even more powerful. Remember when self-serve gas was a new thing? Nobody had to be trained on how to pump gas but we did need intuitive job aids on the pumps...well, some of us probably did. Today Google puts out lots of one- or two-minute videos to show "how-to use" various features of their products. Available on demand, simple, and targeted. (Sometimes, a paper job aid would have been better but let's not quibble.)
Where is this leading? Future challenges could well be technology, that is, application-based/"how-to" for concepts and tools needed to perform work. Lots of job aid, tools, and bite-size modules. What else? Maybe context. Learning lots of fragmented tasks and ideas begs the question "for what purpose?" What are the overall things we want people in this role to accomplish? We suspect that could get lost in the shuffle as companies overreach for the apparent cost-savings of the mini-module approach. Mini-modules are good for answering "how do I ___ ?" but not so much for questions like "why?" or "what is important?"
We are excited about the possibilities for building capability effectively in the technological future. Is your company positioned for success? We'd love to discuss it with you.
Test What Matters
ISPI and ASTD International Conferences
ISPI: Orlando -- April 10 through 13, ASTD: Orlando -- May 23-25
We had a great time at the ISPI Spring Conference...even though Pete was only there for two days. His presentation was well attended and a lot of fun to deliver. It was great to see so much interest in the topic of testing. Participants had great comments and a real interest in pushing the envelope from knowledge testing to performance testing. Of course, we had our annual chance to meet up with a long-time client and her team to help them celebrate their current ISPI award. (I think they have won about ten...they told me they lost count...)
Pete also presented at the ASTD conference as well and also to a large enthusiastic group. Good participation and input in spite of a warm, humid room. Testing is clearly a timely topic. And the more we see of existing commercial test development tools, the more we feel there are some real gaps...or else, opportunities for savvy developers.
If you are interested in the presentation, you can find it on our website, in the library
. Or, give us a call or email and we can send it to you.
Testing and Liability
Don't Operate Outside of Design Parameters On face value, using a test to qualify candidates for job openings seems fairly straightforward. You design a test, require all applicants to take it under the same conditions, and then pick the candidates that perform the best. Employers like tests because it makes hiring decisions easier by establishing an objective metric. But what happens when the test and subsequent hiring policies produce racially disparate results?
Last week, a decision was issued on LEWIS v. CITY OF CHICAGO, a case involving testing and hiring practices in the Chicago Fire department during the mid 90's. The result will require the City of Chicago to hire 111 of the original 6,000 black candidates that scored "qualified" but were passed over and cost an additional $30 million in back pay for the remaining applicants who will not be hired.
The following excerpt is taken from Chief Judge Easterbrook's summary of the facts:
In 1995 the City of Chicago gave a written examination for positions in its Fire Department. Applicants who scored 89 and up were rated highly qualified, while those who scored 64 and below were rated not qualified. Those in between were rated qualified but were told in January 1996 that they were unlikely to be hired. (Applicants were also evaluated for physical skills, criminal records, and other attributes, but those do not matter to this litigation.) From May 1996 through November 2001, the City hired 11 groups of applicants from the well-qualified pool. Each time it chose at random from those who had scored 89 or better; it did not follow the common civil-service practice of hiring in rank order from a list.
By the way, in this case, what's important is NOT whether the test was an accurate measure of firefighting aptitude (though many, including the plaintiff counsel, contend it was also a poor test) but how it was used in making the selection decision. The city of Chicago's focussed strictly on those scoring greater than 89, even though the qualified pool (those scoring greater than 64) were fully capable of performing the job. Additionally, they randomly selected from the pool of candidates that scored greater than 89 instead of following rank order (hiring all candidates that scored 100%, then all 99%, etc.).
Actually, it seems they set a new bar for "qualified" -- they treated everyone scoring over 89 as "qualified" and those scoring below as "not qualified." Is raising the bar illegal? For one thing, you need to make sure your existing workforce can clear the new bar. But in this case, that alone WAS lawful. However, because it produced a disparate outcome, it became necessary for the employer to justify it based on business necessity.
The other thing the Fire Department did was deviate from the custom of hiring based on the rank order of the score and instead, selecting randomly from a pool. To a performance technologist, this also seems like a fair way to select candidates who met the requirements. After all, is there really any significant difference between a candidate that scored 99 and one that scored 98? But the court decided otherwise -- by selecting randomly, they made a large pool of candidates. If they had selected from only those that scored 100, they would have been picking from a smaller pool and it would have been more difficult to prove a disparate outcome. The legal concern was different from the performance/testing concern.
In general, we always have concerns about knowledge tests used for any selection or qualification decision. Performance tests directly tie to the job and require much less analysis and justification to prove the linkage.
Moral of the Story: If you design a test to do something, use it for that and only that. But also, check with your legal department -- they think about things differently.
Legal decisions are often based on points of law and precedents.
Rant: IT Jibberish
Speaking for the Layman
Jargon is one of those two-sided swords. Sure it helps shorten the communication cycle -- instead of saying a bunch of words, you just use one specialized word. Faster, yes?
Sure but sometimes this can be used as a way of making something sound more official, professional, technical, innovative, appealing, and otherwise "don't-try-this-at-home-ish" that it really is.
Recently, we received a communication from Microsoft about a change to Microsoft Live Meeting. This is the web meeting application program we use EVERY DAY...sometimes in place of regular phone calls because it is so easy to use that it is just as easy as dialing a phone (or are we saying "touching a phone" now?) but allows screen-sharing to enrich the conversation. Microsoft actually sent a 28-page document explaining the changes and they are taking about a year to gradually make the transition so that everyone can adjust. So, they definitely got our attention when they said they were going to be changing it. A couple of us actually tried to read their document and that is where it got crazy.
Basically, we couldn't make any sense out of it. As hard as we tried to dig into it, the document offered a wall of jargon that resisted our best efforts. We are accustomed to reading technical information. We've done projects where we've had to digest pharmaceutical procedures, material safety data sheets, control diagrams/schematics, specifications, contracts, even programming and networking. It can be a grueling but usually, if you just keep slogging you can get it.
Not this one. We are aware that smart people work at Microsoft...but that might be part of the problem. Or they might have a collective case of "insulated big company syndrome." They obviously spent so much time thinking of just the right words to explain the new features that they didn't notice that they had invented an entirely new language that nobody but them understands! Think we are exaggerating? Below is a section that came from a list that was intended to summarize what we end users are going to "need to do" to implement the changes. (I believe this was intended to "cut to the chase," "get right down to it," "bottom-line it.")
- Identity option?!? ADFS role?!? Location-based IP filtering?!? What?!? Are you saying it won't work unless everyone goes to Windows 7? Or are you saying that in certain situations you might NOT need everyone on Windows 7? And what are those situations?
- Whether "rich co-existence will be a requirement"?!? Who wouldn't want "rich co-existence"? Well, what is it? (Not explained earlier in the document, by the way.) Sure sounds like a good thing but you might not need or be able to get it...again though, how can I even decide whether to pay attention at all, much less figure it out, if I can't even tell what it is?
The best thing you can assume is that this would all make sense to an IT professional. As a small business, maybe we are not their target audience. But still, most companies eventually require a regular business leader to sign off on purchases and major changes...why not just explain it so everyone can understand it? Just say what it is, why it is important, and how to decide which parts we need. As far as getting your audience excited about entering a world of possibilities...forget it. We just want to assemble a workable set of tools to get our jobs done.
The worst thing you can assume is that a bunch of over-vocabularized uber-nerds took over the marketing department and are trying to write a manifesto instead of just telling us what we need to know. Or, worse yet, they are trying to distract us from what they are really doing...but what could that be? Is the new thing going to cost more? Will it drive customers to purchase more licenses? Will it make it harder for Microsoft competitors to sell related services by somehow locking you in their platform? Can't tell unless you can understand it.
At this point, we still don't know what is happening. On the one hand, some information indicates that it might be nothing more than re-branding, and the core functionality will be unchanged. Then, we will need to do absolutely nothing to adapt to the change. If that is the case, why all the hoopla?
But on the other hand, some of the information indicates that it may change to an instant-messenger-like application, requiring everyone who is participating in your meeting to also be running that program (and have a license for it). If that is the change, just tell us now so we can look for a different solution because that one clearly won't work.
If you know, and can summarize it simply and practically, please give us a call. Otherwise, our next meeting with you might have to be in-person...which is actually fine with us.
Thank you for your interest in PRH Consulting! For more about our company, approach, and experience, please visit our website at www.prhconsulting.com
We hope you think of us the next time you need help improving or supporting performance.
Pete Hybert, CPT
PRH Consulting Inc.
www.prhconsulting.comAll content is copyrighted by PRH Consulting Inc. (2010). Any re-use must include this notice.