Training should always be given a strategic role within organisations’
Most organizations typically do a poor job preparing for the future and understanding which trends and forces are particularly important to the way they do business. Despite planning and forecasting efforts, a significant number of highly successful companies proved to be unprepared for the future. Since last few decades, organizational life has become even more turbulent as changes within industries have come at a faster pace and bigger scale. Organizational stability has diminished as the rate of change and organizational dynamics have accelerated and the ways we structure and define our work have changed. Training and HRD professionals have, for the most part, failed to anticipate evolving needs and major changes. At an individual level, this has resulted in decreased job security and missed professional opportunities for many trainers. At an organizational level, it has meant minimization or less strategic involvement.
It is not particularly surprising that most trainers and HRD professionals haven’t been especially good at identifying emerging issues. For starters, HRD professionals are often not very strategically focused and are often positioned poorly to drive strategic issues within organizations. The combination of increasing work demands, staff shortages, and the degree of change within organizations and industries has often resulted in trainers running faster just to keep up. Additionally, most people (and organizations, for that matter) do a poor job of forecasting. History is replete with examples of smart people, organizations, and governments that all failed to perceive warning signs of new developments. (Pugh, 2000).
It is essential that trainers and HRD professionals focus on the future. Only by understanding where organizations are headed can they play a strategic role as they proactively prepare for what is to come. Having a better grasp of developments to come allows us to do our jobs better, have more impact in the work they do, and decrease the personal and organizational trauma of many significant changes.
Time pressure and the increased speed of work will continue to exert pressure on trainers. (Some firms have responded by scheduling training at night, on weekends, or in small blocks to reduce conflicts around billable hours.) Increased competition in all industries has also led to a demand for quicker responsiveness and demonstrated effectiveness. Thus, trainers face greater pressure to act quickly and measure results, especially with regard to business objectives or return on investment.
An issue that is especially relevant to HRD professionals is that of knowledge management, a field that continues to develop and is a very hot and important issue for organizations of the future. The functions driving most knowledge management efforts fall within the information technology (IT) departments. Consequently, it is not surprising that, in many firms, knowledge management has become synonymous with software. HRD must play a much more active and strategic role in this very key trend—from facilitating the culture change necessary to produce real knowledge management to helping to develop corporate initiatives. Besides the change in training topics, how trainers train will also change dramatically. Although there will continue to be some form of traditional, stand-up classroom training, a much greater percentage of training will fall into non-traditional means. (Pugh, 2000).
Different learning styles, especially those driven through cultural and generational diversity, will force trainers to adapt their designs even further. Learning initiatives that capture knowledge (such as online classes that are easily archived) to enhance corporate memory will become increasingly important. Time pressure will force even more compression of content, especially for organizations that typically focus on billable hours. Communities of practice will become a larger form of “training,” and the role of the trainer in such formats will be more as a community recruiter and facilitator than content provider.
Factors influencing the amount and type of training undertaken in organisations
Information processing style describes the information to which a person pays attention and how he or she processes it. Deduction and induction are examples of information processing style in one case observing a phenomenon and creating a theory to explain one’s observations and, in the other case, creating a theory and searching for the data to confirm or disconfirm the theory. An example of a tool that illustrates the influence of information processing style on learning is the Learning Styles Inventory.
Personality and information processing style both influence the behaviours that an individual uses in order to learn whether the individual prefers to learn by attending a course, asking a colleague for advice, or going online. An example of a tool that illustrates the variety of learning behaviours that a person might use is the Learning.
A trainer can use Curry’s framework to select tools designed to measure “learning style.” Tools representing measures of personality give person information about preferences, along with the knowledge that these preferences are very difficult to change. Tools representing measures of information processing style help a person understand how he or she thinks and may induce him or her to consider a broader cognitive repertoire. Tools representing learning behaviours give person information about how he or she goes about the business of learning and introduce the person to some new strategies. (Pugh, 2000).
The authors make the distinction between factors internal to the individual and factors external to the individual, such as teaching methods, media, and curriculum. They then categorize factors internal to the individual as “the manner in which, and the conditions under which, learners most efficiently and most effectively perceive, store, and recall what they are attempting to learn”. They sort these internal factors into three categories: cognitive style —the learner’s typical mode of perceiving, thinking, problem solving, and remembering; affective style —the learner’s typical model of arousing, directing, and sustaining behaviour); and physiological style, such as one’s health, nutrition, and circadian rhythm.
The relationship between ‘systems’ theory and variations in employee development practices
Many organizations are experimenting with new technologies and assumptions. They dabble with ways to make learning and change more user entered, customized, quicker, and more streamlined. They try to shift the emphasis to performance and results, hoping that by sheer force of analysis and intent, HRD will finally arrive as a clear contributor to saving shrinking profits and dissolving operating fat. They tout the financial, customer, and employee benefits of learning and culture change and of performance consulting.
Just as they think they are making progress, many of us find ourselves facing a cutthroat world of reengineering and downsizing. Investors intensify their demands for high returns from companies not prepared to compete. Anything that isn’t mainstream business or marketing (such as HRD) often is bounced from leadership agendas.
The extent to which organisations link training to business strategy
The competitive key is contact. When this contact reaches beyond the day-to-day environment to engage people in knowledge sharing, problem solving, and creative thought, phenomenal synergies are possible. These synergies are good for the business, and they are stimulating to individuals. They can help establish communities of practice and interest. They can help people develop skills of dialogue and inquiry, of meeting management and creative exploration. They can suggest and reinforce organic gatherings—electronically or in other ways. They can publicize and help create identities for knowledge groups. (Holton, 1996).
Whenever possible, the individual or the team should play an active and participative role. The manager’s job is to provide context, be a partner, and hold people to high standards of practice. Individuals who understand the strategy and business context can set and negotiate many of their own goals. They can monitor customer feedback and, if they receive timely and useful performance status information, can easily self-correct. Teams can and should help select their colleagues and even their leaders.
The difference between actual and ideal practices may seem subtle. But HR practices that are open and that expect individuals to be key players in making them work reduce sabotage and demoralization. They make it more possible for leaders to play the coaching and support role. And putting more responsibility and tools in the hands of individuals and teams frees HRD people to spend more time on strategy implementation, consulting, and design.
Some organizational structures make knowledge sharing difficult, for example. Others allocate work in ways that overload or demotivate and bore people. Some attempts to streamline the organization remove so much muscle and capacity that creativity, planning, and reflection are impossible. There is often not enough slack left for risk taking and crisis response. The HRD role is to be a voice for optimal work designs—or for compensating strategies to offset negative effects (for example, frequent breaks for people doing highly concentrated but repetitive work, such as air traffic controllers). (Holton, 1996).
Business processes and systems are also important for the HRD professional to watch and influence. Many big information systems are being installed today— at very great cost.
Vertical and horizontal integration of personnel and developmental functions
In the early days of disk-based computer-based training (CBT), there was a rush to pour content into electronic tutorials. At the time, computers could only display black and white text. Audio, video and graphics were years away. A few innovative designers made the most of the limited media and created engaging simulations, quizzes, and even games. But this was the exception to the rule, and most early programs were nothing more than books on a computer. Learners were forced to read the text on the screen passively, often clicking the “Enter” button or space bar to move on. Eyestrain and boredom, rather than improved learning and performance, often were the end results. These types of programs, derisively known as page-turners, tainted the image of CBT for many years.
It is not particularly surprising that most trainers and HRD professionals haven’t been especially good at identifying emerging issues. For starters, HRD professionals are often not very strategically focused and are often positioned poorly to drive strategic issues within organizations. The combination of increasing work demands, staff shortages, and the degree of change within organizations and industries has often resulted in trainers running faster just to keep up. It is essential that trainers and HRD professionals focus on the future. Only by understanding where organizations are headed can they play a strategic role as they proactively prepare for what is to come. Having a better grasp of developments to come allows us to do our jobs better, have more impact in the work they do, and decrease the personal and organizational trauma of many significant changes. (Solomon, 1993).
It is not particularly surprising that most trainers and HRD professionals haven’t been especially good at identifying emerging issues. For starters, HRD professionals are often not very strategically focused and are often positioned poorly to drive strategic issues within organizations. The combination of increasing work demands, staff shortages, and the degree of change within organizations and industries has often resulted in trainers running faster just to keep up. Who has time to plan and think about the future in such a world?
The role of training policies within organisations
Performance analysis is the initial step, but it’s not the only analysis that should occur. Think of analysis as happening continuously and in waves. The first wave is performance analysis. Engineering companies call it “scoping.” Why do they do performance analysis? Because excitement about training must be tempered with the recognition that its key messages do not readily or automatically transfer to the workplace. A broad and systemic approach to training that involves key players and policies is critical for improved performance.
Performance analysis is required because there are numerous ways to achieve higher performance, not just training. In another example, the Engineering Company wanted to figure out why recruiters they are not meeting their goals and what to do about it. A revision to the recruiter training curriculum seemed like an obvious solution, but performance analysis revealed a vast sea of other significant recruiting issues, such as personnel selection procedures, recruiting office locations, and medical entry standards for recruits. In this situation, revised training for recruiters would be a worthy but not sufficient solution. The Engineering Company realized that it had to bring numerous players, not just training professionals, to the table to address the problem. As you can see, not only does performance analysis define what the need or opportunity is, but it also helps to identify how to move the people and the organization in the right direction. It allows people up and down the organization, and outside it, such as suppliers and customers, to have input. The result is a better and more palatable effort.
To ensure consistency and replication of studies, operating standards must be developed and applied as the process model is utilized to develop ROI studies. It is extremely important for the results of a study to stand alone and not vary depending on the individual conducting the study. The operating standards detail how each step and issue of the process will be handled. (Holton, 1996).
The traditional ‘training ‘roles associated with the management of training in organisations and their needs
The world of learning is changing drastically, resulting in major implications for the HRD profession. In light of these changes, we must ask ourselves: Where are we going? What are we all about? What roles will add value for the enterprises and people we serve? To help answer these questions, this chapter takes a look at some of the projected contributions HRD practitioners will make, the roles they will assume, and the competencies that will be necessary to perform these roles. (Solomon, 1993).
People have the tools on their desks, at their homes, on the plane, and in their pockets to access information as events and thoughts occur. They don’t need the old intermediaries—the layers of management, the teachers, even the mailperson or the phone operator—to connect to each other and to filter information. At the same time, the facts are in, reported in every business and economic journal Organizations where people are informed, where they participate, and where they act quickly and without fear are more profitable. They attract the best talent. And that talent stays! Customers of these more participative places are more satisfied. They come back—again and again. Yet the traditions of authoritarianism and risk-averse bureaucracy continue to dog our institutions. Many people in leadership roles are unprepared to support a high involvement, high performance environment. They are unconscious of their own use of power and unaware of the impact of their rank. Alternatively, the people they lead often defer to authority, expect to be taken care of, and wallow in dependency and blaming.
The emerging roles of ‘change agent’ manager of learning resources
Traditionally, learners have had everything done for them by instructional designers, who analyzed performance problems, assessed training needs, examined work settings and learner characteristics, formulated instructional objectives, prepared test items and other metrics by which to measure learning upon completion of instruction, sourced instructional materials to meet the instructional objectives, selected the media by which instruction would be delivered, organized and delivered the content, and evaluated the results. But the role for instructional designers is changing.
Traditional definitions of instructional design have emphasized what is done to build instruction. For instance it is said that the design of consumer goods provides an appropriate, if surprising, model for the design of training. Like an automobile, a typewriter, or a television set, a training program is something to be planned, engineered, developed, tried out, sent back to the drawing board, tried out again, and delivered in final form.
More recent definitions suggest that the goal of instructional design, and by implication of instructional designers, is to achieve results. In that sense, “the field of instructional design is associated with analyzing human performance problems systematically, identifying the root causes of those problems, considering various solutions to address the root causes, and implementing the solutions in ways designed to minimize the unintended consequences of corrective action”. The instructional designer is thus interested in achieving improved outputs or outcomes (performance), rather than merely creating a commodity (training) or carrying out an activity (the design of training).
Increasingly, however, instructional designers realize that instruction and results are driven by the internalized mental processes of the learners themselves. Moreover, instructional design itself is misnamed if the goal is to achieve results, considering that only 10 percent of all performance problems is attributable to deficiencies of individual knowledge, skill, or attitude, but that 90 percent is traceable to organizational or management deficiencies.
Most organizations implement some sort of formal performance appraisal and review progress at least annually. Performance reviews are intended primarily to result in ratings (judgments) of an employee’s performance to aid in decisions about whether to retain, promote, or place the employee on the “fast track.”Employee development and performance review have fundamentally different purposes. The purpose of performance review is to render summative judgments about employee performance that will guide future employment status. The purpose of ED is to guide future learning to provide the greatest possible opportunity for employees to succeed. In preparing for a performance review, employees will adopt a “selling” mentality, presenting their capabilities in the most positive, even exaggerated manner. In a development review, the purpose is to win learning resources; thus, the proper stance is to underestimate one’s capabilities. (Solomon, 1993).
The main training and development model used for implementing training policies
To ensure success, it is also important to anticipate potential pitfalls. One pitfall is ‘failing to see the forest for the trees.” With the practical constraints that normally accompany any project, it is important to focus on essential details only. In fact, useful adaptations of the methods are likely to occur in the iterative process of comparing the scope and depth of detail with the overall goals as the project unfolds. It is worth repeating that the development of an initial prototype of the application early on will contribute to a clear sense of which details are essential and which are distractions. As will be discussed next, understanding how much these methods are important for effectiveness and efficiency.
Another sign of a well-done cognitive task analysis is that the content obtained is sensible and suited to the application. For training applications, this probably entails information about decisions, situation assessment, adaptation of standards to contexts, management of transitions, and teamwork. Often, job incumbents and instructors will say, “I knew that, ” because the content identified is a necessary and important part of their job. Yet this content was not in training, technical objectives, or other job documents. In this sense, an important contribution of cognitive task analysis is to provide descriptive expressions for workers’ mental experience. They tacitly and regularly employ cognitive strategies, mental models, cue recognition, decision making, and the like, but without specific methods they do not have an effective means of communicating or discussing these phenomena. (Solomon, 1993).
Leaders of learning organization development and research all refer to one or more cycles of learning, each with specific stages. When operational definitions are applied to the specific parts of a learning cycle, there is the potential for measurable evidence that learning happened and the knowledge gained was used. Also, like the progression of data to information to knowledge, each stage adds value. The cyclical aspect of learning is simply that the last step feeds back information that influences the next iteration; this is the continuous part of continuous learning. Each step of the learning cycle must be supported or the full value will be lost.
Continuous learning and systems for knowledge use usually involve support of many strategies that may include people, processes, technology, or all three. For example, one high-growth consulting firm has “people managers” who help individual consultants with both personal development plans and resources and capture reusable contributions to business products and processes. Content assets from the process of meeting particular client needs are managed and retrieved by corporate “cybrarians.” They also use brown bag sessions with staff to convey lessons learned and discuss applications. This process progressed from the goal of equipping participants to be fluent in discussing it later to the goal of preparing them actually to demonstrate the new behaviours after the session. Similarly, one of the top three automakers in the United States uses a knowledge management system (KMS)—a searchable database—to manage intellectual capital. The company captures lessons learned in an electronically accessible.
Although generally not openly discussed, yet another reason for not conducting more rigorous evaluations is that the training function may have everything to lose and nothing to gain from the data. If there is an ongoing commitment to employee training and development by the organization, no news may be good news when it comes to documenting training impact. Decisions to implement new or maintain existing training programs may be characterized as “nature versus nurture” decisions.” Nature” decisions are made out of habit or by following popular fads; they require no data or input to make. Despite well-established instructional systems models, “nurture” decisions driven by needs assessment or evaluating training effectiveness may be the exception, not the norm. Given the cost and perceived difficulty of conducting training evaluation, training departments may understandably be prone not to evaluate existing programs.The problem with this logic, however, is that it addresses only one of three purposes for evaluation. Although an absence of data may prolong course offerings, it may also mean there is a lack of information to use for revising courses, retaining trainers, preparing trainees, and selling the training to new constituents. In the end, failure to evaluate may result in ineffective courses or a failure to open new markets for good courses, and each of these results puts more stress on the training function.
The impact of rapid technological change and the growth of e-learning on training policies within organisations
Another common performance goal is improved cycle time. For example, a large office technology company with a core value of organizational reflection considers time to market the critical factor for its business success; leaders say a new technology is only successful if they get there first. To compete on a global level, they frequently find that they must look for alliances with competitors. Scanning for these external connections has paid off. Alliances make it possible to provide the customer solutions required by a rapidly changing market.
A prestigious business research firm offers another example. Its leaders recognize that the speed of business is driven by technology. New technologies are changing customer expectations about how fast they can get answers to research questions. Understandably, cycle time reduction is a competitive advantage. Accordingly, the company has invested in processes to learn customer relationships (through client books that record reflections on what works and what does not) and then design the business process for speed. In fact, it has used anticipatory learning to deliver not only what customers asked for but also things they did not know they needed. Through anticipation and planning for technological changes, the group builds capacity to deliver in parallel with actual information technology releases. Investing in a learning organization includes a focus on information technologies and the customer relationship; both are external connections that are vital to the business. Learning is intertwined with the business process and structured so that the whole team is well informed. Knowledge is obtained from client engagements, recorded in client books, and used to improve individual projects. The result is that cycle time for research production has decreased dramatically. The company now handles more business with the same number of staff. (Solomon, 1993).
Links between performance management processes and evaluation
As the concepts, practices of, and research on learning organizations have matured, it has become clear that leaders and facilitators of learning organization development benefit from viewing training and learning as interventions that are part of the larger systems that influence training transfer. In fact, training transfer is only one influence on the typical learning organization goals of increased business capacity and performance. For example a learning organization intervention would include both learning and nonlearning (environmental, motivational, performance supports). Also, systems are defined from multiple perspectives, both individual and organizational.
Systems thinking may be defined in part by the integration of its tangible products. For example, in a global pharmaceutical company, the brand and product management department measures systems thinking by the quality of the brand plan, the extent to which it addresses the value proposition (exactly what is offered to the customer) of each pharmaceutical product relative to the diseases that it addresses, and its position against competitive drugs.
Based on the purpose of the coaching, identify who can most appropriately evaluate progress against the desired outcomes.
References
Corrall, Sheila (2002). Skills for the future. A paper presented to the LAI/CILIP NI annual conference, Belfast.
Crawford, John (2000). Evaluation of library and information services. 2nd edition. London.
Gallacher, Cathryn (1999). Managing change in libraries and information services. London.
Holton, E. F., III. (1996). The flawed four-level evaluation model. Human Resource Development Quarterly, 7, 5–21.
Marrett, Paul (2002) Information law in practice. 2nd edition. Aldershot: Ashgate.
Orna, Elizabeth (1999) Practical information policies. 2nd edition. Aldershot: Gower.
Parker, Sandra and Crawford, Marshall (2001). Performance measurement and metrics. Chapter 6 in Scammell, A., ed. Handbook of Information Management. 8th edition. London.
Pugh, Lyndon (2000) Change management in information services. Aldershot: Ashgate.
Schlicke, Priscilla (2002) Born-and re-born-digital Information Management Report, pp. 1-4.
Solomon, C.M. (1993). Simulation training builds teams through experience. Personnel Journal, 72, 100–108.
Thebridge, Stella and Matthews, Graham. Skills and knowledge required for preservation management. Personnel, Training and Education, 2000, pp. 6-9
Thornton, Steve (2001) Information audits. Chapter 5 in Scammell, A., ed. Handbook of Information Management. 8th edition, London: Aslib.
Webb, Sylvia P. (1997) Thoroughbred or hybrid-the information manager in the 1990s? Information Management Report, pp. 1-4.
Webb, Trevor J. (1998) Researching for business: avoiding the “nice to know” trap. London : Aslib.