Defining Virtual Environments
Virtual Environments (VE) can mean very different things to different people, depending on their backgrounds (see link). Given the focus for this page is about Training in Virtual Environments, we will limit the virtual environments to mean software-generated environments that are useful for technology-based training and instruction. Examples include:
- Serious Games (training/learning occurs in a game-based virtual environment),
- Simulations (training/learning occurs in a non-game virtual environment),
- Virtual Reality (requiring some kind of Goggle to properly view the content, e.g., Oculus Rift, C.A.V.E.),
- Virtual 360º Spheres (a subset of virtual reality for immersive experience or tele-presence – e.g., Virtual Museums/Tours),
- Augmented Reality (requiring some kind of interface/goggle to combine virtual reality with the physical world, e.g. Hololens),
- Tangible Media (remote projections of physical self and objects),
- Holograms (virtual agents capable of human-like or A.I. conversations – e.g., Virtual Einstein),
- MUD, MOO, Virtual worlds (somewhat archaic virtual space for social interaction and collaboration — many of them are glorified chat rooms),
- Holodeck (a fictitious virtual-environment training facility in the Star Trek universe – further explanation).
Simulators (such as the Flight Simulators and Human Manikin) is a specific case of Instructor-led Virtual Training. Instead of putting human inside a virtual environment, the setup is a simulated (virtual) training environment that allows an instructor to create challenges (on-the-fly) for the learner to overcome. The student would interact with a virtual objects (i.e., a simulator, such as the flight dashboard, or the manikin) in front of them while the instructor manipulate the behavior of the virtual object via a computer/console. This type of virtual training is very common for flight schools (commercial and military) and medical schools (surgery and nursing).
At the Virtual Environment Lab (V-Lab), we have thus far been focused on Serious Games assessment and analytics. As the virtual reality market continue to mature, we will also be looking into Virtual Reality training assessment.
The Scope of Training & Learning with VE
Different procedures (training scenarios, serious games) require different amount of conscious vs. implicit cognitive efforts. Hence, skill acquisition is highly dependent on the procedure and is not easily transferable. For transference of learning to occur, the two procedures need to be (seen as) generalizable by the individuals. Meaning, if an individual can recognize (cognitive process) common links between the two procedures, it may be possible to transfer the learning/doing (motor process).
In this climate where many (aging) experts and masters are retiring, leaving the workplace with a largely unskilled workforce, knowing how and what to train can help reduce training cost tremendously. Moreover, since the cost of setting up the virtual environment has already been spent, it makes sense to reuse the setup for in-depth training research and development as a means to recuperate the investment.
Once we have determined the scope of the training: taking a learner from Novice to Proficient (through the Competent level), we can go about setting the assessments needed to measure the skill(s) acquired. We already know that sufficient training will be required to propel an individual from the lower left (Novice) upward to the right (increase in skills). Such training involves repetitions (with changing conditions) over time. The graph on the right used log time as x-axis, and log-practices as y-axis, to better illustrates the 10-year training period and/or the 10,000 practices (i.e., deliberate practice) that are often attributed to the creation of a Master.
Drills, Practice, and Proficiency
Some pundits love to bash Drill as “drill-and-kill.” There are two ways to drill: (a) The boring mindless drill that is pure rote without knowing why you do it, and (b) deliberate practice, the correct way of “drill and practice” to gets you to the next level of expertise. Without repetition and practice for performance improvement, there is no way to better a skill.
But even deliberate practice can get boring if you are just repeating the same motion for a 1000 times. To relieve boredom from drills, variations is the spice to increase engagement and broaden the learning exposure. For example, do not simply repeat the addition of 50+50 up to a hundred times (this type of drill is boring). Instead, think of how many ways there are to add to 100! This is what shu-HA-ri means by “break from the way”.
Novice learners who have picked up the basic moves need to gradually improve their performance through (self-)correction to lower both the error rates (improve accuracy) and time requirement (improve proficiency) for completing the task or procedure.
Strategy vs. Speed
It is important to note that proficiency is not determined by speed alone, in some conditions, better planning can be more important than being the first to finish. Loh and Sheng (2013) showed that time and strategy are both important. In some cases, strategy may be more important than time because time constraints can also promote undesirable effects, such as unnecessary risk taking or ‘gambling.’
- Loh, C. S., & Sheng, Y. (2015). Measuring the (dis-)similarity between expert and novice behaviors as Serious Games Analytics. Education and Information Technologies. 20(1): 5-19. DOI: 10.1007/s10639-013-9263-y
- Loh, C. S., & Sheng, Y. (2014). Maximum Similarity Index (MSI): A metric to differentiate the performance of novices vs. multiple-experts in serious games. Computers in Human Behavior. 39: 322-330. DOI: 10.1016/j.chb.2014.07.022
- Loh, C. S. & Sheng, Y. (July 2013). Performance metrics for serious games: Will the (real) expert please step forward? Proceedings of the 18th International Conference on Computer Games: AI, Animation, Mobile, Interactive Multimedia, Educational & Serious Games (CGAMES 2013). Louisville, KY. [PDF] DOI:10.1109/CGames.2013.6632633