A downloadable game

Ashley Collison

Lochie Frawley

Joe Mallick

Mitch Clifford


Introduction / Overview (CRA: criteria #1 - 5%)

Our crime scene investigation application leverages virtual reality (VR) technology to provide immersive training and testing for crime scene investigator recruits. With pre-built crime scenes, realistic gear handling, evidence identification, and interactive tools, it offers a hands-on learning experience that surpasses traditional methods. Our goal is to equip future investigators with the practical skills, attention to detail, and critical thinking abilities needed for successful crime scene analysis, ultimately enhancing their preparedness and effectiveness in the field. This application ensures that trainees can hone their skills in a safe and controlled virtual environment, eliminating risks of contamination or mishandling of real-world crime scenes, or expensive/time consuming mock creations.

While the bulk of the project resembles what was planned during assignment 4, some features have been altered: The UV light has been attached to the camera, rather than having a separate object as initially planned. Two separate versions of the simulation were added, one for training, and one for testing. This design decision was chosen as it will create a seamless learning experience for the simulation that will allow the users the ability to get used to the experience before testing.

We decided to no longer pursue interacting with digital devices for digital evidence at this stage of the application, this could still be present in a further iteration but was currently not suited to our scope.

A bug was encountered on the final android build of the application where the body of the victim was no longer present. This bug was handled by simply removing the body as this does not hinder the experience of the simulation. This was a late change so the body is still visible in the footage of the application provided.

Technical Development (CRA: criteria #2 - 25%)

Inventory System:

The inventory system is used to switch between investigation tools in the simulation. The user presses a button on the Oculus controller, which then cycles through the inventory, providing the user has picked up the tools in the starting area. The interactWithDoors script file handles the user casting rays at the equipment, which then triggers the InventoryController script. 

The InventoryController script utilises a cyclic linked list data structure to store each inventory node. The AddItem function is called, which checks which item has been interacted with to be added to the inventory. The SwitchInventoryItem script handles the user pressing the A button on the Oculus controller, which cycles through the inventory, allowing the user to seamlessly switch between tools during the investigation.

interactWithDoors: 

interactWithDoors is responsible for all interactions that the user performs with the left controller, including opening and closing doors, picking up equipment and activating buttons. The class defines 5 gameobjects named openDoorRayOrigin0, openDoorRayOrigin1, openDoorRayOrigin2, openDoorRayOrigin3, openDoorRayOrigin4. These are empty game objects positioned around the hand which are used as origin points for raycasts. The script also defines the ray cast distance as a float and also the InventoryController. 

On Start interactWithDoors uses a GetComponent to set the InventoryController. 

On Update interactWithDoors checks to see if OVRInput.Button.PrimaryIndexTrigger (the left trigger) is in the down position. If true then it will set the boolean “triggered” to false and shoots a Ray from each of the 5 openDoorRayOrigins. The reason for using 5 ray casts is that it requires less accuracy from the users when interacting with objects, resulting in a smoother experience. If one of the raycasts detect a collision and that object has the “Door” tag and it is the first raycast to trigger (indicated by the triggered boolean being false) then it will use the SendMessage Function to run the “OpenUp” function on the hit door object. Every Door and Drawer has an OpenUp function which plays the opening animation that came with the apartment asset pack. This function was slightly amended to also trigger the reportDoorCheck event which notifies the gameManager to update the currentOpenedCount. Within this if condition it will also set “triggered” to true, thus stopping other ray casts from activating off of the same object.

If one of the raycasts detect a collision and that object has the “Equipment” tag and it is the first raycast to trigger (indicated by the triggered boolean being false) then it will Add the item to the inventoryController and set the object to inactive, thus simulating the piece of investigation equipment being picked up. Within this If condition it will also set “triggered” to true, thus stopping other ray casts from activating off of the same object.

If one of the raycasts detect a collision and that object has the “Button” tag and it is the first raycast to trigger (indicated by the triggered boolean being false) then it will set the object to inactive. These button objects turning inactive will trigger other events in the program such as teleporting the player to the start location, teleporting the player to the end location or enabling training or test mode. Within this If condition it will also set “triggered” to true, thus stopping other ray casts from activating off of the same object.

Taking Photos and Bagging Evidence:

The user is required to take pictures of the evidence before bagging it up. This is achieved in the application by using raycasting, similar to the way that doors and drawers are interacted with in the simulation.

The approaches taken to photographing and bagging evidence are very similar. In both cases a script is being run, BagEvidence for bagging, and TakePicture for photographs. In both scripts, rays are shot out from either the camera or evidence bag, which when connected with the corresponding evidence item, perform the action of taking an image or bagging the evidence. Each piece of evidence has two invisible box colliders, a larger one for taking photographs that only is interactable when the camera is in the user's hand, and another for bagging the evidence, which activates if the evidence bag is in the user's hand. The box collider for taking a picture is tagged as “EvidenceBox”, while the actual item is tagged as “Evidence”. The box for taking photographs disappears once a photo is taken to avoid taking photographs of the same evidence item multiple times. The item disappears once the evidence is bagged, and for both actions the count in the game manager is updated to track actions. Sound effects were also added to both actions to increase immersion and action feedback.

Starting Zone and Ending Zone

In order to provide essential information on how to interact with the program, the player begins in the “Starting Zone”. The zone contains an introduction to the program along with instructions on controls and what to do. The zone also contains two buttons, one for starting a training run of the program, and one for a testing run of the program. The buttons both teleport the player to the entry room to the crime scene. 

As a conclusion to the program, when the player presses the “Finish Button”, they are teleported to the “Ending Zone”. Here they can view their final score. The score is displayed on the wall via a textbox. The content of the textbox is set by the displayScore function in the gameManager script.

Train and Test

The two modes available to the user are training and testing. The purpose of these modes is to first introduce the user to the environments they will be working in, then to test the user on their skills.

Training provides indicators for each piece of evidence, along with notes on each piece’s significance, in the form of textboxes. The purpose of these textboxes is to guide the user to the location of each piece of evidence along with an indication of what kinds of evidence should be sought out.

Testing removes the indicators attached to evidence so the user must find them on their own. 

Buttons

In order for the player to select which mode they wish to use (train or test), we provided buttons for each, along with a button to end the session.

The script PlayerStart detects when a button becomes inactive (via the interact with doors script). Both the train and test buttons teleport the player to the entry room where they can begin the session. Additionally, the test button removes the textboxes attached to each piece of evidence. The finish button teleports the player to the end zone.

The buttons teleport the player by transforming the playercontrollers location to the location of empty objects located at the destination.

Entry Zone

When the player picks a mode, they are teleported to the “Entry Zone”. This zone contains the equipment they require to enter the scene, along with a checkbox for their equipment. The checklist is monitored by the displayScore script in gameManager and if any of the 3 equipment items are not checked off, the player’s score will be zero. The entry zone also contains the “Finish Button”.

Tick Script

The Tick script manages the checklist for the equipment items required by the user. It detects when the corresponding item of equipment is deactivated (via the raycast interaction) and ticks off the linked item in the checklist.

Scene Picker

A potential goal of our project is to create multiple training scenes with multiple variations in each. While the current build only contains the one scene, the apartment, the ScenePicker script attempts to add some variation to the environment. There are 5 total different types of evidence in the scene: a knife, a phone, used wine glasses, a pair of toothbrushes, and bloodstains on the floor. 3 of those, the knife, the phone, and the glasses, have 3 alternate locations. The ScenePicker script determines where each of those 3 shall be placed.

The script first deactivates each piece of evidence, then randomly generates a number between 1 and 3. Each alternate spawn location for evidence has a corresponding number between 1 and 3, and the script generates a number for each piece of evidence, creating variation in the scene.

Interior Design:

The layout of the apartment needed to have different objects, furniture and rooms to simulate a realistic living space. This not only served to immerse users in the environment but also played a practical role in providing hiding spots for evidence and clues. Attention to small details, such as wall art, decorations, personal items and dishware enriched the immersive experience. This ensured the environment felt natural to the user and not as fabricated / videogame-like.

gameManager:

The gameManager script is responsible for tracking the user's progress within the simulation. At the head of the class are the float variables that keep track of the total number of pictures, bagged evidence and doors within the scene as well as the current number of these metrics which the player has successfully completed. The 3 gameobject variables and 3 booleans related to the Blood tiles are there to ensure that users cannot take multiple pictures of the same bloody piece of floor board and receive multiple points. Each bloody floorboard identified with the UV function and photographed is only worth 1 point.

The gameManager script subscribes to events from 12 different scripts when it is enabled. All of these events report when a door or drawer has been opened, a piece of evidence has been photographed or a piece of evidence has been bagged. These events are unsubscribed when the gameManager script is disabled. 

The addToCount function is called when any of the reportDoorCheck events are triggered. It increments the currentOpenedCount by 1 thus recording that a player has opened a door or drawer. This function accepts a boolean which tracks if the door has already been counted before.

The addToPictureCount function is called when any of the reportPictureTaken events are triggered. It increments the currentPictureCount by 1 thus recording that a player has taken a picture of a piece of evidence. This function accepts a boolean which tracks if the picture has already been counted before.

The addToPictureCountUV function is called when any of the reportUVPictureTaken events are triggered. It increments the currentPictureCount by 1 thus recording that a player has taken a picture of a piece of evidence. The primary difference between this function and addToPictureCount is that this function contains additional logic to make sure that each piece of UV evidence can only be photographed once. 

The addToEvidenceBagCount function is called when any of the reportBaggedEvidence events are triggered. It increments the currentBaggedEvidenceCount by 1 thus recording that a player has bagged a piece of evidence. This function accepts a boolean which tracks if the evidence has already been counted before.

The displayScore function is called on update. It takes all of the floats that record the players progress and converts them into final percentages. If the user remembered to get all the essential equipment before entering the room then their results will be displayed when they finish their investigation. If the user forgot or neglected to equip these items their final score will be 0, regardless of how thorough their investigation.   

The incrementTick function is called when any of the reportTick events are triggered. It increments the tickCount by 1 thus recording that a player has completed one of 3 requirements of putting on the hazmat suit, equipping camera and equipping evidence bags.  

debugScript: 

The debugScript script serves no active purpose in the final application. It was primarily used for testing throughout the early development stage. Primarily it allowed all doors and drawers within the screen to be opened by pressing the “A” button on the oculus controller. This allowed developers to test if all of the “OpenUp” functions were working as intended.

UvLight: 

The UvLight script is used to control the secondary function of the camera equipment, which is the activation of a UV light mode which can be used to detect hidden blood patches. This script operates in a similar fashion to interactWithDoors script with the following differences:

The addition of an if function that tests on update if the currently selected item in the inventory is the camera and if the “B” button has been pressed. 

If the above is true and the UVON boolean is false (uv light isn’t on) then activate the spotlight, set the UVON to true and play the sound effect of the UV light turning on.

If the above is true and the UVON boolean is true (uv light is on) then deactivate the spotlight, set the UVON to false.

Additional if function that will disable the UV light if the currently selected item in the inventory isn’t the camera.

If the UV light is on then create the rays from the camera's 5 raycast origins. 

If one of the raycasts detect a collision and that object has the “UV” tag and it is the first raycast to trigger (indicated by the triggered boolean being false) then it will use the SendMessage Function to run the “ToggleVisibility” function on the hit floor object. 

ToggleVisible: 

ToggleVisible script is used to swap a piece of the apartment's flooring from one without a blood texture to one that does have a blood texture. This class defines one game object which holds the bloody tile (BloodTile), one that has no blood on it (NotBloodTile) and a boolean (isBloodTileActive) to keep track of whether the switch has already occurred. The pair of tiles are located at the exact same location except the bloody one will be disabled and the non-bloody one is enabled. When Uv light hits the non-bloody tile, which has the tag “UV” the ToggleVisibility script is called. ToggleVisibility checks if isBloodTileActive is false, indicating that it has not yet been activated. If it is false then it will set NotBloodTile to inactive, BloodTile to active and isBloodTileActive to true. The Bloody flooring is thus revealed to the user and can be photographed like any other piece of evidence.

3D Content (CRA: criteria #3 – 20%)

Apartment:

The apartment served as a crucial building block for our application. We utilised this kit to create an immersive and detailed crime scene environment, complete with furnished rooms, interactive objects, and detailed textures. By incorporating the apartment kit, we aimed to provide users with a sense of presence and engagement within the VR experience.

Camera:

The camera served multiple functions in our CSI application. It allowed users to document the crime scene and any potential evidence, mirroring the role of crime scene photographers in real investigations. By doing so, it emphasised the significance of accurate and comprehensive documentation for future analysis. In our application, when a photo of evidence was taken, it would make a noise to signify that evidence was found. There was also a UV flashlight also built into the camera, which when switched on showed any remains of blood left on the floor and other materials.

Evidence Bag:

By incorporating the evidence bag, we emphasised the need of maintaining a “chain of custody” and ensuring the integrity of the evidence throughout the investigative process. users were challenged to use the evidence bag effectively to collect and protect key clues, reinforcing the critical practices involved in real crime scene investigations.

Hazmat Suit:

The decision to utilise a hazmat suit to maintain the integrity and sanitary conditions of the evidence, particularly when dealing with bloodstains. Hazmat suits are specifically designed to protect individuals from hazardous materials or biohazards. By incorporating this element, it added an extra layer of realism and emphasised the importance of forensic protocols in preserving the evidentiary value of crime scenes.

Knife:

The inclusion of the knife within our CSI application served as a great addition to the narrative, both as a piece of evidence and as a key point of interest to the investigator. The knife represented a potential murder weapon, and its placement within the apartment added a layer of complexity to the crime scene. This challenged users to gather clues and analyse the rest of the apartment for further evidence.

Phone:

The phone is another crucial point to the storyline, potentially containing information or clues that would be pivotal to solving the crime. 

Body:

The inclusion of the deceased body was a pivotal element that underscored the gravity of the crime and the urgency to investigate. It set the stage for a compelling and challenging investigative experience that would otherwise be replaced with a fake dummy in real life training.

Blood Texture:

The blood textures used in the apartment contributed to the overall realism of the crime scene environment. 

Usability Testing (CRA: criteria #4 - 40%)

The primary focus of this testing is to evaluate the application’s ease of use, functionality and overall user satisfaction. The testing will primarily focus on evaluating the application’s user experience and effectiveness, with a scope centred on key features that are representative of the final product. We sought to test how well the participants learned the controls with minimal external input, how the environment affected them, how intuitive the interactions are, how well they could move around the scene, whether they felt they had enough information to work with and whether they felt this application would be an effective tool for crime scene investigation training. The usability testing will provide valuable insights that will guide further development and refinement, ensuring that the CSI application aligns with user needs and expectations.

Tasks have been thoughtfully crafted to mimic real-world crime scene investigation scenarios, encompassing investigation skills, evidence collection and problem-solving. These tasks include identifying evidence, using a camera to document the findings, correctly bagging evidence, and employing UV lighting to identify blood trails. 

Our methods for data collection primarily involved a post-test questionnaire, with questions regarding ease of movement, how easy they found the primary interactions: opening doors/drawers, taking photos, bagging evidence, use of tools and controls and how well informed they felt about what to do. We also monitored how long it took the participants to figure out what to do, and how to do it. 

The ideal participants for this test process would be students studying to become crime scene investigators and experienced crime scene investigation teachers and real world crime scene investigators.

Participant recruitment is targeted towards individuals with varying levels of forensic experience, or may be studying a related field. Some users did not match this criteria, and were asked to “play out” as if they were a crime scene investigator.

After the user evaluates and tests the application, post-test questionnaires are provided to collect feedback on user satisfaction. The findings will be reported and analysed to determine the application’s strengths and weaknesses, determining areas requiring enhancement. This structured approach ensures usability testing provides valuable insights for refining the CSI application to meet the needs of its intended users effectively.

The users were asked a series of questions, each with a range of responses from Strongly Disagree to Strongly Agree, and three questions at the end where they were allowed to provide their own answers in text form. The questions provided were the following:

https://docs.google.com/forms/d/e/1FAIpQLSdAsAKt2vz47Pc1LmwGoYZUm0pfSY2-b2PVUhg7nTTGldUpcA/viewform

The format of these questions was chosen as it allows for the users to rate their satisfaction efficiently and also provides an opportunity for open ended questions at the end to help find any improvements that have not been thought of previously.

The user feedback for our crime scene investigation application is largely positive. All users found the major interactions of the program easy to use, including interacting with doors/draws, switching between tools, bagging and taking photos of evidence and searching for blood stains. All users reported feeling well informed and that the controls were intuitive and easy to use. Feedback on moving around in VR was clustered around neutral. None reported feeling overwhelmed or confused. All users agreed that the application could be helpful for training CSI investigators.

Regarding the UI elements, opinions were mixed. Some suggested a slightly more informative UI to clarify button usage, while others feel it’s already straightforward and requires no significant changes. Some suggested the presentation of the UI could be optimised.

In terms of improvement, users generally believe the application could be optimised to reduce lag, and some improvements to navigation would be helpful. Some suggested making it slightly more challenging to test the users. Overall, the users appear to appreciate the application’s user-friendliness but see room for some refinements. Users unanimously agreed the CSI application demonstrates its potential as an effective and engaging tool for training and simulating CSI investigations.

Overall, feedback was highly positive regarding the program. Regarding key points defined for testing, feedback for all sections excluding movement in VR were positive, while the difficulties with movement were attributed to movement using the joystick, which is external to this application. All users believed the program would be an effective substitute for real-world mock crime scenes for investigation purposes, and while some issues were identified, they were largely satisfied by the application.

Addressing the Results of the Usability Testing (CRA: criteria #5 - 10%)

Regarding feedback on movement within virtual reality (VR), some optimisations could improve the character model and how it interacts with the scene e.g. reducing the playercontroller’s height to allow better access to some of the lower drawers. Most of the reported difficulty moving around within VR was due to issues with the built in movement system operated by the joystick. Providing a large enough space to accommodate IRL movement could assist with this problem. Feedback on potential improvements to the application mainly stated that it could benefit from less lag. A large portion of the lag could be due to the complexity of the code and the engine running it, to address this some sections of scripts could be altered, e.g. hard coding certain elements in an array instead of using the find game objects with tag functions. 

References

Apartment Kit, Unity Asset Store. Available at: https://assetstore.unity.com/packages/3d/environments/apartment-kit-124055.

Bloodstain pattern, HiClipart. Available at: https://www.hiclipart.com/free-transparent-background-png-clipart-iizsi.

Check mark tick, pixabay. Available at: https://pixabay.com/vectors/check-mark-tick-mark-check-correct-1292787.

DSLR Camera, Unity Asset Store. Available at: https://assetstore.unity.com/packages/3d/props/electronics/next-gen-camera-37365. 

Smartphone, Unity Asset Store. Available at: https://assetstore.unity.com/packages/3d/props/free-phone-181455. 

Pool drop blood splatter, Vexels. Available at: https://www.vexels.com/png-svg/preview/159508/pool-drop-blood-splatter. 

RackRibs Hazmat suit, Sketchfab. Available at: https://sketchfab.com/3d-models/hazmat-suit-version-2-27099d5eb2da457cba98c234b31379d5. 

sergeilihandristovpro Kitchen knife, Sketchfab. Available at: https://sketchfab.com/3d-models/kitchen-knife-a27d59fa297249ecb99220e75eb67eb1. 

Sousinho Plastic evidence bag, Sketchfab. Available at: https://sketchfab.com/3d-models/plastic-evidence-bag-deee24b1f0cc43728dbde3e137e4bfba. 

Blood stain #1, Pngtree. Available at: https://pngtree.com/so/blood-stains.

 

Woman body model, TurboSquid. Available at: https://www.turbosquid.com/3d-models/3d-camiliarealistic-women-model-2103051 

Plastic Bag (Foley) A1 Sound Effect. Available at:

https://www.fesliyanstudios.com/royalty-free-sound-effects-download/plastic-bag-227

Camera shutter Sound Effect. Available at:

https://www.myinstants.com/en/instant/camera-shutter-181/

Night Vision Sound Effect. Available at:

https://www.myinstants.com/en/instant/night-vision/

Bloodstain-pattern-analysis-clipart-blood-splatter. Available at:

https://www.hiclipart.com/free-transparent-background-png-clipart-iizsi

Kleygrewe, L., Hutter, R. I. V., Koedijk, M., & Oudejans, R. R. D. (2023). Virtual reality training for police officers: A comparison of training responses in VR and real-life training. Police Practice and Research (https://www.tandfonline.com/doi/full/10.1080/15614263.2023.2176307#)

Reno, J. et al. (2000) Crime scene investigation: A guide for law enforcement. Washington , DC: U.S. Department of Justice, Office of Justice Programs. 

Download

Download
KIT208-KIT724A5.apk 131 MB

Leave a comment

Log in with itch.io to leave a comment.