Unreal Engine

Getting Started with Unreal

Getting Started (with Unreal!)

Below is are two tutorials on how to set up a basic scene in Unreal. It's recommended for beginners to follow these guides to get familiar with the engine. After following the first tutorial, you'll notice your sky is black (and eerie) where previous Unreal guides featured a beautiful sky. The second guide shows how to add a sky using the Sky Atmosphere Component.

 

Level Designer Quick Start in Unreal Engine | Unreal Engine 5.4 Documentation

Sky Atmosphere Component

 

Where do I go from here?

This is a guide to get familiar with using Unreal, HOWEVER, it is NOT VR FRIENDLY! For example, the sky creation is usually substituted with a sphere mesh and panoramic sky image, and real time lighting is taken under heavy consideration.

Building for VR? Start with a new project and begin some of the other guides listed here. SCiL staff/workstudies are available for assistance or clarification.

VR Settings for Unreal Engine 4/5

The following are from pages 121-137 from the book "Unreal Engine 4 Virtual Realty Projects..."

Applying these settings will take time during a restart!

For PC VR:
Additionally, standalone VR:

XR Rig Setup

Getting Started

Start by launching Unreal Engine. 

Select the games option on the left.

Select the blank template. Project Defaults should be set to blueprint. Raytracing should be disabled

Consider your target device for Target Platform

Quality Preset for standalone requires Scalable. Desktop can be either Maximum or Scalable, but often Scalable is the best choice.

image (1).png

Select project location, name your project, and click Create. After some time the new project appears.

Select plug-ins in the Edit menu. 

Select Virtual Reality in the left menu, under built-in.

Check OpenXR

A box outlined in yellow should appear at the bottom. Click Restart Now.

S2wimage.png

Once restarted, go to File > New Level

Select Basic Level

Go to File > Save Current Level and give your level a meaningful name.

Now is a good moment to APPLY VR SETTINGS

 

Create the XR Rig

Right click in the Content Drawer (at the bottom of the screen) and choose Blueprint Class

When prompted, choose Pawn. Give it a meaningful name, such as VRPawn_BP.

Double click the new pawn to open and edit it. 

A new window should appear- this is the Pawn Object Editor.


In the upper left corner of the Pawn Object Editor under Components, click the Add button (marked by a green +). 

aOzimage.png

Choose scene and rename it to CameraOffset

With CameraOffset selected, press Add again and add a camera. Leave the name as "camera". 

Select CameraOffset once again, press Add, and add a motion controller. Name it "Motion_Controller_L". 

Repeat the previous action, this time naming it "Motion_Controller_R". 

Select "Motion_Controller_L". With this component selected, add a component by either clicking the +Add button or right-clicking and add the component XR Device Visualization. This should now be a child of Motion_Controller_L. Rename the component to something like "XRDeviceVisualization_L". In the Details panel, under Collision, Change Can Character Step Up On to No, and change the Collision Presets from BlockAllDynamic to NoCollision. Finally, move to the Visualization section of the Details panel and check Is Visualization Active.

Repeat with "Motion_Controller_R". 

Make sure the Motion Source is set to Right under the Motion Controller section (directly under Visualization) for "Motion_Controller_R". Do not select Display Device Model (it is deprecated and replaced with a component as mentioned above).


Finally, switch to the Event Graph tab and find the node listed Event BeginPlay. Right click and add a Set Tracking Origin node and under origin verify "Floor Level" is selected. From the BeginPlay node, drag a white connector pin to the newly added node.


Press compile at the top left of the screen.

Press save and return to the level view. 

Add your pawn to the level

Add your newly created player pawn to the scene. Select your pawn and in the details panel, under Pawn, change Auto Possess Player from Disabled to Player 0. You can also change these values within the pawn blueprint itself, especially if you are using it in multiple maps.


In the same panel as the play button, click the three dots and select VR Preview (if you don't see VR Preview, leave Unreal, run the Meta Link software,, confirm VR is working through the headset, and restart the Unreal editor).

Test the scene in your headset. (Adding an object and/or lighting point may be helpful for testing).

Where to go from here?

The wiki is full of guides, but it is important to learn basic teleport as it covers additional basics for both Unreal and VR.

Setting up basic teleport Pt.1

This content is from the following book: Unreal Engine 4 Virtual Reality Projects

It has been modified to work more independently from other systems that are introduced in the book.

Organization is key. I would recommend a folder in your project, possibly named Player where your pawn is stored along with additional assets that are needed by the player.

 

Overview

Let’s break down the items we are going to need to solve teleporting. Some items will get refined as we go:

Prerequisites:

https://docs.unrealengine.com/5.2/en-US/quick-start-guide-for-blueprints-visual-scripting-in-unreal-engine/


Get our motion controller and send out a line trace

Let's begin with the first thing we need to do to get our teleport running—figuring out where the player wants to go:


1. Open up our BP_VRPawn Blueprint, and open My Blueprint | Graphs | EventGraph, if it isn't already open.

The motion controllers setup do a great job for the visuals. We need to add another motion controller but this will not be for visuals. Instead, it will make use of the controller’s “aim” configuration.

Add a motion controller component, name it MotionController_Teleport_R. In the Details panel, change its Motion Source to RightAim.

We should still see the BeginPlay event in our Event Graph where we set our tracking origin. Now, we're going to add some code to our Event Tick.

The Tick event is called every time the engine updates the frame. Be careful about putting too much work into your Tick events, as they can eat performance.


2. If you don't already see an Event Tick node in your Event Graph, right-click anywhere in the graph, type tick in the search box, and select Add Event | Event Tick. If you already have a Tick event defined, this won't add a new one—it'll just take you to that node in the event graph. If you don't, this will create one now.


3. Right-click to the right of Event Tick and add a Line Trace by Channel. Connect the pin from the Event Tick to the Line Trace Channel.
When you perform a line trace, you supply a start point and an end point, and tell it what collision channel you're looking for. If an actor with a collision set to the supplied collision channel intersects the line between the start and end points, the trace will return true, and will return information about what it hit. We're going to use this behavior to find our teleport destination.


Let's start our trace at the location of the right motion controller:

1. From your components list, grab MotionController_Teleport_R, and drag it into your event graph.

2. We want to start our trace at the motion controller's location, so let's drag a connector out from the MotionController_Teleport_R return value and release.

3. In the dialog that appears, type getworld into the search bar and select GetWorldLocation.


4. Drag the result of GetWorldLocation into the Line Trace node's Start input pin.

Now, let's set the trace end point. We're going to end our trace at a point 10,000 units away from our start location, in the direction the controller is facing. Let's do a bit of simple math to figure out where that point is.

5. From the MotionController_Teleport_R output, create a Get Forward Vector node. This will return a vector with a length of 1 that aims in the direction the controller is facing. We said we wanted our end point to be 10,000 units from the start, so let's multiply our Forward vector by that value.

6. Drag the Get Forward Vector return value out and type * into the search bar. Right-click the bottom input and convert the pin to a float (single precision). 

Now, drag a connector out from the float input to the multiply action, and select Promote to Variable:

screenshot-2022-09-30-145714.jpg


7. Name the new variable TeleportTraceLength, compile the Blueprint, and set the variable's value to 10000.

We now have a vector that's 10,000 units long, aiming in the controller's forward direction, but right now, it would be running 10,000 units from the world's center, rather than from the controller, as we intend. Let's add the controller's location to this vector to fix that:

1. Drag another connector from the controller's GetWorldLocation call, and type + in the search bar. Select vector + vector.

2. Drag the output from our forward vector multiplication into the other input.

3. Connect the output of this addition to the End argument of LineTraceByChannel:

teleport-1.png

Before we move on, let's set up some debug drawing to see whether everything is behaving as we expect so far.


4. Hold down the B key and click on the open space to the right of the Line Trace node to create a Branch node. (You can also right click and create a Branch node the way you usually do, but this is a useful shortcut.). Connect execution pin from the LineTraceByChannel to the Branch node.

5. Drag a connector from the Line Trace node's Boolean Return Value to this branch's Condition.
The trace operation will return True if it hits something, and False if it doesn't. We're only interested in debug drawing the result if it hits something, so we're just going to use the True output from our branch. If we did hit something, we need to know where the hit occurred.

6. Drag a connector from Out Hit and select Break Hit Result to see the members of the hit result struct.

Now, let's draw a debug line representing our trace:

1. Drag an execution line from our Branch node's True output, and create a Draw Debug Line action.

2. Drag the Location from the Hit Result struct into the Line End input on the Debug Line call.

3. Drag the hit result's Trace Start to the Line Start.

4. Set the line's thickness to 2, and set its color to anything you like.

While we're at it, let's draw a debug sphere at the hit location:

1. Create a Draw Debug Sphere node.

2. Connect its execution input to the debug line's output.

3. Set its Center to the hit result's Location:

screenshot-2022-10-03-101045.jpg

Be aware that Draw Debug calls only work in development builds. They're useful for understanding what's going on, but they're just debugging tools and need to be replaced with real visualizations for your actual software. We'll do that shortly.

4. Let's test it.

Good. So far, it's doing what we expect—casting a ray from the controller, and showing us where it hits a surface. The problem, though, is that it's just as happy to hit a wall as a floor. We need to restrict it to valid teleport destinations. Let's do that.

IF YOU HAVE CONTROLLER JITTER WHEN AIMING: This is due to a bug with 5.3. You can switch to FXAA anti-aliasing in Project Settings/Rendering/Default Settings/Anti-Aliasing. Be sure to switch back to MSAA when you do not need these nodes anymore. These lines are only for testing purposes and are usually replaced with splines and visual meshes.


Improving our Trace Hit Result

We need a blueprint to represent our teleport area so that we can narrow down what our Line Trace hits.

1. In your content drawer, create a new folder at the root called Interactions. This will hold blueprints for player interactions in our map.

2. In this new folder, right-click and create a new Actor blueprint. Name it something like TeleportArea_BP. Open it up.

3. Add a new component of type StaticMesh. In the Details panel to the right, change the Static Mesh field from None to Plane. Set the scale to (5, 5, 5).

4. Compile, save, and close for now. You can edit the size and visuals later! Add this blueprint to your scene.

We can now check if our line trace is hitting that teleport area only.

1. Make some space between our branch node and our two debug draw nodes. From our Break Hit Result, choose the Hit Actor and drag out and type Cast to TeleportArea_BP. Choose the node. This says that the actor our trace is hitting should check if it's the type TeleportArea_BP. Connect the execution pins between the branch and the Debug Draw nodes.

teleport-2.png

2. Test and verify that everything is working. Although you can't teleport yet, we have a visual and a way to interact with our teleport point. Feel free to edit the size of the teleport area and maybe "hide" the visual of the teleport area.

teleport-3.png

Setting up basic teleport Pt.2

PREREQUISITE: Setting up basic teleport Pt. 1

The following has been taken and slightly modified from pages 196-201 of the book "Unreal Engine 4 Virtual Reality Projects..."

Teleporting the player


The first thing we need to do in this instance is give the player a way to tell the system when they intend to teleport.

 

Creating Input Mappings

We're going to use our Engine Input Mappings to set up a new named input. Let's get started:

1. Open your Player folder (where your pawn is kept) and create a new folder inside called Input. This folder can get quite heavy later! Right-click and choose Input-> Input Mapping Context. Name it something like PlayerInput_IMC.

2. In the same folder, do the same as above but choose to create a Input Action file. Name it TeleportRight_IA.

Input Actions default to a boolean, or true/false action. This is handy for buttons on controllers or keyboards. There are other settings too for other types of actions.

3. Open your PlayerInput_IMC file. There are many drop downs and sub-drop downs here. Click the (+) icon next to Mappings and add your TeleportRight_IA. From here, you can add many different controller supports! Let's add Oculus Touch (R) Trigger by selecting the None dropdown and choosing Oculus Touch->Oculus Touch (R) Trigger.

Our default input requires a simple Pressed and Released feature. Luckily, it's easy to add by choosing your controller input, Oculus R Trigger, and adding two elements to our Triggers (+) section. Change both None to Pressed and Released respectively.

Your PlayerInput_IMC file should look like this when finished:

teleport-4 (1).png

 

Add Input to Pawn

These mappings have been setup and can be edited later. They are also configurable for players who wish to change their input mappings at runtime. Let's tell our player pawn to use these mappings.

1. Open your player pawn and find the node Event BeginPlay. This is where you added the node Set Tracking Origin in a previous tutorial. Right click and type "Get Controller" and a LOT of items will show! We are looking for Get Controller exclusively in the Pawn Section from the list.

2. From the blue return value pin on Get Controller, drag out and type Cast To PlayerController. Select that node and set its executiion pin from Set Tracking Origin. From the As Player Controller blue pin, drag out and type Get EnhancedInputLocalPlayerSubsystem.

3. From that node, drag out and type Add Mapping Context. Connect the two execution pins from Player Controller to this new node.

4. In the mapping context node, you will see a drop down that says Select Asset. Choose your PlayerInput_IMC asset.

When you are finished, it should look like the following:

teleport-5.png


Caching our teleport destination

Now, before we do anything with this event, we need to store the location we found in our
trace method previously so that we can use it here when the player tries to teleport:

1. Under My Blueprint | Variables, hit the + sign to create a new variable.

2. Set its type to Vector, and name it TeleportDest.

3. Add another variable called bHasValidTeleportDest. and make sure it's set to Boolean (red).
Variable names are important. They tell the reader (who might be another developer maintaining your code or might be yourself in the future) what a variable represents. Your variable names should accurately reflect what they contain. In the case of True/False Boolean variables, make sure your name describes what question it's actually answering. So, for instance, in this case, Teleport would be a poor choice for a name, as it doesn't indicate whether the variable's value means that the player can teleport, is teleporting, has recently teleported, or just enjoys daydreaming about teleporting. Be clear about these things. bHasValidTeleportDest clearly indicates what it means. Prefixing Boolean variable names with b is a practice mandated by Epic's coding style guide for C++, but it's a good idea to follow it in Blueprint development as well. (If you plan on developing in C++, you should know and follow the Unreal style guide, which can be found at

https://docs.unrealengine.com/5.0/en-US/epic-cplusplus-coding-standard-for-unreal-engine/


Let's populate these variables. The location we care about is the Projected Location found by the Project Point to Navigation method we're calling at our hit location. Let's store whether we've found a valid location. You'll probably want to drag the Draw Debug Sphere node a bit to the right to give yourself some room since we're about to add a few nodes before we call it:


Let's set our TeleportDest to the Project Point to Navigation method's projected location if
it returns true:
1. Drag our bHasValidTeleportDest variable onto the event graph and choose to set it. Do it again (you should have two).

2. Place the first after the Cast To TeleportArea_BP node, and the other below for Cast Failed. Check the box for the top. The idea here is that if the cast is successful, this variable is set to true (checked). If it fails, it's set to false (unchecked).

3. Add your TeleportDest variable (set) to the graph twice too and place them after the two nodes we just made. The bottom can just be fed an execution pin and left default (0, 0, 0). The other, you can feed in Location from the Break Hit Result.

Your graph should now look like this:

teleport-6.png

The yellow dots you see are called Reroute Nodes. You can double click on a noodle to add them. Feel free to rearrange so that the graph looks nice.


Now, on every tick, we have either a true or a false value in bHasValidTeleportDest, and if it's true, we have the location to which we could teleport!

 

Executing the teleport

Let's use the value we've just stored in the bHasValidTeleportDest flag to see whether we have a valid destination, and teleport the player pawn to the TeleportDest if we do:

1. Make some space and let's add our input action from earlier. Below our Event Tick, with space, right-click and search TeleportRight_IA and choose the one under Enhanced Action Events.

2. From the TeleportRight_IA input action we created a moment ago, we'll connect an execution line from its Started output into a Branch node.
Remember that you can hold down B and click to create a Branch node. Take a look at the other shortcuts found on Epic's Blueprint Editor Cheat Sheet here: https://docs.unrealengine.com/en-us/Engine/Blueprints/UserGuide/CheatSheet. They'll save you a lot of time.


3. Grab your bHasValidTeleportDest variable and drag it onto the Branch node's Condition input.

4. From the True execution output, create a SetActorLocation action, and drag your TeleportDest variable onto its New Location input:

WmPteleport-7.png

Launch it into a VR preview and give it a shot. You should now be able to teleport around the map. It's nice to be able to explore, right?

Picking up objects (Basic)

This page's original source is here: https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/XRDevelopment/VR/VRHowTos/UsingTouchControllers/

Prerequisites

Set up an XR Rig

Basic Teleport (covers input)

Adding Motion Controller support to your UE4 projects can add a level of real sense of immersion and realism. One way to take this immersion and realism to the next level is add the ability to pick up objects placed in the world with your Motion Controllers. In the following document we will take a look at how you can add Motion Controller support your UE4 VR project.

Motion Controller Setup

In the following sections we will go over how to setup your Motion Controllers so that they can pick up and drop items that are placed in the level. For this guide, we will focus on using the just the LEFT controller.

Component, Variable, & Event Setup

Before we can start to add nodes to the Event graph, we first need to create and set up a few Components and Variables. In the following section, add these variables to your variable list

 

Component / Variable Type

Name

Value

PrimitiveComponentType

HitComponent


Boolean

IsHoldingObject

false

Actor

PickedUpActor


You will also need to create two  Custom Events in your Event Graph and name them the following:

 

Node Name

Value

PickUpObject

N/A

DropObject

N/A

 

The names of these do not differentiate between Left or Right (and that is on purpose). At the end of the tutorial, I recommend adding "Inputs" to these events and feeding your controllers in. But don't do that now.

 

Finally, if you haven't already, as found in the teleport tutorial, you need to create two additional Input Actions for Left and Right grip actions. Be sure to assign these actions to your Input Mappings. Here is a link if you forgot how to set up input:

https://scil-wiki.su.edu/books/unreal-engine/page/setting-up-basic-teleport-pt2

Holding and Dropping of Items

The Holding and Dropping of Items section call the different Custom Events that handle picking objects up and holding objects as well as dropping them. The user initiates this by pressing or releasing the Left Motion Controller Trigger. One thing to take note of here is that we are using a Branch statement to make sure that we can only pick one object up at a time. The reason this is important is we only want the user to be able to pick up one object at a time.

Screenshot 2024-09-04 125041.png

Pick Up Object Event

The Pick-Up Object Event section handles all of the logic for finding objects that can be picked up and picking them up. While this is one function, it does have two parts that it can be broken down into. In the following section, we will take a look at these two parts and what they do.

The first part of the Pick-Up Object Event deals with finding the object in the world that meets the requirements for being picked up. To do this, a ray is cast 1,000 CM into the world from the forward facing direction of the user's Left Hand Motion Controller. This ray is also told to ignore any object that does not have an Object Type of Physics Body.

For testing purposes, the  Draw Debug Type has been set to Duration allowing us to the ray that was cast into the world. When you are ready to use this in production, make sure to set the Draw Debug Type to None  .

The second part of the Pick-Up Object Event deals with what happens once we find an object we can pick up. When an object is found the Break Hit Result node is used to get more information about what we hit. In this case, we are using it to find out what Actor and Component where hit. Physics is then disabled on that object, and the object is then attached to our Motion Controller. Finally, we set the variable  IsHoldingObject  to true so that we can not pick something else up.

Screenshot 2024-09-04 130107.png

It may be difficult to see from the image above, but Attach Actor to Component has Location Rule and Rotation Rule set to Snap to Target and Scale Rule set to Keep World.

Drop Object Event

The Drop Object Event section handles all the logic for dropping objects that have been picked up and also making sure that everything is reset, so we are ready to pick something else up.The first thing that the Drop Event does is check to make sure that the user is holding something that can be dropped. If they are holding something, then that object is then detached from their Motion Controller and has its physics re-enabled allowing it to fall to the ground. Finally the Hit Component and Picked Up Actors variables are cleared of any old data making sure they are ready to store the newly picked up object.

With the Blueprint now complete, we just need to add some objects to the world to pick up. Since the objects we are looking to pick up must have a physics body, you can just add a few Static Mesh and set their Mobility to Movable and also make sure to enable Simulate Physics.

Where do we go from here?

You can add input parameters to your Inputs section of the Pickup and Drop events. These inputs you can fill with your motion controller. Then, the parameters can be used within the event itself by replacing the motion controller. If you are stuck, see Wes in SCiL.

This covers basic grab interactions. For a more complete interaction system, consider using the VR Expansion Plugin: VR Expansion Plugin

Input Mappings in Detail for Unreal

How To Use The New Enhanced Input Action Mappings In Unreal Engine 5.1 (Tutorial)

https://www.youtube.com/watch?v=nXJuXUxQfa8

 

Unreal Documentation

https://dev.epicgames.com/documentation/en-us/unreal-engine/enhanced-input-in-unreal-engine

Worldspace UI in Unreal

Prerequisites

If you have not done the previous guides, setting up an XR Rig or setting up teleport, you must complete these guides! One important note is the tutorial listed below uses the OLD input system whereas newer versions of Unreal have the Enhanced Input System. This is covered in the Teleport series here in the wiki. From here, you should be able to substitute the old version with the new version of input when following below.

Create and Interact with UI for Virtual Reality

Follow the tutorial here

Something is wrong! In YouTube tutorials, comments are your friend! Other users tend to update the "changes" in the comments since the video was created.

 

Where do I go from here?

The tutorial listed above uses a "debug" line where most developers use a spline and spline mesh components as well as creating a reticle to act as a cursor against the UI. 

Create a Skybox

About

This guide will show you how to create a "sky" by building a 3D model, importing the model into Unreal, and setting its material properties to accept a 360 photo (or video).

Although this guide makes use of Blender and Unreal Engine, the concepts are the same for any modelling software or other game engine.

Getting Started

For this guide I will be using Blender. Open Blender and go to File/New/General. Delete the Cube, Camera, and Light. Add a UV Sphere mesh by going Add/Mesh/UV Sphere

Screenshot 2024-07-16 103236.png

Modelling: Adjust smoothness

Currently, there are hard edges along the sphere, we will use smooth shading for those edges. Select the object, press [TAB] to enter Edit Mode (or use the drop down in the top left), press [A] to select all the faces, and using the menus at the top, select Face/Shade Smooth

Screenshot 2024-07-16 104659.png

Modelling: Invert normals and scale

If you are not still in Edit mode, switch to Edit Mode [TAB]. Select all the faces [A] and using the menus at the top, go to Mesh/Normals/Flip.

You can check the face orientation (normals): YouTube - Display Face Orientation

Now return to Object Mode [TAB], select the object, press [S] to scale and type 500. Zoom out using the scroll wheel on the mouse and notice it's large and the ViewPort starts to clip. Using the menus, go to Object/Apply/All Transforms.

Optional: You can select the Red/Pink Material button in the right column, select [New] to create a new material, and rename it to something like Skybox_M.

Screenshot 2024-07-16 105720.png

 

You can now export as an FBX. Go to File/Export/FBX (.fbx). Make sure under the Geometry drop down, select Smoothing: Face

Screenshot 2024-07-16 105917.png

 

Unreal Engine

I am assuming you are using an EMPTY level, or a level without all the sky/lighting actors. I am also assuming you have imported the sky model (drag and drop) into your well-organized, stress-free, and cohesive folder structure. If you had not created a material on your model in the previous steps, create a new material and assign it to the mesh if necessary. Next, open that material.

Drag and drop your 360 photo or whichever texture you are using into your project and drag this texture as a node into the material editor. Connect its RGB output pin to the Emissive Color pin.

Screenshot 2024-07-16 125434.png

 

What about an HDR Sky?

Let's say you are using an HDRI sky (.hdr file extension). You need an additional node called Absolute World Position. Connect its XYZ output to the UVs input of your texture.

Screenshot 2024-07-16 125851.png

Finishing Touches

Open up your scene in Unreal Engine. Add a SkyLight actor to the scene with the default settings if you do not already have one.

Now, drag the Sky mesh into the scene that you created in Blender.

Save your work. Bake the lighting if needed (probably). This concludes the guide.

VR Expansion Plugin

VR Expansion Plugin

What is the VR Expansion Plugin?

VR Expansion Plugin

If you are new to this page, I have assumed you have gone through the basics such as setting up a VR rig, basic teleport, and picking up objects. If you have not it is highly recommended you do so before proceeding with this plugin!

This plugin adds a variety of features to quickly build for VR and has extremely high customization. It is different than other tool kits, such as the VR Interaction Toolkit in Unity because there are little constraints in how developers can create interactions.

Some notable example features:

Unity developers may notice this plugin does not come with continuous movement. Most locomotion systems are already built into Unreal by default and the VR Expansion Plugin simply builds off of these feature sets.

An Important Consideration

I have noticed when creating interactions that the author of the plugin sets default values when adding components/actors. Sometimes, these values have collisions turned off, or physics enabled in one or the other. There is a detailed explanation for this reason but the big lesson is that is important to always be testing!

From the website

The VR Expansion Plugin (VRE) was created to help facilitate advanced Virtual Reality interactions and gameplay elements in UE4/UE5. It is an MIT licensed and open source overhaul of many of the engines elements to better accommodate VR.

VR Expansion Plugin

Installation

Download the pre-built plugin here: https://vreue4.com/binaries

Unzip it then copy the unzipped file.

Open your project's folder. In the "Plugins" folder (which you may need to create), paste the unzipped plugin file.


Full installation documentation: https://vreue4.com/documentation?section=installation

VR Expansion Plugin

Getting Started

Prerequisites

It is important you have gone through basics, such as setting up an XR Rig, teleport locomotion, and basic grab. These blueprints will be replaced with VRE blueprints. One important reason for going through the previous guides here in the wiki is I cover the new Enhanced Input System, which the guides below cover the OLD input system!

You must take into account new engine features and enhancements and not simply "blindly" follow a tutorial.

Tutorial Videos

1. can be watched and it covers a lot of conceptual info on the plugin. You shouldn't need to reproduce anything here if you already installed the plugin.

2. Skip (outdated)

3. This is the core, bread and butter of getting started!

4. Movement modes: The author of the YouTube channel covers his implementation of Smooth and Teleport locomotion. It's worth a look to see how other developers accomplish this task, however, SCiL generally uses implementations from other resources. The crouching and climbing work well though from students who have followed those guides in the past.

5. If you followed item 3 above, you should be set to start creating interactions and these guides go deeper into how to use these components.

Something is wrong! In YouTube tutorials, comments are your friend! Other users tend to update the "changes" in the comments since the video was created.

Where do I go from here?

The YouTuber, VR Playground, is one of my favorite Unreal channels. He dives into a lot of interesting VR interactions, even beyond the expansion plugin!

https://www.youtube.com/watch?v=Jj1wNxeAWRM

https://www.youtube.com/@VRPlayground/videos

Packaging for PC

If you need VR Project settings can be found here.

Getting Started

The documentation, listed below, is extremely helpful to help configure your project for packaging. The article below does not cover optimizations, advanced packaging, or the build tools needed to compile the project (those are listed below).

In this field, you will find many synonyms for workflows, tools, etc. Packaging in Unreal is also referred to a Build, an EXE (Windows), an APK (Android), an App or Application, or a Distributable

Packaging Tools Needed

Visual Studio 2022 will need to be installed, along with the following components/workloads from the Visual Studio Installer. Generally, these packages are already installed on SCiL workstations.

Unreal Engine Documentation:

https://dev.epicgames.com/documentation/en-us/unreal-engine/packaging-unreal-engine-projects?application_version=5.4

Menu to package a project:

Screenshot 2024-07-10 165308.png

For 5.3 users, you may have a lower resolution build. This can be solved by going to Project Settings/Rendering/Default Screen Percentage and changing the options to Manual and percentage to 100.0

Unreal Engine for Unity Developers

https://docs.unrealengine.com/5.0/en-US/unreal-engine-for-unity-developers/

Packaging for Standalone

Setup project according to other pages in WIKI, including VR Settings for Unreal Engine 4/5


Ensure Android SDK for Unreal is setup according to this documentation [Unreal 5.4.x]:

https://dev.epicgames.com/documentation/en-us/unreal-engine/set-up-android-sdk-ndk-and-android-studio-using-turnkey-for-unreal-engine


Additionally, under Project Settings/Android, check the following:

More optimizations: https://www.youtube.com/watch?v=y3xFZF9Nyt4

Additional settings may be required depending on the standalone/deployment platform (such as AppLab or Oculus store)

Useful Tutorials and Guides

AR in Unreal

THIS PAGE IS A WORK IN PROGRESS!

Setting up a new AR project

https://dev.epicgames.com/documentation/en-us/unreal-engine/setting-up-a-new-ar-project-in-unreal-engine

Packaging for iOS

https://dev.epicgames.com/documentation/en-us/unreal-engine/setting-up-an-unreal-engine-project-for-ios

Packaging for Android

https://dev.epicgames.com/documentation/en-us/unreal-engine/set-up-android-sdk-ndk-and-android-studio-using-turnkey-for-unreal-engine