As an artist, VR Art created an incredible opportunity when it came to designing my art work within a 3D space. Whilst Maya is more of a modelling software, Openbrush is a collaborative tool designed with both 2D and 3D art in mind.
I experimented with Openbrush by creating a small diorama, using different paint brushes and techniques to create a 2.5D experience. Due to technical difficulties regarding the VR screenshots, I wasn’t able to document my process in a traditional sense. Therefore, I’ll describe my workflow.
To start off with, I used different thick and oil brushes to map out the whirlpool at the bottom of the diorama, using dark and light blues to add shadows and highlights. I made a tornade-esque spiral pattern to help map out the area I wanted to work in. Despite this area being small, it means that I could contain this diorama into a small section and could expand on it’s design / world by adding more in the future.
For now, I continued this spiral visual motif, moving up to the top. The process for this was inspired by spruce trees with the lines serving as the main focus of the diorama. To imagine this from a user’s perspective, the user would start from the bottom of the diorama and work their way to the top, being a more linear approach when handling the VR space.
I continued to add darker shades of colour to add depth to the piece, as well as adding different highlights of complimentary hues such as purple or navy. The bright colours contrast with the dark background help amplify these cooler but vibrant colour schemes.
As shown in the screenshots, the main theme of this diorama is the ocean. But I also wanted to add further colour contrast to the piece. Therefore I added jellyfish into the scene to further guide the user through the diorama.
One thing that I found when creating these jellyfish was the issue of the lack of depth – often being as flat as paper. I was able to combat this issue, however, by adding movement to each of the jellyfish, making them curve around diorama using the highlighter tool.
Once I finally finished adding in the jellyfish, I experimented with extra effects and brushes such as the flower brush and the star effects to further entice the user around the diorama, giving the environment some light whilst also working with 3D Meshes.
In conclusion, Openbrush was my favourite software to experiment with, not only for it’s artistic capabilities but also working within a 3D VR space for the first time. It allowed me to naturally produce art whilst being able to adjust it in Maya later on. It also allowed to me to freely explore my artistic capabilities without being restricted to a singular canvas and with the effects, it only enhances that experience, being able to create beautiful pieces.
In terms of concept art and storyboarding, Openbrush is the best option for me to use purely because I’m able to work in a 3D artistic space, making sure that the environments I create are user friendly and can be accessible to a wide range of playtesters.
As previously mentioned, storyboarding in Openbrush is also beneficial due to it’s lack of restrictions especially when I jumped from a 2D canvas initially.
This also means it’s easier to create 360 experiences, when working within the player’s perspective. Therefore for my larger project (Especially as a manga artist), I’d like to use this alongside MASH networks to create scripted events – these two mediums being mixed together would allow me to make a 2D and 3D world, the main hook for my project’s storytelling and environment.
With Augmented Reality, Whilst it may not directly contribute to my larger VR project, the medium was still interesting to explore as I’ve mainly experienced the format through mobile games such as FNAF AR or Pokemon GO with these games utilizing real world environments to create interesting set ups and mechanics that make these characters interact with the real world around them.
During my research into the capabilities of AR, I also noticed that AR could be used to advertise products or contacts such as the use of QR codes. QR codes, in our modern society, has been essential due to it’s accessible nature especially for users with visual impairments. With AR, by scanning a QR code, users could be greeted with a small animation alongside someone’s contact details to add uniqueness to any contact card or promotion.
In terms of games, as mentioned previously, AR can also be used to virtually interact with a real life environment. For instance, bringing art layers to life or having characters or different animations interact with the world around the user. AR is used primarily as an artistic medium in this case.
Creating AR using Zapworks and Unity
During this experimentation, I worked with Unity to create a small AR product that would allow users to scan a QR code so that when they used their camera over the image, it would pop up within that Unity environment
So I started off by setting up the Zapworks assets and plugins in Unity, adding in an image scanner, a rear facing camera and an image tracking target. For the image tracking target, it helps Zapworks identify the object when scanned, allowing the image to rotate or move alongside the user’s camera movements
Once I set up the environment, I adjusted the AR camera so that it’ll be able to scan and identify the image during the training process and added in an example image into the scene.
I also adjusted the workspace so the image scanner would be able to easily identify the AR trigger alongside adding the image as a tracking target.
Here is a video of the first part of the AR experimentation, This was just a test to help understand and experiment with Unity and Zapworks’s functions and capabilities. However, I wanted to add something further into the experimental AR project: adding in a 3D object that can interact alongside the image target.
So I created a seperate object using the plane mesh and added an extra drawing as a material. Along with this, I added some triggers to the 3D object so that the mesh would pop up along the image.
Here, I added runtime events that would allow the art to only be visible if the user’s camera was hovering over the AR trigger.
Despite trying on multiple attempts, unfortunately the 3D popup for the AR experiment was unsuccessful, but I’d like to continue exploring and improving this concept within the near future.
Test attempt with trying to get the 3D pop up to appear, unfortunately this did not work.
With this medium, I could definitely use and experiment this to further my art similar to my research prior. I could also potentially use this as a way to also showcase different art pieces for my portfolio or for any commission work with employers within the near future or even create some art and showcase them using the animation format. Either way, this medium within emerging technology could help users experience art through an 3D virtual lense and in the future, I’d like to experiment with this medium by mixing different 2D and 3D elements, such as this exploration piece, in order to present my art work in a visually engaging manner.
For the second half of this exercise, I was experimenting with MASH networks to create abstract visuals that I could utilize with my future VR project.
To start off with, I used different MASH Network Nodes to understand the different shapes and patterns that could be made with different nodes being used.
Spherical
I then started to work with different colours, using the colour node to adjust the colours to random hues and saturation. I also used seeds for each node to create different patterns. The most interesting patterns coming from the random node to simulate explosions. I also found that the random node would create unique distributions in their rotation which allowed colours and shapes to overlap with one another.
These visuals made the patterns gravity defying and aesthetically appealing to view. Although, within a VR perspective, these shapes and vibrant colours could potentially cause motion sickness especially in a bright environment.
Here, I continued to work with different colours, shapes and compositions, now going for a more linear approach within my work whilst still experimenting with different hues; again, wanting to experiment with more abstract imagery.
Here, I worked with perspective, creating illusions that can only be seen in certain angles. For example, the everlasting winding staircase. An illusion that could further impact on VR immersion, potentially being used as a way to question the experience’s true reality, much like the Alice In Wonderland VR experience.
However, some considerations to keep in mind when creating these illusions, were the user experience. Since users, especially newcomers to VR, will find illusions that involve questioning reality disorientating. Especially since VR is meant to transport you into a new reality essentially. So, if I were to use optical illusions within my project, it’s best to keep them at a minimum to avoid disorientating perspectives.
I also experimented with animation for the MASH nodes. Here, for example, I used the random node to increase the strength of the explosion as the animation progresses, adding at least 3 keyframes to add anticipation to the sequence.
This will be incredibly useful later on, especially when making various action sequences for my VR project due to it’s unique animation possibilities.
Wall Destruction Experiment
I continued to experiment with the MASH function, starting off with making a wall destruction sequence in order to further explore different possibilities for scripted action sequences.
Created using Replicator MASH and Grid Distribution as well as Transform MASH
To start off with, I created a single block, soften the edges to turn the block into a brick. Afterwards, I used the Replicator Mash and Grid Distribution nodes in order to shape the bricks into a simple wall structure with different patterns. I also added a colour node, using different shades and hues of pink and purple to make the bricks aesthetically pleasing.
Created a small box and attached a camera – Also added a Colour node to MASH network
Once I set up the brick wall, I created a small box cart using the extrude option as well as adding in a first person camera and parenting it to the cart. This is to test out the perspective once the box collides with the wall itself.
The use of camera work, especially in VR, needs to be tested regularly in order to immerse the player into the sequence. Therefore, I’ve also added a border, to keep the perspective centered on the main focus as well as adjusting the focal lense to make the wall appear closer than it actually is.
First Person Camera View
After adding in the camera, I created a curve in order to make the cart move in a linear direction, creating constraints between the curve and the cart in order to create this simple animation.
Top Down View of the distance between the Wall and the box cart – Added a custom curve.
Once I adjusted the camera, I added a signal node, parenting it to the cart itself. This node allows the cart to react with the wall collision sequence, making it so when the collision sphere is near the wall, the bricks will gravitate away from the signal sphere.
As shown in the MASH signal settings, I customised the way the bricks will react to the collision sphere, mainly adjusting the rotation and the position of each brick to create that warping wall effect.
Here are a couple of shots of this action sequence from different perspectives. As shown, in the first person perspective, there’s the illusion of the cart immediately hitting the wall.
However, in retrospective, whilst the cart does collide with the wall; due to the collision sphere’s range, the cart collides with the bricks too early. In addition to this, from a VR perspective, with the bricks directly hitting the camera, I can imagine this could briefly disorient and startle users during the sequence.
Nevertheless, this experimental task was incredibly beneficial when learning how to create scripted events by showing how objects and structures can react to different node collisions. Not only this, but since I’ll be working on a city environment, by learning the distribution and replicator nodes gave me a better understanding on how to create an efficient and consistent blockout for taller, repeating structures when I started to work on my VR project.
Finally here is an outsider perspective of the animation, with the sequence being varied based off of the size of the signal collider, once again, creating unique and bouncy visuals. These perspectives are also dependent on how much the user can see from the first person camera.
For example, there is more warping within this animation – however, the wall reacts before the cart could properly collide with the wall. However, this could be used as an opening portal to another section of an environment. This is also beneficial for 360 perspectives.This example is more suited for a more linear VR experience, with the cart realistically crashing into the wall. However, there’s less of an unique impact when colliding with the wall.
Experimentation 3 – Portal and Audio MASH
For this experiment, I was interested in creating potential VR assets for my larger project. Being inspired by Across The Spiderverse Visuals, I decided to recreate the portal effect seen from the movie to hopefully add into my environments later on in development.
Gvozden, D. (2023). The Definitive List of ‘Spider-Man: Across the Spider-Verse’ Easter Eggs. [online] The Hollywood Reporter. Available at: https://www.hollywoodreporter.com/news/general-news/spider-man-across-the-spider-verse-easter-eggs-list-1235506838/. [Accessed on 25/10/2023]
I started off by creating a simple shape for the portal, adding in coloured textures to each part of the model (including lighting and shadowed textures inside the hexagon) in order to emphasise a 3D effect that can be seen once the portal is animated.
Once I finished the model, I made a CV curve (much similar to the linear curve made in the Wall destruction experiment) and added a curve node onto the MASH network for the hexagon. Once I attached the curve onto the node, the vertical pattern was automatically created, moving downwards and creating an infinite loop based of the directional axis of the curve line itself.
After creating this sequence, I started experimenting with the random node in order to make the patterns feel more realistic – as if you’re going through a portal that’s constantly changing. I slightly adjusted the rotation values based on randomness and this was the result:
Next, with the same mesh, I wanted to explore the audio nodes that MASH had to offer – wanting to create a unique sequence that makes it as if the world exists around the player.
So, by using a song from the Spiderverse soundtrack, I used the spherical distribution node alongside a random node to once again adjust the mesh’s into different patterns and I added the audio node in.
Usually, the audio node is mainly used on simpler shapes such as spheres or squares so, at first, whenever the hexagon moved alongside the song’s wave length, it was distort and clip through the portal effect itself. I managed to fix this issue, however, by adjusting the lowering the strength of the song’s wavelength.
After adjusting the audio MASH network, I added in a simple black background to further add into the illusion, clipping the black image plane into the portal effect in order to make the loop appear more immersive when it started to move.
And then, I added a simple camera set up, adjusting the focal lense to focus on the portal’s warping effect as each hexagon passed by.
Camera and Scene set up
The portal effect worked in order to create the illusion of infintely gliding through. However, as shown with the audio network, because network is static, it breaks the illusion fairly quickly.
So, similar to my process for the portal effect, I created a CV curve in reference to the spherical distribution, allowing the portal effect to take place once again but this time with the audio synced up alongside the sequence.
This was the final result of this experiment, with different portals gliding the user through and around them. On one hand, the colours and composition help create the illusion of user movement, however, in a 360 environment, I’ll need to consider how the user will handle the claustrophobic space that the portal effect provides.
For my larger project, I would want to work more with MASH networks such as these – it has helped me gain a solid understanding on how to approach the VR project, wanting to work with more illusions such as portals or warped effects. MASH networks also allow me to experiment with world building and action sequences.
With UX design especially, I want the user to be able to experience these effects but possibly in a smaller and less space-intrusive scale for user accessibility. So for instance, if I were to add the portal effect into my project, I’ll need to keep the sequence short or adjust the camera to prevent motion sickness – which usually comes from experiencing VR in a claustrophobic sequence or constant moving objects.
When it comes to new technology being introduced every year comes with new explorations and discoveries to be made especially when it comes to new heights in player experience. VR has been a medium I’ve always been interested in but never fully experienced, nor was it a subject I was incredibly knowledgable on besides the Oculus Rift, A VR headset that was released in 2013.
Regardless of this, In these experimental blog posts, I’ll be exploring different techniques and practices in order to grasp not only a better understanding on the potential possibilities of creating different environments but also learn the technical aspects such as UX design and player accessibility.
VR 360 Camera
To begin exploring VR, I started off by using Maya to understand the basics of experience creation. For this, I created a basic environment with different structural shapes in order to simulate the player’s height within the world.
I also added some basic sky dome lighting to the experience – giving the area more depth especially when rendering the scene. The simple composition of these structures helped create an a domain that towers over the user without making it too overwhelming on the senses.
The VR camera helps add that sense of immersion by allowing users to feel as if they’re in a much smaller position. Whilst, at first, this was difficult to create first time due to the proportion considerations; This exercise helped shape my basic understanding on environment building and assisted me to begin to research and apply one of the fundamentals within UX design in VR: Immersion.
Since the VR video won’t be able to work in the browser so please download the video for the full experience.
WebVR – Exploring FrameVR
FrameVR is a web based VR creator that allows users to create their own experiences and to allow other members to see your own spaces in both PC and VR. The website also allows other users to collaborate on your space. These spaces could facilitate anything from work spaces to personal projects or even blockouts for bigger VR projects, allowing users to test out the space in the earlier stages of the production timeline from both a VR and PC perspective.
This is incredibly helpful due to these spaces being accessible upon multiple device types, arguably being able to reach for a wider market and playtesters especially within the blockout phase of any project.
For FrameVR, I was mainly experimenting with the tools that the software provided by creating my own personal space that players could relax in within both PC and VR.
For this small project, I wanted players to draw their immediate attention to the table, using eye-catching objects such as cakes and food models to urge the player to inspect the area as soon as they came into the space.
My main theme for this space was to serve as a ‘Shrine’ – So adding food, plants and candles onto the table helped contribute to the theme alongside a large angel statue to amplify the calm and warm aesthetic to the space. Whilst a casual environment, this project was to mainly experiment with the different tools FrameVR had to offer such as it’s ability to import different models and images into the environment and even features such as the drawing whiteboard, text signs and screen sharing board.
Whilst it was interesting to explore FrameVR and it’s capabilities, I won’t be using this for my blockouts during the production of my VR project, mainly due to my familiarity with Maya as a software as well as Frame’s limitations such as limited polycount and the lack of experience with Frame’s modelling tool.
Nevertheless, This website was interesting to experiment with, especially when it came to understanding visual aesthetics when it came to capturing the user’s attention without making the area overcrowded.
Illumix (2019). Five Nights At Freddy’s AR: Special Delivery [Video game]. Illumix: USA. Available Online: https://fnafar.com/ Accessed on:[18/10/2023].
Niantic, Inc. (2016). Pokemon Go [Video Game]. Niantic, Inc: San Fransico, USA. Available Online: https://pokemongolive.com/?hl=en Accessed on: [18/10/2023].
Spiderman: Across the Spiderverse (2023). Directed by Justin K. Thompson, Joaquim Dos Santos, Kemp Powers. [Film]. California, United States: Sony Pictures Animation.
Jojo’s Bizarre Adventure Anime: Part 2 – Battle Tendancy (2014). Directed by Toshiyuki Katō. Written by Hirohiko Araki [TV Programme]. Netflix.
Fujimoto, T. (2018). Chainsaw Man. [Manga] Tokyo: Weekly Shōnen Jump.
Chainsaw Man Anime: Season 1 (2022). Directed by Ryū Nakayama. Written by Tatsuki Fujimoto [TV Programme]. Crunchyroll.