Raybeem is available on Steam for Oculus Rift & HTC Vive.
Raybeem is the first VR project from Sokay. It’s not 2D. It’s not a game. Yet, it’s as close to my heart as any of the previously released Sokay products. I spent 2017 working full-time on Raybeem. I can describe it as a whirlwind, which was hard to make sense of while I was in the middle of it. I’m writing this mess to dump what’s been going on in my head into a huge blog post.
Raybeem is a virtual reality project that started out as a desire to listen to my favorite Drum & Bass station, Bassdrive.com, in Virtual Reality. As an adolescent, I spent dozens of hours of my life gazing at Winamp visualizers. Switching through visualizations to find my favorite, tweaking settings, and downloading new ones. This was the prime way of listening to music for Bryson in the year 2001.
This was my dream for 2017, listening to Drum & Bass music in VR. In a world of my creation.
Raybeem’s been a large project for me so I’ve had a hard time figuring out what to say about it. At it’s core, I consider Raybeem to be a VR music visualizer. That was the starting point for the concept, anyway. I just wanted to create an application for listening to music in VR. The different visualizations in Raybeem are thought of as “Themes.” The way I imagined, a Raybeem theme is any environment that reacts to music in some way. So a theme could be anything – realistic or abstract, interactive or non-interactive.
When I started working on Raybeem, I had a strong idea of what I was trying to build. As it turned out, even with all the notes, sketches and prototypes, I fooled myself into thinking it would be less work than it was. Here’s a brain-dump of a lot of the things I figured out on my Road to Raybeem’s release.
This was a demo to figure out how to access the audio spectrum data in Unity.
The heart of Raybeem is a component that analyzes the audio spectrum. This is accomplished in Unity by using the AudioSource.GetSpectrumData() method. This gives me the audio data for the current frame of audio being played through Raybeem. This doesn’t care what the source of the audio is, as it can come from local MP3’s, web streams, microphone input, etc.. I took this data and divided and averaged it to come up with LOW, MID, and HIGH range values between 0.0 and 1.0. This essentially gives me a percentage for each of those ranges. I still consider it a naive approach but it worked GE (Good Enough).
I used that audio value to determine a variety of different things. For the most part, these components control things like object scale and color. Another example is the particles that boost their speed above a certain audio threshold in the “Blue” theme. These reactions are defined simply through Unity components that I can drop on objects in themes.
An example of a Raybeem audio react component within Unity.
Since then, I’ve learned a few more things about working with audio. At some point I’d like to revamp the audio analysis aspect of Raybeem to allow for more nuance in theme reaction. I’ll get to that sooner or later!
The only thing that Raybeem’s themes have in common is that they react to music in some way, or provide a great experience for listening to music. In most cases, I’ve created components that listen to an Audio Spectrum range and react based on its percentage of activity. For example, in the “Blue” theme, I have particles that float around and react to the MID range of the audio spectrum. When the MID range is at 0%, the particle is yellow and when it’s at 100%, it’ll be completely red. If the MID range is at 50%, it’ll be exactly halfway between yellow and red, so some orange-y color. I’ve created different components in the themes which apply the same logic to properties like ‘scale’ or more complex color shifting for the windows in 100’s of instances of buildings for the “Highway” theme.
That was a mundane explanation, but essentially I made a bunch of these components and playing around with them I stumbled upon these themes.
The initial theme for Raybeem is this particle system called “Blue.” The algorithm for the particle system is actually based on a tutorial I read by Robert Hodgin. I believe Apple bought the iTunes visualizer from him as it was originally a plug-in he developed for iTunes. It’s quite a shame that they took out all the bells and whistles from it, but that’s another story…
Anyway, the particles were initially going to be something like faeries or butterflies in this “Zen” theme I was developing when I first started working on Raybeem. When I started testing them as I was blocking out an environment for VR, I felt that it was mesmerizing to just sit and watch what the little particles floating around were doing. This was before I even had music in the demo.
From there, I was unclear with the exact visual direction for this “Zen” theme so I dropped the particles in a new theme test with a black backdrop. The black backdrop was too harsh on my eyes, with the solid black with white particles on them. So I tinted the backdrop Sokay blue. Then I noticed that having a solid color as a backdrop didn’t feel right, so I added a gradient so that you can get a sense of which direction is up at least. I then added in a isohedron wireframe dome to give some sense of a horizon. Then I dropped in faded building planes in the distance to give a better sense of depth. Additional tweaks to the particle shaders to give a fog-like effect to show more depth.
This was the testbed for figuring out if there was a visual connection to the music. Initially it was on Gear VR so I could easily show it around to people to gauge interest in it. I felt that most people had a very positive reaction, even if some couldn’t see past the initial prototype. I found out that a lot of people couldn’t get past the abstract nature of this theme, so I wanted to show a variety of themes so people wouldn’t get things twisted.
Once I hit the ground rolling, I had this concept for a “Highway” theme for Raybeem. I had gone to Tokyo, Japan for 3 years in a row and the vision of a huge never-ending city has been forever scarred on my mind. I’m from LA, which is a big city, but it’s all spread out. Not so dense. I wanted to try to make an endless procedural theme where you could “drive” on a traffic-less highway forever. The LA dream, a city with no traffic (as my homie Chris pointed out to me).
Don’t ask me why, but I decided on modeling this thing myself. I’ve got an art background so I’m not shy to throwdown when it becomes necessary. But it’s also necessary to flex once in a while to maintain said skillz. I was aiming for a lo-fi Star Fox kinda look since I’m still nostalgic of that era of 3D graphics (shout out to Virtua Fighter and Virtua Racing!) I was experimenting with that look with my WebGL Rush-D demo a few years ago, so I started right where I left off.
A city block laid out within Unity.
The city didn’t end up being procedural, but it’s a bit randomized. I setup about 10 different arrangements of city-blocks. Each block, I meticulously placed buildings by hand and adjusted them based on how they looked as you passed by while in VR. The order in which the blocks come by is random. I used the Unity asset Curved World to randomly bend the environment to give a sensation like you’re turning around corners while on this endless road.
Initially the plan was to take you through different areas of the city – tunnels with trains and underground structures, coastlines, and endless bridges. I scaled back my vision because I had to actually make all of that stuff! It became time consuming because I ended up needing to add more details to the theme like benches and trees.
I sketched out concepts for city block types.
I cut corners wherever I can. Since I was going for a flat shaded look, I decided to use the COLR Unity asset. I figured the only way I was going to finish Raybeem was by setting limits and accepting what I could get within whatever restraints. COLR saved my ass because it shades polygons based on which direction it faces (essentially North, South, East, and West). I had to combine this shader with the Curved World shader and it was a fairly easy integration.
Where things got tricky was trying to update so many instances of materials. So for this theme, I isolated the window materials for buildings and adjust those dynamically. Since I’ve got dozens of instances of each particular material, I decided to just modify the sharedMaterial for each material. You’re generally not supposed to do that because it has the unintended consequences of modifying the original asset. Since these material would always be only updated by code I just decided to go that route. But at first that didn’t work and I tried some other things. Right now it’s all cloudy and in the end I ended up building a convoluted method of finding a reference to the material but none of that matters anymore. Let’s put the past behind us shall we? It’s working after all! 😉
Oh yeah, well the reason why the material wasn’t updating after I changed the colors dynamically was because of the way the COLR asset was setup. It used Unity Editor scripts to update the material when using the color picker in the Unity Editor. I think in the end I had to make something to force those updates to happen from within Raybeem. It was a nightmare to figure out but in the end it was an easy fix.
I’m still a raver somewhere deep down in my cold heart. Since I started Raybeem I wanted to have a “purely raver” theme but didn’t know which direction to go. With the perfect timing, just as I was trying to figure out which direction to go for a new theme, I found this awesome post on Reddit. Yes, if you can’t tell, I shamelessly ripped off the feel of this theme. It barely took any time to figure out how to make a particle system to create this warm glowiness in the distance that just felt great in VR. I combined it with a Kaleidoscope asset that I’ve been looking for an excuse to use.
I wanted people to be able to give VR lightshows with this theme. I’m still working on the hand functionality, but right now you can leave light trails and shoot butterfly graphics. Hopefully later I’ll add other graphics like unicorns and other raver kinda iconography.
So I’ve never been to Burning Man, but I’ve heard stories from homies. This theme is directly based on my experiences at a rave in the middle of a desert near Lancaster about 10 years ago. I cracked my windshield driving off-road in a Toyota Camry chasing glowstick waypoints in the darkness. It was worth it to experience firedancing first-hand, in the bitter cold of the California desert. Yolo.
For this theme, I actually got the fire from one of Unity’s tutorial projects. It’s on the official Unity YouTube somewhere if you’re interested.
This is the first theme that I released in an update after Raybeem’s release. With this theme, I set out to give more control to the user and give them a world that they can have more interaction with. Much of this interaction comes in the form of an optional mini-game.
When starting the theme, you appear in a space-like environment with star bits of nebula-ish clouds flying by you giving you a sensation of movement. You can use the control sticks to move in the direction your facing, backwards or side-to-side. You have a mini-map in the HUD locked to your view, it appears in the bottom left corner of your view. Using that, you can find a “Sokay coin,” which appears as a beacon. Upon collecting the coin, a mini-game is triggered and you must defeat all of the spawning robots before the time is up.
I kitbashed some Unity assets for the robots and the guns you use. I might eventually create new artwork for this theme, and revamp the gameplay, but I’m planning to focus on new themes at this point. Here’s a video of it in action if you want to take a look.
Getting this thing working for both of these controllers wasn’t that fun.
Initially Raybeem was a prototype for GearVR and the Oculus DK2, which at the time relied solely on gaze for UI selection since the VR motion controls weren’t readily available at the time. As I began serious development of Raybeem, I had to rework the UI input system to be more flexible to allow it to work with any amount of selection inputs. In addition to the new UI functionality, I had to figure out a system for dealing with VR input with both the Oculus Touch and the HTC Vive controllers.
For the gaze-based UI, I ran an update loop that raycasted from the camera and checked for collisions with any object containing an interactive UI component. If that UI component wasn’t already selected, it became the selected object. From there, any selection button press would trigger that interactive UI component. To make it work with controllers, I instead made a list of different sets of raycasts that interact with interactive UI components. So now I had:
- Camera Raycast
- Left Controller Raycast
- Right Controller Raycast
- … and as many more might be needed
But the problem with that was that I didn’t want the gaze to be the selector, although I might want the gaze to be a selector if VR motion controls weren’t available. So I setup a data type that sends along whether the raycast is a “gaze” or “selector” input. This way I can switch the gaze to be a selector if necessary and I can also have the gaze still interact with UI elements. Currently, the VR menu for Raybeem will automatically fade out and move away from you as you look away from it, making use of the “gaze” type of raycast. This system still only allowed for one “selected” interactive UI element, which created a problem of being able to highlight different buttons with different hands, because of that, I followed the lead as other VR software and limited selection to the “active” hand, and hide the other hand’s input. Just act like the problem doesn’t exist in the first place!
Once I got the VR motion controllers working for the UI, I had to figure out what the hands were actually going to do for the themes. During that time, I hadn’t thought of it too much because I was focused on many other buggy aspects of the project. I felt like I could just whoop up anything quickly for the user’s hands if I had to. A challenge in that process was figuring out how to handle the VR controller input for multiple types of controllers as well as button schemes for future themes that might be nothing like the themes I launched with.
What I ended up with is breaking down the input something like this:
- Menu button – brings up the VR menu
- Stick (analog) – sends Vector2(x,y) value
- Trigger (analog) – sends a value between 0.0 – 1.0 based on how hard its pressed
- Grip (digital) – pressing activates a theme effect
Aside from the menu, the controls between different Raybeem themes are inconsistent. Some people didn’t like this because “What do the buttons do?” In some cases, like the “FireDancer” theme, the buttons do nothing. This is partly because I’m figuring it out and short on time, and I don’t want to force functionality in if it doesn’t make sense to me at the time. My explanation for this is that all themes are self-contained and follow their own rules. Some themes might have no interaction whatsoever, some might be full-blown games with complex logic for different situations. These differences most likely need to be presented to the user in some way (so far no tutorials within Raybeem ;p ). As Raybeem continues to materialize, I hope to find a way to ease people into things. Defining the input for these interactions has given me a starting point that I hope to polish up over time.
In retrospect a lot of these problems probably could’ve been alleviated if I would’ve used VRTK , but when I first came across it I wasn’t sure how legit it was. Besides, sometimes it’s worth doing things the “rough” way to understand what’s really going on.
VR Camera Rigs
One of the most challenging aspects of this project was making it compatible with both the Oculus Rift and the HTC Vive, as well as function perfectly fine without VR. I wasn’t even sure if it was possible when I first took on this task. Although I had both VR HMDs, I led development with the Oculus Rift. Since I started it with Oculus’ Gear VR SDK, I didn’t have to change anything to give it compatibility with the Oculus Rift. Oculus is a lot easier to develop for simply because there’s API documentation, which I believe doesn’t exist with the HTC Vive. For Vive development, you’re expected to pick apart demos and troll their development forums which is super unproductive. Once you get past that, development is pretty similar for both of them.
To handle multiple VR SDKs, I created a bootleg “VR Detection” system. When you boot Raybeem, and if the “VR Active” setting is set to ‘ON’, Raybeem will attempt to activate a VR mode. What it does is check to see if VR is available, then it loads an appropriate camera rig based on whether you’re using an HTC Vive, Oculus Rift, or no VR. These camera rigs are based on the rigs delivered in each respective VR SDK. For the most part, I modify these rigs by just drop on components to allow these cameras to integrate into Raybeem. There’s components for UI raycasting, postprocessing, and camera fade effects.
It is possible to simply use a Vive Rig as it’s “Open VR” and Oculus is supported, but from my understanding, that way doesn’t take advantage of Oculus’ drivers which I believe are the only ones that support ASW at this moment. So for the best Oculus experience, you really want to use Oculus’ SDK.
If no VR camera is loaded, the VR detection system will load an “Attract Camera.” The Attract Camera displays the scene from different perspectives while a user is not in VR. The intention is to always show some action on the screen regardless of whether someone is interacting with Raybeem. This attract camera also displays when you unmount the Oculus HMD, I believe this functionality is disabled with the Vive because it was causing crashes (but don’t quote me on that).
Right now that Attract Camera has 3 modes, set by the current theme.
- Orbit: Essentially a turnable that rotates the camera around a point and randomly switches zoom and changes speed & direction.
- Teleport: Camera jumps around to different defined locations in the scene.
- Combo: Camera randomly switches between “Orbit” and “Teleport” modes.
The Attract Camera system was one of my original goals but ended up being a pain to implement because I was already neck-deep in tasks when I started getting it in. It was important to bringing Raybeem to life, though, and I was able to capture a lot of great footage for social media because of it. I hope to keep layering in new Attract Camera modes as new themes bring new requirements to the table. Check our Instagram for footage of this in action.
Raybeem Live at a big trance show at Garage Gallery LA, in Downtown Los Angeles.
Sokay did its first live events during the Downtown Los Angeles Art Walk back in 2011. These were called “Sokay Play.” At the time, I was knee-deep in the development of our Flash game, DONUT GET! Initially I was ashamed of showing our older games like Thugjacker and LUV Tank at these shows, but I quickly realized that it didn’t matter to people how old our games were because they had never seen them before. Our old games were “good as new” to the artwalkers.
With Raybeem I was trying to improve on the user feedback loop, compared to how I handled DONUT GET! I made the mistake of not testing DONUT GET! seriously until the end of the project. This was partially because it took so long to bring all of the parts together for a cohesive experience. In retrospect, I should’ve got it in people’s hands sooner so that I could actually take feedback into consideration before becoming burned out on the project.
Raybeem Live at VRAR, Long Beach Main Library in Downtown Long Beach. Reppin’ 562!
The first show I did for Raybeem was at the event VRAR at the Long Beach Public Library main branch in January 2017. I was lucky that my homie Gabotron helped organize this event at his job. I wasn’t ready, but it was a golden opportunity. This was the first month that I started full-time development of Raybeem. It was still basically a prototype at that point. Raybeem was only the “Blue” theme, with some enhancements and optimizations to take advantage of the power of PC. I added in the ability to cycle through different palettes, but it didn’t have any motion controls or interactivity at that point.
It was an exciting experience to show these strangers a glimpse of my new project. That event was the first VR experience for a lot of people and everyone that tried Raybeem walked away with a smile. Even the old dude that wanted me to play some heavy metal! That was extremely encouraging so early on. I started to take in the comments from everyone trying it out. I don’t remember many major revelations, but this is where I started my “listening process” with users. I listen and try to understand the core or what they like and what they don’t like, and try to imagine ways for me to address their concerns. Little did I know at the time, I was going to be in for a lot more live events for Raybeem!
Our first major show was an art show Super Future from Futra.
It turns out that music is sorta universal. I got lucky to be accepted and invited to several shows leading up to the release of Raybeem. I came across Futra‘s call for submissions for their summer show, Super Future. It was this hybrid art, music, performance event hosted at Lot 613 in Downtown LA. We setup a proper booth and I sweated while hoping everything would go smoothly.
Super Future turned out well thanks to all the Sokay Family that came out to help and show support.
In the end it was all good. We had a long ass line of people for most of the night and we got a lot of great feedback. I was relieved that I didn’t have to man the booth since I had Sokay Family helping out. It was a strange experience to see so people coming up to me like I was some kind of expert on virtual reality stuff.
That event went well but it took a lot out of me. I ended up doing several more shows before actually releasing. We did a few events at Upload LA, a few with a dope LA rave crew. Around this time I finally felt like I was onto something because when Raybeem’s on the dance floor, next to a big ass speaker, reacting to what a DJ’s playing. That’s the recipe right there. That moment is what it was all about. I envisioned Raybeem as something for me to use to smoke weed, kick back and and engulf myself in my tunes after a hard day of traffic and diving in code all day. This personal home experience. Events is where it’s at though!
I’m still trying to figure this out. Doing events takes a lot out of me and it can be a pain to lug a bunch of fragile equipment around. I’ve gotta turn down opportunities for shows sometimes. Gotta keep on pushin!
Follow Sokay on Instagram at @sokaylife.
When I started Raybeem, I decided to take social media a bit more seriously and got a Sokay Instagram setup. We have a Sokay Facebook page with some likes, but it’s been pretty inactive. The plan was to get some help with that so that I could focus on development and whatever else I needed to do for Raybeem, while someone took care of the social media dealings. In the end, that didn’t work out and I had to handle all this social stuff while I was doing development.
For Instagram, I kept it pretty basic and tried to post a picture a day. I started with old Sokay content, to give me time as I prepped Raybeem to a point that I felt good showing it off. I don’t know what else to say. I made sure to take a lot of photos and videos of people using Raybeem just in case it might be useful on social media. First impressions are some rare moments you can’t recreate.
The Flyer and The Teaser
I printed out and hand assembled the Raybeem flyers. By this point, I think I’ve handed out over 200 flyers at shows, conferences, and various events.
I wanted to have something a bit different to post on social media so I brought on the homie Ramiro to do a comic for Raybeem. He’s been killing it with his horror comics and I always wanted an excuse to work with him. One of the challenges I faced with promoting Raybeem was that for a long time I just had a single theme and it looked really “samey” in screenshots and footage. I was thinking that having a comic based on the idea of Raybeem would at least give me some content to promote Raybeem with on social media. In addition to that, we designed it as a flyer to be handed out at our booth when doing live events.
Ramiro’s rough layout sketch of the Raybeem comic.
My favorite part about collaborating with someone is that they often take a vague or bad idea of have and turn it into something great. From what I remember, I told Ramiro some vague concept about the VR application “Raybeem” being created by this mad scientist type of guy named “Dr. Raybeem.” This doctor set out to resolve all forms of psychological damage with a treatment in the form of this virtual reality software, dubbed “Raybeem.” As story goes, Raybeem worked but it worked a bit too well and became an epidemic. People sought it out over drugs & alcohol because it was a place where they never wanted to leave. Nobody has seen Dr. Raybeem in years and some say that he lurks somewhere within the code.
Read the comic for yourself at http://www.liveinyourmusic.com
That’s not all! I really had to see Ramiro’s artwork in motion, so I decided to animate a motion comic as well.
I’ve got no reason to be animating in After Effects but I really enjoy this kinda stuff from time to time.
Putting together a trailer for the release of Raybeem was something that I wanted to get some help with. Fortunately, the homegirl Jennifer Estrada was down to help me out. What made this a challenge was that I was still figuring out exactly what the “package” of Raybeem was. In other words, I figured out most of the functionality but I wasn’t sure how to message that. Additionally, I wasn’t sure exactly what I was going to ship with the release. The features and themes were changing up to the last second. Fortunately, I she was patient with me and probably used to it because she makes trailers and such for a living!
Here’s the result:
Steam & The Release
While my head was spinning with bug fixes and testing on multiple VR headsets on multiple computers, I realized I had to actually release this thing on Steam. I had run through the general idea of releasing games on Steam once I got the store “coming soon” page up.
None of this felt real until I finally saw the Raybeem listing on Steam. Something something something, I eventually figured out how to release builds and whatnot but that was a headache of a process when I just wanted to get it over with.
I’m still tripping that I finally made it to Steam. My first experience with Steam was downloading whatever activation there was for my Best Buy bought Half Life 2 on release day on my crappy connection to my neighbor’s wifi because I didn’t even have internet at the time. Now I’ve got some Sokay stuff on Steam! (and a legitimate internet connection)
Keep on Raybeem’n!
I just released an update to Raybeem with the new StarStream theme. I noticed a pretty bad bug in it just as I was prepping for release, while the homie came thru and tested it a bit. Oh well, there’s more where that came from! Haha!
If you lasted this far in my essay, much love! I think even my mom gave up a few chapters ago. If you’ve got some questions or feedback, feel free to hit me up at sonofbryce [at] sokay.net.
One of our next shows is coming up! VRAR2 at Long Beach Main Library, March 3rd 2018. Stay tuned for more details!
- Join the Sokay Newsletter!
- Raybeem on Steam
- Raybeem on Viveport
- Raybeem coming soon to Oculus Store (hopefully!)
- Sokay YouTube, for Raybeem development videos
- Sokay Instagram
- Sokay Facebook
- Bryson on Twitter
- Oculus’ Unity documentation