{"id":44,"date":"2016-09-05T15:11:00","date_gmt":"2016-09-05T15:11:00","guid":{"rendered":"http:\/\/blog.livierickson.com\/?p=44"},"modified":"2023-08-31T21:44:30","modified_gmt":"2023-08-31T21:44:30","slug":"writing-a-talk-about-webvr-using-webvr","status":"publish","type":"post","link":"https:\/\/liverickson.com\/blog\/?p=44","title":{"rendered":"Writing a talk about WebVR, Using WebVR"},"content":{"rendered":"\n<p>This past Thursday, I had the privilege of speaking about the VR Web, a topic near and dear to my heart, at Coldfront, a front-end focused, single-track conference in Copenhagen, Denmark. Despite battling off the remnants of a particularly nasty cold, my talk at Coldfront was one of my favorites that I\u2019ve ever given. Why? The audience was great, the organizers were wonderful \u2013 but what really excited me about giving this talk was that I wrote the thing using A-Frame.<\/p>\n\n\n\n<p>We\u2019re at an exciting time with the VR Web. WebVR 1.0 is slated to hit Chrome 55 in December, and is in Nightly builds of Firefox. The <em>specs<\/em> of dust (see what I did there? No?) are clearing and the browser-based virtual reality API is being considered as a W3C standard for the internet of the future. I figured that now was as good a time as any to see if it could hold up to a test: a 45-minute presentation on stage in front of 300 or so people. Was my laptop up for the challenge?<a href=\"http:\/\/web.archive.org\/web\/20200923151142\/https:\/\/livierickson.com\/blog\/wp-content\/uploads\/2016\/09\/pic1.png\"><\/a><\/p>\n\n\n\n<p id=\"caption-attachment-8971\">As a matter of fact, it was. I know for a fact that I\u2019m not the first person to attempt this (one such pioneer was sitting in the audience, in fact!) but it was <em>so much fun<\/em> to give a talk about virtual reality using a 3D environment in my presentation. I quite literally walked through my presentation, then pulled up the code and showed how the magic happens.<\/p>\n\n\n\n<p>Building a rough (and it is like, pre-Nintendo era graphics rough) A-Frame application for presenting VR was an eye-opening experience.<\/p>\n\n\n\n<p><strong>Planning &amp; Project Structure<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I started with a basic index.html file that brought in the minimized A-Frame library. I used the built-in primitives to create a rough outline of what I wanted my experience to look like.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I separated out the content that I wanted to cover into a few different buckets \u2013 these would have been slide groups or sections in a traditional presentation, but I cut down on displayed content quite a bit for the purposes of performance. I don\u2019t think anyone missed it.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Each content bucket became an &lt;a-entity&gt; object that I could contain all of the information in. As an example: my first content area was an introduction, which contained a photo of myself, the title of the talk, and my name. These were all A-Frame objects that were children of the parent &lt;a-entity id=\u201dintroduction\u201d&gt; element.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I created a PageComponents.js script that called AFRAME.Register on objects in my scene that I wanted to be able to interact with. There were a few functions that I wrote, many which hid or showed the next content batch, and most of which looked fairly similar to the \u2018play-video\u2019 component:<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I used the A-Frame text-component library that<a href=\"http:\/\/web.archive.org\/web\/20200923151142\/https:\/\/github.com\/ngokevin\/aframe-text-component\"> Kevin Ngo wrote<\/a> to add \u201ctext boxes\u201d to help define different sections of the content<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I used a few different skyboxes to demonstrate 360 photos, and a video texture to play a clip of a different WebVR application that I had written using the .NET framework<\/li>\n<\/ul>\n\n\n\n<p><strong>Presenting on Stage<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I had a backup slide deck ready to go, which I had loaded onto my iPad to use as project notes. It was weird doing a presentation without defined presenter notes, but having a secondary device to scroll through for more context helped me stick to my points without needing to have them all written out on screen.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>I used the Edge browser to navigate through my experience. There are a few things about this that I want to share, besides the disclaimer that I do, remember, work for Microsoft:\n<ul class=\"wp-block-list\">\n<li>I do <strong>not<\/strong> rely on conference WiFi, ever. I don\u2019t care who is hosting the event, every single one of my examples that I plan to use will be located locally on my machine. If I want to demonstrate something that requires live endpoint access, it\u2019s probably going to be a video. I also avoid running locally hosted web servers. This means that I don\u2019t use Chrome to present most of my applications, because up until about a week ago, I had never been able to get a running web server* other than IIS for my Unity WebGL builds and Chrome gets angry about locally-hosted resources because of CORS. That\u2019s a long tangent to get to what I\u2019m saying: Between Firefox and Edge, Edge handled displaying WebGL content for 45 minutes better than Firefox did. My metric is how loud my laptop fan was running, YMMV. I know, very specific performance testing there.<\/li>\n\n\n\n<li>I keep hoping that if I pretend Edge supports the WebVR API, it will one day magically support the WebVR API. (Hi Edge team! My alias is Livieric if you want to chat!)<\/li>\n\n\n\n<li>Until I get one of those shiny new Nvidia laptops with a desktop-ready graphics card, I can\u2019t present with a desktop VR headset anyway, so having the experience render stereoscopically anyway is a moot point. Also, on stage, people don\u2019t care if it\u2019s stereoscopic, they\u2019re seeing it on a giant projector anyway.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>While not related at all to WebVR specifically, I wore my GearVR on my head the entire 45 minutes. This served several purposes: when, as a speaker, I show that I don\u2019t take myself too seriously, it helps the audience feel at ease too and the talk is a lot of fun. I was basically playing my talk like a video game, so the accessory helped set the tone. I also had a convenient place to store it while I wasn\u2019t talking about Mobile VR headsets. The really interesting part was afterwards \u2013 someone in the audience mentioned that it helped break the stigma of wearing a VR headset to them, which I thought was pretty cool.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>After I had finished walking through the app, I went into the source code and showed the audience how I had written the application. It ended up being about 300 lines of code total for the whole thing. Pretty snazzy!<\/li>\n<\/ul>\n\n\n\n<p>I tend to alternate between Unity and WebVR technologies, but I stay really passionate about both of them, even when I\u2019m trading off. There are a lot of benefits to each approach, and I\u2019ve been spending a lot of time in Unity recently for HoloLens development, so it was great to get back into the web ecosystem to build out my talk. I was impressed with how A-Frame has been maturing, and the tools evolving around it \u2013 shout out to Kevin Ngo for the text-component addition, that library is incredibly helpful.<\/p>\n\n\n\n<p>It\u2019s always a good exercise to switch things up, in my opinion. I am constantly humbled when I attempt to write vanilla JS code and can\u2019t figure out how to iterate through a variable or compare strings. It\u2019s also incredibly rewarding to struggle through something and see it work. It gives you the opportunity to work on pushing boundaries.<a href=\"http:\/\/web.archive.org\/web\/20200923151142\/https:\/\/livierickson.com\/blog\/wp-content\/uploads\/2016\/09\/AframElect.png\"><\/a><\/p>\n\n\n\n<p id=\"caption-attachment-8991\">By default, today\u2019s Electron library doesn\u2019t support WebVR because of the version of the Chromium browser used. When support for the WebVR API lands later this year, I hope that Electron may show&nbsp;some&nbsp;promise for JS desktop VR apps.<\/p>\n\n\n\n<p>The app that I built is something that will evolve over time. I\u2019ve got a week to polish it up and present at Full Stack Fest in Barcelona this Friday, and as the web tools evolve for VR, so will my experiments. After a number of failed past attempts, I was motivated to finally get a Node environment set up so that I could wrap my A-Frame site in an Electron package, something that worked absolutely beautifully and has motivated me to go deeper into open source. When the WebVR spec gets pulled into Chrome\/Chromium all up later this year, Electron has the potential to be a really fascinating way to build desktop VR applications in JS.<\/p>\n\n\n\n<p>Pairing up a library like Electron with Mozilla\u2019s experimental browser engine Servo, a multi-threaded, GPU-first engine, would be a really interesting way to potentially start seeing highly performant desktop JS application development. Web beacons as a delivery and discovery mechanism for immersive web experiences is going to be an entirely new way to showcase location-relevant experiences. I\u2019ll be the first to say that I have no freaking idea how it all ends up being technically feasible, but I can say that I\u2019ve never been more excited about the potential of browser-based immersive technologies and the world that is evolving around them.<\/p>\n\n\n\n<p>The future is fantastic.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This past Thursday, I had the privilege of speaking about the VR Web, a topic near and dear to my heart, at Coldfront, a front-end focused, single-track conference in Copenhagen, Denmark. Despite battling off the remnants of a particularly nasty cold, my talk at Coldfront was one of my favorites that I\u2019ve ever given. Why? The audience was great, the organizers were wonderful \u2013 but what really excited me about giving this talk was that<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"activitypub_content_warning":"","activitypub_content_visibility":"","activitypub_max_image_attachments":0,"activitypub_interaction_policy_quote":"","activitypub_status":"","footnotes":""},"categories":[3,6],"tags":[],"class_list":["post-44","post","type-post","status-publish","format-standard","hentry","category-development","category-spatial-computing"],"_links":{"self":[{"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/44","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=44"}],"version-history":[{"count":1,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/44\/revisions"}],"predecessor-version":[{"id":289,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/44\/revisions\/289"}],"wp:attachment":[{"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=44"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=44"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/liverickson.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=44"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}