27th July 2015

Quick thoughts on GPUs for VR

I see lots of speculation around about what graphics card you need for good VR experiences. At this point we are at least a few months away from availability of commercial VR systems for PCs. As such the advice I can give is pretty easy right now- wait if you can! If you absolutely positively need to buy a new graphics card now, go ahead. But if you can wait until you are actually receiving your VR headset things will be cheaper and there will be more information about upcoming cards, relative performance for VR scenarios and the needs of various titles needs.

In the end the answer won’t be straight-forward because it will all depend on content. Can you run VR with a HTC Vive with an older graphics card? Probably if all you want to do are more simple experiences. Heck, you could almost get the required 90FPS on an integrated GPU if your scene is just a simple cube (but maybe not quite). But the key thing is that you REALLY NEED to make 90fps. All the time. Techniques like async time warp are interesting but they have really visible artifacts when used with positional tracking and if you have tracked controllers they will be VERY noticeable (since every timewarp your controllers will glitch in their positions). Good experiences in VR are more about consistent smooth motion that 100% matches what you expect even more than realistic rendering. For example one of the demos we show is Job Simulator where everything is a cartoon representation of objects. It doesn’t look like reality but the Owlchemy guys did such a good job with it that it FEELS like reality.

So what am I going to do for my personal rig? First of all, wait as long as possible. Second, I’m not going to cut corners. Like most tech VR equipment will great much less expensive in the future. But this is the first generation of real products and its not going to be inexpensive for a quality setup. In the past I have never bought a GPU for much over $300, but I’m expecting to spend in the $650-range for either a 980ti/equivalent or maybe dual slightly smaller GPUs since stereo rendering can do such a great job on dual-GPU setups. I know its a lot, but then again 15 years ago I used to have to spend $4000 for a high-end PC and the whole system with an amazing GPU should be less than $2000 now. Until now the difference between that $300 GPU and something better was really hard to tell for most people, but with these VR experiences it becomes worth it.

posted in Graphics, Technology, Virtualization | 0 Comments

27th July 2015

VR Growing Pains

One of the things we are very passionate about at Valve is helping the industry create a whole new set of VR experiences that people love. We want to make sure that the exposure that the world gets to VR is really great and it doesn’t make people sick/sore/unhappy.

Surprisingly this is actually somewhat controversial. While many people can get sick easily if they are in VR experiences that move the world around them without the player actually moving, there are some people who are (mostly) immune to it. They can play some of the FPS games adapted for VR or just “tough it out” to try experiences. I’m the last person who is going to tell them they can’t do that to themselves, but I do get concerned when these people are excited about sharing those experiences with their friends and I certainly advise software developers working on new VR titles to make experiences that will be enjoyed by a wide audience, not just that 20%.

Another way to think about it is you don’t want the reviews of your game to read like these (actual reviews)-
“New X Title ‘Y’ Is Beautiful and Engaging and It Makes My Neck Hurt”
“With a little luck, (and dramamine for motion sickness) the VR mode will make it to the final release”
“I intentionally limited the amount of barrel rolls and hard turns as to avoid the dreaded ‘hot sweats’”
“In the demo I played stick-yaw was the method of turning, which made me a little motion sick by the demo’s end”
“X will test your skills… and potentially your stomach”

The bottom line is that the real audience isn’t going to tough it out, train themselves to avoid sim sickness or anything like that. If your game doesn’t get this right your audience will be extremely limited. There are already some fairly well known techniques to do this right but we are all learning more- again, the key is honest testing.

posted in Technology, VR | 0 Comments

7th June 2015

Macs and VR

The funny thing is that Macs should be great for VR. On average they tend to have slightly better graphics than the typical PC because Apple controls the ecosystem and has always needed good baseline graphics performance to support the visuals in their OS.

HOWEVER, quality VR is really hard to run on middle of the road GPUs. And to be clear, middle of the road in this case is not 50% of the performance of the high end (what you would see from a 970 or 980M). Its 10%. 70% of Mac gamers according to the Steam Hardware Survey are running on laptops. And really no Macs at least as configured by Apple short of the $3999 MacPro models have enough GPU to make 90 frames per second with an HMD for any reasonably complex content. The new Retina 5k display iMac can be gotten with an MD Radeon M295X that should be good enough but that is over $2500 and doesn’t even have an HDMI output so hooking up an HMD would need an adapter that might cause some interesting issues.

By comparison its pretty easy to build out a PC for about $1500 that has a top of the line Core i7 and a GeForce GTX 980 that should be equivalent to what we are using for demos. I should say that most of the PC OEMs don’t make this easy at the moment- good luck for example finding a system on the Dell site with reasonable specs without getting a very expensive Alienware. Some of the OEMs that let you customize a bunch more are a more reasonable bet for now. Hopefully as VR becomes more of a think on the marketplace both Apple and the PC OEMs will catch on and offer some great configurations optimized to be good for these scenarios (which also means they will be good general gaming PCs).

posted in Apple, Technology, VR | 0 Comments

6th June 2015

VR- Getting Sick

The topic of getting sick in VR (often called “simulator sickness”) is a complex one. The simplified core of the issue is that when your visual perception doesn’t match what your body (and especially inner ear) feels, many people feel sick. This is obviously not exclusive to VR- it can manifest in cars, airplanes, spaceflight, amusement park rides and more. Note that whatever the mechanism is for this in the brain isn’t 100% tuned to all exact physical experiences. Many people get sick on a roller coaster even though their motion perfectly matches what they see- just the brain didn’t evolve to be prepared for tight accelerating turns, changes in the gravity and so in those cases the mismatches between reality and brain-reality are enough to cause the problem.

Some people even get sick playing first-person video games without VR. The types of motions in a typical FPS with lots of dashing around, quick turns, and strafing movements can be especially difficult for people who experiences these issues.

In VR I think of two main sources of mismatch between what your body feels and the photons entering your retina. Both can be equally important for VR experience designers to consider and both can be very complicated to get right. System-mismatch is the mismatch caused by the core VR technology. At some basic level a VR system is measuring your head’s current position, velocity and acceleration and rendering a frame. Ideally that frame will be positioned so that it exactly corresponds to your head position at the moment in time the photons hit your retina. The other type is content-mismatch. This is by-design movement of the camera/head controlled by the experience that doesn’t exactly match the player’s physical movement.

For the fundamental technology we are working on at Valve we spend a lot of effort on system-mismatch. There are a number of techniques to address this, but they involve accurate tracking of head position, prediction of future head position, reducing latency in the rendering pipeline, deferring locking in head position in the rendering pipeline, adjusting rendering at the last minute, making sure rendering actually makes framerate (since there is no worse added latency than missing a frame totally) and finally reducing latency between your GPU and the actual photons being emitted (HDMI has a fair amount of latency and this relates to why wireless displays are very difficult).

As I said, most of these are issues for the core VR system. As a content/experience/game developer there isn’t that much you can do to change the tracking accuracy of whatever VR hardware you are using. Your main challenge is to never miss framerate. On the HTC Vive we run at 90hz so overall there are 11ms per frame. But there is plenty of overhead between various system things, the compositor, GPU scheduling, etc, so realistically you only have about 8ms to render your frame. It is critically important to design your content so that it fits within budget 100% (or at least 99.9%) of the time. Ideally game engines will help since its really hard to tune content to be consistent- most environments have something simple most of the time punctuated by some really complicated stuff occasionally (when all the explosions happen or something). If your game engine can automatically adjust and reduce render target sizes, AA, texture mips, etc, that can help a ton.

Content-mismatch is however a more complicated issue. In general I’m a fan of never making your players sick. Its true that there are some people who are less susceptible but its not clear to me that they are actually comfortable after spending a bunch of time in an experience that is questionable. I’d love to do some research where we wire some of them up to HR, breathing, and other monitors and have them play an experience that pushes the boundaries for a full hour. Personally I’m middle of the road- I’m far from immune to getting sick but not nearly as bad as some. Part of what confuses some of the debate so far is that so few people have extended time in VR. When I’m playing an experience that moves the camera I’m usually fine for 2 or 3 minutes (typical demo length), feeling just a little “off” at 5 minutes and it doesn’t get bad for a bit longer. But VR is still not going to be successful if 80% of people are feeling even a little uncomfortable after 5 minutes. They might not even realize they feel sick, but they certainly won’t enjoy the experience the same way they would if they were in well designed experiences that didn’t have these issues.

With all of the above preface, its useful to discuss what you can actually do. Most of the demos that Valve showed at GDC and since then had almost no forced-camera movement. We wanted to be especially careful to make sure people had great experiences while we are still learning what works. At this point I don’t think we can definitively say that we know what works, but there are a bunch of techniques that look promising.

In general it seems fine to have the player in a slow-moving elevator with good visual references. CloudHead Games does this well in The Gallery demo. Cockpits seem to help so driving/flying airplanes/etc look like they might work when done carefully (with the caveat that many people get sick in the real thing with these scenarios). The agency of directing the vehicle (car/airplane) yourself seems to help some. I have also seen good results with locomotion with teleporting- I often build prototypes that let you use one of the controllers to point to where you want to be and press the button to teleport yourself there to move over distances greater than the physical space you have available.

RoadToVR has some interesting stuff about experimenting with locomotion at the Austin VR Jam here.

posted in Technology, VR | 0 Comments

5th June 2015

VR Topics and Performance

The really exciting thing about working on VR at the moment is just how little we actually know. It really feels like the good old days of first working with a mouse and trying to figure out this brand new thing and having to learn how users interact with it. Some of my colleagues have been working on this stuff for years and certainly have learned a lot but since real mass-audience commercial products are not really on the market yet its early to think you actually have the answers.

There are tons of controversial topics at the moment- performance requirements, issues around making people sick, input, interactions, APIs, support for multiple hardware platforms, and more. I’ll try to touch on a couple of these things, but keep in mind that while it’s a subtle distinction these comments are a bit more “a few things we have learned” rather than “what we know”. And even more importantly I’m reluctant to say “you need to do it this way” because people are still inventing great ways to make stuff work.

Having said that I’ll touch on some performance stuff. A bunch of people were working at Valve for years development experimental hardware to try to understand the basic characteristics of what a VR system needs to really create a solid comfortable sense of presence. Of course there are lots of caveats and I’ll go into the details of what I think comfortable means but for me that means you need a display that updates at 90 frames per second with really solid tracking so that as you move your head the photons that hit your eyes really correspond to what would normally be happening as you move your head in the real world.

Hitting that 90 fps can be quite a performance challenge. When I’m playing games on my normal monitor I tend to prioritize beautiful rendering over frame-rate. I play on my 30″ monitor and crank up the quality settings and if the frame-rate ranges between 25fps to 50fps, fine. VR is a different beast- if you don’t hit 90fps consistently your users will feel sick. As they move the world won’t keep up with what the brain expects to see and for most people this can cause motion sickness which really ruins the experience.

So its critical that you hit 90fps. There are a bunch of techniques that game engines need to adopt to accomplish this since they traditionally have been designed to run at a fixed quality level with variable frame-rate. But there are also the questions about what hardware you need for this and its difficult to provide clear answers. Our demos at GDC all ran on a powerful gaming PC (although one that used normal parts) with a NVidia GTX 980 GPU. We wanted to show some high end experiences and preparing demos for an event like that doesn’t give you much time to optimize performance. But you could certainly run a bunch of great VR experiences on a machine with lower end parts. I guess the difficult part is just that its so hard to provide clear guidelines on what hardware you need given all the variables of the different kinds of content, how much optimization will happen before release and the need to hit that frame-rate with a lot of pixels consistently.

posted in Gaming, Technology, VR | 0 Comments

4th June 2015

15 Years of the Web

(this was written by me back in January 2009 but I didn’t hit publish at the time.. At this point its past 20 years…)

15 years ago in January 1994 was the first time I saw the web. Other folks can use whatever dates they want to to mark the history of the web- it certainly started earlier back in 1991 or 1993 when NCSA Mosaic came out or any of several other dates.

The timing of the web is more personal for me in two ways. First of all, its been a pretty central part of my professional life pretty much continuously since 1994. With a couple of small exceptions just about every project I’ve been involved in for the past 15 years has involved the web in some way. But beyond that, the web has fundamentally changed day to day life in a way that few other technologies have.

There are many technologies that I just didn’t “get” for a long time (IM and blogs to pick on two of the more embarrassing examples), but the web is something that seemed important instantly. Our experiences with the web are dramatically different than they were back in 1994, but the basics- HTML, HTTP, server side applications, browsers, client side UI projected from the server all haven’t fundamentally changed since then. What was obvious about this set of technologies was that this was an amazingly malleable platform, one consistent set of parts that could serve as the basis for all of these different kinds of things. Sometimes it can be frustrating that these core pieces are so flexible- it means that we have continued to evolve them, sometimes painfully, rather than being able to jump to a new “better” way to do things. But in the end, these same pieces have been able to pull it all off, and their global reach makes a really high barrier for any alternatives to surpass.

Looking back on the past 15 years, I have had a sense of amazement at some of the progress. I remember just being blown away the first time I saw URLs on the side of buses and in TV ads. There have been tons of similar “wow, its really happening” moments along the way, but the funny thing is it hasn’t been at all surprising most of the things that people have done with the web- collaborative workplaces, online communities, entertainment, games, they were all pretty obvious (at a high level- not all the smart things that people figured out to make them real) from the very beginning.

posted in Technology | 0 Comments

4th June 2015

What I’m up to lately

Its been awhile since I’ve done anything on this blog. I’m not entirely certain that people read blogs anymore here in mid 2015. But I’ve been doing some cool stuff lately and I thought I’d push myself to take some time to write about it periodically.

I’ve been at Valve for 15 months now and its been a pretty exciting place to work. Its the sort of place where you can pretty easily move around between projects to be taking care of the stuff that is important with the nice side effect that most of the time it stays pretty fresh. I’ve done a bunch of stuff so far but the two main things so far are the Steam Inventory Service and VR stuff (SteamVR, OpenVR, HTC Vive).

Most of the fun stuff to share involves the VR stuff but today I’ll just mention a couple of things about the Inventory Service. The PC games industry has been more vibrant than ever the past few years and we have been looking at ways to help enable developers to deliver even better experiences, hopefully focusing mostly on the gameplay and less on the extra stuff outside the game itself. Dealing with servers, databases, scalability and all those fun things are some of the things that can be challenging to deal with for all but the biggest game developers. With the Inventory service a game developer can add persistent items to a game without having to run any servers themselves. Further, if their game is successful we are already setup to handle massive scale on this service and its especially efficient for us to run it across the multiple games using it with Steam.

Diving into game development has involved learning about some new types of problems. For example, its a pretty good assumption that any game will have its client code hacked at some point. So if you design something like the Inventory Service where the client grants items (“you just killed an Orc so here is a magic ring”) people will be able to cheat. Some games might not really care about this, but for many the value in the items can often be their rarity. Players feel really special having gotten that extra rare item and if everyone can get it normal players will be unhappy.

So to combat this, we have a couple of modes you can use the Inventory Service. If you are running your own trusted servers, they can talk to our service to drop specific items. But in the case where you aren’t running servers at all, the main safe way to drop items is via timed drops. Steam measures how much time you spend playing the game and after a specific amount of time you are eligible for a random item. If the game wants some semantics associated with that there can be different buckets of random items, but the key is that nothing the client can do can force a specific item.

posted in Developers, Games, Gaming, Technology | 1 Comment

14th January 2013

Sous Vide Buffalo Wings

Last Sunday I wanted to make some great wings for a marathon football day. I’ve been a big fan of wings for a long time but its one of those foods that can be done so poorly the result is horrifying. The ideal wing for me needs to be really crispy on the outside so that it stays crispy when drenched in nuclear sauce, but still have nice juicy (but not under-cooked) meat. I can also be a bit of a purist when it comes to the hot-sauce, although I have to say the Pok Pok wings are amazing & live up to the hype (but have no relationship to Buffalo wings).

So, I decided to try to sous vide the wings. Its one of those perfect situations for sous vide since you want to be able to cook the wing in as hot as possible oil so that it gets really crispy on the outside, but you don’t want to over cook the inside or leave it in the oil so long it gets greasy. Many traditional recipes accomplish this by double-frying but sous vide ends up being much easier and probably healthier.

I basically followed This recipe from The Zen Kitchen. It was pretty straight-forward but I did want to add a couple of notes. First of all, I asked the butcher to cut the wings in half but forgot to get the “tips” cut off too, and didn’t remember that until after I did the sous vide step. The tips were pretty easy to cut then, but ended up coming off with some of the meat since the already cooked wing was pretty tender.

Next time I’d also be more careful to try to get the wings in one layer in the vacuum bag, and after I took them out I’d probably open the vacuum bag and drain them before refrigerating. Because I left them in the bags and refrigerated them overnight I had a ton of chicken fat and gelatin to scrape off them before frying. The volume of the stuff was amazing- certainly part of what probably makes this approach somewhat more healthy, given that 4 hours at 170F is going to render a ton of the fat. Would probably make a great base for a soup but we didn’t have anything we wanted to do with it.

The frying went off without a hitch and then we tossed them in store-bought Franks Red Hot sauce. The resulting wings were so crispy they could be drenched in the sauce and still be crispy. As you bit in, they were still amazingly juicy too. Our friends had some nuclear hot sauce that you could apply to for the extra-hot ones, and the result was just about perfect!

Again, this is a perfect example of one of the misconceptions of sous vide. If you have decent equipment (doesn’t need to be thousands of dollars, but a nice home vacuum sealer & temp regulator), sous vide can actually be quite a bit easier than other approaches and can have amazing results. The only extra complexity is that you often need to plan ahead a bit (I had to sous vide the night before) but if you can manage that, its really not at all difficult.

posted in Cooking, Food | 1 Comment

27th November 2012

Remote Light Sensor with XBee

Ironically my first project hasn’t used a Netduino yet. I’ve started playing with using XBee units to transmit a remote value of a light sensor, and it turns out the XBee units are based on their own microcontroller that has a reasonable amount of built-in functionality so you sometimes don’t need an additional fully programmable controller. Even more wacky, you configure the things with a variation of the good old Hayes AT command set, something that I’m more than familiar with from my modem days, but that frankly I was hoping to never see again.

So this project has two parts- a remote unit that just uses an XBee transmitter to read the value from a photo resistor, and a second XBee unit that receives the data and interfaces with a PC via USB. For this first attempt I’m using two XBee series 1 units, two Uartsbee interface units and a breadboard.

The Uartsbee units are pretty convenient to get going at first with the USB interface, but their design as one big flaw. The XBee units themselves aren’t compatible with breadboards since their pins have 2mm spacing instead of the .1”/2.54mm spacing that the breadboards use. The Uartsbee device however has its two rows of pins so far apart that if you put it in a breadboard it fills the entire width of the breadboard giving you no room to attach them to anything.

My project was modeled after one in chapter 4 of Building Wireless Sensor Networks: with ZigBee, XBee, Arduino, and Processing. For this project I needed to wire the photo resistor to the analog input with connections to ground and +3.3v. However the necessary pins were on opposite sides of the board and I couldn’t plug them in to my breadboard. The hack I did was that I configured IO pins 1 & 2 to be digital outputs and set one to low and one to high. That gave me the ground & +3.3v signals that I needed. I just used the USB connector for power, and that way didn’t have to attach anything to the other side of the board. I did however have to hookup one additional pin that the book doesn’t mention- they didn’t show wiring to the vREF and I couldn’t get it to work until I saw a reference to that in the XBee documentation and attached +3.3v to vREF (pin 14).

Next step was to configure the XBee units. I hooked them both up to USB and ran the XBee X-CTU utility twice. It showed me COM3 and COM4 and I used the terminal mode to hook up to each unit on a distinct port. The only configuration that was really required was on the “remote” unit- I hit it with-
ATD02 (analog input)
ATD15 (digital output high)
ATD24 (digital output low)
ATIR 3e8 (sample rate every 1000ms)
ATWR (write configuration)

After the ATIR command data started showing up on the “PC” side XBee- success!

Last step is to write a program to read the data on the PC side. I was really happy to see that the drivers just made the output from the XBee show up as a serial port so I could just use System.IO.Ports.SerialPort to read it. The documentation on the packet formats were a bit confusing so I spent a little time decoding them and I still get packets that are a different size than I expected, but I got the whole thing working- code is below.

using System;
using System.IO.Ports;

namespace XBeeBase
{
    class Program
    {
        static void Main(string[] args)
        {
            SerialPort port = new SerialPort("COM3");
            port.BaudRate = 9600;
            port.Parity = Parity.None;
            port.DataBits = 8;
            port.StopBits = StopBits.One;
            port.Handshake = Handshake.None;

            port.Open();

            try
            {
                while (true)
                {
                    byte[] buffer = ReadPacket(port);

                    switch (buffer[0])
                    {
                        case 0x83:
                            int sigStr = buffer[3];
                            int options = buffer[4];
                            int numSamples = buffer[5];
                            int chanInd = buffer[6] * 256 + buffer[7];
                            int value = buffer[buffer.Length - 3] * 256 + buffer[buffer.Length - 2];
                            if (numSamples == 1 && buffer.Length == 13)
                            {
                                System.Console.WriteLine("Read str= " + sigStr + " opt=" + options + " len= " + (buffer.Length-1) + " num= " + numSamples + " chan=" + chanInd + " value=" + value);
                            }

                            break;
                        default:
                            System.Console.WriteLine("Invalid input " + buffer[0]);
                            break;
                    }
                }
            }
            finally
            {
                port.Close();
            }
        }

        static byte[] ReadPacket(SerialPort port)
        {
            while (true)
            {
                int b = port.ReadByte();
                if (b != 0x7e)
                    continue;

                int sizeH = port.ReadByte();
                int sizeL = port.ReadByte();
                int size = sizeH * 256 + sizeL;
                byte[] buffer = new byte[size + 1];

                int pos = 0;
                while (pos < size + 1)
                {
                    pos += port.Read(buffer, pos, size + 1 - pos);
                }

                // Ignoring the checksum. Ideally we would validate it here.
              return buffer;            
            }
        }
    }
}

References-
Building Wireless Sensor Networks: with ZigBee, XBee, Arduino, and Processing
XBee documentation from Digi
XBee Adapter Kit I didn't use this one yet but it looks better than the one I did use for the breadboard unit.
XBee Series 2 I used the Series 1 so far for this project but the Series 2 are more flexible.

posted in Hardware, Microcontrollers, Networking, Technology | 0 Comments

7th November 2012

Posts about Netduino

I bought a Netduino a couple of weeks ago and am planning on posting about my experimentation with it (& other related stuff). I’ve played with electronics since I was in high-school, but its amazing the recent breakthrough that makes this stuff accessible. Back then I would pour over the Apple ][ schematic and try to learn how that stuff worked and hook up simple circuits with TTL gates. In college I built a simple 6502 computer by wiring up the CPU and some SRAM and having manual switches to input stuff. I made a very simple boot loader, but it was such a pain to get it going I never made it do anything interesting.

Now these new platforms are amazing. You can buy the Netduino which is an inexpensive 32-bit ARM microcontroller running the .NET Micro Framework, fire up Visual Studio, plug in the controller with an USB cable and hit F5 and its doing stuff. It has a bunch of I/O breakouts, and so you can build cool circuits from there. So far I made some simple light flashing things and a RGB LED controller that cycles the colors of a RGB LED and changes the cycle speed based on the setting of a pot. Next step is experimenting with some XBee controllers to do some simple mesh-networking. Part of the cool thing of those is it seems like simple functions can be wired directly to the XBee controller without needing a distinct microcontroller in each location.

posted in Hardware, Microcontrollers, Technology | 0 Comments