Guy’s GDC Roundup

Today I’m sitting in Santa Cruz, having spent an action-packed week in downtown San Francisco at Game Developers’ Conference 2014. GDC is an event I’ve wanted to get along to for many years, and I have followed it closely (albeit vicariously) through twitter and the blogosphere for the past few years. I really wasn’t prepared for what a massive event it was. In every time-slot there were at least 3 sessions that I really wanted to see, and usually a couple more that I would be interested in seeing. Luckily, my excellent friend and boss Danu got me an “All Access” pass, which includes 12 months access to the “GDC Vault”, which has recordings of [pretty much] all the sessions. I’ll summarise a few of the main things I took away from the conference.

High-Quality Mobile Graphics

I went to several talks about high-quality graphics, from Imagination Technologies (ImgTec), ARM, Intel, Unity and Epic. They all had different names for what I’m simply calling High-Quality graphics; some talked about “Console Quality”, some talked about “AAA Quality”, some talked about “PC” quality, but the long and the short of it is that mobile SOC’s now have enough compute, and enough memory bandwidth to do graphics at least as good as the last generation of consoles (Xbox360 and PS3), if not better. I learned that you have to be a lot more cunning than on PC, though, because mobile GPU implementations differ quite wildly, so you have to be prepared to have multiple render paths in your engine, and you have to think carefully about what your lowest-common denominator.

Things to watch: ARM announced some GLES 3 extensions which both they and ImgTec support, which should make deferred shading a great deal easier on mobile: one of them is the ability to sample the backbuffer in a shader, and the other is called “Pixel Local Storage” – which allows you to store extra data along with the colour/depth/stencil data in the backbuffer. Essentially, global data, in the same block of memory as the backbuffer, that persistes from draw-call to draw call. You could thus use the backbuffer as your G-Buffer for deferred shading. In addition to this ImgTec announced that they are releasing a new GPU based on their 6 series design, which has ray-tracing hardware in it. The idea would be to use rays for things like shadows and environment maps. It was quite convincing, but if only they have it, it goes against the whole “lowest common denominator” idea. I really hope other vendors pick this up.

Remember that the GPU is Asynchronous

I attended a talk on how to find, fix and avoid “GPU Sync points”, which are places where you make your application code wait for the GPU to do something, causing an enormous pipeline bubble. I attended a talk on how the Nitrous engine from Oxide Games works with AMD’s Mantle, as well as two talks on DirectX 12. Both Mantle, and DirectX 12 bring the application a giant step closer to the GPU, throwing away things like state-tracking in the driver, and the multi-layer App/Client/Server/Driver architectures that we have today. The biggest key from all of these talks was that you have to remember that the GPU is quite far away from the CPU (not quite so far for intel, but miles away for a discrete GPU), and as such you should treat it as an asynchronous resource. Queue work up, and let it go.

Things to watch: Both Mantle and DirectX 12 allow the application to create resources and queue work on the GPU from whatever thread they want. I had some conversations (with some people who will remain nameless) that lead me to believe OpenGL (at least OpenGL ES) will head in this direction in the near future. If you are building engines, you better be building them to be task based. If you’re using other people’s engines, you better demand that they are building them task-based, and making them scale well with multiple cores. Dan Baker from Oxide Games mentioned in two separate talks that their task-based architecture, when unleashed with Mantle or DX12, took them from being CPU bound in DX11 to [sometimes] being GPU bound. The powers-that-be are about to give us the ability to actually use our GPUs, make sure you’re ready to do so.

We’ve got to fix this monetization thing

For all this tech talk, I felt like there was a bit of a looming shadow at the conference: despite the ease with which you can distribute a good game, it is really hard to get your good game noticed, because the market is awash with games that have been ruined because the designers built the thing around fleecing the customers. Something like 60% of the show floor was advertising and payment processing providers. There was a talk on “how to monetize teens”. I attended the rant session, and there was a slide with a count-down, and an “IAP” that could get us to the next slide; eventually two guys got up and paid some money to the speaker, and he asked the crowd to “give the Whales a round of applause” – all of this, obviously, was to make a point. Later in that session, we were reminded that the terms “Minnow” and “Whale” come from the slot-machine industry, and we were challenged about the morality of viewing our audience in the same way. Then I found out that Firaxis/2K/Take-Two brought X-Com Enemy Unknown to mobile for $25 (NZD) and that it is selling like wild-fire and driving console and PC sales! I reminisced with others about the good old days of Doom, and Wolfenstein, and Commander Keen, and other games which we all got the first few levels for as shareware, and then bought the full version later. By the end of the week, I came to this conclusion: we (the industry) need to stop making slot machines for children, and start making fun games, that people want to play, and want to play again, and then work out how to avoid giving it all away for free. Concentrate on making awesome games, stop working out how to fleece your players.

Things to watch There was a massive room full of people present at the rant session, I can only hope a bunch of them took on board the idea of making good games, rather than designing mechanics to fleece their customers.

Conclusion

GDC was awesome. Mobile tech has got awesome. We need to make awesome games.

My Ultimate Cygwin Setup

I sat down today to do some programming, and I got a little bit distracted improving my environment, but I think that where I’ve got to is quite good, so I’ll share it with you. So, what are my requirements?

  • a unix-like console with
  • an easy mechanism for installing packages
  • git, python, and other useful things
  • summary information about my git repos, at the command prompt
  • auto-completion for git
  • shortcuts to frequently used folders, and some other conveniences

A unix-like console on Windows

Have you heard of Cygwin? Cygwin is great. Cygwin is unix on Windows. Sometimes I think Microsoft should just buy Cygwin, and make it a first-class citizen of Windows. Installing Cygwin is easy, but the base install doesn’t come with many packages; you can re-run the installer to add more, but if you’re like me and you don’t save your downloads to disk, it can be a bit of a pain. Get the installer here. Don’t finish the installation just yet because…

An easy mechanism for installing packages

You’ll want to install a couple of extra packages required to install apt-cyg. I learned how to install it here. You’ll want to install cygwin with the following extra packages, to make sure you can run apt-cyg:

  • wget
  • tar
  • bzip2
  • subversion
  • vim

Once Cygwin is up and running do the following:

$ svn --force export http://apt-cyg.googlecode.com/svn/trunk/ /bin/
$ chmod +x /bin/apt-cyg

If you’re running the x86_64 version of Cgywin (which I recommend if you’re on 64-bit Windows), then you’ll also want to open up /bin/apt-cyg in a text editor:

$ vim /bin/apt-cyg

and change the following two lines

98:  wget -N $mirror/setup.bz2 to wget -N $mirror/x86_64/setup.bz2
105: wget -N $mirror/setup.ini to wget -N $mirror/x86_64/setup.ini

Now you’re ready to install some more useful tools!

Git, Python and other useful things

bash-completion – auto-complete for bash
ca-certificates
– allows your Cygwin environment to validate SSL certificates
curl – a useful command-line tool for accessing urls, similar to wget, but more powerful
git – a distributed version control tool
git-svn – lets git play nice with SVN
python – an interpreted programming language. I like to write shell scripts with it
python-setuptools – people use this when distributing their python stuff
rsync – handy tool for synchronizing stuff from one place to another (especially over the internet)

You should be able to install all of these with apt-cyg, it’ll handle all the dependencies:

$ apt-cyg install bash-completion$ apt-cyg install ca-certificates
$ apt-cyg install curl
$ apt-cyg install git
$ apt-cyg install git-svn
$ apt-cyg install python
$ apt-cyg install python-setuptools
$ apt-cyg install rsync

[Update]
I forgot to mention ncurses, a library for writing text-base interfaces. I use it to create cls (see my update lower down).

$ apt-cyg install ncurses

Summary information about my git repos, at the command prompt

I have been using posh-git inside Windows Powershell, The thing I like about posh-git is that it gives you an overview of your repository status at your command prompt, like so:

D:\Users\Guy\Programming\git\bitlyWin8 [master +5 ~9 -0 | +0 ~0 -6]>

Telling you which branch you are on, and giving you a summary of both tracked, and un-tracked changes. However, this can make my prompt quite slow in large repositories, because it takes quite a while to run the script which generates it, but I still want something like it in cygwin. What I have found, although less detailed than posh-git, is nice and quick, and it gives me a simple indicator of what I need to do with my repository. There is a script called git-prompt.sh which can do some nice things with your prompt, so let’s go ahead and get that. We’ll be adding a few shell scripts that get run when you open a cygwin terminal window, and I like to keep these in a bin folder inside my home folder. We’ll download the shell scripts to there:

$ cd ~
$ mkdir bin
$ cd bin
$ wget 
https://raw.github.com/git/git/master/contrib/completion/git-prompt.sh

Now edit your .bash_profile to add git command prompting to your bash sessions. The top of the git-prompt.sh file has a good explanation on the options, which are controled by environment variables.

source ~/bin/git-prompt.sh

# Show if there are unstaged (*) and/or staged (+) changes
export GIT_PS1_SHOWDIRTYSTATE=1

# Show if there is anything stashed ($)
export GIT_PS1_SHOWSTASHSTATE=1

# Show if there are untracked files (%)
export GIT_PS1_SHOWUNTRACKEDFILES=1

# Show how we're tracking relative to upstream
export GIT_PS1_SHOWUPSTREAM="verbose"

# Save the old command prompt, and set the new one to  <number of commands issued> user:directory (branch <status symbols>)$ eg: 167 Guy:Arduino (master u=)
PS1_OLD=${PS1}
export PS1='\[33[1;34m\]\!\[33[0m\] \[33[1;35m\]\u\[33[0m\]:\[33[1;35m\]\W\[33[0m\] \[33[1;92m\]$(__git_ps1 "(%s)")\[33[0m\]$ '
export PS1='\[33[1;34m\]\!\[33[0m\] \[33[1;35m\]\u\[33[0m\]:\[33[1;35m\]\W\[33[0m\] \[33[1;92m\]$(__git_ps1 "(%s)")\[33[0m\]\$ '

Auto completion for git

We already installed bash-completion and now all we need to do is add a script that supplies completion functions for git. To download the scritpt:

cd ~/bin
curl https://github.com/git/git/raw/master/contrib/completion/git-completion.bash -OL

And then to add it to your .bash_profile

source ~/bin/git-completion.bash

Shortcuts to frequently used folders, and some other conveniences

Cygwin maps your Windows filesystem to /cygdrive/<drive letter>/ but this can be a bit tedious to get to, so I’ve created some shortcuts. You might like to break these out into a separate file if you end up with heaps of them, but I’ve only got a few for now. Open up your .bash_profile and add the following things:

alias cdp='cd /cygdrive/d/Users/Guy/Programming'
alias cda='cd /cygdrive/d/Users/Guy/Documents/Arduino'
alias cdc='cd /cygdrive/c'
alias cdd='cd /cygdrive/d'

[Update]
Cygwin doesn’t have cls out of the box. But with ncurses installed, you can use the commmand tput clear to do the same thing. I aliased it to cls: (thanks mikyra)

alias cls='tput clear'

[Update 2]
There’s a terminal command open in OSX that is quite nice, it essentially does a shell execute on whatever you pass it. Cygwin has something similar called cygstart, but that’s not a nice name. (thanks erichui).

alias open='cygstart'

I’ve also added an alias to reload my bash profile (so that later on I can edit it, and the see my changes easily).

alias reload='source ~/.bash_profile'

Finally, I like to add that bin folder I created to the path:

export PATH="${HOME}/bin:${PATH}"

Acknowledgements

  1. Taylor McGann’s blog was useful in showing me how to do a bunch of the git prompt, git completion, and fancy bash profile stuff.
  2. This StackOverflow post on apt-cyg.
  3. This StackOverflow post on clearing the screen.
  4. This StackOverflow post on cygstart.

ASP.NET MVC4 Binding to a List of Complex Types

UPDATE: I have an idea how they might be doing it: Expression Trees!

One of my colleagues and I ran into a little trouble with Model Binding in ASP.NET MVC 4 yesterday. We wanted a form to post back an IEnumberable of a complex type, let’s call it IEnumberable. Doing this naively didn’t work. After a bit of googling, and experimentation we found out how to get what we wanted, but the answer was a bit quirky: you must use an array/collection index operator inside your Html.Whatever() lambda, otherwise the Html helper doesn’t know that it should put an array index at the front of the field names in your form. If it doesn’t put this index (in the form of “[0].”, “[1].”), then the MVC model binder can’t work out how to group your fields into instances of a complex type.

Continue reading

Porting Open Asset Import Library (Assimp) to WinRT (4)

So, I changed my mind about the whole zlib vs System.IO.Compression, partly because the latter is not already a Windows Runtime accessible type (afaik), and secondly because the code would not be at all portable, and I’d like to be able to contribute my work back to the community if I can. I’ve run into a couple of other problems though, one to do with Microsoft’s war on Buffer Overrun exploits (fair enough, Windows XP was a real problem that way), and the other to do with the security constraints that come with Windows Store apps.

Buffer Overrun / Overflow

First, the whole buffer overrun thing: it seems that Microsoft have deprecated large portions of the C Runtime on windows (fopen, strcpy, pretty much anything that takes a potentially unsanitary char* as an argument, and writes the data contained therein to another unsanitary char* argument), in favour of the ‘safe’ variants of these functions (which only exist on windows). The gist here is that you need to tell it how long the destination buffer is, so that the function will not overrun the buffer. I can understand why they did this, after the whole security shake-up they had during the Vista development cycle (rumoured to be the reason it took so long). Incidentally, we might see Apple making similar moves after this little hiring.

Apparently these warnings have existed since Visual Studio 2005, but with Windows Store apps, the warnings about these functions have become errors. You can turn them off by defining _CRT_SECURE_NO_WARNINGS, but I’m not sure if I want to. On the one hand, it does concretely close a potential attack vector in any code I write. On the other hand, Microsoft’s series of _s functions (fopen_s, strcpy_s, wcstombs_s, etc) are not part of Standard C, and are thus not portable either. Some people have suggested that if one is re-writing C code to use these functions, they should just re-write it in C++ and use the iostream classes instead, which is great, except most OSS newbies like me don’t have the clout to get a C++ re-write of zlib accepted. So, I’m left with two options: write code that Microsoft have deemed they don’t want running on their OS, or write ugly, macro-heavy wrappers so that the code uses fopen_s on windows, and fopen elsewhere. I’m going to try the second option for now, and see how far that gets me. Don’t be surprised if I blog in the future about how ssimp core team won’t let me commit my code.

Windows Store App Security Model

The next issue that I’ve encountered is more interesting, and it has risen out of the unique space that Windows RT fills in the market, as more than a tablet, but a little bit less than a full-blown Windows PC. If we look at the iPad/iPhone, and Windows Phones as well (I’m, not even going to discuss Android, it’s just wrong to put Linux on a phone), each app is a completely walled garden, except for a few API’s which give access to things like, Pictures, Emails, Contacts – your personal data that apps can add value to. The thing about accessing this data is, as I mentioned before, it is done through and API, you don’t just go poking around the file system opening files as you please. Furthermore, when these apps do try to access the file system using direct, fairly low-level calls, like fopen, they can only see within their own little sandbox. This is fine on a device whose paradigm is that Apps only work on their own data, but Windows is a bit different. Windows is traditionally a desktop OS, I have a Documents Library, and a Pictures Library, and a Downloads Library, and my files are a thing in themselves which transcend my current set of Apps. So, you can access the file system, but it has to be through an API (StoreageFile) if it isn’t within your sandbox, and you can’t even ask for files outside your sandbox unless the paths have come from a FilePicker. I think this model is sound – an app can’t touch anything it didn’t create without your very explicit permission. My goal, however, is to write a Windows Store app version of Assimp Viewer, and so I have a need to use fopen on files outside my sandbox, which was a bit of a conundrum. Luckily though, I’m not blazing a trail here, and I found this post which details a neat strategy which lets you use the StoreageFile API to bring the file into (and out of I presume) your sandbox, so that you can use your low-level C code on it. This shouldn’t be too hard for me to get into the app, because AssimpView is already a Windows only application.

So, my current status is that I’m testing out my build of zlib for a Windows Store App, and it seems to work thus far. I think building the assimp library itself might not be so complicated, because, today’s points aside, any ANSI C code should “just work” when compiled as a Windows Store App static (or dynamic) library.

Porting Open Asset Import Library (Assimp) to WinRT (3)

So I’ve been doing a bit of digging about in the Assimp source, and it seems that, for the core library at least, the only dependency is on zlib – which it builds in your solution on Windows. My next possible step, then, could be to move the code from this zlib project over to a Windows Runtime Component project. However, I’m a little reticent to do this, since the System.IO.Compression namespace can do all the compression stuff, and to an extent it depends on whether it is easier to port zlib, or modify Assimp so that it doesn’t use zlib. My hope is that I can keep my changes to Assimp isolated to a fairly low level in the codebase, so that it is reasonably easy to keep in harmony with their continuing efforts, and porting zlib seems more likely to achieve that goal.

Digging further, I have found that at present, the zlib functionality is only called by the ‘unzip’ library that also comes with Assimp, which itself is currently only used by the Q3BSPZipArchive class. So, I think I might put in an alternate code-path in either the unzip code, or perhaps even the Q3BSPZipArchive to use the Windows Runtime code. I suspect that modifying unzip.c would be the best approach, because then anyone who writes a custom importer using the Q3BSP one as a template will be able to use my functionality.

It’s settled then, I’ll make an alternate unzip.c (or pre-compiler path therein) which uses the native Windows Runtime API to handle the unzipping. Not tonight though.

 

Porting Open Asset Import Library (Assimp) to WinRT (2)

Last time I got to the point where I had found that I can’t build the code in Windows 8 because DirectX 11.1 has deprecated a bunch of DirectX 9 functionality, so I installed the Jun 2010 DirectX SDK to get up and running. My plan is to get it building as a Windows 8 Desktop library, and then rip out the code that won’t e compatible with WinRT.

The next thing I discovered that the function object classes in the STL (std::plus, std::minus etc), were causing a bunch of errors in Vertex.h, the fix was to find AssimpPCH.h in the “Header Files” folder, and add an “#include <functional> to the list of STL includes.

Rebuilding from there gives me a a dirty ole’ LNK2019 in the assimp_viewer project. I think the CMAKE files aren’t adding a -Ld3d9 directive, so I added an entry to “Additional Dependencies” under link settings to “C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\Lib\x86\d3d9.lib”

Make sure you set a value for CMAKE_INSTALL_PREFIX, and build the “INSTALL” project. You’ll then need to add the CMAKE_INSTALL_PREFIX\lib folder to your path so that you can run the assimp_viewD.exe file.

So far, so good. My next step is to rip out the D3D9 stuff and upgrade it with D3D11 stuff, so that I can build it as a Desktop / Windows Pro assembly. After that I can work out how to make it into a Windows Store app/library, which will be more challenging because of the requirements of the WinRT platform (eg file access must be asynchronous).

See you next time.

Porting Open Asset Import Library (Assimp) to WinRT

With the advent of Microsoft’s Windows 8, and it’s ARM-compatible Windows RT variant, Microsoft have really jumped headlong into the mobile device arena. I think the fact that developers will now be able to develop a single code-base that works on Phones, Tablets, Laptops and Desktop computers, will be quite a big deal for their platform, but it comes with one particularly difficult challenge: the code must only be compiled against what is know as the WinRT stack. This is a subset of the full Windows 8 SDK which is common to all hardware platforms, and it presents a challenge because none of the libraries we’re used to using right now (libpng, libjpg, etc, etc, name your library here). So, I’ve decided, mainly for interest’s sake, to try and port the wonderful Open Asset Import library, commonly referred to as Assimp, to the WinRT platform. Now, I probably would use this in a toolchain more than a runtime engine, but it is still a good test case. I started on this project today, and I can see a few troubles ahead of me:

  1. They don’t currently support VS2012, but CMAKE should fix this
  2. I have to do a “Boost Free” build, so I’ll have to see how that goes.
  3. The library uses D3DX, which is deprecated from Windows 8 up, so I’ll have to build it agains the old DirectX SDK first, and then start there.

Anyhow, I’m downloading the DirectX SDK now, so I’ll have to wait. More as it happens.